Softmax: Difference between revisions

From Helpful
Jump to navigation Jump to search
mNo edit summary
mNo edit summary
 
(3 intermediate revisions by the same user not shown)
Line 1: Line 1:
{{#addbodyclass:tag_math}}
{{stub}}
{{stub}}


'''softmax''' (sometimes called softargmax, normalized exponential function, and other things )
'''softmax''' (sometimes called softargmax {{comment|(except it's not really like [[argmax]])}}, normalized exponential function, and other things)
* takes a vector of numbers  
* takes a vector of numbers  
:: (any scale)
:: (any scale)
Line 11: Line 12:




<!--
Note that it is ''not'' just normalization, nor is it just a way to bring out the strongest answer.
Note that it is ''not'' just normalization, nor is it just a way to bring out the strongest answer.



Latest revision as of 23:15, 21 April 2024

This article/section is a stub — some half-sorted notes, not necessarily checked, not necessarily correct. Feel free to ignore, or tell me about it.

softmax (sometimes called softargmax (except it's not really like argmax), normalized exponential function, and other things)

  • takes a vector of numbers
(any scale)
  • returns a same-length vector of probabilities
all in 0 .. 1
that sum to 1.0




https://en.wikipedia.org/wiki/Softmax_function