Softmax: Difference between revisions
Jump to navigation
Jump to search
mNo edit summary |
mNo edit summary |
||
(2 intermediate revisions by the same user not shown) | |||
Line 1: | Line 1: | ||
{{#addbodyclass:tag_math}} | |||
{{stub}} | {{stub}} | ||
'''softmax''' (sometimes called softargmax, normalized exponential function, and other things ) | '''softmax''' (sometimes called softargmax {{comment|(except it's not really like [[argmax]])}}, normalized exponential function, and other things) | ||
* takes a vector of numbers | * takes a vector of numbers | ||
:: (any scale) | :: (any scale) | ||
Line 8: | Line 9: | ||
:: all in 0 .. 1 | :: all in 0 .. 1 | ||
:: that sum to 1.0 | :: that sum to 1.0 | ||
Latest revision as of 23:15, 21 April 2024
✎ This article/section is a stub — some half-sorted notes, not necessarily checked, not necessarily correct. Feel free to ignore, or tell me about it.
softmax (sometimes called softargmax (except it's not really like argmax), normalized exponential function, and other things)
- takes a vector of numbers
- (any scale)
- returns a same-length vector of probabilities
- all in 0 .. 1
- that sum to 1.0