# softmax Algorithm

In mathematics, the softmax function, also known as softargmax or normalized exponential function, is a function that takes as input a vector of K real numbers, and normalizes it into a probability distribution consisting of K probabilities proportional to the exponential of the input numbers. Furthermore, the larger input components will correspond to larger probabilities. We are pertained with feed-forward non-linear networks (multi-layer perceptrons, or MLPs) with multiple outputs. The purpose of the softmax in decision theory is credited to Luce (1959), who used the axiom of independence of irrelevant options in rational choice theory to deduce the softmax in Luce's choice axiom for relative preferences. We look for appropriate output non-linearities and for appropriate criterion for adaptation of the parameters of the network (e.g. weights).

### softmax source code, pseudocode and analysis

COMING SOON!