relu Algorithm

It was demonstrated for the first time in 2011 to enable better training of deeper networks, compared to the widely used activation functions prior to 2011, e.g., the logistic sigmoid (which is inspired by probability theory; see logistic regression) and its more practical counterpart, the hyperbolic tangent. A unit utilizing the rectifier is also named a correct linear unit (ReLU).Rectified linear units find applications in computer vision and speech recognition use deep neural nets.

Steps and Performance

COMING SOON!