# Functions
CrossEntropy implements a cross-entropy loss function.
CrossEntropySeq calculates the CrossEntropy loss on the given sequence.
Distance is a loss function that calculates the distance between target and x.
FocalLoss implements a variant of the CrossEntropy loss that reduces the loss contribution from "easy" examples and increases the importance of correcting misclassified examples.
MAE measures the mean absolute error (a.k.a.
MAESeq calculates the MAE loss on the given sequence.
MSE measures the mean squared error (squared L2 norm) between each element in the input x and target y.
MSESeq calculates the MSE loss on the given sequence.
NLL returns the loss of the input x respect to the target y.
Norm2Quantization is a loss function that is minimized when norm2(x) = 1.
OneHotQuantization is a loss function that pushes towards the x vector to be 1-hot.
Perplexity computes the perplexity, implemented as exp over the cross-entropy.
SPG (Softmax Policy Gradient) is a Gradient Policy used in Reinforcement Learning.
WeightedCrossEntropy implements a weighted cross-entropy loss function.
WeightedFocalLoss implements a variant of the CrossEntropy loss that reduces the loss contribution from "easy" examples and increases the importance of correcting misclassified examples.
ZeroOneQuantization is a loss function that is minimized when each component of x satisfies x(i) ≡ [x]i ∈ {0, 1}.