Posts

Showing posts with the label Cross Entropy

Categorical Cross Entropy loss vs Sparse Categorical Cross Entropy loss vs Kullback-Leibler Divergence loss

Image
  Introduction How would we say if a model is good or bad? We need a measure of performance to assess the model. This measure based on the context is either minimized  or maximized to optimize the model. A measure which is minimized to optimize the model is called “Loss”. Several factors like type of the data, learning algorithm used etc. come into play when choosing a Loss function. In this post let’s explore a few Loss functions which are based on Entropy. Cross entropy Entropy is the measure of “uncertainty” in the possible outcome of random variables. In the context of classification If all the classes are equally probable then the system is of high entropy and vice versa. It is mathematically given as  What does this entropy mean in terms of assessment of a model performance. Let’s understand this with the help of the infamous CIFAR 10 dataset. This Data set contains tons of images belonging to 10 mutually exclusive classes. Here’s an example- Unique classes in Dat...