answersLogoWhite

0

In machine learning, entropic loss is a specific kind of loss function. A loss function measures the error between the predicted results and the desired results. Entropic loss uses the logistic function to show the error (or "loss") between the true answer and the predicted answer.

A general loss function is defined here:

Lφ= ∫ (φ(z) - y)dz, where the bounds of the integral are φ-1(y) (on the bottom) and φ-1(yhat) (on the top). yhat is the estimated (predicted) result, and y is the desired result.

The entropic loss function is defined here:

Lφ= ∫ (1/(1 + exp(-z)) - y)dz

As you can see, the entropic loss is a specific instance of a loss function. All I've done is I've chosen my loss function φ(z) to be the logistic function.

The logistic function is a common choice because it is a convex function, and therefore has only one local minimum (therefore it is also the global minimum). When we try to minimize loss, we want only one minimum so that when we set the derivative to zero, we know we've found what we want.

User Avatar

Wiki User

13y ago

Still curious? Ask our experts.

Chat with our AI personalities

MaxineMaxine
I respect you enough to keep it real.
Chat with Maxine
FranFran
I've made my fair share of mistakes, and if I can help you avoid a few, I'd sure like to try.
Chat with Fran
RossRoss
Every question is just a happy little opportunity.
Chat with Ross

Add your answer:

Earn +20 pts
Q: What does entropic loss mean?
Write your answer...
Submit
Still have questions?
magnify glass
imp