answersLogoWhite

0

In machine learning, entropic loss is a specific kind of loss function. A loss function measures the error between the predicted results and the desired results. Entropic loss uses the logistic function to show the error (or "loss") between the true answer and the predicted answer.

A general loss function is defined here:

Lφ= ∫ (φ(z) - y)dz, where the bounds of the integral are φ-1(y) (on the bottom) and φ-1(yhat) (on the top). yhat is the estimated (predicted) result, and y is the desired result.

The entropic loss function is defined here:

Lφ= ∫ (1/(1 + exp(-z)) - y)dz

As you can see, the entropic loss is a specific instance of a loss function. All I've done is I've chosen my loss function φ(z) to be the logistic function.

The logistic function is a common choice because it is a convex function, and therefore has only one local minimum (therefore it is also the global minimum). When we try to minimize loss, we want only one minimum so that when we set the derivative to zero, we know we've found what we want.

User Avatar

Wiki User

13y ago

Still curious? Ask our experts.

Chat with our AI personalities

CoachCoach
Success isn't just about winning—it's about vision, patience, and playing the long game.
Chat with Coach
DevinDevin
I've poured enough drinks to know that people don't always want advice—they just want to talk.
Chat with Devin
BlakeBlake
As your older brother, I've been where you are—maybe not exactly, but close enough.
Chat with Blake

Add your answer:

Earn +20 pts
Q: What does entropic loss mean?
Write your answer...
Submit
Still have questions?
magnify glass
imp