Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

loss function compute problem #4

Open
speeding-motor opened this issue Aug 7, 2019 · 1 comment
Open

loss function compute problem #4

speeding-motor opened this issue Aug 7, 2019 · 1 comment

Comments

@speeding-motor
Copy link

HI, when I read you code , it make me refused here, in the the LIstNet loss function:

def get_loss(self, x_t, y_t):
        # ---- start loss calculation ----
        ...
        p_true = F.softmax(F.reshape(y_t,(y_t.shape[0],y_t.shape[1])))
        xm = F.max(pred,axis=1,keepdims = True)
        logsumexp = F.logsumexp(pred,axis=1)

        logsumexp = F.broadcast_to(logsumexp,(xm.shape[0],pred.shape[1]))
        loss = -1 * F.sum( p_true * (pred - logsumexp) )
        ...

here use p_true * (red- logsumexp), why do like this, you are exactly not follow the paper to compute the loss function right?

@speeding-motor
Copy link
Author

HI, when I read you code , it make me refused here, in the the LIstNet loss function:

def get_loss(self, x_t, y_t):
        # ---- start loss calculation ----
        ...
        p_true = F.softmax(F.reshape(y_t,(y_t.shape[0],y_t.shape[1])))
        xm = F.max(pred,axis=1,keepdims = True)
        logsumexp = F.logsumexp(pred,axis=1)

        logsumexp = F.broadcast_to(logsumexp,(xm.shape[0],pred.shape[1]))
        loss = -1 * F.sum( p_true * (pred - logsumexp) )
        ...

here use p_true * (red- logsumexp), why do like this, you are exactly not follow the paper to compute the loss function right?

But why do you calculate the loss function in this way? is that work well?

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

1 participant