You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
I think there is no need to put LogSoftmax layer at the end of network if you are going to use CrossEntropyLoss ? Because CrossEntropyLoss combines nn.LogSoftmax() and nn.NLLLoss() in one single class.
The text was updated successfully, but these errors were encountered:
I think there is no need to put LogSoftmax layer at the end of network if you are going to use CrossEntropyLoss ? Because CrossEntropyLoss combines nn.LogSoftmax() and nn.NLLLoss() in one single class.
@Frusciante@activatedgeek I think so. But I am not understand this: x = self.c2_1(output) output = self.c2_2(output) output += x
It seems that C3 layer is not realized in this way。
LeNet-5/lenet.py
Line 37 in b9d7f39
I think there is no need to put LogSoftmax layer at the end of network if you are going to use CrossEntropyLoss ? Because CrossEntropyLoss combines nn.LogSoftmax() and nn.NLLLoss() in one single class.
The text was updated successfully, but these errors were encountered: