为什么alexnet只训练四层和五层,不训练前面几层 #354
-
您好,我想问一下,在加载预训练模型时,alexnet只对4,5层训练,为什么不对前面几层训练,是有什么考虑吗; |
Beta Was this translation helpful? Give feedback.
Answered by
ZhiyuanChen
May 2, 2020
Replies: 1 comment
-
可以,但没必要。 |
Beta Was this translation helpful? Give feedback.
0 replies
Answer selected by
ZhiyuanChen
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
可以,但没必要。