Skip to content

Unit 4.5 - Is the activation function necessary for using MLP as a regression model? #114

Discussion options

You must be logged in to vote

Hi there,

here, I meant removing the activation function after the last layer. Otherwise, you will restrict the model outputs. So, I wouldn't remove any ReLU function in the middle of a network.

Replies: 1 comment 1 reply

Comment options

You must be logged in to vote
1 reply
@melrefaey
Comment options

Answer selected by melrefaey
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
2 participants