Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

activation function option #10

Open
targos opened this issue Jul 7, 2017 · 2 comments
Open

activation function option #10

targos opened this issue Jul 7, 2017 · 2 comments
Assignees

Comments

@targos
Copy link
Member

targos commented Jul 7, 2017

currently we can only have the same activation function for all layers. is it normal for FNN or the option is just missing?

@jajoe
Copy link
Member

jajoe commented Jul 18, 2017

the option is missing, but I've seen that generally we use the same activation function for all layers with FNN (that's not the case with other types of neural networks). So that's not a huge problem for the FNN.

@yawetse
Copy link

yawetse commented Feb 21, 2018

@targos could you add the feature to specify a different activation function for the output layer?

@maasencioh maasencioh self-assigned this Feb 21, 2018
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

5 participants