Skip to content

Commit

Permalink
add relu in nnsetup
Browse files Browse the repository at this point in the history
  • Loading branch information
stone8oy committed Apr 24, 2015
1 parent e7fdd58 commit 4758895
Showing 1 changed file with 2 additions and 1 deletion.
3 changes: 2 additions & 1 deletion NN/nnsetup.m
Original file line number Diff line number Diff line change
Expand Up @@ -6,7 +6,8 @@
nn.size = architecture;
nn.n = numel(nn.size);

nn.activation_function = 'tanh_opt'; % Activation functions of hidden layers: 'sigm' (sigmoid) or 'tanh_opt' (optimal tanh).
nn.activation_function = 'tanh_opt'; % Activation functions of hidden layers: 'sigm' (sigmoid) or 'tanh_opt' (optimal tanh) or 'relu'.
nn.neg_slope = 0; % unit for relu
nn.learningRate = 2; % learning rate Note: typically needs to be lower when using 'sigm' activation function and non-normalized inputs.
nn.momentum = 0.5; % Momentum
nn.scaling_learningRate = 1; % Scaling factor for the learning rate (each epoch)
Expand Down

0 comments on commit 4758895

Please sign in to comment.