-
Notifications
You must be signed in to change notification settings - Fork 105
/
ES-HyperNEAT readme
84 lines (65 loc) · 3.88 KB
/
ES-HyperNEAT readme
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
40
41
42
43
44
45
46
47
48
49
50
51
52
53
54
55
56
57
58
59
60
61
62
63
64
65
66
67
68
69
70
71
72
73
74
75
76
77
78
79
80
81
82
83
WARNING: This ES-HyperNEAT implementation is highly experimental and has not been validated yet by independent researchers. Any problems or failed experiments should not be blamed on the algorithm, but the particular implementation here.
Changelog:
-----------------------------------------
v2-16-07-2015: Revised the ES Hyper NEAT to improve performance
- Substrate link threshold and max weight are now useful
- Replaced most vectors with unordered map to improve all the searches
- Made use of .reserve to further improve speed
- Changed the way variance is calculated for a node - instead of traversing the whole tree,
the variance now checks only the direct children of the node. It seems to work fine.
- Finalized the retina task example.
- In utilities - added a method to save data from an experiment to a file. and another one to save a .png
of a neural network or a pattern drawn by a CPPN.
-----------------------------------------
Overview:
The current extension introduces Evovlable Substrate HyperNEAT and
multiobjective optimization via the NSGA-II non-dominated sort.
In order to generate a network via ES-HyperNEAT simply call
Build_ES_Phenotype on the genome that you want to use.
Just as the BuildHyperNeatPhenotype it takes as arguments an empty
neural network, a substrate and parameters. The substrate can be the same
as used for HyperNEAT, although any hidden nodes and substrate settings
will be ignored
In order to use Link expression Output set the LEO parameter to true. Doing so will add an additional output
with a Step function to it. If you want to add a gaussian seed to the Genome set the GaussianSeed
parameter to true. You also need to use the new genome constructor for
this to work. Speaking of the new constructor -
it also offers the opportunity to build an empty genome with only biases connected to the outputs
There is also a little python class - utilities.py that contains an updated method for visualizing the neural
networks with cv2 (since the original one does not remove nodes, that do not have connections, and ES-HyperNEAT
leaves some of these around), and a few other visualisation utilities based on Matplotlib.
Finally - there is an XOR example in the examples folder.
##########
##Genoome#
##########
- New constructor - Fully connected with possibility for empty genomes, and adding LEO
and Gaussian Seed:
Genome(unsigned int a_ID,
unsigned int a_NumInputs,
unsigned int a_NumOutputs,
bool
ActivationFunction a_OutputActType,
const Parameters& a_Parameters);
- Build_ES_Phenotype() method that generates a neural networks as per ES-HyperNEAT, and it's associated methods for
quad-tree traversal, connection pruning and calculating the variance of a node.
- A GetPoints method that returns what the ES algorithm generates for a
single input point
#############
# Parameters#
#############
Qtree_X, Qtree_Y - the center of the quadtree
Width - the total width of the tree.
InitialDepth - The minimum depth the tree wil search for nodes.
MaxDepth - the maximum depth the tree can reach if there is enough variance in
the pattern generated by the CPPN
DivisionThreshold - How deep the quadtree will search. Default is 0.3
VarianceThreshold - How much variance is needed in order to express stuff
BandThreshold - the value used for pruning nodes.
IterationLevel - In essence it specifies the number of hidden layers
that the ES-HyperNEAT looks for in addition to the initial.
Thus Iteration level of one means two hidden layers
CPPN_Bias - the bias value used by the CPPN
- Leo - Bool - use LEO or not
- LeoThreshold - float. Values above this number will lead to expressed connection.
Usefu only in the case you change the Leo activation function. Now it is hardcoded as
a step, but in the future it will be added to the parameters