-
Notifications
You must be signed in to change notification settings - Fork 3
/
Specifications.txt
47 lines (37 loc) · 1.48 KB
/
Specifications.txt
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
40
41
42
43
44
45
46
47
Initial states:
x1 = [-0.77, -0.75]
x2 = [-0.45, -0.43]
x3 = [0.51, 0.54]
x4 = [-0.3, -0.28]
t = 5 seconds
steps = 10 (control period = 0.5s)
Goal states:
x1 = [-0.1, 0.2]
x2 = [-0.9, -0.6]
Neural network specification:
nn_tora_relu_tanh.txt:
4 # number of inputs
1 # number of outputs
3 # number of hidden layers
20 # number of neurons in the fisrt hidden layer
20 # number of neurons in the second hidden layer
20 # number of neurons in the third hidden layer
# weights and bias of the neural network
-0.12919427454471588
...
0 # offset of the neural network
11 # scalar of the neural network
The activation function used in the hidden layers is relu activation function. The output layer use tanh activation function.
nn_tora_sigmoid.txt:
4 # number of inputs
1 # number of outputs
3 # number of hidden layers
20 # number of neurons in the fisrt hidden layer
20 # number of neurons in the second hidden layer
20 # number of neurons in the third hidden layer
# weights and bias of the neural network
-0.00012612085265573114
...
0 # offset of the neural network
11 # scalar of the neural network
The activation function used in the neural network is sigmoid activation function.