-
Notifications
You must be signed in to change notification settings - Fork 43
Experiment 2
John Martinsson edited this page Dec 1, 2016
·
4 revisions
The dataset is a subset of 20 randomly chosen species from the BirdCLEF2016 dataset. The dataset is split into 1/3 validation (for each species), and 2/3 training.
The loss function is categorical crossentropy, and the optimizer is adadelta. The metric is accuracy. This is treated as a single-label problem, where only the main/forground species is optimized for.
Wed, 30 Nov 2016 14:13:37 +0000
mini-batch: 1 / 10
Train on 1963 samples, validate on 1667 samples
Epoch 1/10
1963/1963 [==============================] - 24s - loss: 2.4857 - acc: 0.3123 - val_loss: 2.6788 - val_acc: 0.2178
Epoch 2/10
1963/1963 [==============================] - 22s - loss: 1.4970 - acc: 0.5695 - val_loss: 2.9184 - val_acc: 0.1962
Epoch 3/10
1963/1963 [==============================] - 22s - loss: 1.0138 - acc: 0.6964 - val_loss: 2.6968 - val_acc: 0.2460
Epoch 4/10
1963/1963 [==============================] - 22s - loss: 0.7037 - acc: 0.7891 - val_loss: 3.3490 - val_acc: 0.2825
Epoch 5/10
1963/1963 [==============================] - 24s - loss: 0.5174 - acc: 0.8456 - val_loss: 3.6442 - val_acc: 0.2975
Epoch 6/10
1963/1963 [==============================] - 23s - loss: 0.3879 - acc: 0.8844 - val_loss: 3.2274 - val_acc: 0.3503
Epoch 7/10
1963/1963 [==============================] - 24s - loss: 0.3134 - acc: 0.9007 - val_loss: 3.5478 - val_acc: 0.3551
Epoch 8/10
1963/1963 [==============================] - 24s - loss: 0.2173 - acc: 0.9333 - val_loss: 3.4909 - val_acc: 0.3953
Epoch 9/10
1963/1963 [==============================] - 24s - loss: 0.1977 - acc: 0.9414 - val_loss: 3.7732 - val_acc: 0.3653
Epoch 10/10
1963/1963 [==============================] - 24s - loss: 0.1686 - acc: 0.9465 - val_loss: 3.8895 - val_acc: 0.3665
mini-batch: 2 / 10
Train on 2182 samples, validate on 1667 samples
Epoch 1/10
2182/2182 [==============================] - 25s - loss: 1.2483 - acc: 0.7324 - val_loss: 2.5404 - val_acc: 0.4313
Epoch 2/10
2182/2182 [==============================] - 25s - loss: 0.4059 - acc: 0.8863 - val_loss: 3.1214 - val_acc: 0.3893
Epoch 3/10
2182/2182 [==============================] - 26s - loss: 0.2295 - acc: 0.9326 - val_loss: 3.3917 - val_acc: 0.3857
Epoch 4/10
2182/2182 [==============================] - 26s - loss: 0.1908 - acc: 0.9459 - val_loss: 3.1463 - val_acc: 0.3881
Epoch 5/10
2182/2182 [==============================] - 26s - loss: 0.1187 - acc: 0.9661 - val_loss: 3.5850 - val_acc: 0.4139
Epoch 6/10
2182/2182 [==============================] - 26s - loss: 0.1115 - acc: 0.9656 - val_loss: 3.5532 - val_acc: 0.4277
Epoch 7/10
2182/2182 [==============================] - 26s - loss: 0.1321 - acc: 0.9620 - val_loss: 3.6318 - val_acc: 0.3797
Epoch 8/10
2182/2182 [==============================] - 26s - loss: 0.0784 - acc: 0.9794 - val_loss: 3.5892 - val_acc: 0.4529
Epoch 9/10
2182/2182 [==============================] - 26s - loss: 0.0631 - acc: 0.9794 - val_loss: 3.4168 - val_acc: 0.4385
Epoch 10/10
2182/2182 [==============================] - 26s - loss: 0.0692 - acc: 0.9826 - val_loss: 3.6942 - val_acc: 0.4589
mini-batch: 3 / 10
Train on 2155 samples, validate on 1667 samples
Epoch 1/10
2155/2155 [==============================] - 24s - loss: 1.1383 - acc: 0.7875 - val_loss: 2.8149 - val_acc: 0.3683
Epoch 2/10
2155/2155 [==============================] - 26s - loss: 0.3236 - acc: 0.9109 - val_loss: 3.0819 - val_acc: 0.3797
Epoch 3/10
2155/2155 [==============================] - 26s - loss: 0.1883 - acc: 0.9452 - val_loss: 3.1149 - val_acc: 0.3515
Epoch 4/10
2155/2155 [==============================] - 26s - loss: 0.1621 - acc: 0.9568 - val_loss: 3.8599 - val_acc: 0.3743
Epoch 5/10
2155/2155 [==============================] - 26s - loss: 0.1125 - acc: 0.9712 - val_loss: 4.1774 - val_acc: 0.3719
Epoch 6/10
2155/2155 [==============================] - 26s - loss: 0.0936 - acc: 0.9745 - val_loss: 4.1143 - val_acc: 0.3461
Epoch 7/10
2155/2155 [==============================] - 26s - loss: 0.0749 - acc: 0.9800 - val_loss: 4.6720 - val_acc: 0.3545
Epoch 8/10
2155/2155 [==============================] - 26s - loss: 0.0732 - acc: 0.9805 - val_loss: 4.1372 - val_acc: 0.3737
Epoch 9/10
2155/2155 [==============================] - 26s - loss: 0.0712 - acc: 0.9810 - val_loss: 4.3063 - val_acc: 0.3527
Epoch 10/10
2155/2155 [==============================] - 26s - loss: 0.0482 - acc: 0.9856 - val_loss: 4.4243 - val_acc: 0.3641
mini-batch: 4 / 10
Train on 2127 samples, validate on 1667 samples
Epoch 1/10
2127/2127 [==============================] - 24s - loss: 0.8496 - acc: 0.8275 - val_loss: 2.9989 - val_acc: 0.4457
Epoch 2/10
2127/2127 [==============================] - 25s - loss: 0.2136 - acc: 0.9384 - val_loss: 3.2634 - val_acc: 0.4385
Epoch 3/10
2127/2127 [==============================] - 26s - loss: 0.1279 - acc: 0.9614 - val_loss: 3.9961 - val_acc: 0.4475
Epoch 4/10
2127/2127 [==============================] - 26s - loss: 0.1030 - acc: 0.9690 - val_loss: 3.6325 - val_acc: 0.4601
Epoch 5/10
2127/2127 [==============================] - 26s - loss: 0.0898 - acc: 0.9718 - val_loss: 3.5076 - val_acc: 0.4691
Epoch 6/10
2127/2127 [==============================] - 26s - loss: 0.0570 - acc: 0.9817 - val_loss: 3.6601 - val_acc: 0.4367
Epoch 7/10
2127/2127 [==============================] - 26s - loss: 0.0365 - acc: 0.9901 - val_loss: 4.0761 - val_acc: 0.4415
Epoch 8/10
2127/2127 [==============================] - 25s - loss: 0.0743 - acc: 0.9793 - val_loss: 4.0615 - val_acc: 0.4295
Epoch 9/10
2127/2127 [==============================] - 26s - loss: 0.0582 - acc: 0.9835 - val_loss: 3.6759 - val_acc: 0.4445
Epoch 10/10
2127/2127 [==============================] - 26s - loss: 0.0394 - acc: 0.9901 - val_loss: 3.9827 - val_acc: 0.4535
mini-batch: 5 / 10
Train on 2302 samples, validate on 1667 samples
Epoch 1/10
2302/2302 [==============================] - 26s - loss: 0.7962 - acc: 0.8401 - val_loss: 2.9373 - val_acc: 0.4571
Epoch 2/10
2302/2302 [==============================] - 27s - loss: 0.1970 - acc: 0.9513 - val_loss: 3.1720 - val_acc: 0.4661
Epoch 3/10
2302/2302 [==============================] - 28s - loss: 0.1398 - acc: 0.9579 - val_loss: 3.6390 - val_acc: 0.4367
Epoch 4/10
2302/2302 [==============================] - 27s - loss: 0.1279 - acc: 0.9666 - val_loss: 3.9785 - val_acc: 0.4451
Epoch 5/10
2302/2302 [==============================] - 27s - loss: 0.0776 - acc: 0.9778 - val_loss: 3.0758 - val_acc: 0.4649
Epoch 6/10
2302/2302 [==============================] - 27s - loss: 0.0654 - acc: 0.9805 - val_loss: 3.3092 - val_acc: 0.4853
Epoch 7/10
2302/2302 [==============================] - 27s - loss: 0.0649 - acc: 0.9813 - val_loss: 3.6358 - val_acc: 0.4949
Epoch 8/10
2302/2302 [==============================] - 27s - loss: 0.0637 - acc: 0.9805 - val_loss: 3.5548 - val_acc: 0.4697
Epoch 9/10
2302/2302 [==============================] - 27s - loss: 0.0633 - acc: 0.9839 - val_loss: 4.7260 - val_acc: 0.4085
Epoch 10/10
2302/2302 [==============================] - 27s - loss: 0.0520 - acc: 0.9878 - val_loss: 3.8758 - val_acc: 0.4931
mini-batch: 6 / 10
Train on 2302 samples, validate on 1667 samples
Epoch 1/10
2302/2302 [==============================] - 25s - loss: 0.7176 - acc: 0.8697 - val_loss: 2.9733 - val_acc: 0.4763
Epoch 2/10
2302/2302 [==============================] - 27s - loss: 0.1757 - acc: 0.9513 - val_loss: 3.4087 - val_acc: 0.4937
Epoch 3/10
2302/2302 [==============================] - 28s - loss: 0.1003 - acc: 0.9726 - val_loss: 3.4030 - val_acc: 0.4799
Epoch 4/10
2302/2302 [==============================] - 27s - loss: 0.0664 - acc: 0.9809 - val_loss: 4.0968 - val_acc: 0.4439
Epoch 5/10
2302/2302 [==============================] - 28s - loss: 0.0576 - acc: 0.9865 - val_loss: 3.9286 - val_acc: 0.4733
Epoch 6/10
2302/2302 [==============================] - 27s - loss: 0.0505 - acc: 0.9874 - val_loss: 4.0753 - val_acc: 0.4517
Epoch 7/10
2302/2302 [==============================] - 27s - loss: 0.0481 - acc: 0.9865 - val_loss: 4.5374 - val_acc: 0.4169
Epoch 8/10
2302/2302 [==============================] - 27s - loss: 0.0255 - acc: 0.9913 - val_loss: 4.4632 - val_acc: 0.4517
Epoch 9/10
2302/2302 [==============================] - 27s - loss: 0.0334 - acc: 0.9909 - val_loss: 4.5556 - val_acc: 0.4109
Epoch 10/10
2302/2302 [==============================] - 27s - loss: 0.0280 - acc: 0.9922 - val_loss: 4.4173 - val_acc: 0.4367
mini-batch: 7 / 10
Train on 1870 samples, validate on 1667 samples
Epoch 1/10
1870/1870 [==============================] - 21s - loss: 0.8480 - acc: 0.8503 - val_loss: 2.6427 - val_acc: 0.4991
Epoch 2/10
1870/1870 [==============================] - 22s - loss: 0.2693 - acc: 0.9390 - val_loss: 2.4147 - val_acc: 0.5105
Epoch 3/10
1870/1870 [==============================] - 23s - loss: 0.1438 - acc: 0.9615 - val_loss: 2.8575 - val_acc: 0.4943
Epoch 4/10
1870/1870 [==============================] - 23s - loss: 0.0938 - acc: 0.9706 - val_loss: 2.8597 - val_acc: 0.5081
Epoch 5/10
1870/1870 [==============================] - 23s - loss: 0.0834 - acc: 0.9802 - val_loss: 2.8569 - val_acc: 0.5183
Epoch 6/10
1870/1870 [==============================] - 23s - loss: 0.0770 - acc: 0.9775 - val_loss: 2.8868 - val_acc: 0.5225
Epoch 7/10
1870/1870 [==============================] - 23s - loss: 0.0592 - acc: 0.9856 - val_loss: 2.9457 - val_acc: 0.5255
Epoch 8/10
1870/1870 [==============================] - 23s - loss: 0.0489 - acc: 0.9872 - val_loss: 2.8433 - val_acc: 0.4967
Epoch 9/10
1870/1870 [==============================] - 23s - loss: 0.0465 - acc: 0.9856 - val_loss: 3.0872 - val_acc: 0.4559
Epoch 10/10
1870/1870 [==============================] - 23s - loss: 0.0391 - acc: 0.9882 - val_loss: 3.2565 - val_acc: 0.4637
mini-batch: 8 / 10
Train on 2168 samples, validate on 1667 samples
Epoch 1/10
2168/2168 [==============================] - 25s - loss: 0.6560 - acc: 0.8898 - val_loss: 3.3276 - val_acc: 0.4577
Epoch 2/10
2168/2168 [==============================] - 26s - loss: 0.1562 - acc: 0.9663 - val_loss: 3.3955 - val_acc: 0.4307
Epoch 3/10
2168/2168 [==============================] - 26s - loss: 0.0992 - acc: 0.9751 - val_loss: 3.6586 - val_acc: 0.4115
Epoch 4/10
2168/2168 [==============================] - 26s - loss: 0.0745 - acc: 0.9783 - val_loss: 3.6203 - val_acc: 0.4427
Epoch 5/10
2168/2168 [==============================] - 26s - loss: 0.0835 - acc: 0.9806 - val_loss: 4.1321 - val_acc: 0.4121
Epoch 6/10
2168/2168 [==============================] - 26s - loss: 0.0475 - acc: 0.9862 - val_loss: 4.0812 - val_acc: 0.4157
Epoch 7/10
2168/2168 [==============================] - 26s - loss: 0.0607 - acc: 0.9843 - val_loss: 3.7399 - val_acc: 0.4379
Epoch 8/10
2168/2168 [==============================] - 26s - loss: 0.0280 - acc: 0.9917 - val_loss: 3.4404 - val_acc: 0.4379
Epoch 9/10
2168/2168 [==============================] - 26s - loss: 0.0250 - acc: 0.9917 - val_loss: 4.2537 - val_acc: 0.4247
Epoch 10/10
2168/2168 [==============================] - 26s - loss: 0.0266 - acc: 0.9931 - val_loss: 3.8694 - val_acc: 0.4511
mini-batch: 9 / 10
Train on 2188 samples, validate on 1667 samples
Epoch 1/10
2188/2188 [==============================] - 25s - loss: 0.7216 - acc: 0.8743 - val_loss: 3.1471 - val_acc: 0.4505
Epoch 2/10
2188/2188 [==============================] - 26s - loss: 0.2036 - acc: 0.9479 - val_loss: 3.2019 - val_acc: 0.4799
Epoch 3/10
2188/2188 [==============================] - 26s - loss: 0.1558 - acc: 0.9657 - val_loss: 3.7009 - val_acc: 0.4319
Epoch 4/10
2188/2188 [==============================] - 26s - loss: 0.0945 - acc: 0.9771 - val_loss: 4.0865 - val_acc: 0.4655
Epoch 5/10
2188/2188 [==============================] - 26s - loss: 0.0714 - acc: 0.9817 - val_loss: 3.8362 - val_acc: 0.4637
Epoch 6/10
2188/2188 [==============================] - 26s - loss: 0.0829 - acc: 0.9813 - val_loss: 3.5526 - val_acc: 0.5021
Epoch 7/10
2188/2188 [==============================] - 26s - loss: 0.0665 - acc: 0.9849 - val_loss: 3.4500 - val_acc: 0.4913
Epoch 8/10
2188/2188 [==============================] - 26s - loss: 0.0324 - acc: 0.9904 - val_loss: 3.6835 - val_acc: 0.4697
Epoch 9/10
2188/2188 [==============================] - 26s - loss: 0.0390 - acc: 0.9890 - val_loss: 3.6621 - val_acc: 0.4535
Epoch 10/10
2188/2188 [==============================] - 26s - loss: 0.0262 - acc: 0.9922 - val_loss: 3.5114 - val_acc: 0.4631
mini-batch: 10 / 10
Train on 1923 samples, validate on 1667 samples
Epoch 1/10
1923/1923 [==============================] - 22s - loss: 0.6398 - acc: 0.8861 - val_loss: 2.8830 - val_acc: 0.4439
Epoch 2/10
1923/1923 [==============================] - 23s - loss: 0.1554 - acc: 0.9636 - val_loss: 2.7958 - val_acc: 0.4367
Epoch 3/10
1923/1923 [==============================] - 24s - loss: 0.0670 - acc: 0.9808 - val_loss: 3.0888 - val_acc: 0.4595
Epoch 4/10
1923/1923 [==============================] - 24s - loss: 0.0777 - acc: 0.9792 - val_loss: 3.3226 - val_acc: 0.4007
Epoch 5/10
1923/1923 [==============================] - 24s - loss: 0.0725 - acc: 0.9813 - val_loss: 3.3113 - val_acc: 0.4085
Epoch 6/10
1923/1923 [==============================] - 24s - loss: 0.0545 - acc: 0.9854 - val_loss: 3.2378 - val_acc: 0.4433
Epoch 7/10
1923/1923 [==============================] - 24s - loss: 0.0483 - acc: 0.9886 - val_loss: 3.7581 - val_acc: 0.3977
Epoch 8/10
1923/1923 [==============================] - 24s - loss: 0.0392 - acc: 0.9886 - val_loss: 3.6613 - val_acc: 0.4325
Epoch 9/10
1923/1923 [==============================] - 24s - loss: 0.0462 - acc: 0.9912 - val_loss: 3.9375 - val_acc: 0.4247
Epoch 10/10
1923/1923 [==============================] - 24s - loss: 0.0267 - acc: 0.9922 - val_loss: 4.1165 - val_acc: 0.4313
Wed, 30 Nov 2016 15:02:37 +0000
The weights have been saved in: ./weights/2016_11_30_14:13:35_cuberun.h5
- Top 1: 114
- Top 2: 140
- Top 3: 159
- Top 4: 160
- Top 5: 163
Total predictions: 200
Predicted | Ground Truth |
---|---|
[ 2 18 3 11 14] | 2 |
[17 5 13 14 11] | 17 |
[18 11 4 12 14] | 18 |
[ 4 8 5 11 13] | 8 |
[ 3 5 1 13 9] | 3 |
[15 8 1 11 4] | 8 |
[ 4 2 12 6 10] | 2 |
[ 6 1 13 10 8] | 6 |
[11 13 1 6 10] | 11 |
[ 4 1 11 2 17] | 4 |
[ 4 11 12 10 18] | 12 |
[ 1 4 18 3 6] | 1 |
[15 1 11 8 10] | 18 |
[ 4 15 1 12 18] | 15 |
[11 1 17 8 13] | 11 |
[10 11 2 6 18] | 11 |
[18 10 4 1 13] | 10 |
[ 3 1 12 10 4] | 9 |
[12 10 17 11 4] | 12 |
[15 11 1 10 18] | 15 |
[ 2 12 19 4 1] | 19 |
[18 11 10 4 12] | 11 |
[11 13 4 3 1] | 11 |
[ 5 19 18 1 2] | 5 |
[18 15 9 7 10] | 18 |
[ 1 4 14 17 2] | 8 |
[ 4 18 5 3 1] | 14 |
[13 4 16 3 1] | 4 |
[ 4 13 10 8 14] | 4 |
[ 1 13 8 15 10] | 1 |
[ 1 4 10 8 18] | 13 |
[13 1 11 5 2] | 4 |
[ 1 10 11 4 2] | 19 |
[ 4 14 13 1 16] | 16 |
[ 0 5 17 1 2] | 0 |
[ 4 5 18 13 2] | 7 |
[ 6 5 13 1 4] | 6 |
[ 5 4 11 14 1] | 5 |
[ 1 3 10 17 5] | 1 |
[10 1 13 2 11] | 7 |
[10 19 18 4 11] | 10 |
[ 4 13 11 1 10] | 4 |
[10 4 5 1 6] | 10 |
[11 1 13 14 10] | 11 |
[18 4 1 10 13] | 18 |
[18 4 10 12 2] | 18 |
[10 1 4 11 13] | 10 |
[ 5 13 4 3 18] | 5 |
[17 18 13 19 4] | 8 |
[17 10 11 14 4] | 17 |
[11 1 5 10 4] | 12 |
[ 1 4 3 5 12] | 3 |
[17 1 13 10 8] | 17 |
[10 1 13 8 17] | 10 |
[12 2 1 13 17] | 19 |
[ 2 1 4 10 13] | 2 |
[ 4 1 11 15 10] | 4 |
[11 3 1 10 6] | 3 |
[3 5 4 2 1] | 3 |
[ 9 15 1 10 7] | 9 |
[15 1 13 17 8] | 15 |
[14 11 13 4 1] | 14 |
[ 4 1 15 9 11] | 15 |
[ 5 4 17 19 10] | 12 |
[ 0 19 10 5 17] | 19 |
[17 13 11 3 1] | 17 |
[ 1 13 5 4 14] | 13 |
[11 17 4 1 14] | 17 |
[10 12 4 11 18] | 10 |
[ 5 4 12 2 11] | 2 |
[ 1 4 18 5 2] | 1 |
[18 11 10 14 15] | 18 |
[ 6 2 4 12 1] | 6 |
[ 1 4 12 10 11] | 12 |
[15 8 4 1 13] | 17 |
[13 18 11 8 1] | 11 |
[11 4 1 13 2] | 11 |
[11 18 15 14 13] | 7 |
[ 4 2 16 11 10] | 14 |
[17 1 3 11 13] | 17 |
[ 2 18 6 10 11] | 2 |
[ 4 2 1 18 17] | 2 |
[ 6 15 11 9 4] | 6 |
[12 11 18 1 2] | 11 |
[ 3 5 4 11 9] | 3 |
[ 4 14 13 1 11] | 4 |
[11 15 1 10 4] | 11 |
[ 3 11 4 1 10] | 11 |
[ 8 18 4 13 1] | 8 |
[ 4 1 11 13 5] | 4 |
[18 11 4 10 14] | 18 |
[17 4 14 11 18] | 17 |
[17 13 1 3 5] | 17 |
[ 4 11 1 13 10] | 4 |
[ 3 0 1 13 19] | 6 |
[ 4 1 5 3 17] | 2 |
[ 2 18 14 12 3] | 2 |
[16 4 11 1 13] | 16 |
[ 5 1 8 13 15] | 5 |
[ 2 6 12 11 1] | 2 |
[11 1 14 12 4] | 11 |
[16 4 2 14 8] | 16 |
[ 4 5 1 6 13] | 2 |
[ 3 16 13 1 4] | 16 |
[11 14 4 17 3] | 11 |
[10 4 5 14 8] | 10 |
[10 1 13 8 17] | 10 |
[ 3 5 4 18 1] | 3 |
[14 11 16 4 1] | 14 |
[ 8 13 1 5 10] | 7 |
[ 4 1 10 2 6] | 9 |
[ 2 17 12 10 1] | 2 |
[ 3 1 14 2 18] | 14 |
[13 1 3 17 4] | 13 |
[18 10 11 1 14] | 18 |
[15 5 1 4 16] | 7 |
[10 11 1 15 13] | 10 |
[11 13 4 1 10] | 4 |
[ 4 13 16 8 11] | 4 |
[12 1 13 3 11] | 12 |
[ 1 13 5 11 17] | 1 |
[ 4 8 1 17 10] | 4 |
[11 17 1 10 15] | 17 |
[ 3 14 11 4 5] | 1 |
[ 4 14 11 18 5] | 4 |
[ 4 5 17 13 1] | 8 |
[ 4 1 2 13 18] | 8 |
[ 4 5 17 3 1] | 17 |
[11 2 4 1 13] | 2 |
[14 5 13 11 12] | 14 |
[ 4 18 14 17 1] | 12 |
[ 1 2 6 17 10] | 14 |
[ 5 1 13 17 2] | 5 |
[18 1 4 5 2] | 1 |
[11 8 4 13 1] | 11 |
[13 4 5 18 1] | 13 |
[ 3 17 11 13 5] | 4 |
[ 1 17 11 13 8] | 1 |
[ 5 2 14 4 13] | 14 |
[ 2 18 4 6 3] | 2 |
[14 4 19 11 17] | 19 |
[ 3 11 1 16 4] | 11 |
[10 15 18 3 12] | 10 |
[10 5 19 18 11] | 10 |
[ 4 2 5 6 12] | 4 |
[ 1 4 13 10 11] | 9 |
[10 11 1 12 2] | 9 |
[16 1 13 11 8] | 16 |
[5 4 1 2 6] | 5 |
[16 14 17 13 8] | 16 |
[ 4 13 8 11 1] | 8 |
[ 4 17 1 2 11] | 4 |
[11 3 12 13 17] | 12 |
[ 4 11 17 2 10] | 17 |
[10 12 1 11 18] | 16 |
[18 1 17 14 4] | 18 |
[0 6 2 5 4] | 0 |
[ 3 2 5 18 4] | 2 |
[ 3 11 1 14 13] | 3 |
[17 13 1 4 11] | 17 |
[15 8 9 4 1] | 9 |
[ 3 16 4 11 9] | 2 |
[0 1 2 4 6] | 0 |
[ 1 11 13 15 17] | 1 |
[11 1 2 10 8] | 16 |
[10 17 3 19 5] | 10 |
[4 6 2 3 1] | 2 |
[17 1 4 14 12] | 12 |
[ 1 4 2 5 14] | 1 |
[ 3 4 0 1 18] | 3 |
[ 2 13 6 1 3] | 2 |
[ 4 14 18 16 11] | 2 |
[ 2 4 18 10 11] | 2 |
[12 19 11 8 15] | 12 |
[13 11 8 4 1] | 8 |
[14 3 2 5 6] | 4 |
[ 5 17 4 14 1] | 5 |
[ 4 8 1 18 17] | 8 |
[11 4 2 6 1] | 2 |
[ 8 13 17 11 19] | 8 |
[ 4 10 5 14 7] | 10 |
[13 1 8 18 4] | 10 |
[13 5 16 11 3] | 5 |
[ 5 17 1 12 11] | 17 |
[ 6 11 9 10 18] | 6 |
[11 18 10 13 4] | 11 |
[ 4 1 17 13 8] | 4 |
[11 12 1 13 8] | 11 |
[16 13 4 0 8] | 4 |
[ 2 3 18 11 1] | 2 |
[ 4 6 2 12 8] | 11 |
[ 4 17 5 2 13] | 17 |
[15 11 10 14 18] | 15 |
[ 1 13 2 3 17] | 1 |
[15 18 14 1 11] | 11 |
[18 10 11 4 14] | 18 |
[ 4 18 14 11 6] | 4 |
[13 1 3 14 11] | 13 |
[10 4 18 2 1] | 10 |
[ 5 1 17 13 8] | 5 |