Skip to content

Commit

Permalink
Random sampling and optimization for GP
Browse files Browse the repository at this point in the history
  • Loading branch information
woodRock committed Sep 30, 2024
1 parent 5ccf26b commit af41014
Show file tree
Hide file tree
Showing 37 changed files with 3,700 additions and 1,937 deletions.
Binary file modified code/gp/checkpoints/embedded-gp.pth
Binary file not shown.
4 changes: 2 additions & 2 deletions code/gp/data.py
Original file line number Diff line number Diff line change
Expand Up @@ -23,9 +23,9 @@ def load_dataset(
logger = logging.getLogger(__name__)

# Path for university computers
# path = ["/", "vol", "ecrg-solar", "woodj4", "fishy-business", "data", "REIMS_data.xlsx"]
path = ["/", "vol", "ecrg-solar", "woodj4", "fishy-business", "data", "REIMS_data.xlsx"]
# Path for home computer
path = ["~/", "Desktop", "fishy-business", "data", "REIMS_data.xlsx"]
# path = ["~/", "Desktop", "fishy-business", "data", "REIMS_data.xlsx"]

path = os.path.join(*path)

Expand Down
Binary file modified code/gp/figures/tree-0.pdf
Binary file not shown.
Binary file modified code/gp/figures/tree-1.pdf
Binary file not shown.
Binary file modified code/gp/figures/tsne.png
Loading
Sorry, something went wrong. Reload?
Sorry, we cannot display this file.
Sorry, this file is invalid so it cannot be displayed.
2 changes: 1 addition & 1 deletion code/gp/grid.sh
Original file line number Diff line number Diff line change
Expand Up @@ -38,4 +38,4 @@ python3 main.py \
--dataset "${DATASET}" \
--file-path "checkpoints/run_${i}.pth" \
--output "logs/${DATASET}/${NAME}/run" --run "${i}" \
--generations 100 --beta 1;
--generations 500 --beta 1;
132 changes: 20 additions & 112 deletions code/gp/logs/instance-recognition/tmp/run_1.log
Original file line number Diff line number Diff line change
Expand Up @@ -3,115 +3,23 @@ INFO:__main__:No model found. Train from scratch.
INFO:gp: fitness size
------------------------------------------------------------------------- -----------------------------------------------
gen nevals avg gen max min nevals std avg gen max min nevals std
0 1023 0.499896 0 0.729926 0.278353 1023 0.0631878 3.47996 0 7 2 1023 1.33797
INFO:gp:1 785 0.551801 1 0.729926 0.285014 785 0.0650061 3.71065 1 12 2 785 1.47523
INFO:gp:2 805 0.587259 2 0.77563 0.376567 805 0.0724732 4.04106 2 13 2 805 1.63308
INFO:gp:3 754 0.620295 3 0.822639 0.321003 754 0.0794282 4.49169 3 11 2 754 1.83314
INFO:gp:4 749 0.646424 4 0.826476 0.336167 749 0.0873193 4.78104 4 11 2 749 2.05329
INFO:gp:5 770 0.672003 5 0.853044 0.273277 770 0.0879586 4.91398 5 14 1 770 2.19446
INFO:gp:6 772 0.691333 6 0.855272 0.351144 772 0.0970087 5.22972 6 14 2 772 2.40784
INFO:gp:7 752 0.706595 7 0.887111 0.255612 752 0.108274 5.58162 7 14 2 752 2.41714
INFO:gp:8 762 0.730524 8 0.887111 0.309611 762 0.112222 5.64614 8 15 2 762 2.20855
INFO:gp:9 781 0.748268 9 0.891581 0.379341 781 0.124223 5.76442 9 14 2 781 1.81291
INFO:gp:10 777 0.768843 10 0.891612 0.182184 777 0.129422 5.95308 10 16 2 777 1.63112
INFO:gp:11 786 0.791127 11 0.891612 0.315671 786 0.126525 5.74096 11 13 3 786 1.66569
INFO:gp:12 736 0.812931 12 0.902041 0.387927 736 0.116605 5.47703 12 12 3 736 1.75565
INFO:gp:13 784 0.790891 13 0.902041 0.356561 784 0.138161 4.97752 13 14 3 784 1.57028
INFO:gp:14 774 0.787596 14 0.902041 0.406704 774 0.140267 4.85924 14 11 3 774 1.42788
INFO:gp:15 762 0.784126 15 0.902041 0.45421 762 0.148206 4.87683 15 12 3 762 1.38575
INFO:gp:16 790 0.767919 16 0.902041 0.394293 790 0.161311 4.81134 16 11 3 790 1.32816
INFO:gp:17 771 0.775909 17 0.902041 0.36569 771 0.158981 4.88368 17 11 3 771 1.54476
INFO:gp:18 776 0.775508 18 0.902041 0.38804 776 0.159153 4.73607 18 11 3 776 1.49706
INFO:gp:19 767 0.776436 19 0.902921 0.356761 767 0.157556 4.72825 19 14 3 767 1.54706
INFO:gp:20 784 0.764664 20 0.903957 0.427594 784 0.162711 5.10068 20 14 3 784 1.51713
INFO:gp:21 774 0.775559 21 0.903957 0.46528 774 0.161416 5.22287 21 14 3 774 1.39093
INFO:gp:22 764 0.787506 22 0.903957 0.476466 764 0.154038 5.28739 22 14 3 764 1.33217
INFO:gp:23 792 0.780598 23 0.903957 0.413152 792 0.160149 5.31769 23 14 3 792 1.42652
INFO:gp:24 787 0.77228 24 0.903957 0.419493 787 0.163477 5.43793 24 13 3 787 1.61427
INFO:gp:25 772 0.7851 25 0.903957 0.402249 772 0.15685 5.5523 25 11 3 772 1.6938
INFO:gp:26 781 0.785279 26 0.90438 0.454422 781 0.156967 5.66569 26 12 3 781 1.80681
INFO:gp:27 781 0.795155 27 0.90438 0.454817 781 0.153898 6.1437 27 14 3 781 1.74635
INFO:gp:28 763 0.778379 28 0.904828 0.465863 763 0.164713 6.75562 28 15 3 763 2.02692
INFO:gp:29 771 0.803268 29 0.915668 0.441353 771 0.14838 7.56305 29 17 3 771 2.3571
INFO:gp:30 757 0.791363 30 0.915668 0.49811 757 0.158352 8.59629 30 16 3 757 2.48756
INFO:gp:31 783 0.786466 31 0.915668 0.442115 783 0.15957 9.06354 31 19 3 783 2.62086
INFO:gp:32 791 0.798125 32 0.916515 0.477159 791 0.152607 8.96383 32 25 3 791 2.99195
INFO:gp:33 760 0.808712 33 0.916515 0.394553 760 0.147836 8.84555 33 20 3 760 3.18225
INFO:gp:34 780 0.800468 34 0.917121 0.37862 780 0.15466 9.61193 34 21 3 780 3.28731
INFO:gp:35 768 0.806743 35 0.917774 0.444445 768 0.153214 10.6676 35 24 3 768 3.45803
INFO:gp:36 747 0.815084 36 0.917774 0.425593 747 0.150938 11.2776 36 22 3 747 3.65003
INFO:gp:37 762 0.805015 37 0.917774 0.470463 762 0.157473 11.8915 37 22 3 762 3.88205
INFO:gp:38 762 0.815959 38 0.91811 0.494399 762 0.153014 12.4125 38 29 3 762 3.76171
INFO:gp:39 778 0.804355 39 0.918727 0.477109 778 0.161105 12.5777 39 31 3 778 3.3516
INFO:gp:40 744 0.814563 40 0.918727 0.421569 744 0.154071 12.4897 40 23 3 744 3.17179
INFO:gp:41 735 0.820401 41 0.919245 0.498874 735 0.149553 12.1408 41 24 3 735 2.89161
INFO:gp:42 748 0.809638 42 0.919245 0.43804 748 0.156432 11.8993 42 27 3 748 2.7748
INFO:gp:43 760 0.815925 43 0.92088 0.442004 760 0.154297 11.8837 43 23 3 760 2.98172
INFO:gp:44 783 0.808706 44 0.92088 0.37383 783 0.156972 12.1075 44 25 3 783 2.96314
INFO:gp:45 757 0.810498 45 0.92088 0.425659 757 0.155846 12.3421 45 24 3 757 2.88832
INFO:gp:46 807 0.806136 46 0.92088 0.497953 807 0.1579 12.6149 46 26 3 807 3.15241
INFO:gp:47 775 0.800199 47 0.92088 0.358201 775 0.162218 12.9404 47 27 3 775 3.50023
INFO:gp:48 779 0.818596 48 0.92088 0.442004 779 0.153381 13.9814 48 31 3 779 3.50375
INFO:gp:49 783 0.812757 49 0.92088 0.471795 783 0.158625 15.0958 49 31 3 783 3.64594
INFO:gp:50 797 0.810127 50 0.92088 0.452573 797 0.160969 15.6872 50 31 3 797 3.37519
INFO:gp:51 787 0.815338 51 0.920889 0.497655 787 0.158264 15.912 51 31 3 787 3.31678
INFO:gp:52 793 0.815281 52 0.920889 0.468073 793 0.155715 16.0166 52 31 3 793 3.2598
INFO:gp:53 765 0.808734 53 0.920889 0.458882 765 0.161774 16.1584 53 31 3 765 3.352
INFO:gp:54 797 0.821923 54 0.922018 0.479821 797 0.154776 16.1779 54 31 3 797 3.32172
INFO:gp:55 779 0.813846 55 0.922018 0.48392 779 0.157753 16.1183 55 31 3 779 3.32041
INFO:gp:56 766 0.814719 56 0.922018 0.497403 766 0.157206 16.1496 56 31 3 766 3.2666
INFO:gp:57 784 0.814296 57 0.922613 0.405596 784 0.157644 16.2454 57 31 3 784 3.4612
INFO:gp:58 758 0.821684 58 0.922613 0.420947 758 0.152866 16.3226 58 32 3 758 3.25483
INFO:gp:59 780 0.825515 59 0.922613 0.457244 780 0.150834 16.2737 59 31 3 780 3.71299
INFO:gp:60 753 0.81846 60 0.922957 0.497423 753 0.156388 16.3715 60 26 3 753 3.14754
INFO:gp:61 769 0.805389 61 0.922957 0.438971 769 0.164625 16.5914 61 31 3 769 3.24123
INFO:gp:62 796 0.819933 62 0.922957 0.477783 796 0.153439 16.6305 62 33 3 796 3.70653
INFO:gp:63 777 0.820593 63 0.922957 0.498156 777 0.154968 17.0489 63 32 3 777 3.23434
INFO:gp:64 778 0.821147 64 0.922957 0.45629 778 0.152297 16.9326 64 31 3 778 3.12814
INFO:gp:65 777 0.810072 65 0.922957 0.473568 777 0.159908 16.3978 65 33 3 777 3.84081
INFO:gp:66 781 0.802886 66 0.922957 0.500044 781 0.165093 15.958 66 31 3 781 3.47329
INFO:gp:67 796 0.816385 67 0.922957 0.448235 796 0.154776 15.5875 67 29 3 796 3.41177
INFO:gp:68 770 0.816612 68 0.922957 0.430903 770 0.157546 15.5865 68 27 3 770 2.93229
INFO:gp:69 773 0.824161 69 0.922986 0.476993 773 0.150116 15.4848 69 29 3 773 3.52196
INFO:gp:70 776 0.82461 70 0.922986 0.424497 776 0.149877 15.3558 70 29 3 776 3.43327
INFO:gp:71 763 0.829958 71 0.922986 0.489627 763 0.146498 15.433 71 29 3 763 3.29744
INFO:gp:72 766 0.814624 72 0.923311 0.45761 766 0.156706 15.4389 72 29 3 766 3.34961
INFO:gp:73 763 0.821233 73 0.923311 0.498051 763 0.154049 15.4907 73 29 3 763 3.17672
INFO:gp:74 764 0.82622 74 0.923311 0.479651 764 0.148043 15.1887 74 29 3 764 3.56185
INFO:gp:75 738 0.827625 75 0.923311 0.498947 738 0.149131 15.3705 75 29 3 738 3.31125
INFO:gp:76 783 0.827134 76 0.923311 0.476575 783 0.146819 15.4242 76 29 3 783 3.13743
INFO:gp:77 807 0.820368 77 0.923311 0.396266 807 0.147946 15.4203 77 29 3 807 3.1167
INFO:gp:78 770 0.824512 78 0.923589 0.47691 770 0.144311 15.3627 78 27 3 770 3.1425
INFO:gp:79 778 0.822427 79 0.923589 0.485568 778 0.146216 15.4907 79 27 3 778 2.84216
INFO:gp:80 799 0.830577 80 0.923957 0.498294 799 0.143233 15.3324 80 29 3 799 3.21759
INFO:gp:81 783 0.826847 81 0.923957 0.500109 783 0.143779 15.4751 81 29 3 783 3.28227
INFO:gp:82 800 0.822555 82 0.923957 0.469638 800 0.14745 15.3656 82 27 3 800 3.21003
INFO:gp:83 784 0.830225 83 0.923957 0.49219 784 0.141352 15.3675 83 29 3 784 3.28476
INFO:gp:84 782 0.824755 84 0.923957 0.450296 782 0.147844 15.3128 84 27 3 782 3.44399
INFO:gp:85 770 0.822191 85 0.923957 0.4955 770 0.150371 15.5152 85 29 3 770 2.98178
INFO:gp:86 784 0.823876 86 0.923957 0.472628 784 0.150861 15.2405 86 29 3 784 3.40875
INFO:gp:87 786 0.822881 87 0.923957 0.478847 786 0.151234 15.3998 87 29 3 786 3.26441
INFO:gp:88 774 0.826019 88 0.924319 0.499171 774 0.149194 15.2561 88 27 3 774 3.21666
INFO:gp:89 761 0.817204 89 0.924319 0.498709 761 0.155806 15.2571 89 29 3 761 3.18927
INFO:gp:90 765 0.80818 90 0.924319 0.471696 765 0.160777 15.5093 90 29 3 765 3.05596
INFO:gp:91 775 0.816553 91 0.924369 0.451294 775 0.155724 15.3627 91 29 3 775 3.22812
INFO:gp:92 782 0.81918 92 0.924681 0.48183 782 0.152258 15.2766 92 29 3 782 3.35526
INFO:gp:93 759 0.816147 93 0.924681 0.415834 759 0.154289 15.4203 93 29 3 759 3.17204
INFO:gp:94 781 0.828013 94 0.924681 0.382163 781 0.145992 15.2434 94 29 3 781 3.28157
INFO:gp:95 800 0.823131 95 0.924681 0.454791 800 0.151609 15.1398 95 29 3 800 3.34626
INFO:gp:96 773 0.817946 96 0.924681 0.474667 773 0.155595 15.2981 96 29 3 773 3.2642
INFO:gp:97 782 0.828821 97 0.924681 0.464878 782 0.147034 15.3324 97 29 3 782 3.13854
INFO:gp:98 780 0.821998 98 0.924681 0.49848 780 0.152802 15.2551 98 27 3 780 2.9793
INFO:gp:99 789 0.829959 99 0.924681 0.447873 789 0.14947 15.347 99 29 3 789 3.18013
INFO:gp:100 744 0.837835 100 0.924681 0.428076 744 0.144555 15.303 100 27 3 744 3.00538
INFO:__main__:Saving model to file: checkpoints/run_1.pth
INFO:util:Train accuracy: 1.0 +- 0.0
INFO:util:Val accuracy: 1.0 +- 0.0
INFO:util:Test accuracy: 1.0 +- 0.0
INFO:util:Train intra-class: 0.1305769204334566, Train Inter-class: 0.3773865582725305
INFO:util:Val intra-class: 0.1123204351434035, Val Inter-class: 0.31193940042327534
INFO:util:Test intra-class: 0.14803360666984905, Test inter-class: 0.38580807192647343
INFO:plot:Saving t-SNE to file: figures/tsne.png
INFO:plot:Saving t-SNE to file: figures/tsne.png
INFO:plot:Saving tree to file: figures/tree-0.pdf
INFO:plot:Saving tree to file: figures/tree-1.pdf
0 1023 0.501868 0 0.613544 0.407531 1023 0.0299968 3.47996 0 7 2 1023 1.33797
INFO:gp:1 765 0.519066 1 0.629001 0.379208 765 0.0333159 3.54057 1 10 2 765 1.40244
INFO:gp:2 802 0.527865 2 0.64198 0.420283 802 0.0367809 3.72825 2 12 2 802 1.51643
INFO:gp:3 758 0.540622 3 0.64198 0.431184 758 0.0400244 3.95406 3 13 2 758 1.75722
INFO:gp:4 767 0.548524 4 0.659949 0.408758 767 0.0440856 4.05279 4 14 2 767 1.80668
INFO:gp:5 774 0.55574 5 0.659949 0.407226 774 0.0463681 4.08993 5 14 2 774 1.81601
INFO:gp:6 770 0.562576 6 0.666257 0.402521 770 0.0478876 4.07038 6 12 2 770 1.66175
INFO:gp:7 774 0.567189 7 0.679979 0.415782 774 0.0495709 3.92278 7 11 1 774 1.48334
INFO:gp:8 759 0.575886 8 0.679979 0.434176 759 0.0488082 3.82111 8 11 1 759 1.39271
INFO:gp:9 771 0.579987 9 0.679979 0.422666 771 0.049332 3.652 9 9 1 771 1.3346
INFO:gp:10 775 0.58472 10 0.679979 0.423257 775 0.051532 3.65787 10 11 2 775 1.31249
INFO:gp:11 783 0.58861 11 0.679979 0.425068 783 0.0493266 3.66862 11 9 2 783 1.31264
INFO:gp:12 770 0.595577 12 0.679979 0.43038 770 0.0481819 3.69501 12 13 2 770 1.40827
INFO:gp:13 766 0.592787 13 0.68333 0.413487 766 0.0521365 3.61095 13 9 2 766 1.2534
INFO:gp:14 801 0.596988 14 0.684711 0.423177 801 0.0473713 3.71261 14 11 2 801 1.37265
INFO:gp:15 756 0.600406 15 0.684711 0.41957 756 0.049307 3.75758 15 9 2 756 1.344
INFO:gp:16 781 0.60127 16 0.684711 0.438588 781 0.0489692 3.81329 16 11 2 781 1.44679
INFO:gp:17 777 0.601838 17 0.710934 0.450248 777 0.0492369 3.94721 17 10 1 777 1.50566
INFO:gp:18 768 0.604534 18 0.710934 0.405953 768 0.0498751 3.80547 18 13 1 768 1.4525
INFO:gp:19 777 0.604765 19 0.710934 0.410136 777 0.0490415 3.8348 19 12 2 777 1.49521
Loading

0 comments on commit af41014

Please sign in to comment.