Skip to content

simoncarrignon/abc-pandora

Repository files navigation

PANDORA ABC

Intro

Simple readme to do ABC on various architectures (at least MN4,tentative of Nord3 too) with ceeculture module of pandora.

Dependencies

Internal Tools

  • Obviously you need pandora and ceeculture
  • you will need the script that compute the score of the experience:
    • for this version of ceeculture we utilise a Rscript include in ceeculture
       ln -s ~/ceeculture/AnalyseTools/computeScore.R .
  • we dont need nothing else as greasy works also very well with SLURM, so we dont need anymore mn4_manual_scheduling
ln -s ~/mn_tools/mn4_manual_scheduling.sh .

usage exemple

So more or less you have everything. Obviously that wont work yet but let's imagine it will, we will run the experiments in a new folder. Dont forget that ABC needs lot of space, as it will store millions of folder, config files, and results. It may be wise to use a place like $SCRATCH for your experiments (if your in marenotrum, if not do what you want, that's your hard drive)

let's do a new folder:

mkdir $SCRATCH/test 
cd $SCRATCH/test

If you haven't done it yet you can git clone the tools here:

git clone https://framagit.org/sc/abc-pandora/
cd abc-pandora
git checkout mn-dev #We suppose you are on marenostrum
cd ..
ln -s abc-pandora/* .

if you have installed everything already somewhere else :

ln -s ${HOME}/pandora_abc/* .

We mostly ready, in theory you are almost able to run ./main_abc.py: If you are in marenostrum don't run it in the login{1-3} node as you will be quickly killed. You can ask for a node or run a interactive job:

salloc -p interactive  
python ./main_abc.py numParticule numpart  numproc_node epsilon

where

  • numParticule: total number of particule (aka Thetas, aka set of parameter) that you to draw your distribution (the bigger, the better)
  • numpart: number of particules we will generate and check at the same
  • numproc_node: number of parallele task we run in one marenostrum node (should be < numpart as the idea is , from the numpart generated by the python script we split and run them in separated nodes)
  • epsilon: the maximum score we accept (o minimum) for the particule. ie all particule should have a score < epsilon

Now that I realized that greasy was available you can just use the 2mn4.sh script in the folder and do:

sbatch main.job-mn4 numParticule numpart  numproc_node epsilon

Test the results

Once you are satisfied with a list of particle with epsilon < N, you can rerun more experiments using those thetas. To do so use the script rerun.py

python rerun.py numParticule numpart_pernode pref epsilon

this will create numParticule/numpart_pernode taskfiles and a folder pref-rerun with all the experiments listed in the taskfiles.

You can then simply send the files to marenostrum via greasy using rerun.job-mn4

the full setup would be:

python rerun.py 10000 768 48 "tenthousand" result_0.0101072379299.csv #generate a bunch of taskfile 
for i in `seq 1 10`; do sbatch rerun.job-mn4 rerun-tenthousand.task.$i ; done #launch a bunch of this bunch one shot

About

No description, website, or topics provided.

Resources

License

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published