-
Notifications
You must be signed in to change notification settings - Fork 562
Documentation for repository structure #956
Comments
It certainly is possible to run the Minigo pipeline locally, though I have personally never done so :) Your understanding of the codebase is correct. The selfplay is all done in C++ by It sounds like the
Please let us know how you get on or if you have any questions, we'll be happy to help. Good luck! |
So, just to confirm, if I run instructions from
with (found in
In theory I would be at least barking at the right tree? |
Yep, that looks like the correct tree. I do recommend trying on 9x9 before 19x19 first though, it's around 10x faster:
You may also want to change
|
Hi
Firstly, thanks for all the hard work on AlphaZero reproduction!
Is it possible to run C++ implementation locally (w/o Google Cloud cluster)?
I'm trying to run Minigo C++ locally mostly for learning purposes. After exploring the repo it's not completely clear to me what is the purpose of different folders, what is the interplay between Python/C++ implementations and what would be main entry point to train locally.
So far I figured (perhaps incorrectly):
cc
- is this fully stand alone implementation? Can it be used to train a model by usingconcurrent_selfplay
as an entry point? Does it talk to Python in any way?cluster
- looks like Kubernetes stuff, can be ignored when running locally (?)ml_perf
- the scriptstart_selfplay.sh
calls C++concurrent_selfplay
, buttrain.py
calls Python? I guess this is a MLPerf wrapper and is not required when running locally?rl_loop
- looks like some kind of wrapper?.py
files inminigo
folder - looks like python implementation? Is it fully independent, or does it talk to C++ in any way?My current overall theory is that self-play (and test-games?) can be run either with C++ or Python, but training neural network can only be executed in python. Wrappers take care or switching between the two at the right times?
It seems I need to bootstrap, then self-play/train in a loop. The followup issue is that there are multiple bootstrap, selfplay and train scripts across the repository, some of them are wrappers around others, and it is non-obvious to me which folder contains the "master" training loop for local c++ end-to-end execution (if it is possible at all?).
Thanks in advance, will keep digging in the mean time.
The text was updated successfully, but these errors were encountered: