Brand new codespace environment will
- install VSCode extensions and Python modules
- create Ollama container having single SMALL ;-) Large Language Model
- call Ollama completion service and save output in the file
- stops the container
How to test it:
- fork this repository
- create new Codespace, give it 4 cores to make it faster
- use Shift-WIN/CMD-P and "View creation log" to check the test progress
- wait until OLLAMA.md file appears (it should in ~3 minutes)