Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Tutorial/examples for using docker-compose code as a python module in a python script #4542

Closed
pdonorio opened this issue Feb 28, 2017 · 14 comments

Comments

@pdonorio
Copy link

Hi there,
I've been looking quite a lot for this and can't find any answer neither in the documentation nor anywhere else, so I'm opening an issue to see ask for directions.

I need to write a good python script as helper to get start in my framework which is heavily based on docker containers. I am looking into docker-py that is quite great since it integrates also with the swarm mode.

That said, the problem starts when using docker-compose yaml files inside the framework and allowing the developer to add and customize theirs.
So docker-py doesn't read any YAML and it doesn't help in validating or getting pieces in a standard way or as docker specify.

The idea that came to mind was to use docker-compose library functions to do what I need. I wrote some code a few months ago based on the above, and kinda worked. Shouldn't there exist some guidelines/examples for those interested in wrapping, extending or using docker-compose python functions inside their python code?

Thanks for your great work and any support.

@shin-
Copy link

shin- commented Feb 28, 2017

Hi @pdonorio ,

The reason there is no tutorial or documentation about using docker-compose as a library is that we do not officially support this usecase, or make any guarantees of a stable API of any kind. If you choose to do so despite this, please be mindful of the risk it entails.

@pdonorio
Copy link
Author

pdonorio commented Mar 1, 2017

Hi, thanks for the clarification.

Maybe it is just my opinion but that risk you underlined is kind of part of the Docker ecosystem core, right? By trying to be always innovative Docker has come to break logics and behaviour a few times already, I know that on my own skin.

Since compose is using versioning for the normal usage of the tool, wouldn't be a solution to do the same for this use case? You could provide a piece of example to load a project into python with your library and warn the developer to stick to a package version to avoid disruption.

If that is not what you guys think at least consider to add your statement into the docs with a "for developers" section ;)

@matheus1lva
Copy link

Even a "for developers" section would be great, being able to directly """"compose up"""" a service would be amazing.

@fruch
Copy link

fruch commented Mar 28, 2017

here is how I've used it,
I guess there are better ways to do it, comments are welcomed:
https://github.com/fruch/doorman/blob/master/tests/integration/conftest.py

@jairov4
Copy link

jairov4 commented Jun 14, 2017

Just basic functioality access would be great. We could embrace a tighter integration than just ugly forking system processes and parsing its stdout.

@pdonorio
Copy link
Author

Hi all, just for the record,
I opened this issue since we have many projects depending on docker AND docker-compose so I was trying to create a python package to control these environments easily for us but most of all for our user (basically with the idea of them not necessarily needing how to interact with docker, letting that on our shoulders).

Since this piece of code is starting to work (thank you @fruch!) I thought of sharing here if it may help. It works with python 3.4+, here's a quick start:

pip3 install rapydo-controller==0.5.2
# very beta release, stick with this version as new ones may change API

Then if you run from a folder where you have a docker-compose.yml:

>>> from controller.compose import Compose

>>> dc = Compose(files=['docker-compose.yml'])
>>> options = {
    '-q': None
}
>>> dc.command('ps', options)

2017-07-28 15:44:31,428 [INFO    controller.compose:89] Requesting within compose: 'ps'
Name   Command   State   Ports
------------------------------
2017-07-28 15:44:31,464 [VERY_VERBOSE controller.compose:103] Executed compose ps w/{'-q': None, 'SERVICE': []}

You can change the command and the options for any docker-compose command. Beware that options do not have a default and you must specify each of them for each command...

@fruch
Copy link

fruch commented Jul 28, 2017

cool, nice to see I've helped in some way.
the default part was really a pain, I move to doing

subproccess.check_output("docker-compose ps".split())

easier for understanding and for adding/trying stuff.

If compose is treated as commandline only tool, by the development team, we might want to stick to that...

@fruch
Copy link

fruch commented Oct 18, 2017

@shin- can we have a bit more info why this one was closed ?

@shin-
Copy link

shin- commented Oct 18, 2017

The original question was answered here: #4542 (comment)

As this is a use case we don't want to provide support for going forward, I consider this to be resolved. Of course, Compose remains an open source project and you're welcome to leverage it in any way you find useful!

@fruch
Copy link

fruch commented Oct 18, 2017

fair enough, thanks you.

nfk referenced this issue in nfk/pytest-docker Mar 1, 2018
This patch introduces the usage of docker-compose library, to perform
the `up` and `down` commands. It avoids to fork subprocesses and the
pytest fixture can use the compose.Project object to deal with the
docker-compose.yml configuration and your containers.

In order to replace subprocess calls by the usage of the docker-compose
library it is required to update the current tests. This patch
replaces the mock of subprocess by the direct call of `docker-compose`.
I hope that github CI supports docker to support these tests.

Tests about DockerComposeExecutor are removed, the next patch will
rework the DockerComposeExecutor and the Services classes.
@ns-cweber
Copy link

@shin- It seems like it would be a pretty minimal inconvenience to support this use case. You already have a public interface, it seems like it could be relatively straightforward to expose it as a library (with the CLI calling directly into it), and the library could have all of the same semantics and stability guarantees as the CLI. Of course, I'm not a contributor; just throwing this out there as something to think about.

@gabrieldemarmiesse
Copy link

For anyone interested, I made a lib that allows you to call docker commands. Behind the scenes, it calls subprocess.

A minimal example:

from python_on_whales import docker

docker.compose.build()
docker.compose.up(detach=True)

list_of_containers = docker.compose.ps()

docker.compose.down()

The docs: https://gabrieldemarmiesse.github.io/python-on-whales

@kchawla-pi
Copy link

Hi @shin- . It has been a while but I am v curious about why this is not supported, though docker and swarm are.

@schollii
Copy link

Now that docker compose is a subcommand of docker ie we have to run "docker compose ..." instead of "docker-compose", it seems reasonable to expect that the docker-py module would provide a "compose" API. Any plans?

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Projects
None yet
Development

No branches or pull requests

9 participants