-
Notifications
You must be signed in to change notification settings - Fork 5.3k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Tutorial/examples for using docker-compose code as a python module in a python script #4542
Comments
Hi @pdonorio , The reason there is no tutorial or documentation about using docker-compose as a library is that we do not officially support this usecase, or make any guarantees of a stable API of any kind. If you choose to do so despite this, please be mindful of the risk it entails. |
Hi, thanks for the clarification. Maybe it is just my opinion but that risk you underlined is kind of part of the Docker ecosystem core, right? By trying to be always innovative Docker has come to break logics and behaviour a few times already, I know that on my own skin. Since compose is using versioning for the normal usage of the tool, wouldn't be a solution to do the same for this use case? You could provide a piece of example to load a project into python with your library and warn the developer to stick to a package version to avoid disruption. If that is not what you guys think at least consider to add your statement into the docs with a "for developers" section ;) |
Even a "for developers" section would be great, being able to directly """"compose up"""" a service would be amazing. |
here is how I've used it, |
Just basic functioality access would be great. We could embrace a tighter integration than just ugly forking system processes and parsing its stdout. |
Hi all, just for the record, Since this piece of code is starting to work (thank you @fruch!) I thought of sharing here if it may help. It works with pip3 install rapydo-controller==0.5.2
# very beta release, stick with this version as new ones may change API Then if you run from a folder where you have a >>> from controller.compose import Compose
>>> dc = Compose(files=['docker-compose.yml'])
>>> options = {
'-q': None
}
>>> dc.command('ps', options)
2017-07-28 15:44:31,428 [INFO controller.compose:89] Requesting within compose: 'ps'
Name Command State Ports
------------------------------
2017-07-28 15:44:31,464 [VERY_VERBOSE controller.compose:103] Executed compose ps w/{'-q': None, 'SERVICE': []} You can change the command and the options for any |
cool, nice to see I've helped in some way.
easier for understanding and for adding/trying stuff. If compose is treated as commandline only tool, by the development team, we might want to stick to that... |
@shin- can we have a bit more info why this one was closed ? |
The original question was answered here: #4542 (comment) As this is a use case we don't want to provide support for going forward, I consider this to be resolved. Of course, Compose remains an open source project and you're welcome to leverage it in any way you find useful! |
fair enough, thanks you. |
This patch introduces the usage of docker-compose library, to perform the `up` and `down` commands. It avoids to fork subprocesses and the pytest fixture can use the compose.Project object to deal with the docker-compose.yml configuration and your containers. In order to replace subprocess calls by the usage of the docker-compose library it is required to update the current tests. This patch replaces the mock of subprocess by the direct call of `docker-compose`. I hope that github CI supports docker to support these tests. Tests about DockerComposeExecutor are removed, the next patch will rework the DockerComposeExecutor and the Services classes.
@shin- It seems like it would be a pretty minimal inconvenience to support this use case. You already have a public interface, it seems like it could be relatively straightforward to expose it as a library (with the CLI calling directly into it), and the library could have all of the same semantics and stability guarantees as the CLI. Of course, I'm not a contributor; just throwing this out there as something to think about. |
For anyone interested, I made a lib that allows you to call docker commands. Behind the scenes, it calls subprocess. A minimal example:
The docs: https://gabrieldemarmiesse.github.io/python-on-whales |
Hi @shin- . It has been a while but I am v curious about why this is not supported, though docker and swarm are. |
Now that docker compose is a subcommand of docker ie we have to run "docker compose ..." instead of "docker-compose", it seems reasonable to expect that the docker-py module would provide a "compose" API. Any plans? |
Hi there,
I've been looking quite a lot for this and can't find any answer neither in the documentation nor anywhere else, so I'm opening an issue to see ask for directions.
I need to write a good python script as helper to get start in my framework which is heavily based on docker containers. I am looking into
docker-py
that is quite great since it integrates also with the swarm mode.That said, the problem starts when using docker-compose yaml files inside the framework and allowing the developer to add and customize theirs.
So docker-py doesn't read any YAML and it doesn't help in validating or getting pieces in a standard way or as docker specify.
The idea that came to mind was to use docker-compose library functions to do what I need. I wrote some code a few months ago based on the above, and kinda worked. Shouldn't there exist some guidelines/examples for those interested in wrapping, extending or using docker-compose python functions inside their python code?
Thanks for your great work and any support.
The text was updated successfully, but these errors were encountered: