A client library for accessing Longship API
First, create a client:
from longship_api_client import Client
client = Client(base_url="https://api.example.com")
If the endpoints you're going to hit require authentication, use AuthenticatedClient
instead:
from longship_api_client import AuthenticatedClient
client = AuthenticatedClient(base_url="https://api.example.com", token="SuperSecretToken")
Now call your endpoint and use your models:
from longship_api_client.models import MyDataModel
from longship_api_client.api.my_tag import get_my_data_model
from longship_api_client.types import Response
my_data: MyDataModel = get_my_data_model.sync(client=client)
# or if you need more info (e.g. status_code)
response: Response[MyDataModel] = get_my_data_model.sync_detailed(client=client)
Or do the same thing with an async version:
from longship_api_client.models import MyDataModel
from longship_api_client.api.my_tag import get_my_data_model
from longship_api_client.types import Response
my_data: MyDataModel = await get_my_data_model.asyncio(client=client)
response: Response[MyDataModel] = await get_my_data_model.asyncio_detailed(client=client)
By default, when you're calling an HTTPS API it will attempt to verify that SSL is working correctly. Using certificate verification is highly recommended most of the time, but sometimes you may need to authenticate to a server (especially an internal server) using a custom certificate bundle.
client = AuthenticatedClient(
base_url="https://internal_api.example.com",
token="SuperSecretToken",
verify_ssl="/path/to/certificate_bundle.pem",
)
You can also disable certificate validation altogether, but beware that this is a security risk.
client = AuthenticatedClient(
base_url="https://internal_api.example.com",
token="SuperSecretToken",
verify_ssl=False
)
There are more settings on the generated Client
class which let you control more runtime behavior, check out the docstring on that class for more info.
Things to know:
-
Every path/method combo becomes a Python module with four functions:
sync
: Blocking request that returns parsed data (if successful) orNone
sync_detailed
: Blocking request that always returns aRequest
, optionally withparsed
set if the request was successful.asyncio
: Likesync
but async instead of blockingasyncio_detailed
: Likesync_detailed
but async instead of blocking
-
All path/query params, and bodies become method arguments.
-
If your endpoint had any tags on it, the first tag will be used as a module name for the function (my_tag above)
-
Any endpoint which did not have a tag will be in
longship_api_client.api.default
This project uses Poetry to manage dependencies and packaging. Here are the basics:
- Update the metadata in pyproject.toml (e.g. authors, version)
- If you're using a private repository, configure it with Poetry
poetry config repositories.<your-repository-name> <url-to-your-repository>
poetry config http-basic.<your-repository-name> <username> <password>
- Publish the client with
poetry publish --build -r <your-repository-name>
or, if for public PyPI, justpoetry publish --build
If you want to install this client into another project without publishing it (e.g. for development) then:
- If that project is using Poetry, you can simply do
poetry add <path-to-this-client>
from that project - If that project is not using Poetry:
- Build a wheel with
poetry build -f wheel
- Install that wheel from the other project
pip install <path-to-wheel>
- Build a wheel with
This client was generated with openapi-python-client
and to update it to the latest version of the Longship API, you need to download the Swagger configuration to a JSON file and run following in a directory one level above this repo:
openapi-python-client update --path longship-api-client/fixtures/longship_24-06-23.json