LLM-based modification for a documentation in a code repository/file.
Step 1: Start an instance of the MLflow AI Gateway - tutorial
- Open a new terminal window and cd into the above directory. This terminal window serves the gateway on local host and must remain open while using the service.
- Install the mlflow gateway via
pip install 'mlflow[gateway]'
- Expose your LLM token via
export OPENAI_API_KEY=XYZ
- Start the gateway
mlflow deployments start-server --config-path ./gateway/config.yaml
python cli.py --read-file-path=/path/to/file --overwrite=true --gateway-route-name=chat-gpt-4