Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

[bug] Bug Report: 404 Error Using Guardrails Server with Azure OpenAI API #1170

Closed
OrenRachmil opened this issue Nov 18, 2024 · 1 comment
Closed
Labels
bug Something isn't working duplicate This issue or pull request already exists

Comments

@OrenRachmil
Copy link

Describe the bug
I am trying to use the Guardrails AI server framework with the Azure OpenAI API to validate LLM output. Despite modifying the example to fit Azure OpenAI, I encounter a 404 - {'detail': 'Not Found'} error. Below are the details of my setup, code, and debugging steps.

I started the Guardrails server with the following command:
guardrails start --config config.py
I used the following Python code to connect to the Guardrails server and proxy my Azure OpenAI API request:

from openai import AzureOpenAI
# Set up Azure OpenAI client with Guardrails server acting as the endpoint
client = AzureOpenAI(
    azure_endpoint="http://localhost:8000/guards/gibberish_guard/openai/v1",
    api_key='api-key',
    api_version="2024-09-01-preview"
)

# Send request through the Guardrails proxy
response = client.chat.completions.create(
    model="model name",
    messages=[{
        "role": "user",
        "content": "Make up some gibberish for me please!"
    }]
)

# Access the validated response
print(response.choices[0].message.content)
print(response.guardrails['validation_passed'])

When running this code, I get the following error:
openai.NotFoundError: Error code: 404 - {'detail': 'Not Found'}

Request for Assistance

  1. Is there an example of using Guardrails with the Azure OpenAI API, or specific adjustments required to make them compatible?
  2. How can I configure Guardrails to handle Azure OpenAI properly with custom validation routes?
  3. I would also like to integrate the server with litellm to access other models beyond OpenAI. Is there documentation or guidance for such integration?
@OrenRachmil OrenRachmil added the bug Something isn't working label Nov 18, 2024
@CalebCourier CalebCourier added the duplicate This issue or pull request already exists label Nov 21, 2024
@CalebCourier
Copy link
Collaborator

Duplicate of #1159

@OrenRachmil since you're already active in the comments on the above issue, let's continue the conversation over there.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
bug Something isn't working duplicate This issue or pull request already exists
Projects
None yet
Development

No branches or pull requests

2 participants