Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Problem accessing azure opeani #60

Open
mladencucakSYN opened this issue Jan 21, 2024 · 16 comments
Open

Problem accessing azure opeani #60

mladencucakSYN opened this issue Jan 21, 2024 · 16 comments

Comments

@mladencucakSYN
Copy link

I am trying to use your package but it is not possible for me as we have a private instance for my company. It would be very helpful if you could adapt the way URLs are created. The code below provides an example of how I can construct a URL for chat (I have not implemented the version for embeddings yet). It should be possible to have different versions.
In your implementation, it seems like the problem is how the [URL construction](https://github.com/JamesHWade/gpttools/bThe code below provides an lob/42f140acd7d91c439bd04dad7f5e23ac67c7fc42/R/azure_openai.R)

The code below provides an example of how I construct a URL for chat (I have not implemented the version for embeddings yet).

    endpoint =   Sys.getenv("GENAI_SYN_GPT4TURBO_ENDPOINT")
    deployment =  Sys.getenv("GENAI_SYN_GPT4TURBO_DEPLOYMENT_NAME")
    api_version =  Sys.getenv("GENAI_SYN_GPT4TURBO_API_VERSION")
    if(is.null(api_key))
      api_key <-  Sys.getenv("GENAI_SYN_GPT4TURBO_KEY")

    url <- glue::glue('https://{endpoint}.openai.azure.com/openai/deployments/{deployment}/chat/completions?api-version={api_version}')
@JamesHWade JamesHWade mentioned this issue Jan 22, 2024
15 tasks
@JamesHWade
Copy link
Owner

Please let me know if you are able to get the Azure OpenAI working with the latest version. You'll need to change the name of the environmental variables in your example to match those listed here https://jameshwade.github.io/gpttools/articles/azure.html.

@mladencucakSYN
Copy link
Author

Thanks James. Do you have a direct way(function) to check access, please?

@JamesHWade
Copy link
Owner

The best approach for now is to use gptstudio::gptstudio_sitrep(). The set is the same for gpttools. If it works for gptstudio, it shoud work for gpttools.

Please let me know if it doesn't work. It's difficult for me to do much testing with Azure OpenAI since I don't have constant access to it.

@mladencucakSYN
Copy link
Author

Hey @JamesHWade Thanks, This slipped my mind for a day or two.
I have aligned the names of env variables as recommended and no luck yet.
I understand your problem (the lack of access). Nots sure how to proceed debugging this. perhaps if you link here the function and internals for constructing the aPI call maybe I can inspect? O should i just fetch the repo? Would be nice to make this work as you are doing a lot of interesting developments here.

@mladencucak
Copy link

Any progress on this?

@JamesHWade
Copy link
Owner

Not really. I don't have the ability to test Azure OpenAI, so it's very difficult for me to debug.

I'm working on getting access but don't have it yet.

@mladencucak
Copy link

Sure. Maybe i can help if there is something specific you would need for testing, or point me out some direction. Dont really have the time to figure all ins and out of thep ackage.
I'm sure they would sponsor access to api for package developer...

@JamesHWade
Copy link
Owner

Can you please give it another try. It now works for me (I got access to Azure OpenAI).

@mladencucak
Copy link

I tried these functions
`> gptstudio::gptstudio_sitrep()
Error: 'gptstudio_sitrep' is not an exported object from 'namespace:gptstudio'

gpttools::gpt_sitrep()
── OpenAI API Status ──────────────────────────────────────────────────────────────────────────────────────────────
✖ OpenAI API not validated.
── RStudio API ────────────────────────────────────────────────────────────────────────────────────────────────────
✔ Addins likely with work, rstudioapi is available.
── Settings for gpttools ──────────────────────────────────────────────────────────────────────────────────────────
ℹ Max tokens set to
ℹ Code style is set to`

Then I tried running the addin but it did not work.
`Loading required package: shiny
Warning: package 'shiny' was built under R version 4.3.2

Listening on http://127.0.0.1:4475
Warning: Error in : Model name is not a valid character scalar
85:
78: gptstudio_skeleton_build
76: observe
75:
4: shiny::runApp
3: eval
2: eval
1: .rs.sourceWithProgress`
I suppose you have specific model names that are considered as valid strings? The name of our deployment is not the same as those on the list by Azure.

@JamesHWade
Copy link
Owner

Can you share your environmental variable names?

For Azure OpenAI we use environmental variables to define the model name.

Thanks for your persistence! This is tricky to debug.

@mladencucak
Copy link

No thank you! Does your Azure setup work for you? If this does not work I might dig into the source code myself as it is difficult for you with our weird naming and god knows what is happening behind the scenes here as it is our private azure instance. See my setup below.
AZURE_OPENAI_TASK="chat/completions" AZURE_OPENAI_ENDPOINT="cprdit-openai-gpt4t" AZURE_OPENAI_DEPLOYMENT_NAME="cprdit-gpt4t" AZURE_OPENAI_KEY="" AZURE_OPENAI_API_VERSION="2023-07-01-preview"

@JamesHWade
Copy link
Owner

I think the issue might be with your endpoint environmental variable. That should be a url. Something like this: https://YOUR_RESOURCE_NAME.openai.azure.com/

I'm also realizing that I don't have embedding models setup to sure Azure OpenAI. The only two options are OpenAI and local embeddings.

@mladencucak
Copy link

like this?
AZURE_OPENAI_ENDPOINT="https://cprdit-openai-gpt4t.openai.azure.com/"
get teh same warning. how about adding some print warnings to the shiny? Publish that in dev branch?
Could you point me to the piece where you make azure calls?

@JamesHWade
Copy link
Owner

That is much longer than my url, but I don't have access to see the backend at all, unfortunately.

You can see the httr2 calls here:

https://github.com/JamesHWade/gpttools/blob/main/R/stream-azure-openai.R

You can try something like this to help you debug. The should print to the console the call it will make just before it makes it.

debug_azure_openai <- function(prompt = NULL,  use_token = Sys.getenv("AZURE_OPENAI_USE_TOKEN")) {
  messages <- list(
    list(
      role = "user",
      content = prompt
    )
  )

  body <- list(
    stream = TRUE,
    messages = messages
  )


  response <-
    httr2::request(Sys.getenv("AZURE_OPENAI_ENDPOINT")) |>
    httr2::req_url_path_append("openai/deployments") |>
    httr2::req_url_path_append(Sys.getenv("AZURE_OPENAI_DEPLOYMENT_NAME")) |>
    httr2::req_url_path_append(Sys.getenv("AZURE_OPENAI_TASK")) |>
    httr2::req_url_query("api-version" = Sys.getenv("AZURE_OPENAI_API_VERSION")) |>
    httr2::req_headers(
      "api-key" = Sys.getenv("AZURE_OPENAI_KEY"),
      "Content-Type" = "application/json"
    )

  if (use_token) {
    token <- retrieve_azure_token()
    response <- response |> httr2::req_auth_bearer_token(token = token)
  }

  response <-
    response |>
    httr2::req_body_json(data = body) |>
    httr2::req_retry(max_tries = 3) |>
    httr2::req_error(is_error = function(resp) FALSE)

  response |> httr2::req_dry_run()

  response <- response |> httr2::req_verbose() |> httr2::req_perform()

  invisible(response)
}

@mladencucak
Copy link

mladencucak commented Apr 26, 2024

Apologies for the long delay.
I can see problem here:

 token <-
    try(
      AzureRMR::get_azure_login(
        tenant = Sys.getenv("AZURE_OPENAI_TENANT_ID"),
        app = Sys.getenv("AZURE_OPENAI_CLIENT_ID"),
        scopes = ".default",
        auth_type = "client_credentials"
      )
    )
  
  if (inherits(token, "try-error")) {
    token <- AzureRMR::create_azure_login(
      tenant = Sys.getenv("AZURE_OPENAI_TENANT_ID"),
      app = Sys.getenv("AZURE_OPENAI_CLIENT_ID"),
      password = Sys.getenv("AZURE_OPENAI_CLIENT_SECRET"),
      host = "https://cognitiveservices.azure.com/",
      scopes = ".default"
    )
  }

I don't have those variables.
Also

element_callback = create_handler("openai") Error in create_handler("openai") : could not find function "create_handler"
I did make a call

>  ( response <-
+     httr2::request(Sys.getenv("AZURE_OPENAI_ENDPOINT")) |>
+     httr2::req_url_path_append("openai/deployments") |>
+     httr2::req_url_path_append(Sys.getenv("AZURE_OPENAI_DEPLOYMENT_NAME")) |>
+     httr2::req_url_path_append(Sys.getenv("AZURE_OPENAI_TASK")) |>
+     httr2::req_url_query("api-version" = Sys.getenv("AZURE_OPENAI_API_VERSION")) |>
+     httr2::req_headers(
+       "api-key" = Sys.getenv("AZURE_OPENAI_KEY"),
+       "Content-Type" = "application/json"
+     ))
<httr2_request>
GET
https://cprdit-openai-gpt4t.openai.azure.com/openai/deployments/cprdit-gpt4t/chat/completions?api-version=2023-07-01-preview
Headers:
• api-key: '......'
• Content-Type: 'application/json'
Body: empty
>  ( response <-
+     response |>
+     httr2::req_body_json(data = body) |>
+     httr2::req_retry(max_tries = 3) |>
+     httr2::req_error(is_error = function(resp) FALSE))
<httr2_request>
POST
https://cprdit-openai-gpt4t.openai.azure.com/openai/deployments/cprdit-gpt4t/chat/completions?api-version=2023-07-01-preview
Headers:
• api-key: 'f78be09ece5741bd9edecf397557c8a5'
• Content-Type: 'application/json'
Body: json encoded data
Policies:
• retry_max_tries: 3
• error_is_error: a function
>   response |> httr2::req_dry_run()
POST /openai/deployments/cprdit-gpt4t/chat/completions?api-version=2023-07-01-preview HTTP/1.1
Host: cprdit-openai-gpt4t.openai.azure.com
User-Agent: httr2/1.0.0 r-curl/5.2.1 libcurl/8.3.0
Accept: */*
Accept-Encoding: deflate, gzip
api-key: f78be09ece5741bd9edecf397557c8a5
Content-Type: application/json
Content-Length: 33

{"stream":true,"messages":"test"}
>   response <- response |> httr2::req_verbose() |> httr2::req_perform()
-> POST /openai/deployments/cprdit-gpt4t/chat/completions?api-version=2023-07-01-preview HTTP/1.1
-> Host: cprdit-openai-gpt4t.openai.azure.com
-> User-Agent: httr2/1.0.0 r-curl/5.2.1 libcurl/8.3.0
-> Accept: */*
-> Accept-Encoding: deflate, gzip
-> api-key: f78be09ece5741bd9edecf397557c8a5
-> Content-Type: application/json
-> Content-Length: 33
-> 
<- HTTP/1.1 400 model_error
<- Date: Fri, 26 Apr 2024 08:34:41 GMT
<- Content-Type: text/plain; charset=utf-8
<- Content-Length: 22
<- Connection: keep-alive
<- x-content-type-options: nosniff
<- x-ms-rai-invoked: true
<- x-request-id: 9264c456-d3fd-4d52-9554-0e50374b439d
<- ms-azureml-model-error-reason: model_error
<- ms-azureml-model-error-statuscode: 400
<- x-ms-client-request-id: 6aa38352-1822-4d91-af43-c72d1a9a0378
<- x-ms-region: Sweden Central
<- azureml-model-session: d034-20240328085946
<- apim-request-id: 6aa38352-1822-4d91-af43-c72d1a9a0378
<- Strict-Transport-Security: max-age=31536000; includeSubDomains; preload
<- x-ratelimit-remaining-requests: 39
<- x-ratelimit-remaining-tokens: 38720
<- 

The problem is again the URL creation
url <- glue::glue('https://{endpoint}.openai.azure.com/openai/deployments/{deployment}/chat/completions?api-version={api_version}')

@mladencucakSYN
Copy link
Author

Just a kind reminder on this issue. :)

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

3 participants