Skip to content
This repository has been archived by the owner on Jul 12, 2023. It is now read-only.

Error 429 still exists #303

Closed
sherifkozman opened this issue Aug 21, 2020 · 10 comments
Closed

Error 429 still exists #303

sherifkozman opened this issue Aug 21, 2020 · 10 comments
Assignees
Labels
kind/bug Something is malfunctioning.

Comments

@sherifkozman
Copy link
Contributor

sherifkozman commented Aug 21, 2020

We are running off commit : 691f9f5 while the rate limiting error 429 still persists.

[Edit by @sethvargo - removed a screenshot that contained identifiable information]

@sherifkozman sherifkozman added the kind/bug Something is malfunctioning. label Aug 21, 2020
@sethvargo
Copy link
Member

Hi @sherifkozman - can you share the server logs that correspond to this request? Since, the reset time is Jan 1, 1970 (time.At(0)), there's likely an error connecting to Redis which you should see in the server or apiserver logs.

@sethvargo sethvargo self-assigned this Aug 21, 2020
@sethvargo
Copy link
Member

Also, I deleted your screenshot. It had your cookie value and domain, which someone could have easily used to login and access the server as you.

@sherifkozman
Copy link
Contributor Author

@sethvargo this one is actually on the "server" service not apiserver
Screen Shot 2020-08-21 at 8 28 53 AM

@sethvargo
Copy link
Member

That's the request log. This would be a log a few lines before or after. It should be named ratelimit or similar.

screenshot-20200821-115116@2x

@sherifkozman
Copy link
Contributor Author

There is nothing before or after, if there a debug env we should enable ?

The only thing before is this set of logs, do we need to update the server code to anything after August 17th ?

Screen Shot 2020-08-21 at 9 05 52 AM

@sethvargo
Copy link
Member

Would it be possible to give me access to your logs for this project temporarily? github-username at google dot com. It's challenging to debug via screenshots 😄 . Did you deploy with LOG_DEBUG=true and DB_DEBUG=true?

@sethvargo
Copy link
Member

Responded on chat with a few suggestions.

@sethvargo
Copy link
Member

Just an update on this since @sherifkozman and I are chatting offline. We believe the issue is related to scale up/down events not properly closing connections and then, eventually, exhausting the pool. We're going to investigate some service-level changes and will report back.

@mikehelmick
Copy link
Contributor

#391 should fix this issue.

/close

@google-oss-robot
Copy link

@mikehelmick: Closing this issue.

In response to this:

#391 should fix this issue.

/close

Instructions for interacting with me using PR comments are available here. If you have questions or suggestions related to my behavior, please file an issue against the kubernetes/test-infra repository.

@github-actions github-actions bot locked as resolved and limited conversation to collaborators Oct 6, 2020
Sign up for free to subscribe to this conversation on GitHub. Already have an account? Sign in.
Labels
kind/bug Something is malfunctioning.
Projects
None yet
Development

No branches or pull requests

4 participants