Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

K8s Provider uses a context deadline with 0 second timeout #652

Closed
orktes opened this issue Apr 21, 2022 · 1 comment
Closed

K8s Provider uses a context deadline with 0 second timeout #652

orktes opened this issue Apr 21, 2022 · 1 comment

Comments

@orktes
Copy link
Contributor

orktes commented Apr 21, 2022

Describe the bug
When using k8s cluster provider the registration of the member times out due to a context deadline being set to 0. Seems like it cause by the monitor actor using ReceiveTimeout to get the timeout. ReceiveTimeout is never set for the actor.

https://github.com/asynkron/protoactor-go/blob/dev/cluster/clusterproviders/k8s/k8s_cluster_monitor.go#L21

To Reproduce

  • Setup a pod with the correct roles to access kubernetes APIs
  • Use k8s provider when configuring the cluster
  • Logs are flooded with 10:42:55 ERROR [CLUSTER] [KUBERNETES] Failed to register service to k8s, will retry error="unable to get own pod information for example-54f696bc49-r9npg: context deadline exceeded

I added some debug logs and its indeed caused by the linked line.

Expected behavior
Member is registered correctly.

@DamnWidget
Copy link
Contributor

Thanks for the detailed information, will take a look when I am back

DamnWidget added a commit to DamnWidget/protoactor-go that referenced this issue May 17, 2022
    * Used slice of anoymous structs as payload while sending patch
      operation to k8s cluster
    * Fixed watch mechanism to do not cancel the operation immediately
    * Made sure an outtime is set while registering members on k8s
rogeralsing added a commit that referenced this issue May 26, 2022
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

3 participants