-
Notifications
You must be signed in to change notification settings - Fork 74
Add client side backoff retries for pubsub messages #1218
Conversation
/retest |
1 similar comment
/retest |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
It looks like a retry policy is not yet set anywhere?
cmd/broker/retry/main.go
Outdated
@@ -58,6 +58,9 @@ type envConfig struct { | |||
|
|||
// Max to 10m. | |||
TimeoutPerEvent time.Duration `envconfig:"TIMEOUT_PER_EVENT"` | |||
|
|||
MinRetryBackoff time.Duration `envconfig:"MIN_RETRY_BACKOFF" default:"1s"` | |||
MaxRetryBackoff time.Duration `envconfig:"MAX_RETRY_BACKOFF" default:"30s"` |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I was wondering if this should be a bit longer, like a few minutes?
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
How long would you suggest? I don't have a preference. I simply don't want to over blocking the queue because of the limitation we have today.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Changed to 1m now.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I just think if the backoff reaches 30s, it's very likely the issue is not transient, and some longer backoff is probably better.
But just a guess. We can always tune this later. 1min seems good to me.
/lgtm |
[APPROVALNOTIFIER] This PR is APPROVED This pull-request has been approved by: liu-cong, yolocs The full list of commands accepted by this bot can be found here. The pull request process is described here
Needs approval from an approver in each of these files:
Approvers can indicate their approval by writing |
/retest |
1 similar comment
/retest |
The following is the coverage report on the affected files.
|
/lgtm |
/retest
…On Mon, Jun 8, 2020 at 4:06 PM Knative test reporter robot < ***@***.***> wrote:
The following jobs failed:
Test name Triggers Retries
pull-google-knative-gcp-integration-tests 0/3
Failed non-flaky tests preventing automatic retry of
pull-google-knative-gcp-integration-tests:
github.com/google/knative-gcp/test/e2e.TestEventTransformationForSubscriptiongithub.com/google/knative-gcp/test/e2e.TestChannelChaingithub.com/google/knative-gcp/test/e2e.TestChannelTracinggithub.com/google/knative-gcp/test/e2e.TestChannelTracing/includes_incoming_trace_idgithub.com/google/knative-gcp/test/e2e.TestChannelChain/Channel-messaging.cloud.google.com/v1alpha1github.com/google/knative-gcp/test/e2e.TestChannelChain/Channel-messaging.cloud.google.com/v1beta1github.com/google/knative-gcp/test/e2e.TestEventTransformationForSubscription/Channel-messaging.cloud.google.com/v1alpha1github.com/google/knative-gcp/test/e2e.TestEventTransformationForSubscription/Channel-messaging.cloud.google.com/v1beta1
—
You are receiving this because your review was requested.
Reply to this email directly, view it on GitHub
<#1218 (comment)>,
or unsubscribe
<https://github.com/notifications/unsubscribe-auth/ABD65DBQJSIL7PXZVJBFB33RVVVGHANCNFSM4NS4ONSA>
.
|
The following jobs failed:
Automatically retrying due to test flakiness... |
Fixes #1100
Additional Changes
pool
package as it feels a bit unnature to have a deeper packagepkg/broker/handler/pool
depend on a parent packagepkg/broker/handler
implementation; no functional change involvedRelease Note
Docs