-
Notifications
You must be signed in to change notification settings - Fork 16
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
feat: enhance the capping of max spans per trace at global level #290
Conversation
Codecov Report
@@ Coverage Diff @@
## main #290 +/- ##
============================================
+ Coverage 79.44% 79.60% +0.15%
- Complexity 1265 1268 +3
============================================
Files 112 112
Lines 4934 4937 +3
Branches 452 451 -1
============================================
+ Hits 3920 3930 +10
+ Misses 812 806 -6
+ Partials 202 201 -1
Flags with carried forward coverage won't be shown. Click here to find out more.
Continue to review full report at Codecov.
|
This comment has been minimized.
This comment has been minimized.
@@ -28,6 +28,7 @@ span.window.store.retention.time.mins = ${?SPAN_WINDOW_STORE_RETENTION_TIME_MINS | |||
span.window.store.segment.size.mins = 20 | |||
span.window.store.segment.size.mins = ${?SPAN_WINDOW_STORE_SEGMENT_SIZE_MINS} | |||
|
|||
upper.max.span.count = 6 |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
As we already have max.span.count
for per tenant, didn't go for max.span.count.upper.limit
as config. Let me know if you have a better name.
@@ -28,6 +28,7 @@ span.window.store.retention.time.mins = ${?SPAN_WINDOW_STORE_RETENTION_TIME_MINS | |||
span.window.store.segment.size.mins = 20 | |||
span.window.store.segment.size.mins = ${?SPAN_WINDOW_STORE_SEGMENT_SIZE_MINS} | |||
|
|||
upper.max.span.count = 6 |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I think we should call this default.max.span.count
and the below section can be used as an override per tenant.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
This makes sense. Made changes.
This comment has been minimized.
This comment has been minimized.
…nt for global limit
This comment has been minimized.
This comment has been minimized.
? maxSpanCountMap.get(key.getTenantId()) | ||
: Long.MAX_VALUE; | ||
|
||
if (inFlightSpansPerTrace >= defaultMaxSpanCountLimit |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I thought tenant specific config acts as an override the default.
But the condition here doesn't seem to indicate that. We are picking the least of both.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
What @ravisingal & I had discussed offline of having a global upper limit, and so I had prior notatin of upper.max.span.count
. But, what you are saying is that for a specific tenant, we want a higher limit, that makes sense.
Let me know @avinashkolluru @ravisingal if we want to modify and go with that if that is making more sense.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I'd suggest made the tenant specific config as an override. It shouldn't matter whether its is lower or higher than the default.
The rawSpanGrouper, stitch the spans into traces. To cap the number of spans per tace, currently, we have config that controls per-tenant level. e.g
As part of this PR, we are enhancing this scheme by adding a global upper limit. As shown in the below example config,
tenant1
is applying 5 spans/trace limit, but for other tenants (say tenant2), it will apply 6 spans/trace limit.