Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

fix: BatchExporter should export continuously when batch size is reached #3958

Merged

Conversation

nordfjord
Copy link
Contributor

@nordfjord nordfjord commented Jun 29, 2023

Which problem is this PR solving?

The batch export processor should trigger an export when either of these conditions are met:

  • It has been more than scheduledDelayMillis since the last export
  • The queue has maxExportBatchSize items in it

Example from the dotnet implementation https://github.com/open-telemetry/opentelemetry-dotnet/blob/main/src/OpenTelemetry/BatchExportProcessor.cs#L265

Relevant portion of the specification https://github.com/open-telemetry/opentelemetry-specification/blob/main/specification/trace/sdk.md?plain=1#L610

Fixes #3094

@nordfjord nordfjord requested a review from a team June 29, 2023 13:53
@nordfjord nordfjord marked this pull request as draft June 29, 2023 14:02
@codecov
Copy link

codecov bot commented Jun 29, 2023

Codecov Report

Merging #3958 (55b94b1) into main (5ce32c0) will decrease coverage by 0.01%.
The diff coverage is 100.00%.

❗ Current head 55b94b1 differs from pull request most recent head dd742ed. Consider uploading reports for the commit dd742ed to get more accurate results

@@            Coverage Diff             @@
##             main    #3958      +/-   ##
==========================================
- Coverage   92.25%   92.25%   -0.01%     
==========================================
  Files         331      331              
  Lines        9465     9473       +8     
  Branches     1997     1999       +2     
==========================================
+ Hits         8732     8739       +7     
- Misses        733      734       +1     
Files Coverage Δ
...dk-trace-base/src/export/BatchSpanProcessorBase.ts 93.38% <100.00%> (+0.46%) ⬆️

... and 1 file with indirect coverage changes

@nordfjord nordfjord marked this pull request as ready for review June 29, 2023 15:09
Copy link
Member

@dyladan dyladan left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

While this works, it has a major issue which I also ran into while trying to solve this problem. If many spans are created quickly, the concurrency of the export is unbounded and this can result in an unbounded number of concurrent exports. Please take a look at #3828 for a solution which takes this into account and limits concurrency. Right now the biggest TODOs are related to testing, but I would appreciate comments on the main body of code if you have any.

@dyladan dyladan self-requested a review July 5, 2023 16:53
Copy link
Member

@dyladan dyladan left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Dismissed my previous rejection. Can we add a test to ensure the concurrency is actually limited?

It looks like you've modified some existing tests and haven't added any new ones, but it does look like one of the ones you modified tests the eager behavior. There are also some changes to the tests that aren't immediately obvious why you made them. Made comments on those specific tests.

@dyladan
Copy link
Member

dyladan commented Jul 5, 2023

/cc @martinkuba looks like this actually doesn't have the same concurrency issue. He limited it in a slightly different way than i did but it requires a lot less change so I might actually prefer this solution.

@nordfjord nordfjord changed the title fix: BathExporter should export continuously when batch size is reached fix: BatchExporter should export continuously when batch size is reached Jul 6, 2023
@Flarna Flarna mentioned this pull request Jul 12, 2023
1 task
@github-actions
Copy link

github-actions bot commented Sep 4, 2023

This PR is stale because it has been open 60 days with no activity. Remove stale label or comment or this will be closed in 14 days.

CHANGELOG.md Outdated Show resolved Hide resolved
return flush();
}
if (this._timer !== undefined) return;
this._timer = setTimeout(() => flush(), this._scheduledDelayMillis);
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Nice to have
It would be "nice" to clear (set to undefined) the this._timer just to avoid calling clearTimeout() with an expired timer. This also would allow (in a future PR) to provide some slightly better performance rather than always calling _clearTimeout()`

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

Eager exporting for BatchSpanProcessor
4 participants