Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Add batch log processor implementation with test coverage #434

Merged
merged 16 commits into from
Dec 22, 2020

Conversation

kxyr
Copy link
Contributor

@kxyr kxyr commented Dec 9, 2020

This PR adds a batch log processor for the Logging Prototype, based off the batch span processor, and following the simple log processor. The batch log processor has a main thread that is responsible for batching the LogRecords, and a worker thread that sends these records to an exporter. Note: std::cerr messages are included here as a placeholder before error diagnostics is available.

cc @MarkSeufert @alolita

@kxyr kxyr requested a review from a team December 9, 2020 22:18
@codecov
Copy link

codecov bot commented Dec 9, 2020

Codecov Report

Merging #434 (5a507bb) into master (b80dccb) will increase coverage by 0.06%.
The diff coverage is 97.32%.

Impacted file tree graph

@@            Coverage Diff             @@
##           master     #434      +/-   ##
==========================================
+ Coverage   94.37%   94.43%   +0.06%     
==========================================
  Files         185      187       +2     
  Lines        8047     8234     +187     
==========================================
+ Hits         7594     7776     +182     
- Misses        453      458       +5     
Impacted Files Coverage Δ
sdk/src/logs/batch_log_processor.cc 93.82% <93.82%> (ø)
sdk/test/logs/batch_log_processor_test.cc 100.00% <100.00%> (ø)

Copy link
Member

@lalitb lalitb left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

LGTM in general. Few minor changes would be needed in this PR ( as mentioned in comments) once Recordable interface PR is reviewed and merged. As I understand, @xukaren is working on that PR. We can hold this PR for merging till then.

sdk/src/logs/batch_log_processor.cc Outdated Show resolved Hide resolved
sdk/src/logs/batch_log_processor.cc Outdated Show resolved Hide resolved
sdk/test/logs/batch_log_processor_test.cc Outdated Show resolved Hide resolved
sdk/src/logs/batch_log_processor.cc Show resolved Hide resolved
sdk/test/logs/batch_log_processor_test.cc Outdated Show resolved Hide resolved
@kxyr kxyr force-pushed the logs-sdk3 branch 2 times, most recently from 5ba22ce to f642a60 Compare December 14, 2020 20:10
@kxyr kxyr changed the base branch from master to pr-merge December 14, 2020 20:16
@kxyr kxyr changed the base branch from pr-merge to master December 14, 2020 20:16
@kxyr kxyr force-pushed the logs-sdk3 branch 2 times, most recently from a55d098 to 13bc179 Compare December 14, 2020 20:31
Karen Xu added 2 commits December 15, 2020 11:45
Replace while loop with cv.wait with predicate

Remove while loop around cv.notify

Replace operator= with .store() for clarity
Copy link
Contributor

@jsuereth jsuereth left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I still think the concurrency portions of this may need some help.

Given the use of locks + condition for the core of this class, may make sense to just use a real lock/mutex instead of Atomics for now.

sdk/src/logs/batch_log_processor.cc Show resolved Hide resolved
sdk/src/logs/batch_log_processor.cc Outdated Show resolved Hide resolved
sdk/test/logs/batch_log_processor_test.cc Outdated Show resolved Hide resolved
@kxyr
Copy link
Contributor Author

kxyr commented Dec 18, 2020

@jsuereth, please let me know what you think of these changes! :)
@maxgolov would love a final stamp as well!

@kxyr
Copy link
Contributor Author

kxyr commented Dec 19, 2020

I have reverted this log processor to the same one as the batch span processor. This way there won't be any concurrency issues if this PR is merged.

Copy link
Contributor

@jsuereth jsuereth left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I think keeping this the same as the Trace batch processor (for now) is definitely the right call. Threading logic looks good.

@lalitb lalitb requested a review from maxgolov December 21, 2020 19:21
@maxgolov
Copy link
Contributor

Logged a separate issue to make this code readable, and/or avoiding the while looks on waiting for variable update:
#477

@maxgolov maxgolov added the pr:please-merge This PR is ready to be merged by a Maintainer (rebased, CI passed, has enough valid approvals, etc.) label Dec 21, 2020
@lalitb lalitb merged commit 5e946f9 into open-telemetry:master Dec 22, 2020
kxyr pushed a commit to open-o11y/opentelemetry-cpp that referenced this pull request Dec 22, 2020
kxyr pushed a commit to open-o11y/opentelemetry-cpp that referenced this pull request Dec 22, 2020
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
pr:please-merge This PR is ready to be merged by a Maintainer (rebased, CI passed, has enough valid approvals, etc.)
Projects
None yet
Development

Successfully merging this pull request may close these issues.

5 participants