Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

common::MakeAttributes creates garbled output for std::string #2651

Open
t-b opened this issue Apr 29, 2024 · 8 comments
Open

common::MakeAttributes creates garbled output for std::string #2651

t-b opened this issue Apr 29, 2024 · 8 comments
Assignees
Labels
bug Something isn't working Stale triage/accepted Indicates an issue or PR is ready to be actively worked on.

Comments

@t-b
Copy link
Contributor

t-b commented Apr 29, 2024

Describe your environment
OS: Debian bookworm
GCC: 12.2.0
Version: 054b0dc ([RELEASE] Release opentelemetry-cpp version 1.15.0 (#2639), 2024-04-21), also present in 1.11.0.

Steps to reproduce

  • Download patch to ostream test 0001-Show-error-with-garbled-output.zip
  • unzip 0001-Show-error-with-garbled-output.patch.zip
  • git am 0001-Show-error-with-garbled-output.patch
  • cmake -S . -B build -DBUILD_TESTING=ON -DCMAKE_BUILD_TYPE=RelWithDebInfo -DCMAKE_POSITION_INDEPENDENT_CODE=ON -DBUILD_SHARED_LIBS=ON -DWITH_OTLP_GRPC=ON -DWITH_OTLP_HTTP=ON -DWITH_ABSEIL=ON -DWITH_BENCHMARK=OFF -DCMAKE_CXX_STANDARD=17
  • cmake --build build --target ostream_log_test && ctest --test-dir build -R IntegrationTestWithEventId --verbose

Outputs:

...
539: {
539:   timestamp          : 0
539:   observed_timestamp : 1714407256398793342
539:   severity_num       : 5
539:   severity_text      : DEBUG
539:   body               : Hello {key1} {key2}
539:   resource           : 
539:     service.name: unknown_service
539:     telemetry.sdk.version: 1.15.0
539:     telemetry.sdk.name: opentelemetry
539:     telemetry.sdk.language: cpp
539:   attributes         : 
539:     key2: �fA�U
539: 
539: ��A����abcd
539:     key1: 123
539:   event_id           : 12345678
539:   event_name         : test_event_id
539:   trace_id           : 00000000000000000000000000000000
539:   span_id            : 0000000000000000
539:   trace_flags        : 00
539:   scope              : 
539:     name             : opentelelemtry_library
539:     version          : 1.15.0
539:     schema_url       : https://opentelemetry.io/schemas/1.11.0
539:     attributes       : 
539:       scope.attr.key: 123
539: }
...

If I use the SimpleLogRecordProcessor there is no garbled output. And if I pass in a const char* via someString.c_str() and use the batched processor I also get garbled output.

What is the expected behavior?
That the strings I sent are outputted properly.

What is the actual behavior?
Garbled output.

Additional context
Add any other context about the problem here.

I would also be interested to know if the bug is limited to the ostream exporter or also applies to other exporters, we want tot use otlp with grpc and http in production.

This looks like a lifetime issue with regarding common::AttributeValue.

@t-b t-b added the bug Something isn't working label Apr 29, 2024
@github-actions github-actions bot added the needs-triage Indicates an issue or PR lacks a `triage/foo` label and requires one. label Apr 29, 2024
@lalitb
Copy link
Member

lalitb commented Apr 29, 2024

This looks like a lifetime issue with regarding common::AttributeValue.

You are correct.
The ostream-exporter uses ReadWriteLogRecord as recordable implementation, and the attributes are stored as:

  std::unordered_map<std::string, opentelemetry::common::AttributeValue> attributes_map_;

where AttributeValue is a (no)std::variant with non-owning values:

using AttributeValue =
    nostd::variant<bool,
                   int32_t,
                   int64_t,
                   uint32_t,
                   double,
                   const char *,
                   nostd::string_view,
                   nostd::span<const bool>,
                   nostd::span<const int32_t>,
                   nostd::span<const int64_t>,
                   nostd::span<const uint32_t>,
                   nostd::span<const double>,
                   nostd::span<const nostd::string_view>,
                   // Not currently supported by the specification, but reserved for future use.
                   // Added to provide support for all primitive C++ types.
                   uint64_t,
                   // Not currently supported by the specification, but reserved for future use.
                   // Added to provide support for all primitive C++ types.
                   nostd::span<const uint64_t>,
                   // Not currently supported by the specification, but reserved for future use.
                   // See https://github.com/open-telemetry/opentelemetry-specification/issues/780
                   nostd::span<const uint8_t>>;

So, while using ostream-exporter, you need to ensure that the attribute value remains valid till the log is exported. I understand this is difficult to ensure with batch-processor as the export is async, but since ostream-exporter is only for testing scenario, I think we should be good with this expectation.

I would also be interested to know if the bug is limited to the ostream exporter or also applies to other exporters, we want tot use otlp with grpc and http in production.

The issue is specific to ostream-exporter with batch processor. The OTLP log exporter would directly serialize the attributes in the proto format, and so the lifetime issue won't come into picture.

@t-b
Copy link
Contributor Author

t-b commented Apr 29, 2024

@lalitb Thanks for the quick reply and explanation. Glad this is only present with the ostream exporter.

But even for testing-only scenarios with the ostream exporter I would prefer a compiletime/runtime error.

@lalitb
Copy link
Member

lalitb commented Apr 29, 2024

Agree, the fix should be straighforward to avoid the crash by using the owning types for attributes ( as done within span-data)

opentelemetry::sdk::common::AttributeMap attribute_map_;

I will pick up the issue after couple of weeks if it is not fixed.

@lalitb
Copy link
Member

lalitb commented Apr 29, 2024

@t-b Also, apart from current issue I described, if your code has something like:

  auto attributes = opentelemetry::common::MakeAttributes(
        {{ "key1" , some_value_1},
         { "key2", some_value_2}};
  logger->Debug("Logging attributes", attributes);

you need to ensure that the some_value_1 and some_value_2 are not released before the call to logger->Debug(). This is valid for all the exporters.

@marcalff
Copy link
Member

marcalff commented May 6, 2024

Accepted - to fix ostream exporter to behave the same as OTLP

@marcalff marcalff added triage/accepted Indicates an issue or PR is ready to be actively worked on. and removed needs-triage Indicates an issue or PR lacks a `triage/foo` label and requires one. labels May 6, 2024
Copy link

github-actions bot commented Jul 7, 2024

This issue was marked as stale due to lack of activity.

@github-actions github-actions bot added the Stale label Jul 7, 2024
@t-b
Copy link
Contributor Author

t-b commented Jul 8, 2024

Unstale.

@github-actions github-actions bot removed the Stale label Jul 11, 2024
Copy link

This issue was marked as stale due to lack of activity.

@github-actions github-actions bot added the Stale label Sep 11, 2024
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
bug Something isn't working Stale triage/accepted Indicates an issue or PR is ready to be actively worked on.
Projects
None yet
Development

No branches or pull requests

3 participants