Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

S3 Exporter - Sumologic IC marshaler panics when enabled with any other exporter #33501

Open
bgrouxupgrade opened this issue Jun 11, 2024 · 4 comments
Labels
bug Something isn't working comp:sumologic Sumologic components Stale

Comments

@bgrouxupgrade
Copy link

bgrouxupgrade commented Jun 11, 2024

Component(s)

exporter/awss3

What happened?

Description

Sumologic IC marshaler for the AWS S3 exporter panics when used in conjunction with any other exporter to due mutating data in the pipeline without setting MutatesData

Steps to Reproduce

  • create a simple otel pipeline for logging
  • add an s3 exporter, use the sumo ic marshaler
  • add another exporter, doesn't matter which kind
  • try to push logs through the pipeline

Example aws s3 config with sumo_ic

  exporters:
      awss3/test:
        marshaler: "sumo_ic"
        s3uploader:
          region: "us-west-2"
          s3_bucket:  "<my bucket arn>"
          s3_prefix: "someprefx/"

Expected Result

No boom

Actual Result

Boom

panic: invalid access to shared data
goroutine 118 [running]:
go.opentelemetry.io/collector/pdata/internal.(*State).AssertMutable(...)
	go.opentelemetry.io/collector/pdata@v1.1.0/internal/state.go:20
go.opentelemetry.io/collector/pdata/pcommon.Map.Remove({0x400af9cc18?, 0x400b746ee8?}, {0x6f008a3?, 0x4002f78260?})
	go.opentelemetry.io/collector/pdata@v1.1.0/pcommon/map.go:76 +0x1a4
github.com/open-telemetry/opentelemetry-collector-contrib/exporter/awss3exporter.sumoMarshaler.MarshalLogs({}, {0x40047cbcc8?, 0x400b746ee8?})
	github.com/open-telemetry/opentelemetry-collector-contrib/exporter/awss3exporter@v0.94.0/sumo_marshaler.go:112 +0x288
github.com/open-telemetry/opentelemetry-collector-contrib/exporter/awss3exporter.(*s3Marshaler).MarshalLogs(0xcfde320?, {0x40047cbcc8?, 0x400b746ee8?})
	github.com/open-telemetry/opentelemetry-collector-contrib/exporter/awss3exporter@v0.94.0/s3_marshaler.go:26 +0x30
github.com/open-telemetry/opentelemetry-collector-contrib/exporter/awss3exporter.(*s3Exporter).ConsumeLogs(0x4002fad9b0, {0x7dbfbd8, 0x40030eecb0}, {0x40047cbcc8?, 0x400b746ee8?})
	github.com/open-telemetry/opentelemetry-collector-contrib/exporter/awss3exporter@v0.94.0/exporter.go:63 +0x48
go.opentelemetry.io/collector/exporter/exporterhelper.(*logsRequest).Export(0x7dbfb30?, {0x7dbfbd8?, 0x40030eecb0?})
	go.opentelemetry.io/collector/exporter@v0.94.1/exporterhelper/logs.go:59 +0x40
go.opentelemetry.io/collector/exporter/exporterhelper.(*timeoutSender).send(0x400307a390?, {0x7dbfb30?, 0x40044fd3b0?}, {0x7d76878, 0x40116e3518})
	go.opentelemetry.io/collector/exporter@v0.94.1/exporterhelper/timeout_sender.go:49 +0xa0
go.opentelemetry.io/collector/exporter/exporterhelper.(*baseRequestSender).send(0x4002d06e00?, {0x7dbfb30?, 0x40044fd3b0?}, {0x7d76878?, 0x40116e3518?})
	go.opentelemetry.io/collector/exporter@v0.94.1/exporterhelper/common.go:35 +0x38
go.opentelemetry.io/collector/exporter/exporterhelper.(*logsExporterWithObservability).send(0x4002fad9e0, {0x7dbfb30?, 0x400307a390?}, {0x7d76878?, 0x40116e3518?})
	go.opentelemetry.io/collector/exporter@v0.94.1/exporterhelper/logs.go:171 +0x74
go.opentelemetry.io/collector/exporter/exporterhelper.(*baseRequestSender).send(0x0?, {0x7dbfb30?, 0x400307a390?}, {0x7d76878?, 0x40116e3518?})
	go.opentelemetry.io/collector/exporter@v0.94.1/exporterhelper/common.go:35 +0x38
go.opentelemetry.io/collector/exporter/exporterhelper.(*baseExporter).send(0x4002e5f040, {0x7dbfb30?, 0x400307a390?}, {0x7d76878?, 0x40116e3518?})
	go.opentelemetry.io/collector/exporter@v0.94.1/exporterhelper/common.go:199 +0x50
go.opentelemetry.io/collector/exporter/exporterhelper.NewLogsExporter.func1({0x7dbfb30, 0x400307a390}, {0x40047cbcc8?, 0x400b746ee8?})
	go.opentelemetry.io/collector/exporter@v0.94.1/exporterhelper/logs.go:99 +0xb4
go.opentelemetry.io/collector/consumer.ConsumeLogsFunc.ConsumeLogs(...)
	go.opentelemetry.io/collector/consumer@v0.94.1/logs.go:25
go.opentelemetry.io/collector/internal/fanoutconsumer.(*logsConsumer).ConsumeLogs(0x400b746c7c?, {0x7dbfb30, 0x400307a390}, {0x40047cbcc8?, 0x400b746ee8?})
	go.opentelemetry.io/collector@v0.94.1/internal/fanoutconsumer/logs.go:73 +0xb4
go.opentelemetry.io/collector/consumer.ConsumeLogsFunc.ConsumeLogs(...)
	go.opentelemetry.io/collector/consumer@v0.94.1/logs.go:25
github.com/open-telemetry/opentelemetry-collector-contrib/connector/routingconnector.(*logsConnector).ConsumeLogs(0x400307a270, {0x7dbfb30, 0x400307a390}, {0x400472b548?, 0x4006144d80?})
	github.com/open-telemetry/opentelemetry-collector-contrib/connector/routingconnector@v0.94.0/logs.go:100 +0x1bc
go.opentelemetry.io/collector/internal/fanoutconsumer.(*logsConsumer).ConsumeLogs(0x400307a2d0, {0x7dbfb30, 0x400307a390}, {0x400472b548?, 0x4006144d80?})
	go.opentelemetry.io/collector@v0.94.1/internal/fanoutconsumer/logs.go:62 +0x208
go.opentelemetry.io/collector/processor/batchprocessor.(*batchLogs).export(0x4003072480, {0x7dbfb30, 0x400307a390}, 0x0?, 0x0)
	go.opentelemetry.io/collector/processor/batchprocessor@v0.94.1/batch_processor.go:489 +0x168
go.opentelemetry.io/collector/processor/batchprocessor.(*shard).sendItems(0x40030724c0, 0x4003523f2c?)
	go.opentelemetry.io/collector/processor/batchprocessor@v0.94.1/batch_processor.go:256 +0x54
go.opentelemetry.io/collector/processor/batchprocessor.(*shard).start(0x40030724c0)
	go.opentelemetry.io/collector/processor/batchprocessor@v0.94.1/batch_processor.go:218 +0x16c
created by go.opentelemetry.io/collector/processor/batchprocessor.(*batchProcessor).newShard in goroutine 1
	go.opentelemetry.io/collector/processor/batchprocessor@v0.94.1/batch_processor.go:160 +0x19c
	```

### Collector version

v0.94

### Environment information

_No response_

### OpenTelemetry Collector configuration

_No response_

### Log output

_No response_

### Additional context

_No response_
@bgrouxupgrade bgrouxupgrade added bug Something isn't working needs triage New item requiring triage labels Jun 11, 2024
Copy link
Contributor

Pinging code owners:

See Adding Labels via Comments if you do not have permissions to add labels yourself.

Copy link
Contributor

This issue has been inactive for 60 days. It will be closed in 60 days if there is no activity. To ping code owners by adding a component label, see Adding Labels via Comments, or if you are unsure of which component this issue relates to, please ping @open-telemetry/collector-contrib-triagers. If this issue is still relevant, please ping the code owners or leave a comment explaining why it is still relevant. Otherwise, please close it.

Pinging code owners:

See Adding Labels via Comments if you do not have permissions to add labels yourself.

@atoulme
Copy link
Contributor

atoulme commented Oct 2, 2024

That's a problem for sumo_ic.

@atoulme atoulme added comp:sumologic Sumologic components and removed Stale exporter/awss3 labels Oct 2, 2024
@atoulme atoulme removed the needs triage New item requiring triage label Oct 2, 2024
Copy link
Contributor

github-actions bot commented Dec 2, 2024

This issue has been inactive for 60 days. It will be closed in 60 days if there is no activity. To ping code owners by adding a component label, see Adding Labels via Comments, or if you are unsure of which component this issue relates to, please ping @open-telemetry/collector-contrib-triagers. If this issue is still relevant, please ping the code owners or leave a comment explaining why it is still relevant. Otherwise, please close it.

@github-actions github-actions bot added the Stale label Dec 2, 2024
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
bug Something isn't working comp:sumologic Sumologic components Stale
Projects
None yet
Development

No branches or pull requests

2 participants