Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Metric attributes not being exported from collector #5247

Closed
martinxluptak opened this issue Apr 22, 2022 · 2 comments
Closed

Metric attributes not being exported from collector #5247

martinxluptak opened this issue Apr 22, 2022 · 2 comments
Labels
bug Something isn't working

Comments

@martinxluptak
Copy link

Describe the bug
The attributes (labels) of collected Metrics in my application utilizing opentelemetry-collector are not being exported.

I am making a simple node.js request counter:

import { DiagConsoleLogger, DiagLogLevel, diag } from '@opentelemetry/api';
import { OTLPMetricExporter } from '@opentelemetry/exporter-metrics-otlp-grpc';
import { MeterProvider } from '@opentelemetry/sdk-metrics-base';
import { Resource } from '@opentelemetry/resources';
import { SemanticResourceAttributes } from '@opentelemetry/semantic-conventions';

// Optional and only needed to see the internal diagnostic logging (during development)
diag.setLogger(new DiagConsoleLogger(), DiagLogLevel.DEBUG);

const exporter = new OTLPMetricExporter();

const meter = new MeterProvider({
    exporter,
    interval: 10 * 1000,
    resource: new Resource({
        [SemanticResourceAttributes.SERVICE_NAME]: 'my-distinguishing-metric-service',
    }),
}).getMeter('example-exporter-collector');

const requestCounter = meter.createCounter('requests', {
    description: 'Example of a Counter',
});

const userId = 'myUserId';
const statusCode = 500;

const labels = { userId, statusCode };

setInterval(() => {
    // @ts-ignore
    requestCounter.add(1, labels);
}, 1000);

Steps to reproduce
I can verify that the metric itself is exported in both logging and OTLP exporters. However, the labels are not exported via OTLP and are also missing in the logging exporter with logLevel: debug.
Log from logging exporter:

otel-collector             |
otel-collector             | 2022-04-22T10:25:22.359Z   INFO    loggingexporter/logging_exporter.go:56  MetricsExporter {"#metrics": 1}
otel-collector             | 2022-04-22T10:25:22.359Z   DEBUG   loggingexporter/logging_exporter.go:66  ResourceMetrics #0
otel-collector             | Resource SchemaURL:
otel-collector             | Resource labels:
otel-collector             |      -> service.name: STRING(my-distinguishing-metric-service)
otel-collector             |      -> telemetry.sdk.language: STRING(nodejs)
otel-collector             |      -> telemetry.sdk.name: STRING(opentelemetry)
otel-collector             |      -> telemetry.sdk.version: STRING(1.0.1)
otel-collector             | ScopeMetrics #0
otel-collector             | ScopeMetrics SchemaURL:
otel-collector             | InstrumentationScope example-exporter-collector
otel-collector             | Metric #0
otel-collector             | Descriptor:
otel-collector             |      -> Name: requests
otel-collector             |      -> Description: Example of a Counter
otel-collector             |      -> Unit: 1
otel-collector             |      -> DataType: Sum
otel-collector             |      -> IsMonotonic: true
otel-collector             |      -> AggregationTemporality: AGGREGATION_TEMPORALITY_CUMULATIVE
otel-collector             | NumberDataPoints #0
otel-collector             | StartTimestamp: 2022-04-22 10:20:42.327000064 +0000 UTC
otel-collector             | Timestamp: 2022-04-22 10:25:21.634371328 +0000 UTC
otel-collector             | Value: 279.000000
otel-collector             |

Log from DiagConsoleLogger() in the node app:

otel-collector             |
otel-collector             | 2022-04-22T10:25:22.359Z   INFO    loggingexporter/logging_exporter.go:56  MetricsExporter {"#metrics": 1}
otel-collector             | 2022-04-22T10:25:22.359Z   DEBUG   loggingexporter/logging_exporter.go:66  ResourceMetrics #0
otel-collector             | Resource SchemaURL:
otel-collector             | Resource labels:
otel-collector             |      -> service.name: STRING(my-distinguishing-metric-service)
otel-collector             |      -> telemetry.sdk.language: STRING(nodejs)
otel-collector             |      -> telemetry.sdk.name: STRING(opentelemetry)
otel-collector             |      -> telemetry.sdk.version: STRING(1.0.1)
otel-collector             | ScopeMetrics #0
otel-collector             | ScopeMetrics SchemaURL:
otel-collector             | InstrumentationScope example-exporter-collector
otel-collector             | Metric #0
otel-collector             | Descriptor:
otel-collector             |      -> Name: requests
otel-collector             |      -> Description: Example of a Counter
otel-collector             |      -> Unit: 1
otel-collector             |      -> DataType: Sum
otel-collector             |      -> IsMonotonic: true
otel-collector             |      -> AggregationTemporality: AGGREGATION_TEMPORALITY_CUMULATIVE
otel-collector             | NumberDataPoints #0
otel-collector             | StartTimestamp: 2022-04-22 10:20:42.327000064 +0000 UTC
otel-collector             | Timestamp: 2022-04-22 10:25:21.634371328 +0000 UTC
otel-collector             | Value: 279.000000
otel-collector             |

What did you expect to see?
I expected the attributes to be displayed in logging and OTLP exporters.

What did you see instead?
I only see the metric itself without the attributes.

What config did you use?

otel-config.yaml:

extensions:
  health_check: {}
receivers:
  otlp:
    protocols:
      grpc:
processors:
  batch: # compresses data and reduces number of outgoing connections.
exporters:
  logging:
    logLevel: debug
  otlp:
    endpoint: https://otlp.nr-data.net:4317
    headers:
      api-key: $NEW_RELIC_API_KEY
service:
  extensions: [health_check]
  pipelines:
    metrics:
      receivers: [otlp]
      processors: [batch]
      exporters: [logging, otlp]
    traces:
      receivers: [otlp]
      processors: [batch]
      exporters: [logging, otlp]
    logs:
      receivers: [otlp]
      processors: [batch]
      exporters: [logging, otlp]

Environment

  • WSL2/Ubuntu
  • Node v14.18.3
  • otel/opentelemetry-collector:0.49.0 docker image.

package.json excerpt:

...
        "@opentelemetry/api": "^1.0.4",
        "@opentelemetry/exporter-metrics-otlp-grpc": "^0.27.0",
        "@opentelemetry/resources": "^1.0.1",
        "@opentelemetry/sdk-metrics-base": "^0.27.0",
        "@opentelemetry/semantic-conventions": "^1.0.1",
...

Additional context

  • I have tried removing the batch processor, no change.
  • I haven't tried other SDKs i.e. python, Go, ... however it's seems it's likely caused by a (mis)configuration in opentelemetry-collector or perhaps an unsupported feature, however. I haven't found this documented anywhere.
  • Clone my repository to reproduce the bug
@martinxluptak martinxluptak added the bug Something isn't working label Apr 22, 2022
@svetlanabrennan
Copy link

@martinxluptak

This issue is happening because the JS otlp exporters are using an outdated version of protobufs. The JS otlp exporters are still sending over the old labels over otlp instead of the newer attributes but there is an open issue to get the protobuf version updated.

Since there is an open issue about this in the otel-js repo, we can close this issue here.

@martinxluptak
Copy link
Author

closed; instead tracking the opentelemetry-js issue.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
bug Something isn't working
Projects
None yet
Development

No branches or pull requests

2 participants