Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Network metrics are not comming #34900

Open
mohitgoyal201617 opened this issue Aug 28, 2024 · 5 comments
Open

Network metrics are not comming #34900

mohitgoyal201617 opened this issue Aug 28, 2024 · 5 comments
Labels

Comments

@mohitgoyal201617
Copy link

mohitgoyal201617 commented Aug 28, 2024

Component(s)

No response

What happened?

Description

Steps to Reproduce

Apply below collector config

receivers:
  otlp:
    protocols:
      grpc:
      http:
  docker_stats:
    endpoint: unix:///var/run/docker.sock
    collection_interval: 30s
    timeout: 10s
    api_version: 1.24
    metrics:
      container.uptime:
        enabled: true
      container.restarts:
        enabled: true
      container.network.io.usage.rx_errors:
        enabled: true
      container.network.io.usage.tx_errors:
        enabled: true
      container.network.io.usage.rx_packets:
        enabled: true
      container.network.io.usage.tx_packets:
        enabled: true
processors:
  batch:
    send_batch_size: 1000
    timeout: 10s
  resourcedetection:
    detectors: [env, system]
    timeout: 2s
    system:
      hostname_sources: [os]
exporters:
  logging:
    verbosity: normal
 
service:
  pipelines:
    metrics:
      receivers: [otlp, docker_stats]
      processors: [resourcedetection, batch]
      exporters: [logging]

Expected Result

Result should contain n/w related metrics

Actual Result

No n/w related metric

Collector version

V0.106.0

Environment information

Environment

OS: (e.g., "Ubuntu 20.04")
Compiler(if manually compiled): (e.g., "go 14.2")

OpenTelemetry Collector configuration

No response

Log output

No response

Additional context

No response

@mohitgoyal201617 mohitgoyal201617 added bug Something isn't working needs triage New item requiring triage labels Aug 28, 2024
Copy link
Contributor

Pinging code owners for receiver/dockerstats: @rmfitzpatrick @jamesmoessis. See Adding Labels via Comments if you do not have permissions to add labels yourself.

@jamesmoessis
Copy link
Contributor

jamesmoessis commented Aug 29, 2024

@mohitgoyal201617 Need more information on the setup. What is the docker API version you are using, and docker engine version? Are you using the latest version of the collector?

Are you able to provide a sample output from the debug exporter, so we can see what is and is not exported?

This hasn't been reported anywhere else, so until you can provide some evidence this is an issue on the docker stats receiver, I can't do much to help you.

@mohitgoyal201617
Copy link
Author

Docker Engine Version: 20.10.8
API Version: 1.41
Go Version: go1.16.6
Otel Collector: V0.106.0

@mohitgoyal201617
Copy link
Author

Any idea ?

Copy link
Contributor

This issue has been inactive for 60 days. It will be closed in 60 days if there is no activity. To ping code owners by adding a component label, see Adding Labels via Comments, or if you are unsure of which component this issue relates to, please ping @open-telemetry/collector-contrib-triagers. If this issue is still relevant, please ping the code owners or leave a comment explaining why it is still relevant. Otherwise, please close it.

Pinging code owners:

See Adding Labels via Comments if you do not have permissions to add labels yourself.

@github-actions github-actions bot added the Stale label Dec 12, 2024
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
Projects
None yet
Development

No branches or pull requests

4 participants