Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

New component: Splunk Enterprise Receiver #12667

Closed
MovieStoreGuy opened this issue Jul 25, 2022 · 21 comments
Closed

New component: Splunk Enterprise Receiver #12667

MovieStoreGuy opened this issue Jul 25, 2022 · 21 comments
Assignees
Labels
Accepted Component New component has been sponsored comp:splunk Splunk components Stale

Comments

@MovieStoreGuy
Copy link
Contributor

The purpose and use-cases of the new component

For those using self host splunk (Splunk Enterprise), there is a few endpoints being exposed by the indexer cluster that can be captured and help monitor the platform.

Example configuration for the component

splunkenterprise:
  hostname: <host url>
  collection_period: <time>
  metrics:
    splunk.enterprise.cluster.peer.status:
       enabled: true
    # ... I am sure there is more but we'll save it for more feedback from the splunk crew

Telemetry data types supported

Metrics

Is this a vendor-specific component? If so, are you proposing to contribute this as a representative of the vendor?

This is a vendor specific component but I am happy for this component to be a community support receiver

Sponsor (Optional)

It would be nice to have someone from Splunk @dmitryax , @bogdandrutu as a sponsor for this.

@mx-psi mx-psi added comp:splunk Splunk components Sponsor Needed New component seeking sponsor labels Jul 25, 2022
@dmitryax
Copy link
Member

dmitryax commented Jul 25, 2022

I don't work on Splunk Enterprise, but I can sponsor this receiver

@dmitryax dmitryax removed the Sponsor Needed New component seeking sponsor label Jul 25, 2022
@MovieStoreGuy
Copy link
Contributor Author

Thank you @dmitryax.

We have a few custom checks that existing with the SignalFx agent, however, I don't believe what we are doing is unique to just Atlassian so I am happy to upstream our work.

@dmitryax
Copy link
Member

We have a few custom checks that existing with the SignalFx agent

What do you mean? Is there a Signalfx agent monitor for Spunk Enterprise?

@MovieStoreGuy
Copy link
Contributor Author

MovieStoreGuy commented Jul 25, 2022

As in we've written a custom python monitor that currently monitors are hosted splunk platform using exposed rest endpoints that are published.

(Sorry, still trying to wake up)

@dmitryax
Copy link
Member

Oh I see. Makes sense. It would be nice to have a native OTel collector receiver for that

@mx-psi mx-psi added the Accepted Component New component has been sponsored label Jul 27, 2022
@github-actions
Copy link
Contributor

This issue has been inactive for 60 days. It will be closed in 60 days if there is no activity. To ping code owners by adding a component label, see Adding Labels via Comments, or if you are unsure of which component this issue relates to, please ping @open-telemetry/collector-contrib-triagers. If this issue is still relevant, please ping the code owners or leave a comment explaining why it is still relevant. Otherwise, please close it.

@github-actions github-actions bot added the Stale label Nov 10, 2022
@atoulme
Copy link
Contributor

atoulme commented Nov 16, 2022

I'd be interested to see this one happen. How can I help?

@MovieStoreGuy
Copy link
Contributor Author

Hey @atoulme ,

Sorry, I've been meaning to come back to this one. What I wanted to do is take the existing internal python scripts we use to monitor splunk and upstream it for everyone to use. We are currently using Splunk 8 and looking forward to Splunk 9.

What would be really helpful is any external documentation on monitor endpoints, associated runbooks, and potentially help validating if this could work with other deployments of Splunk would be amazing.

@atoulme
Copy link
Contributor

atoulme commented Nov 16, 2022

Ah, I'm not sure I have a comprehensive guide. I'm always in favor of small iterations for this type of things - we can start with just one or two metrics. Splunk publishes an introspection API we can call: https://docs.splunk.com/Documentation/Splunk/9.0.1/RESTREF/RESTintrospect#server.2Fhealth.2Fdeployment

Starting small with just one call.

@MovieStoreGuy
Copy link
Contributor Author

If you're happy to wait the weekend, my goal this week was to port our internal stuff to the collector to then share upstream.

Also happy to talk more on slack

@atoulme
Copy link
Contributor

atoulme commented Nov 16, 2022

No rush here.

@atoulme
Copy link
Contributor

atoulme commented Jan 7, 2023

Happy new year! Interested if there's more to share now.

@atoulme atoulme removed the Stale label Feb 11, 2023
@shalper2
Copy link
Contributor

I'm happy to work with @MovieStoreGuy on this

@MovieStoreGuy
Copy link
Contributor Author

I have somewhat of base PR made for this, but I haven't worked out the kinks in it. Let me open it up as a draft so that others can extend what I've done.

@github-actions
Copy link
Contributor

This issue has been inactive for 60 days. It will be closed in 60 days if there is no activity. To ping code owners by adding a component label, see Adding Labels via Comments, or if you are unsure of which component this issue relates to, please ping @open-telemetry/collector-contrib-triagers. If this issue is still relevant, please ping the code owners or leave a comment explaining why it is still relevant. Otherwise, please close it.

@github-actions
Copy link
Contributor

This issue has been inactive for 60 days. It will be closed in 60 days if there is no activity. To ping code owners by adding a component label, see Adding Labels via Comments, or if you are unsure of which component this issue relates to, please ping @open-telemetry/collector-contrib-triagers. If this issue is still relevant, please ping the code owners or leave a comment explaining why it is still relevant. Otherwise, please close it.

@github-actions github-actions bot added the Stale label Jun 26, 2023
@dmitryax dmitryax removed the Stale label Jun 26, 2023
@dmitryax
Copy link
Member

dmitryax commented Jun 26, 2023

@shalper2 are you planning to continue working on this? If so, can this be assigned to you?

@shalper2
Copy link
Contributor

hey, yes sorry this should have been assigned to me!

@github-actions
Copy link
Contributor

This issue has been inactive for 60 days. It will be closed in 60 days if there is no activity. To ping code owners by adding a component label, see Adding Labels via Comments, or if you are unsure of which component this issue relates to, please ping @open-telemetry/collector-contrib-triagers. If this issue is still relevant, please ping the code owners or leave a comment explaining why it is still relevant. Otherwise, please close it.

@github-actions github-actions bot added the Stale label Aug 28, 2023
@atoulme atoulme removed the Stale label Aug 28, 2023
dmitryax pushed a commit that referenced this issue Sep 13, 2023
**Description:** First pass at implementing the component. This PR is
primarily focused on implementing the components architecture with less
focus on the collection of actual Splunk performance data. This was done
to keep the PR relatively short. Considerable work has however been done
to implement the receiver logic and accompanying tests.

**Link to tracking Issue:**
[12667](#12667)
MovieStoreGuy pushed a commit that referenced this issue Sep 29, 2023
**Description:** 
- Adds additional metrics to Splunk Enterprise receiver obtained from
API endpoints
- Updates tests and docs for these additional metrics

**Link to tracking Issue:**
#12667

**Testing:** 
- Additional tests for new endpoints and metrics added to `scraper_test`
- Verify current tests also pass

**Documentation:**
- Newly generated docs entries for additional metrics
@github-actions
Copy link
Contributor

This issue has been inactive for 60 days. It will be closed in 60 days if there is no activity. To ping code owners by adding a component label, see Adding Labels via Comments, or if you are unsure of which component this issue relates to, please ping @open-telemetry/collector-contrib-triagers. If this issue is still relevant, please ping the code owners or leave a comment explaining why it is still relevant. Otherwise, please close it.

@github-actions github-actions bot added the Stale label Oct 30, 2023
@atoulme
Copy link
Contributor

atoulme commented Oct 30, 2023

This is done! Closing as complete.

@atoulme atoulme closed this as completed Oct 30, 2023
jmsnll pushed a commit to jmsnll/opentelemetry-collector-contrib that referenced this issue Nov 12, 2023
)

**Description:** 
- Adds additional metrics to Splunk Enterprise receiver obtained from
API endpoints
- Updates tests and docs for these additional metrics

**Link to tracking Issue:**
open-telemetry#12667

**Testing:** 
- Additional tests for new endpoints and metrics added to `scraper_test`
- Verify current tests also pass

**Documentation:**
- Newly generated docs entries for additional metrics
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
Accepted Component New component has been sponsored comp:splunk Splunk components Stale
Projects
None yet
Development

No branches or pull requests

5 participants