Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

[receiver/splunkenterprise] fix flaky search for iops metrics #35081

Closed
shalper2 opened this issue Sep 9, 2024 · 4 comments
Closed

[receiver/splunkenterprise] fix flaky search for iops metrics #35081

shalper2 opened this issue Sep 9, 2024 · 4 comments

Comments

@shalper2
Copy link
Contributor

shalper2 commented Sep 9, 2024

Component(s)

receiver/splunkenterprise

What happened?

Description

The search which generates the iops metrics is flaky due to a reference to a directory which may not be present on the Splunk deployment's host machine.

Collector version

v0.108.0

Environment information

No response

OpenTelemetry Collector configuration

No response

Log output

No response

Additional context

This is not an issue directly related to the opentelemetry collector or even the splunkenterprise receiver's own internal logic/implementation.

@shalper2 shalper2 added bug Something isn't working needs triage New item requiring triage labels Sep 9, 2024
Copy link
Contributor

github-actions bot commented Sep 9, 2024

Pinging code owners:

See Adding Labels via Comments if you do not have permissions to add labels yourself.

@crobert-1
Copy link
Member

Makes sense to me, and was filed by a code owner. Removing needs triage.

@crobert-1 crobert-1 removed the needs triage New item requiring triage label Sep 9, 2024
mx-psi pushed a commit that referenced this issue Sep 11, 2024
…35082)

**Description:** 
Quick bugfix for a flaky Splunk search related to gathering average iops
metrics in the splunkenterprise receiver.

**Link to tracking Issue:** 

[35081](#35081)

**Testing:**
Tested amended search in splunk enterprise deployments and received
proper results

**Documentation:**
no new documentation provided

---------

Co-authored-by: Curtis Robert <[email protected]>
jriguera pushed a commit to springernature/opentelemetry-collector-contrib that referenced this issue Oct 4, 2024
…pen-telemetry#35082)

**Description:** 
Quick bugfix for a flaky Splunk search related to gathering average iops
metrics in the splunkenterprise receiver.

**Link to tracking Issue:** 

[35081](open-telemetry#35081)

**Testing:**
Tested amended search in splunk enterprise deployments and received
proper results

**Documentation:**
no new documentation provided

---------

Co-authored-by: Curtis Robert <[email protected]>
Copy link
Contributor

This issue has been inactive for 60 days. It will be closed in 60 days if there is no activity. To ping code owners by adding a component label, see Adding Labels via Comments, or if you are unsure of which component this issue relates to, please ping @open-telemetry/collector-contrib-triagers. If this issue is still relevant, please ping the code owners or leave a comment explaining why it is still relevant. Otherwise, please close it.

Pinging code owners:

See Adding Labels via Comments if you do not have permissions to add labels yourself.

@github-actions github-actions bot added the Stale label Nov 11, 2024
Copy link
Contributor

This issue has been closed as inactive because it has been stale for 120 days with no activity.

@github-actions github-actions bot closed this as not planned Won't fix, can't repro, duplicate, stale Jan 10, 2025
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Projects
None yet
Development

No branches or pull requests

2 participants