Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Unit test failing in CI: OtlpHttpExporterTestPeer.ExportJsonIntegrationTest #1049

Closed
lalitb opened this issue Nov 2, 2021 · 9 comments · Fixed by #1080, #1171 or #1185
Closed

Unit test failing in CI: OtlpHttpExporterTestPeer.ExportJsonIntegrationTest #1049

lalitb opened this issue Nov 2, 2021 · 9 comments · Fixed by #1080, #1171 or #1185
Assignees
Labels
bug Something isn't working

Comments

@lalitb
Copy link
Member

lalitb commented Nov 2, 2021

One of the failure instances:
https://github.com/open-telemetry/opentelemetry-cpp/runs/4083453037?check_suite_focus=true

[ RUN      ] OtlpHttpExporterTestPeer.ExportJsonIntegrationTest
exporters/otlp/test/otlp_http_exporter_test.cc:240: Failure
Value of: waitForRequests(2, old_count + 1)
  Actual: false
Expected: true
[  FAILED  ] OtlpHttpExporterTestPeer.ExportJsonIntegrationTest (4446 ms)
@lalitb lalitb added the bug Something isn't working label Nov 2, 2021
@ThomsonTan
Copy link
Contributor

Wondering why this was not caught by our PR build.

@lalitb
Copy link
Member Author

lalitb commented Nov 2, 2021

Wondering why this was not caught by our PR build.

I don't think it is failing consistently, so not got caught in our PR build

@esigo
Copy link
Member

esigo commented Nov 3, 2021

This was happening here as well #1044 (comment).

@lalitb lalitb self-assigned this Nov 8, 2021
@lalitb
Copy link
Member Author

lalitb commented Nov 12, 2021

Haven't seen this happening this week, probably a transient issue with the CI pipeline. Will watch for some more time before closing.

@owent
Copy link
Member

owent commented Nov 21, 2021

I find it takes a lot time to finish tests when run under valgrind . It almost take 6 seconds to finish this test even in a successful case.
Maybe just increasing timeout could help?

@esigo
Copy link
Member

esigo commented Jan 26, 2022

It happened again. I'll try to raise a PR with mock in the next coming days.

@ThomsonTan ThomsonTan reopened this Jan 26, 2022
@owent
Copy link
Member

owent commented Jan 27, 2022

It happened again. I'll try to raise a PR with mock in the next coming days.

Thanks. Maybe it's also better to use multi_handle instead of start a new thread to start a asynchronous HTTP request when using libcurl.Should I start another issue?

@lalitb
Copy link
Member Author

lalitb commented Jan 27, 2022

Thanks. Maybe it's also better to use multi_handle instead of start a new thread to start a asynchronous HTTP request when using libcurl.Should I start another issue?

Thank @owent. Which thread invocation are we talking about here? The only place a potential thread may get created internally for libcurl is here -

result_ = std::async(std::launch::async, [this, callback] {

Do we want to remove it, and instead use multi_handle?

@esigo esigo self-assigned this Jan 28, 2022
@esigo esigo mentioned this issue Jan 30, 2022
3 tasks
@owent
Copy link
Member

owent commented Feb 19, 2022

Reference in

Thanks. Maybe it's also better to use multi_handle instead of start a new thread to start a asynchronous HTTP request when using libcurl.Should I start another issue?

Thank @owent. Which thread invocation are we talking about here? The only place a potential thread may get created internally for libcurl is here -

result_ = std::async(std::launch::async, [this, callback] {

Do we want to remove it, and instead use multi_handle?

Yes, I can create another issue and try to use multi_handle after #1209 is merged.It can reduce the cost when there are a lot sessions.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment