Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

context-propagation-quickstart fails after "Use MethodHandle" changes #45346

Closed
rsvoboda opened this issue Jan 3, 2025 · 11 comments · Fixed by #45406
Closed

context-propagation-quickstart fails after "Use MethodHandle" changes #45346

rsvoboda opened this issue Jan 3, 2025 · 11 comments · Fixed by #45406
Labels
Milestone

Comments

@rsvoboda
Copy link
Member

rsvoboda commented Jan 3, 2025

Describe the bug

context-propagation-quickstart fails after "Use MethodHandle" changes

This quickstart fails since https://github.com/quarkusio/quarkus-quickstarts/actions/runs/12424359109 which has December 20, 2024 03:24 timestamp

I bisect commits and identified the last commit where the quickstrtart worked and it's 069c5a0

Image

It fails after after "Use MethodHandle" changes

When I used main and reverted 38d776a and e430fbb context-propagation-quickstart was still failing.

There must be other changes that trigger hang of the execution of context-propagation-quickstart cc @geoand

Quickstart hangs the execution and prints this in logs after one minute:

2025-01-03 08:41:01,465 WARN  [com.arj.ats.arjuna] (Transaction Reaper) ARJUNA012117: TransactionReaper::check processing TX 0:ffff0a141eec:d97b:67779451:0 in state  RUN
2025-01-03 08:41:01,467 WARN  [com.arj.ats.arjuna] (Transaction Reaper Worker 0) ARJUNA012121: TransactionReaper::doCancellations worker Thread[#180,Transaction Reaper Worker 0,5,main] successfully canceled TX 0:ffff0a141eec:d97b:67779451:0
Image

Expected behavior

context-propagation-quickstart works with Quarkus main

Actual behavior

context-propagation-quickstart fails with Quarkus main

How to Reproduce?

Run context-propagation-quickstart from https://github.com/quarkusio/quarkus-quickstarts development branch

mvn clean verify -pl context-propagation-quickstart -Dquarkus.platform.version=3.17.5 works well

Output of uname -a or ver

macOS

Output of java -version

Java 21

Quarkus version or git rev

Quarkus main

Build tool (ie. output of mvnw --version or gradlew --version)

No response

Additional information

No response

@rsvoboda rsvoboda added the kind/bug Something isn't working label Jan 3, 2025
Copy link

quarkus-bot bot commented Jan 3, 2025

/cc @FroMage (context-propagation), @manovotn (context-propagation)

@geoand
Copy link
Contributor

geoand commented Jan 3, 2025

I could not reproduce this.

@rsvoboda
Copy link
Member Author

rsvoboda commented Jan 3, 2025

@geoand latest GH Action run: https://github.com/quarkusio/quarkus-quickstarts/actions/runs/12591744899/job/35095307992

2025-01-03 03:12:11,384 INFO  [io.quarkus] (main) Installed features: [agroal, cdi, hibernate-orm, hibernate-orm-panache, jdbc-postgresql, kafka-client, messaging, messaging-kafka, narayana-jta, rest, rest-jackson, smallrye-context-propagation, vertx]
[Hibernate] 
    select
        p1_0.id,
        p1_0.value 
    from
        Price p1_0
2025-01-03 03:13:13,087 WARN  [com.arj.ats.arjuna] (Transaction Reaper) ARJUNA012117: TransactionReaper::check processing TX 0:ffff0a01005c:a7bf:6777558d:0 in state  RUN
2025-01-03 03:13:13,088 WARN  [com.arj.ats.arjuna] (Transaction Reaper Worker 0) ARJUNA012121: TransactionReaper::doCancellations worker Thread[Transaction Reaper Worker 0,5,main] successfully canceled TX 0:ffff0a01005c:a7bf:6777558d:0
@QuarkusTest has detected a hang, as there has been no test activity in PT10M
To configure this timeout use the quarkus.test.hang-detection-timeout config property
A stack trace is below to help diagnose the potential hang
....
...
2025-01-03 03:42:13,161 INFO  [org.apa.htt.imp.exe.RetryExec] (sse-event-source(http://localhost:8081/prices)-thread-1) I/O exception (org.apache.http.NoHttpResponseException) caught when processing request to {}->http://localhost:8081: The target server failed to respond
2025-01-03 03:42:13,161 INFO  [org.apa.htt.imp.exe.RetryExec] (sse-event-source(http://localhost:8081/prices)-thread-1) Retrying request to {}->http://localhost:8081
2025-01-03 03:42:13,162 WARN  [com.arj.ats.arjuna] (vert.x-eventloop-thread-1) ARJUNA012077: Abort called on already aborted atomic action 0:ffff0a01005c:a7bf:6777558d:0
2025-01-03 03:42:13,162 ERROR [io.qua.mut.run.MutinyInfrastructure] (vert.x-eventloop-thread-1) Mutiny had to drop the following exception: java.lang.RuntimeException: jakarta.transaction.RollbackException: ARJUNA016102: The transaction is not active! Uid is 0:ffff0a01005c:a7bf:6777558d:0
	at io.quarkus.narayana.jta.runtime.interceptor.TransactionalInterceptorBase.lambda$handleAsync$8(TransactionalInterceptorBase.java:305)
	at io.smallrye.context.impl.wrappers.SlowContextualRunnable.run(SlowContextualRunnable.java:19)
	at io.smallrye.mutiny.groups.MultiOnTerminate.lambda$invoke$0(MultiOnTerminate.java:52)
	at io.smallrye.mutiny.operators.multi.MultiOnTerminationInvoke$MultiOnTerminationInvokeProcessor.execute(MultiOnTerminationInvoke.java:72)
	at io.smallrye.mutiny.operators.multi.MultiOnTerminationInvoke$MultiOnTerminationInvokeProcessor.cancel(MultiOnTerminationInvoke.java:63)
	at org.jboss.resteasy.reactive.server.handlers.PublisherResponseHandler$AbstractMultiSubscriber.lambda$new$0(PublisherResponseHandler.java:268)
	at org.jboss.resteasy.reactive.server.vertx.VertxResteasyReactiveRequestContext$2.handle(VertxResteasyReactiveRequestContext.java:97)
	at org.jboss.resteasy.reactive.server.vertx.VertxResteasyReactiveRequestContext$2.handle(VertxResteasyReactiveRequestContext.java:94)
	at io.vertx.core.impl.ContextInternal.dispatch(ContextInternal.java:270)
	at io.vertx.core.http.impl.Http1xServerResponse.handleClosed(Http1xServerResponse.java:710)
	at io.vertx.core.http.impl.Http1xServerResponse.handleException(Http1xServerResponse.java:677)
	at io.vertx.core.http.impl.Http1xServerRequest.handleException(Http1xServerRequest.java:711)
	at io.vertx.core.http.impl.Http1xServerConnection.lambda$handleClosed$7(Http1xServerConnection.java:501)
	at io.vertx.core.impl.ContextImpl.execute(ContextImpl.java:327)
	at io.vertx.core.impl.DuplicatedContext.execute(DuplicatedContext.java:158)
	at io.vertx.core.impl.ContextInternal.execute(ContextInternal.java:207)
	at io.vertx.core.http.impl.Http1xServerConnection.handleClosed(Http1xServerConnection.java:500)
	at io.vertx.core.net.impl.VertxHandler.channelInactive(VertxHandler.java:143)
	at io.netty.channel.AbstractChannelHandlerContext.invokeChannelInactive(AbstractChannelHandlerContext.java:303)
	at io.netty.channel.AbstractChannelHandlerContext.invokeChannelInactive(AbstractChannelHandlerContext.java:281)
	at io.netty.channel.AbstractChannelHandlerContext.fireChannelInactive(AbstractChannelHandlerContext.java:274)
	at io.netty.channel.ChannelInboundHandlerAdapter.channelInactive(ChannelInboundHandlerAdapter.java:81)
	at io.netty.handler.timeout.IdleStateHandler.channelInactive(IdleStateHandler.java:280)
	at io.netty.channel.AbstractChannelHandlerContext.invokeChannelInactive(AbstractChannelHandlerContext.java:303)
	at io.netty.channel.AbstractChannelHandlerContext.invokeChannelInactive(AbstractChannelHandlerContext.java:281)
	at io.netty.channel.AbstractChannelHandlerContext.fireChannelInactive(AbstractChannelHandlerContext.java:274)
	at io.netty.handler.codec.ByteToMessageDecoder.channelInputClosed(ByteToMessageDecoder.java:412)
	at io.netty.handler.codec.ByteToMessageDecoder.channelInactive(ByteToMessageDecoder.java:377)
	at io.quarkus.narayana.jta.runtime.interceptor.TransactionalInterceptorBase.lambda$handleAsync$8(TransactionalInterceptorBase.java:305)
	at io.smallrye.context.impl.wrappers.SlowContextualRunnable.run(SlowContextualRunnable.java:19)
	at io.smallrye.mutiny.groups.MultiOnTerminate.lambda$invoke$0(MultiOnTerminate.java:52)
	at io.smallrye.mutiny.operators.multi.MultiOnTerminationInvoke$MultiOnTerminationInvokeProcessor.execute(MultiOnTerminationInvoke.java:72)
	at io.smallrye.mutiny.operators.multi.MultiOnTerminationInvoke$MultiOnTerminationInvokeProcessor.cancel(MultiOnTerminationInvoke.java:63)
	at org.jboss.resteasy.reactive.server.handlers.PublisherResponseHandler$AbstractMultiSubscriber.lambda$new$0(PublisherResponseHandler.java:268)
	at org.jboss.resteasy.reactive.server.vertx.VertxResteasyReactiveRequestContext$2.handle(VertxResteasyReactiveRequestContext.java:97)
	at org.jboss.resteasy.reactive.server.vertx.VertxResteasyReactiveRequestContext$2.handle(VertxResteasyReactiveRequestContext.java:94)
	at io.vertx.core.impl.ContextInternal.dispatch(ContextInternal.java:270)
	at io.vertx.core.http.impl.Http1xServerResponse.handleClosed(Http1xServerResponse.java:710)
	at io.vertx.core.http.impl.Http1xServerResponse.handleException(Http1xServerResponse.java:677)
	at io.vertx.core.http.impl.Http1xServerRequest.handleException(Http1xServerRequest.java:711)
	at io.vertx.core.http.impl.Http1xServerConnection.lambda$handleClosed$7(Http1xServerConnection.java:501)
	at io.vertx.core.impl.ContextImpl.execute(ContextImpl.java:327)
	at io.vertx.core.impl.DuplicatedContext.execute(DuplicatedContext.java:158)
	at io.vertx.core.impl.ContextInternal.execute(ContextInternal.java:207)
	at io.vertx.core.http.impl.Http1xServerConnection.handleClosed(Http1xServerConnection.java:500)
	at io.vertx.core.net.impl.VertxHandler.channelInactive(VertxHandler.java:143)
	at io.netty.channel.AbstractChannelHandlerContext.invokeChannelInactive(AbstractChannelHandlerContext.java:303)
	at io.netty.channel.AbstractChannelHandlerContext.invokeChannelInactive(AbstractChannelHandlerContext.java:281)
	at io.netty.channel.AbstractChannelHandlerContext.fireChannelInactive(AbstractChannelHandlerContext.java:274)
	at io.netty.channel.ChannelInboundHandlerAdapter.channelInactive(ChannelInboundHandlerAdapter.java:81)
	at io.netty.handler.timeout.IdleStateHandler.channelInactive(IdleStateHandler.java:280)
	at io.netty.channel.AbstractChannelHandlerContext.invokeChannelInactive(AbstractChannelHandlerContext.java:303)
	at io.netty.channel.AbstractChannelHandlerContext.invokeChannelInactive(AbstractChannelHandlerContext.java:281)
	at io.netty.channel.AbstractChannelHandlerContext.fireChannelInactive(AbstractChannelHandlerContext.java:274)
	at io.netty.handler.codec.ByteToMessageDecoder.channelInputClosed(ByteToMessageDecoder.java:412)
	at io.netty.handler.codec.ByteToMessageDecoder.channelInactive(ByteToMessageDecoder.java:377)
	at io.netty.channel.AbstractChannelHandlerContext.invokeChannelInactive(AbstractChannelHandlerContext.java:303)
	at io.netty.channel.AbstractChannelHandlerContext.invokeChannelInactive(AbstractChannelHandlerContext.java:281)
	at io.netty.channel.AbstractChannelHandlerContext.fireChannelInactive(AbstractChannelHandlerContext.java:274)
	at io.netty.channel.DefaultChannelPipeline$HeadContext.channelInactive(DefaultChannelPipeline.java:1352)
	at io.netty.channel.AbstractChannelHandlerContext.invokeChannelInactive(AbstractChannelHandlerContext.java:301)
	at io.netty.channel.AbstractChannelHandlerContext.invokeChannelInactive(AbstractChannelHandlerContext.java:281)
	at io.netty.channel.DefaultChannelPipeline.fireChannelInactive(DefaultChannelPipeline.java:850)
	at io.netty.channel.AbstractChannel$AbstractUnsafe$7.run(AbstractChannel.java:811)
	at io.netty.util.concurrent.AbstractEventExecutor.runTask(AbstractEventExecutor.java:173)
	at io.netty.util.concurrent.AbstractEventExecutor.safeExecute(AbstractEventExecutor.java:166)
	at io.netty.util.concurrent.SingleThreadEventExecutor.runAllTasks(SingleThreadEventExecutor.java:472)
	at io.netty.channel.nio.NioEventLoop.run(NioEventLoop.java:569)
	at io.netty.util.concurrent.SingleThreadEventExecutor$4.run(SingleThreadEventExecutor.java:997)
	at io.netty.util.internal.ThreadExecutorMap$2.run(ThreadExecutorMap.java:74)
	at io.netty.util.concurrent.FastThreadLocalRunnable.run(FastThreadLocalRunnable.java:30)
	at java.base/java.lang.Thread.run(Thread.java:840)
Caused by: jakarta.transaction.RollbackException: ARJUNA016102: The transaction is not active! Uid is 0:ffff0a01005c:a7bf:6777558d:3
	at com.arjuna.ats.internal.jta.transaction.arjunacore.TransactionImple.commitAndDisassociate(TransactionImple.java:1297)
	at com.arjuna.ats.internal.jta.transaction.arjunacore.BaseTransaction.commit(BaseTransaction.java:104)
	at io.quarkus.narayana.jta.runtime.NotifyingTransactionManager.commit(NotifyingTransactionManager.java:70)
	at io.quarkus.narayana.jta.runtime.interceptor.TransactionalInterceptorBase.endTransaction(TransactionalInterceptorBase.java:406)
	at io.quarkus.narayana.jta.runtime.interceptor.TransactionalInterceptorBase.lambda$handleAsync$7(TransactionalInterceptorBase.java:300)
	at io.quarkus.narayana.jta.runtime.interceptor.TransactionalInterceptorBase.doInTransaction(TransactionalInterceptorBase.java:324)
	at io.quarkus.narayana.jta.runtime.interceptor.TransactionalInterceptorBase.lambda$handleAsync$8(TransactionalInterceptorBase.java:300)
	... 43 more

@rsvoboda
Copy link
Member Author

rsvoboda commented Jan 3, 2025

I could not reproduce this.

Have you run the quickstart against latest Quarkus main / development branch?

@geoand
Copy link
Contributor

geoand commented Jan 3, 2025

Yup

@rsvoboda
Copy link
Member Author

rsvoboda commented Jan 3, 2025

Pretty strange, GH actions are failing since December 20 on this quickstart, I was able to reproduce it locally too ...

@geoand
Copy link
Contributor

geoand commented Jan 3, 2025

I'll have another look next week

@rsvoboda
Copy link
Member Author

rsvoboda commented Jan 6, 2025

@geoand I have update for you :) I found the true regression commit. I got fooled by "Merge pull request" commits when I looked into GitHub for commits history, not necessarily using the same commits tree. PRs for commits mentioned in the description were merged on Dec 21st.

Commit which causes the fail of context-propagation-quickstart is f0e1e4e9c92 which came in #44903 PR.
When I used Quarkus main from today and reverted f0e1e4e9c92 I got passing run of the problematic quickstart.

Should be the PR reverted or will you look into new adjustments?

@geoand
Copy link
Contributor

geoand commented Jan 6, 2025

That's interesting, thanks for the update.

I was off today so I still haven't looked at this issue.

@geoand
Copy link
Contributor

geoand commented Jan 7, 2025

@rsvoboda I looked into this and indeed #44903 contains a subtle change that could be break some cases (like the one implicitly used in the quickstart) so I opened #45406

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
Projects
None yet
3 participants
@rsvoboda @geoand and others