Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

IndexOutOfBoundsException when adding a header to Response on WildFly #371

Closed
eyalkoren opened this issue Dec 5, 2018 · 14 comments
Closed
Assignees
Labels
bug Bugs

Comments

@eyalkoren
Copy link
Contributor

Related to https://discuss.elastic.co/t/apm-java-agent-indexoutofboundsexception/159419.
User has Wildfly containers that recently got IndexOutOfBoundsException.
Working with Java agent 1.0.1 and APM-Server 6.5.1

[0m[31m12:25:47.699 ERROR [io.undertow.request] (default task-15) UT005023: Exception handling request to /api/households/5380009081/sync: java.lang.IndexOutOfBoundsException: Index: 0, Size: 0
at java.util.ArrayList.rangeCheck(ArrayList.java:657)
at java.util.ArrayList.get(ArrayList.java:433)
at co.elastic.apm.util.PotentiallyMultiValuedMap.add(PotentiallyMultiValuedMap.java:57)
at co.elastic.apm.impl.context.Response.addHeader(Response.java:64)
at org.jboss.resteasy.plugins.server.servlet.HttpServletDispatcher.service(HttpServletDispatcher.java:52)
at javax.servlet.http.HttpServlet.service(HttpServlet.java:790)
at io.undertow.servlet.handlers.ServletHandler.handleRequest(ServletHandler.java:85)
at io.undertow.servlet.handlers.security.ServletSecurityRoleHandler.handleRequest(ServletSecurityRoleHandler.java:62)
at io.undertow.servlet.handlers.ServletDispatchingHandler.handleRequest(ServletDispatchingHandler.java:36)
at org.wildfly.extension.undertow.security.SecurityContextAssociationHandler.handleRequest(SecurityContextAssociationHandler.java:78)
at io.undertow.server.handlers.PredicateHandler.handleRequest(PredicateHandler.java:43)
at io.undertow.servlet.handlers.security.SSLInformationAssociationHandler.handleRequest(SSLInformationAssociationHandler.java:131)
at io.undertow.servlet.handlers.security.ServletAuthenticationCallHandler.handleRequest(ServletAuthenticationCallHandler.java:57)
at io.undertow.server.handlers.PredicateHandler.handleRequest(PredicateHandler.java:43)
at io.undertow.security.handlers.AbstractConfidentialityHandler.handleRequest(AbstractConfidentialityHandler.java:46)
at io.undertow.servlet.handlers.security.ServletConfidentialityConstraintHandler.handleRequest(ServletConfidentialityConstraintHandler.java:64)
at io.undertow.security.handlers.AuthenticationMechanismsHandler.handleRequest(AuthenticationMechanismsHandler.java:60)
at io.undertow.servlet.handlers.security.CachedAuthenticatedSessionHandler.handleRequest(CachedAuthenticatedSessionHandler.java:77)
at io.undertow.security.handlers.NotificationReceiverHandler.handleRequest(NotificationReceiverHandler.java:50)
at io.undertow.security.handlers.AbstractSecurityContextAssociationHandler.handleRequest(AbstractSecurityContextAssociationHandler.java:43)
at io.undertow.server.handlers.PredicateHandler.handleRequest(PredicateHandler.java:43)
at org.wildfly.extension.undertow.security.jacc.JACCContextIdHandler.handleRequest(JACCContextIdHandler.java:61)
at io.undertow.server.handlers.PredicateHandler.handleRequest(PredicateHandler.java:43)
at io.undertow.server.handlers.PredicateHandler.handleRequest(PredicateHandler.java:43)
at io.undertow.servlet.handlers.ServletInitialHandler.handleFirstRequest(ServletInitialHandler.java:284)
at io.undertow.servlet.handlers.ServletInitialHandler.dispatchRequest(ServletInitialHandler.java:263)
at io.undertow.servlet.handlers.ServletInitialHandler.access$000(ServletInitialHandler.java:81)
at io.undertow.servlet.handlers.ServletInitialHandler$1.handleRequest(ServletInitialHandler.java:174)
at io.undertow.server.Connectors.executeRootHandler(Connectors.java:202)
at io.undertow.server.HttpServerExchange$1.run(HttpServerExchange.java:793)
at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
at java.lang.Thread.run(Thread.java:748)
@asafs133
Copy link

asafs133 commented Dec 5, 2018

We've experienced it in the middle of LoadTesting phase.

@eyalkoren
Copy link
Contributor Author

Can you share the code in your app that handles this request (/api/households/5380009081/sync) or at least share some details as to how it is written in general?

@asafs133
Copy link

asafs133 commented Dec 5, 2018

@eyalkoren,
We made some changes against the original code, but the pattern should look like the code below:

     @GET
     @Path("sync")
     @ValidateToken
     public void SyncHousehold(@SuppressWarnings("unused") @Context HttpServletRequest request,
              @Suspended final AsyncResponse response, @PathParam("bla") String householdId) {
          if (Strings.IsEmpty(householdId)) {
              response.resume(Response.serverError());
          }
          HttpResponse h = // get from Redis
     response.resume(Response.status(h.Status).entity(h.Result).build());
     }


ValidationTokenIntercaeptor:

     @AroundInvoke
     public Object Intercept(InvocationContext ctx) throws Exception {
          Object[] params = ctx.getParameters();
          HttpServletRequest request = (HttpServletRequest)params[0];
          
          Token token = (Token) request.getAttribute(token);
          if (token == null) {
              throw aaaException.UNAUTHORIZED();
          }
          Header header = new Header();
          header.id = token.id;
          header.role = token.role;   
          String role = Converter.ToJson(header);
          role = role.replaceAll("\n", "");// removing '\n' because when passes with it, the last '}' disappear
          
          request.setAttribute("role", role);
          
          return ctx.proceed();
     }

@eyalkoren
Copy link
Contributor Author

@asafs133 I think that will be very useful. Thanks for the effort!

@asafs133
Copy link

asafs133 commented Dec 6, 2018

@eyalkoren,
We have just stopped working with the APM Java agent following the exception we receive.
Does anyone work on solving the problem?
I do not want to sound nervous, but we're stuck):

10x.

@eyalkoren
Copy link
Contributor Author

@asafs133 it's high priority for us, but it won't be before next week. I hope we can issue a bugfix that will take care of this error as well next week, but unfortunately can't promise.

@asafs133
Copy link

asafs133 commented Dec 7, 2018

@eyalkoren, sure thanks!

@eyalkoren
Copy link
Contributor Author

@asafs133 what version of resteasy are you using?

@asafs133
Copy link

@eyalkoren,
reasteasy 3.5.1.Final

@eyalkoren
Copy link
Contributor Author

eyalkoren commented Dec 13, 2018

I created an app that uses AsyncResponse with reasteasy 3.5.1.Final, deployed it on WildFly and ran some load tests (12 threads invoking 300 requests each in less than 30 seconds). I was able to reproduce the race condition though not the exact same symptom (in WildFly 8 through 11, 12-13 behave a bit differently...). Here, two threads are concurrently adding a header to the response, one adding a key and before adding the value, another thread sees partial change and fails. In my case I got two threads managing to add the header before reporting the transaction (so headers were reported with two equal values). As far as I concern, these as two symptoms of the same race condition where two threads modify the same transaction object.
Making a long story short, the reason for that is in this usage of HttpServletRequest.isAsyncStarted(), which may cause a race condition. There is a bit more to it that require changes in our HttpServletRequest.startAsync instrumentation mechanism, which RESTEasy implementation of @Suspended and AsyncResponse relies on, so I will include the full explanation in the related PR and link from here.
@asafs133 bottom line- working on it :)

@asafs133
Copy link

I created an app that uses AsyncResponse with reasteasy 3.5.1.Final, deployed it on WildFly and ran some load tests (12 threads invoking 300 requests each in less than 30 seconds). I was able to reproduce the race condition though not the exact same symptom (in WildFly 8 through 11, 12-13 behave a bit differently...). Here, two threads are concurrently adding a header to the response, one adding a key and before adding the value, another thread sees partial change and fails. In my case I got two threads managing to add the header before reporting the transaction (so headers were reported with two equal values). As far as I concern, these as two symptoms of the same race condition where two threads modify the same transaction object.
Making a long story short, the reason for that is in this usage of HttpServletRequest.isAsyncStarted(), which may cause a race condition. There is a bit more to it that require changes in our HttpServletRequest.startAsync instrumentation mechanism, which RESTEasy implementation of @Suspended and AsyncResponse relies on, so I will include the full explanation in the related PR and link from here.
@asafs133 bottom line- working on it :)

@eyalkoren,
Thank you, appreciate the hard work you are doing to solve the issue.

@eyalkoren
Copy link
Contributor Author

Should be fixed in #385

@asafs133
Copy link

Should be fixed in #385

@eyalkoren,
Thank you very much!
I appreciate the work you did and will let you know about the results once the Pull request will confirm.

@eyalkoren
Copy link
Contributor Author

@asafs133 release 1.2.0 is ready and should fix this issue

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
bug Bugs
Projects
None yet
Development

No branches or pull requests

4 participants