Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

[BUG] java.util.ConcurrentModificationException with plugins.security.cache.ttl_minutes=0 #2263

Closed
olddanmer opened this issue Nov 16, 2022 · 11 comments
Assignees
Labels
bug Something isn't working triaged Issues labeled as 'Triaged' have been reviewed and are deemed actionable. v2.10.0 Issues targeting release v2.10.0

Comments

@olddanmer
Copy link

What is the bug?
With option plugins.security.cache.ttl_minutes: 0 opensearch sometimes raises java.util.ConcurrentModificationException error and logstash get 500 error at that moment.

I use dashboards with OpenID integration, so option plugins.security.cache.ttl_minutes: 0 is recommended by documentation. Without plugins.security.cache.ttl_minutes: 0 dashboards doesn't work properly with OpenID (see bug opensearch-project/security-dashboards-plugin#1138 (comment))

How can one reproduce the bug?
Steps to reproduce the behavior:

  1. Setup opensearch 2.3.0, logstash-oss 7.12.1
  2. Set opensearch setting plugins.security.cache.ttl_minutes=0
  3. logstash uses bulk write (?)
  4. See errors in logs

opensearch logs

Nov 15 15:15:07 srvroom-es-logs-1 docker[1142]: [2022-11-15T15:15:07,061][ERROR][o.o.s.f.SecurityFilter   ] [es-logs-1.srvroom.labs.intellij.net] Unexpected exception java.util.ConcurrentModificationException
Nov 15 15:15:07 srvroom-es-logs-1 docker[1142]: java.util.ConcurrentModificationException: null
Nov 15 15:15:07 srvroom-es-logs-1 docker[1142]: at java.util.HashMap$HashIterator.nextNode(HashMap.java:1597) ~[?:?]
Nov 15 15:15:07 srvroom-es-logs-1 docker[1142]: at java.util.HashMap$KeyIterator.next(HashMap.java:1620) ~[?:?]
Nov 15 15:15:07 srvroom-es-logs-1 docker[1142]: at java.util.Collections$UnmodifiableCollection$1.next(Collections.java:1054) ~[?:?]
Nov 15 15:15:07 srvroom-es-logs-1 docker[1142]: at java.util.AbstractCollection.addAll(AbstractCollection.java:335) ~[?:?]
Nov 15 15:15:07 srvroom-es-logs-1 docker[1142]: at java.util.HashSet.<init>(HashSet.java:121) ~[?:?]
Nov 15 15:15:07 srvroom-es-logs-1 docker[1142]: at org.opensearch.security.securityconf.ConfigModelV7$RoleMappingHolder.map(ConfigModelV7.java:1200) ~[opensearch-security-2.3.0.0.jar:2.3.0.0]
Nov 15 15:15:07 srvroom-es-logs-1 docker[1142]: at org.opensearch.security.securityconf.ConfigModelV7.mapSecurityRoles(ConfigModelV7.java:1268) ~[opensearch-security-2.3.0.0.jar:2.3.0.0]
Nov 15 15:15:07 srvroom-es-logs-1 docker[1142]: at org.opensearch.security.privileges.PrivilegesEvaluator.mapRoles(PrivilegesEvaluator.java:488) ~[opensearch-security-2.3.0.0.jar:2.3.0.0]
Nov 15 15:15:07 srvroom-es-logs-1 docker[1142]: at org.opensearch.security.privileges.PrivilegesEvaluator.evaluate(PrivilegesEvaluator.java:220) ~[opensearch-security-2.3.0.0.jar:2.3.0.0]
Nov 15 15:15:07 srvroom-es-logs-1 docker[1142]: at org.opensearch.security.filter.SecurityFilter.apply0(SecurityFilter.java:303) [opensearch-security-2.3.0.0.jar:2.3.0.0]
Nov 15 15:15:07 srvroom-es-logs-1 docker[1142]: at org.opensearch.security.filter.SecurityFilter.apply(SecurityFilter.java:149) [opensearch-security-2.3.0.0.jar:2.3.0.0]
Nov 15 15:15:07 srvroom-es-logs-1 docker[1142]: at org.opensearch.action.support.TransportAction$RequestFilterChain.proceed(TransportAction.java:216) [opensearch-2.3.0.jar:2.3.0]
Nov 15 15:15:07 srvroom-es-logs-1 docker[1142]: at org.opensearch.action.support.TransportAction.execute(TransportAction.java:188) [opensearch-2.3.0.jar:2.3.0]
Nov 15 15:15:07 srvroom-es-logs-1 docker[1142]: at org.opensearch.action.support.TransportAction.execute(TransportAction.java:107) [opensearch-2.3.0.jar:2.3.0]
Nov 15 15:15:07 srvroom-es-logs-1 docker[1142]: at org.opensearch.client.node.NodeClient.executeLocally(NodeClient.java:110) [opensearch-2.3.0.jar:2.3.0]
Nov 15 15:15:07 srvroom-es-logs-1 docker[1142]: at org.opensearch.client.node.NodeClient.doExecute(NodeClient.java:97) [opensearch-2.3.0.jar:2.3.0]
Nov 15 15:15:07 srvroom-es-logs-1 docker[1142]: at org.opensearch.client.support.AbstractClient.execute(AbstractClient.java:426) [opensearch-2.3.0.jar:2.3.0]
Nov 15 15:15:07 srvroom-es-logs-1 docker[1142]: at org.opensearch.client.support.AbstractClient.bulk(AbstractClient.java:502) [opensearch-2.3.0.jar:2.3.0]
Nov 15 15:15:07 srvroom-es-logs-1 docker[1142]: at org.opensearch.rest.action.document.RestBulkAction.lambda$prepareRequest$0(RestBulkAction.java:111) [opensearch-2.3.0.jar:2.3.0]
Nov 15 15:15:07 srvroom-es-logs-1 docker[1142]: at org.opensearch.rest.BaseRestHandler.handleRequest(BaseRestHandler.java:125) [opensearch-2.3.0.jar:2.3.0]
Nov 15 15:15:07 srvroom-es-logs-1 docker[1142]: at org.opensearch.security.filter.SecurityRestFilter$1.handleRequest(SecurityRestFilter.java:127) [opensearch-security-2.3.0.0.jar:2.3.0.0]
Nov 15 15:15:07 srvroom-es-logs-1 docker[1142]: at org.opensearch.rest.RestController.dispatchRequest(RestController.java:312) [opensearch-2.3.0.jar:2.3.0]
Nov 15 15:15:07 srvroom-es-logs-1 docker[1142]: at org.opensearch.rest.RestController.tryAllHandlers(RestController.java:398) [opensearch-2.3.0.jar:2.3.0]
Nov 15 15:15:07 srvroom-es-logs-1 docker[1142]: at org.opensearch.rest.RestController.dispatchRequest(RestController.java:241) [opensearch-2.3.0.jar:2.3.0]
Nov 15 15:15:07 srvroom-es-logs-1 docker[1142]: at org.opensearch.security.ssl.http.netty.ValidatingDispatcher.dispatchRequest(ValidatingDispatcher.java:63) [opensearch-security-2.3.0.0.jar:2.3.0.0]
Nov 15 15:15:07 srvroom-es-logs-1 docker[1142]: at org.opensearch.http.AbstractHttpServerTransport.dispatchRequest(AbstractHttpServerTransport.java:366) [opensearch-2.3.0.jar:2.3.0]
Nov 15 15:15:07 srvroom-es-logs-1 docker[1142]: at org.opensearch.http.AbstractHttpServerTransport.handleIncomingRequest(AbstractHttpServerTransport.java:445) [opensearch-2.3.0.jar:2.3.0]
Nov 15 15:15:07 srvroom-es-logs-1 docker[1142]: at org.opensearch.http.AbstractHttpServerTransport.incomingRequest(AbstractHttpServerTransport.java:356) [opensearch-2.3.0.jar:2.3.0]
Nov 15 15:15:07 srvroom-es-logs-1 docker[1142]: at org.opensearch.http.netty4.Netty4HttpRequestHandler.channelRead0(Netty4HttpRequestHandler.java:55) [transport-netty4-client-2.3.0.jar:2.3.0]
Nov 15 15:15:07 srvroom-es-logs-1 docker[1142]: at org.opensearch.http.netty4.Netty4HttpRequestHandler.channelRead0(Netty4HttpRequestHandler.java:41) [transport-netty4-client-2.3.0.jar:2.3.0]
Nov 15 15:15:07 srvroom-es-logs-1 docker[1142]: at io.netty.channel.SimpleChannelInboundHandler.channelRead(SimpleChannelInboundHandler.java:99) [netty-transport-4.1.79.Final.jar:4.1.79.Final]
Nov 15 15:15:07 srvroom-es-logs-1 docker[1142]: at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:379) [netty-transport-4.1.79.Final.jar:4.1.79.Final]
Nov 15 15:15:07 srvroom-es-logs-1 docker[1142]: at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:365) [netty-transport-4.1.79.Final.jar:4.1.79.Final]
Nov 15 15:15:07 srvroom-es-logs-1 docker[1142]: at io.netty.channel.AbstractChannelHandlerContext.fireChannelRead(AbstractChannelHandlerContext.java:357) [netty-transport-4.1.79.Final.jar:4.1.79.Final]
Nov 15 15:15:07 srvroom-es-logs-1 docker[1142]: at org.opensearch.http.netty4.Netty4HttpPipeliningHandler.channelRead(Netty4HttpPipeliningHandler.java:71) [transport-netty4-client-2.3.0.jar:2.3.0]
Nov 15 15:15:07 srvroom-es-logs-1 docker[1142]: at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:379) [netty-transport-4.1.79.Final.jar:4.1.79.Final]
Nov 15 15:15:07 srvroom-es-logs-1 docker[1142]: at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:365) [netty-transport-4.1.79.Final.jar:4.1.79.Final]
Nov 15 15:15:07 srvroom-es-logs-1 docker[1142]: at io.netty.channel.AbstractChannelHandlerContext.fireChannelRead(AbstractChannelHandlerContext.java:357) [netty-transport-4.1.79.Final.jar:4.1.79.Final]
Nov 15 15:15:07 srvroom-es-logs-1 docker[1142]: at io.netty.handler.codec.MessageToMessageDecoder.channelRead(MessageToMessageDecoder.java:103) [netty-codec-4.1.79.Final.jar:4.1.79.Final]
Nov 15 15:15:07 srvroom-es-logs-1 docker[1142]: at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:379) [netty-transport-4.1.79.Final.jar:4.1.79.Final]
Nov 15 15:15:07 srvroom-es-logs-1 docker[1142]: at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:365) [netty-transport-4.1.79.Final.jar:4.1.79.Final]
Nov 15 15:15:07 srvroom-es-logs-1 docker[1142]: at io.netty.channel.AbstractChannelHandlerContext.fireChannelRead(AbstractChannelHandlerContext.java:357) [netty-transport-4.1.79.Final.jar:4.1.79.Final]
Nov 15 15:15:07 srvroom-es-logs-1 docker[1142]: at io.netty.handler.codec.MessageToMessageDecoder.channelRead(MessageToMessageDecoder.java:103) [netty-codec-4.1.79.Final.jar:4.1.79.Final]
Nov 15 15:15:07 srvroom-es-logs-1 docker[1142]: at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:379) [netty-transport-4.1.79.Final.jar:4.1.79.Final]
Nov 15 15:15:07 srvroom-es-logs-1 docker[1142]: at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:365) [netty-transport-4.1.79.Final.jar:4.1.79.Final]
Nov 15 15:15:07 srvroom-es-logs-1 docker[1142]: at io.netty.channel.AbstractChannelHandlerContext.fireChannelRead(AbstractChannelHandlerContext.java:357) [netty-transport-4.1.79.Final.jar:4.1.79.Final]
Nov 15 15:15:07 srvroom-es-logs-1 docker[1142]: at io.netty.handler.codec.MessageToMessageDecoder.channelRead(MessageToMessageDecoder.java:103) [netty-codec-4.1.79.Final.jar:4.1.79.Final]
Nov 15 15:15:07 srvroom-es-logs-1 docker[1142]: at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:379) [netty-transport-4.1.79.Final.jar:4.1.79.Final]
Nov 15 15:15:07 srvroom-es-logs-1 docker[1142]: at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:365) [netty-transport-4.1.79.Final.jar:4.1.79.Final]
Nov 15 15:15:07 srvroom-es-logs-1 docker[1142]: at io.netty.channel.AbstractChannelHandlerContext.fireChannelRead(AbstractChannelHandlerContext.java:357) [netty-transport-4.1.79.Final.jar:4.1.79.Final]
Nov 15 15:15:07 srvroom-es-logs-1 docker[1142]: at io.netty.handler.codec.ByteToMessageDecoder.fireChannelRead(ByteToMessageDecoder.java:327) [netty-codec-4.1.79.Final.jar:4.1.79.Final]
Nov 15 15:15:07 srvroom-es-logs-1 docker[1142]: at io.netty.handler.codec.ByteToMessageDecoder.channelRead(ByteToMessageDecoder.java:299) [netty-codec-4.1.79.Final.jar:4.1.79.Final]
Nov 15 15:15:07 srvroom-es-logs-1 docker[1142]: at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:379) [netty-transport-4.1.79.Final.jar:4.1.79.Final]
Nov 15 15:15:07 srvroom-es-logs-1 docker[1142]: at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:365) [netty-transport-4.1.79.Final.jar:4.1.79.Final]
Nov 15 15:15:07 srvroom-es-logs-1 docker[1142]: at io.netty.channel.AbstractChannelHandlerContext.fireChannelRead(AbstractChannelHandlerContext.java:357) [netty-transport-4.1.79.Final.jar:4.1.79.Final]
Nov 15 15:15:07 srvroom-es-logs-1 docker[1142]: at io.netty.handler.timeout.IdleStateHandler.channelRead(IdleStateHandler.java:286) [netty-handler-4.1.79.Final.jar:4.1.79.Final]
Nov 15 15:15:07 srvroom-es-logs-1 docker[1142]: at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:379) [netty-transport-4.1.79.Final.jar:4.1.79.Final]
Nov 15 15:15:07 srvroom-es-logs-1 docker[1142]: at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:365) [netty-transport-4.1.79.Final.jar:4.1.79.Final]
Nov 15 15:15:07 srvroom-es-logs-1 docker[1142]: at io.netty.channel.AbstractChannelHandlerContext.fireChannelRead(AbstractChannelHandlerContext.java:357) [netty-transport-4.1.79.Final.jar:4.1.79.Final]
Nov 15 15:15:07 srvroom-es-logs-1 docker[1142]: at io.netty.handler.codec.MessageToMessageDecoder.channelRead(MessageToMessageDecoder.java:103) [netty-codec-4.1.79.Final.jar:4.1.79.Final]
Nov 15 15:15:07 srvroom-es-logs-1 docker[1142]: at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:379) [netty-transport-4.1.79.Final.jar:4.1.79.Final]
Nov 15 15:15:07 srvroom-es-logs-1 docker[1142]: at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:365) [netty-transport-4.1.79.Final.jar:4.1.79.Final]
Nov 15 15:15:07 srvroom-es-logs-1 docker[1142]: at io.netty.channel.AbstractChannelHandlerContext.fireChannelRead(AbstractChannelHandlerContext.java:357) [netty-transport-4.1.79.Final.jar:4.1.79.Final]
Nov 15 15:15:07 srvroom-es-logs-1 docker[1142]: at io.netty.handler.ssl.SslHandler.unwrap(SslHandler.java:1373) [netty-handler-4.1.79.Final.jar:4.1.79.Final]
Nov 15 15:15:07 srvroom-es-logs-1 docker[1142]: at io.netty.handler.ssl.SslHandler.decodeJdkCompatible(SslHandler.java:1236) [netty-handler-4.1.79.Final.jar:4.1.79.Final]
Nov 15 15:15:07 srvroom-es-logs-1 docker[1142]: at io.netty.handler.ssl.SslHandler.decode(SslHandler.java:1285) [netty-handler-4.1.79.Final.jar:4.1.79.Final]
Nov 15 15:15:07 srvroom-es-logs-1 docker[1142]: at io.netty.handler.codec.ByteToMessageDecoder.decodeRemovalReentryProtection(ByteToMessageDecoder.java:510) [netty-codec-4.1.79.Final.jar:4.1.79.Final]
Nov 15 15:15:07 srvroom-es-logs-1 docker[1142]: at io.netty.handler.codec.ByteToMessageDecoder.callDecode(ByteToMessageDecoder.java:449) [netty-codec-4.1.79.Final.jar:4.1.79.Final]
Nov 15 15:15:07 srvroom-es-logs-1 docker[1142]: at io.netty.handler.codec.ByteToMessageDecoder.channelRead(ByteToMessageDecoder.java:279) [netty-codec-4.1.79.Final.jar:4.1.79.Final]
Nov 15 15:15:07 srvroom-es-logs-1 docker[1142]: at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:379) [netty-transport-4.1.79.Final.jar:4.1.79.Final]
Nov 15 15:15:07 srvroom-es-logs-1 docker[1142]: at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:365) [netty-transport-4.1.79.Final.jar:4.1.79.Final]
Nov 15 15:15:07 srvroom-es-logs-1 docker[1142]: at io.netty.channel.AbstractChannelHandlerContext.fireChannelRead(AbstractChannelHandlerContext.java:357) [netty-transport-4.1.79.Final.jar:4.1.79.Final]
Nov 15 15:15:07 srvroom-es-logs-1 docker[1142]: at io.netty.channel.DefaultChannelPipeline$HeadContext.channelRead(DefaultChannelPipeline.java:1410) [netty-transport-4.1.79.Final.jar:4.1.79.Final]
Nov 15 15:15:07 srvroom-es-logs-1 docker[1142]: at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:379) [netty-transport-4.1.79.Final.jar:4.1.79.Final]
Nov 15 15:15:07 srvroom-es-logs-1 docker[1142]: at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:365) [netty-transport-4.1.79.Final.jar:4.1.79.Final]
Nov 15 15:15:07 srvroom-es-logs-1 docker[1142]: at io.netty.channel.DefaultChannelPipeline.fireChannelRead(DefaultChannelPipeline.java:919) [netty-transport-4.1.79.Final.jar:4.1.79.Final]
Nov 15 15:15:07 srvroom-es-logs-1 docker[1142]: at io.netty.channel.nio.AbstractNioByteChannel$NioByteUnsafe.read(AbstractNioByteChannel.java:166) [netty-transport-4.1.79.Final.jar:4.1.79.Final]
Nov 15 15:15:07 srvroom-es-logs-1 docker[1142]: at io.netty.channel.nio.NioEventLoop.processSelectedKey(NioEventLoop.java:722) [netty-transport-4.1.79.Final.jar:4.1.79.Final]
Nov 15 15:15:07 srvroom-es-logs-1 docker[1142]: at io.netty.channel.nio.NioEventLoop.processSelectedKeysPlain(NioEventLoop.java:623) [netty-transport-4.1.79.Final.jar:4.1.79.Final]
Nov 15 15:15:07 srvroom-es-logs-1 docker[1142]: at io.netty.channel.nio.NioEventLoop.processSelectedKeys(NioEventLoop.java:586) [netty-transport-4.1.79.Final.jar:4.1.79.Final]
Nov 15 15:15:07 srvroom-es-logs-1 docker[1142]: at io.netty.channel.nio.NioEventLoop.run(NioEventLoop.java:496) [netty-transport-4.1.79.Final.jar:4.1.79.Final]
Nov 15 15:15:07 srvroom-es-logs-1 docker[1142]: at io.netty.util.concurrent.SingleThreadEventExecutor$4.run(SingleThreadEventExecutor.java:997) [netty-common-4.1.79.Final.jar:4.1.79.Final]
Nov 15 15:15:07 srvroom-es-logs-1 docker[1142]: at io.netty.util.internal.ThreadExecutorMap$2.run(ThreadExecutorMap.java:74) [netty-common-4.1.79.Final.jar:4.1.79.Final]
Nov 15 15:15:07 srvroom-es-logs-1 docker[1142]: at java.lang.Thread.run(Thread.java:833) [?:?]
Nov 15 15:15:07 srvroom-es-logs-1 docker[1142]: [2022-11-15T15:15:07,067][WARN ][r.suppressed             ] [es-logs-1.srvroom.domain.net] path: /_bulk, params: {}
Nov 15 15:15:07 srvroom-es-logs-1 docker[1142]: org.opensearch.OpenSearchSecurityException: Unexpected exception indices:data/write/bulk
Nov 15 15:15:07 srvroom-es-logs-1 docker[1142]: at org.opensearch.security.filter.SecurityFilter.apply0(SecurityFilter.java:374) [opensearch-security-2.3.0.0.jar:2.3.0.0]
Nov 15 15:15:07 srvroom-es-logs-1 docker[1142]: at org.opensearch.security.filter.SecurityFilter.apply(SecurityFilter.java:149) [opensearch-security-2.3.0.0.jar:2.3.0.0]
Nov 15 15:15:07 srvroom-es-logs-1 docker[1142]: at org.opensearch.action.support.TransportAction$RequestFilterChain.proceed(TransportAction.java:216) [opensearch-2.3.0.jar:2.3.0]
Nov 15 15:15:07 srvroom-es-logs-1 docker[1142]: at org.opensearch.action.support.TransportAction.execute(TransportAction.java:188) [opensearch-2.3.0.jar:2.3.0]
Nov 15 15:15:07 srvroom-es-logs-1 docker[1142]: at org.opensearch.action.support.TransportAction.execute(TransportAction.java:107) [opensearch-2.3.0.jar:2.3.0]
Nov 15 15:15:07 srvroom-es-logs-1 docker[1142]: at org.opensearch.client.node.NodeClient.executeLocally(NodeClient.java:110) [opensearch-2.3.0.jar:2.3.0]
Nov 15 15:15:07 srvroom-es-logs-1 docker[1142]: at org.opensearch.client.node.NodeClient.doExecute(NodeClient.java:97) [opensearch-2.3.0.jar:2.3.0]
Nov 15 15:15:07 srvroom-es-logs-1 docker[1142]: at org.opensearch.client.support.AbstractClient.execute(AbstractClient.java:426) [opensearch-2.3.0.jar:2.3.0]
Nov 15 15:15:07 srvroom-es-logs-1 docker[1142]: at org.opensearch.client.support.AbstractClient.bulk(AbstractClient.java:502) [opensearch-2.3.0.jar:2.3.0]
Nov 15 15:15:07 srvroom-es-logs-1 docker[1142]: at org.opensearch.rest.action.document.RestBulkAction.lambda$prepareRequest$0(RestBulkAction.java:111) [opensearch-2.3.0.jar:2.3.0]
Nov 15 15:15:07 srvroom-es-logs-1 docker[1142]: at org.opensearch.rest.BaseRestHandler.handleRequest(BaseRestHandler.java:125) [opensearch-2.3.0.jar:2.3.0]
Nov 15 15:15:07 srvroom-es-logs-1 docker[1142]: at org.opensearch.security.filter.SecurityRestFilter$1.handleRequest(SecurityRestFilter.java:127) [opensearch-security-2.3.0.0.jar:2.3.0.0]
Nov 15 15:15:07 srvroom-es-logs-1 docker[1142]: at org.opensearch.rest.RestController.dispatchRequest(RestController.java:312) [opensearch-2.3.0.jar:2.3.0]
Nov 15 15:15:07 srvroom-es-logs-1 docker[1142]: at org.opensearch.rest.RestController.tryAllHandlers(RestController.java:398) [opensearch-2.3.0.jar:2.3.0]
Nov 15 15:15:07 srvroom-es-logs-1 docker[1142]: at org.opensearch.rest.RestController.dispatchRequest(RestController.java:241) [opensearch-2.3.0.jar:2.3.0]
Nov 15 15:15:07 srvroom-es-logs-1 docker[1142]: at org.opensearch.security.ssl.http.netty.ValidatingDispatcher.dispatchRequest(ValidatingDispatcher.java:63) [opensearch-security-2.3.0.0.jar:2.3.0.0]
Nov 15 15:15:07 srvroom-es-logs-1 docker[1142]: at org.opensearch.http.AbstractHttpServerTransport.dispatchRequest(AbstractHttpServerTransport.java:366) [opensearch-2.3.0.jar:2.3.0]
Nov 15 15:15:07 srvroom-es-logs-1 docker[1142]: at org.opensearch.http.AbstractHttpServerTransport.handleIncomingRequest(AbstractHttpServerTransport.java:445) [opensearch-2.3.0.jar:2.3.0]
Nov 15 15:15:07 srvroom-es-logs-1 docker[1142]: at org.opensearch.http.AbstractHttpServerTransport.incomingRequest(AbstractHttpServerTransport.java:356) [opensearch-2.3.0.jar:2.3.0]
Nov 15 15:15:07 srvroom-es-logs-1 docker[1142]: at org.opensearch.http.netty4.Netty4HttpRequestHandler.channelRead0(Netty4HttpRequestHandler.java:55) [transport-netty4-client-2.3.0.jar:2.3.0]
Nov 15 15:15:07 srvroom-es-logs-1 docker[1142]: at org.opensearch.http.netty4.Netty4HttpRequestHandler.channelRead0(Netty4HttpRequestHandler.java:41) [transport-netty4-client-2.3.0.jar:2.3.0]
Nov 15 15:15:07 srvroom-es-logs-1 docker[1142]: at io.netty.channel.SimpleChannelInboundHandler.channelRead(SimpleChannelInboundHandler.java:99) [netty-transport-4.1.79.Final.jar:4.1.79.Final]
Nov 15 15:15:07 srvroom-es-logs-1 docker[1142]: at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:379) [netty-transport-4.1.79.Final.jar:4.1.79.Final]
Nov 15 15:15:07 srvroom-es-logs-1 docker[1142]: at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:365) [netty-transport-4.1.79.Final.jar:4.1.79.Final]
Nov 15 15:15:07 srvroom-es-logs-1 docker[1142]: at io.netty.channel.AbstractChannelHandlerContext.fireChannelRead(AbstractChannelHandlerContext.java:357) [netty-transport-4.1.79.Final.jar:4.1.79.Final]
Nov 15 15:15:07 srvroom-es-logs-1 docker[1142]: at org.opensearch.http.netty4.Netty4HttpPipeliningHandler.channelRead(Netty4HttpPipeliningHandler.java:71) [transport-netty4-client-2.3.0.jar:2.3.0]
Nov 15 15:15:07 srvroom-es-logs-1 docker[1142]: at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:379) [netty-transport-4.1.79.Final.jar:4.1.79.Final]
Nov 15 15:15:07 srvroom-es-logs-1 docker[1142]: at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:365) [netty-transport-4.1.79.Final.jar:4.1.79.Final]
Nov 15 15:15:07 srvroom-es-logs-1 docker[1142]: at io.netty.channel.AbstractChannelHandlerContext.fireChannelRead(AbstractChannelHandlerContext.java:357) [netty-transport-4.1.79.Final.jar:4.1.79.Final]
Nov 15 15:15:07 srvroom-es-logs-1 docker[1142]: at io.netty.handler.codec.MessageToMessageDecoder.channelRead(MessageToMessageDecoder.java:103) [netty-codec-4.1.79.Final.jar:4.1.79.Final]
Nov 15 15:15:07 srvroom-es-logs-1 docker[1142]: at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:379) [netty-transport-4.1.79.Final.jar:4.1.79.Final]
Nov 15 15:15:07 srvroom-es-logs-1 docker[1142]: at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:365) [netty-transport-4.1.79.Final.jar:4.1.79.Final]
Nov 15 15:15:07 srvroom-es-logs-1 docker[1142]: at io.netty.channel.AbstractChannelHandlerContext.fireChannelRead(AbstractChannelHandlerContext.java:357) [netty-transport-4.1.79.Final.jar:4.1.79.Final]
Nov 15 15:15:07 srvroom-es-logs-1 docker[1142]: at io.netty.handler.codec.MessageToMessageDecoder.channelRead(MessageToMessageDecoder.java:103) [netty-codec-4.1.79.Final.jar:4.1.79.Final]
Nov 15 15:15:07 srvroom-es-logs-1 docker[1142]: at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:379) [netty-transport-4.1.79.Final.jar:4.1.79.Final]
Nov 15 15:15:07 srvroom-es-logs-1 docker[1142]: at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:365) [netty-transport-4.1.79.Final.jar:4.1.79.Final]
Nov 15 15:15:07 srvroom-es-logs-1 docker[1142]: at io.netty.channel.AbstractChannelHandlerContext.fireChannelRead(AbstractChannelHandlerContext.java:357) [netty-transport-4.1.79.Final.jar:4.1.79.Final]
Nov 15 15:15:07 srvroom-es-logs-1 docker[1142]: at io.netty.handler.codec.MessageToMessageDecoder.channelRead(MessageToMessageDecoder.java:103) [netty-codec-4.1.79.Final.jar:4.1.79.Final]
Nov 15 15:15:07 srvroom-es-logs-1 docker[1142]: at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:379) [netty-transport-4.1.79.Final.jar:4.1.79.Final]
Nov 15 15:15:07 srvroom-es-logs-1 docker[1142]: at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:365) [netty-transport-4.1.79.Final.jar:4.1.79.Final]
Nov 15 15:15:07 srvroom-es-logs-1 docker[1142]: at io.netty.channel.AbstractChannelHandlerContext.fireChannelRead(AbstractChannelHandlerContext.java:357) [netty-transport-4.1.79.Final.jar:4.1.79.Final]
Nov 15 15:15:07 srvroom-es-logs-1 docker[1142]: at io.netty.handler.codec.ByteToMessageDecoder.fireChannelRead(ByteToMessageDecoder.java:327) [netty-codec-4.1.79.Final.jar:4.1.79.Final]
Nov 15 15:15:07 srvroom-es-logs-1 docker[1142]: at io.netty.handler.codec.ByteToMessageDecoder.channelRead(ByteToMessageDecoder.java:299) [netty-codec-4.1.79.Final.jar:4.1.79.Final]
Nov 15 15:15:07 srvroom-es-logs-1 docker[1142]: at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:379) [netty-transport-4.1.79.Final.jar:4.1.79.Final]
Nov 15 15:15:07 srvroom-es-logs-1 docker[1142]: at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:365) [netty-transport-4.1.79.Final.jar:4.1.79.Final]
Nov 15 15:15:07 srvroom-es-logs-1 docker[1142]: at io.netty.channel.AbstractChannelHandlerContext.fireChannelRead(AbstractChannelHandlerContext.java:357) [netty-transport-4.1.79.Final.jar:4.1.79.Final]
Nov 15 15:15:07 srvroom-es-logs-1 docker[1142]: at io.netty.handler.timeout.IdleStateHandler.channelRead(IdleStateHandler.java:286) [netty-handler-4.1.79.Final.jar:4.1.79.Final]
Nov 15 15:15:07 srvroom-es-logs-1 docker[1142]: at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:379) [netty-transport-4.1.79.Final.jar:4.1.79.Final]
Nov 15 15:15:07 srvroom-es-logs-1 docker[1142]: at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:365) [netty-transport-4.1.79.Final.jar:4.1.79.Final]
Nov 15 15:15:07 srvroom-es-logs-1 docker[1142]: at io.netty.channel.AbstractChannelHandlerContext.fireChannelRead(AbstractChannelHandlerContext.java:357) [netty-transport-4.1.79.Final.jar:4.1.79.Final]
Nov 15 15:15:07 srvroom-es-logs-1 docker[1142]: at io.netty.handler.codec.MessageToMessageDecoder.channelRead(MessageToMessageDecoder.java:103) [netty-codec-4.1.79.Final.jar:4.1.79.Final]
Nov 15 15:15:07 srvroom-es-logs-1 docker[1142]: at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:379) [netty-transport-4.1.79.Final.jar:4.1.79.Final]
Nov 15 15:15:07 srvroom-es-logs-1 docker[1142]: at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:365) [netty-transport-4.1.79.Final.jar:4.1.79.Final]
Nov 15 15:15:07 srvroom-es-logs-1 docker[1142]: at io.netty.channel.AbstractChannelHandlerContext.fireChannelRead(AbstractChannelHandlerContext.java:357) [netty-transport-4.1.79.Final.jar:4.1.79.Final]
Nov 15 15:15:07 srvroom-es-logs-1 docker[1142]: at io.netty.handler.ssl.SslHandler.unwrap(SslHandler.java:1373) [netty-handler-4.1.79.Final.jar:4.1.79.Final]
Nov 15 15:15:07 srvroom-es-logs-1 docker[1142]: at io.netty.handler.ssl.SslHandler.decodeJdkCompatible(SslHandler.java:1236) [netty-handler-4.1.79.Final.jar:4.1.79.Final]
Nov 15 15:15:07 srvroom-es-logs-1 docker[1142]: at io.netty.handler.ssl.SslHandler.decode(SslHandler.java:1285) [netty-handler-4.1.79.Final.jar:4.1.79.Final]
Nov 15 15:15:07 srvroom-es-logs-1 docker[1142]: at io.netty.handler.codec.ByteToMessageDecoder.decodeRemovalReentryProtection(ByteToMessageDecoder.java:510) [netty-codec-4.1.79.Final.jar:4.1.79.Final]
Nov 15 15:15:07 srvroom-es-logs-1 docker[1142]: at io.netty.handler.codec.ByteToMessageDecoder.callDecode(ByteToMessageDecoder.java:449) [netty-codec-4.1.79.Final.jar:4.1.79.Final]
Nov 15 15:15:07 srvroom-es-logs-1 docker[1142]: at io.netty.handler.codec.ByteToMessageDecoder.channelRead(ByteToMessageDecoder.java:279) [netty-codec-4.1.79.Final.jar:4.1.79.Final]
Nov 15 15:15:07 srvroom-es-logs-1 docker[1142]: at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:379) [netty-transport-4.1.79.Final.jar:4.1.79.Final]
Nov 15 15:15:07 srvroom-es-logs-1 docker[1142]: at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:365) [netty-transport-4.1.79.Final.jar:4.1.79.Final]
Nov 15 15:15:07 srvroom-es-logs-1 docker[1142]: at io.netty.channel.AbstractChannelHandlerContext.fireChannelRead(AbstractChannelHandlerContext.java:357) [netty-transport-4.1.79.Final.jar:4.1.79.Final]
Nov 15 15:15:07 srvroom-es-logs-1 docker[1142]: at io.netty.channel.DefaultChannelPipeline$HeadContext.channelRead(DefaultChannelPipeline.java:1410) [netty-transport-4.1.79.Final.jar:4.1.79.Final]
Nov 15 15:15:07 srvroom-es-logs-1 docker[1142]: at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:379) [netty-transport-4.1.79.Final.jar:4.1.79.Final]
Nov 15 15:15:07 srvroom-es-logs-1 docker[1142]: at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:365) [netty-transport-4.1.79.Final.jar:4.1.79.Final]
Nov 15 15:15:07 srvroom-es-logs-1 docker[1142]: at io.netty.channel.DefaultChannelPipeline.fireChannelRead(DefaultChannelPipeline.java:919) [netty-transport-4.1.79.Final.jar:4.1.79.Final]
Nov 15 15:15:07 srvroom-es-logs-1 docker[1142]: at io.netty.channel.nio.AbstractNioByteChannel$NioByteUnsafe.read(AbstractNioByteChannel.java:166) [netty-transport-4.1.79.Final.jar:4.1.79.Final]
Nov 15 15:15:07 srvroom-es-logs-1 docker[1142]: at io.netty.channel.nio.NioEventLoop.processSelectedKey(NioEventLoop.java:722) [netty-transport-4.1.79.Final.jar:4.1.79.Final]
Nov 15 15:15:07 srvroom-es-logs-1 docker[1142]: at io.netty.channel.nio.NioEventLoop.processSelectedKeysPlain(NioEventLoop.java:623) [netty-transport-4.1.79.Final.jar:4.1.79.Final]
Nov 15 15:15:07 srvroom-es-logs-1 docker[1142]: at io.netty.channel.nio.NioEventLoop.processSelectedKeys(NioEventLoop.java:586) [netty-transport-4.1.79.Final.jar:4.1.79.Final]
Nov 15 15:15:07 srvroom-es-logs-1 docker[1142]: at io.netty.channel.nio.NioEventLoop.run(NioEventLoop.java:496) [netty-transport-4.1.79.Final.jar:4.1.79.Final]
Nov 15 15:15:07 srvroom-es-logs-1 docker[1142]: at io.netty.util.concurrent.SingleThreadEventExecutor$4.run(SingleThreadEventExecutor.java:997) [netty-common-4.1.79.Final.jar:4.1.79.Final]
Nov 15 15:15:07 srvroom-es-logs-1 docker[1142]: at io.netty.util.internal.ThreadExecutorMap$2.run(ThreadExecutorMap.java:74) [netty-common-4.1.79.Final.jar:4.1.79.Final]
Nov 15 15:15:07 srvroom-es-logs-1 docker[1142]: at java.lang.Thread.run(Thread.java:833) [?:?]

Logstash logs:

Nov 15 15:15:07 srvroom-es-logs-1 docker[6711]: [2022-11-15T15:15:07,069][ERROR][logstash.outputs.elasticsearch][main][1f047d8c251176835ae470d6323b5986e9ec9d34e8a425b30855ce135ee014a5] Encountered a retryable error. Will Retry with exponential backoff  {:code=>500, :url=>"https://es-logs-1.srvroom.domain.net:9200/_bulk", :content_length=>79914}

What is the expected behavior?
opensearch indexes data from logstash without errors

What is your host/environment?

  • OS: Amazon Linux 2
  • opensearch 2.3.0
  • logstash-oss 7.12.1
  • opensearch and logstash run in docker

opensearch.yml

cluster.name: es-logs
node.name: es-logs-1.srvroom.domain.net
node.attr.node_type: hot
node.master: true
node.data: true
network.bind_host: "0.0.0.0"
network.publish_host: es-logs-1.srvroom.domain.net
indices.memory.index_buffer_size: 24%
indices.query.bool.max_clause_count: 2048
discovery.seed_hosts: ["es-logs-1.srvroom.domain.net","es-logs-2.srvroom.domain.net","es-logs-3.srvroom.domain.net","es-logs-4.srvroom.domain.net","es-logs-5.srvroom.domain.net","es-logs-warm-1.srvroom.domain.net"]
path.data: /var/opensearch/data
cluster.routing.allocation.disk.watermark.low: "30gb"
cluster.routing.allocation.disk.watermark.high: "8gb"
cluster.routing.allocation.disk.watermark.flood_stage: "4gb"
plugins.security.ssl.transport.pemcert_filepath: es-logs-1.srvroom.domain.net.crt
plugins.security.ssl.transport.pemkey_filepath: es-logs-1.srvroom.domain.net.key
plugins.security.ssl.transport.pemtrustedcas_filepath: es-logs-1.srvroom.domain.net.crt
plugins.security.ssl.transport.enforce_hostname_verification: false
plugins.security.ssl.transport.enabled: true
plugins.security.ssl.http.pemcert_filepath: es-logs-1.srvroom.domain.net.crt
plugins.security.ssl.http.pemkey_filepath: es-logs-1.srvroom.domain.net.key
plugins.security.ssl.http.pemtrustedcas_filepath: es-logs-1.srvroom.domain.net.crt
plugins.security.ssl.http.enabled: true
plugins.security.allow_unsafe_democertificates: false
plugins.security.allow_default_init_securityindex: true
plugins.security.authcz.admin_dn:
 - CN=es-logs-admin.srvroom.domain.net
plugins.security.ssl_cert_reload_enabled: true
plugins.security.nodes_dn:
- "CN=es-logs-1.srvroom.domain.net"
- "CN=es-logs-2.srvroom.domain.net"
- "CN=es-logs-3.srvroom.domain.net"
- "CN=es-logs-4.srvroom.domain.net"
- "CN=es-logs-5.srvroom.domain.net"
- "CN=es-logs-warm-1.srvroom.domain.net"
plugins.security.enable_snapshot_restore_privilege: true
plugins.security.check_snapshot_restore_write_privileges: true
plugins.security.restapi.roles_enabled: ["all_access", "security_rest_api_access"]
plugins.security.audit.type: internal_opensearch
plugins.security.audit.config.disabled_rest_categories: NONE
plugins.security.audit.config.disabled_transport_categories: NONE
plugins.security.cache.ttl_minutes: 0
opendistro_security.audit.ignore_users:
 - kibanaserver
 - elastalert
 - logwriter
 - exporter
compatibility.override_main_response_version: true

opensearch-security/config.yml

---
_meta:
  type: "config"
  config_version: 2
config:
  dynamic:
    http:
      anonymous_auth_enabled: true
    authc:
      basic_internal_auth_domain:
        http_enabled: true
        transport_enabled: false
        order: 0
        http_authenticator:
          type: basic
          challenge: false
        authentication_backend:
          type: internal
      openid_auth_domain:
        http_enabled: true
        transport_enabled: false
        order: 1
        http_authenticator:
          type: openid
          challenge: false
          config:
            subject_key: email
            openid_connect_url: https://accounts.google.com/.well-known/openid-configuration
        authentication_backend:
          type: noop
@olddanmer olddanmer added bug Something isn't working untriaged Require the attention of the repository maintainers and may need to be prioritized labels Nov 16, 2022
@peternied
Copy link
Member

Thanks for filing @olddanmer thanks for the great details on this issue.

@RyanL1997 RyanL1997 added triaged Issues labeled as 'Triaged' have been reviewed and are deemed actionable. and removed untriaged Require the attention of the repository maintainers and may need to be prioritized labels Nov 28, 2022
@RyanL1997
Copy link
Collaborator

[Triaged] Thanks for filing this issue. We will take look into it.

@RyanL1997 RyanL1997 added untriaged Require the attention of the repository maintainers and may need to be prioritized and removed triaged Issues labeled as 'Triaged' have been reviewed and are deemed actionable. labels Apr 17, 2023
@stephen-crawford
Copy link
Contributor

[Triage] Hi @RyanL1997, thank you for bringing this back to our collective attention. If you could try to reproduce this issue on the newer versions, could you please follow-up with what you find? Thank you.

@stephen-crawford stephen-crawford added triaged Issues labeled as 'Triaged' have been reviewed and are deemed actionable. and removed untriaged Require the attention of the repository maintainers and may need to be prioritized labels Apr 17, 2023
@RyanL1997
Copy link
Collaborator

RyanL1997 commented Jun 1, 2023

Hi @olddanmer, sorry for the late reply. I have tried to reproduce this issue and here are some findings from my end:

For this one:

I use dashboards with OpenID integration, so option plugins.security.cache.ttl_minutes: 0 is recommended by documentation. Without plugins.security.cache.ttl_minutes: 0 dashboards doesn't work properly with OpenID (see bug opensearch-project/security-dashboards-plugin#1138 (comment))

I think this original issue runs into that error because of the mis-configuring of opensearch_dashboards.yml(see my comment here). However, based on the experiments I did, I have tried both setups of with/without this plugins.security.cache.ttl_minutes: 0, and both of them work properly on my end. So may I ask you that are you having trouble to setup the OpenSearch Core + Dashboards without the cache ttl setting?

Here are some specs:
Experiment (with cache ttl config):

  • OpenSearch Core + Dashboards version: 2.3.0
  • logstash-oss 8.6.1
  • opensearch.yml:
plugins.security.cache.ttl_minutes: 0

plugins.security.ssl.transport.pemcert_filepath: esnode.pem
plugins.security.ssl.transport.pemkey_filepath: esnode-key.pem
plugins.security.ssl.transport.pemtrustedcas_filepath: root-ca.pem
plugins.security.ssl.transport.enforce_hostname_verification: false
plugins.security.ssl.http.enabled: true
plugins.security.ssl.http.pemcert_filepath: esnode.pem
plugins.security.ssl.http.pemkey_filepath: esnode-key.pem
plugins.security.ssl.http.pemtrustedcas_filepath: root-ca.pem
plugins.security.allow_unsafe_democertificates: true
plugins.security.allow_default_init_securityindex: true
plugins.security.authcz.admin_dn:
  - CN=kirk,OU=client,O=client,L=test, C=de

plugins.security.audit.type: internal_opensearch
plugins.security.enable_snapshot_restore_privilege: true
plugins.security.check_snapshot_restore_write_privileges: true
plugins.security.restapi.roles_enabled: ["all_access", "security_rest_api_access"]
plugins.security.system_indices.enabled: true
plugins.security.system_indices.indices: [".plugins-ml-model", ".plugins-ml-task", ".opendistro-alerting-config", ".opendistro-alerting-alert*", ".opendistro-anomaly-results*", ".opendistro-anomaly-detector*", ".opendistro-anomaly-checkpoints", ".opendistro-anomaly-detection-state", ".opendistro-reports-*", ".opensearch-notifications-*", ".opensearch-notebooks", ".opensearch-observability", ".opendistro-asynchronous-search-response*", ".replication-metadata-store"]
node.max_local_storage_nodes: 3
  • .../opensearch-security/config.yml's OIDC config:
    authc:
      ...
      ...
      openid_auth_domain:
        http_enabled: true
        transport_enabled: true
        order: 1
        http_authenticator:
          type: openid
          challenge: false
          config:
            subject_key: preferred_username
            roles_key: roles
            openid_connect_url: http://localhost:8080/auth/realms/master/.well-known/openid-configuration
        authentication_backend:
          type: noop
        ...
        ...
  • opensearch_dashboards.yml:
opensearch.hosts: ["https://localhost:9200"]
opensearch.ssl.verificationMode: none
opensearch.username: "admin"
opensearch.password: "admin"
opensearch.requestHeadersWhitelist: [authorization, securitytenant]

opensearch_security.multitenancy.enabled: false
# opensearch_security.multitenancy.enabled: true
# opensearch_security.multitenancy.tenants.preferred: [Private, Global]
opensearch_security.readonly_mode.roles: ["kibana_read_only"]
opensearch_security.cookie.secure: false
server.host: "0.0.0.0"

# Enable OpenID authentication
opensearch_security.auth.type: "openid"
#opensearch_security.auth.multiple_auth_enabled: true

# The IdP metadata endpoint
opensearch_security.openid.connect_url: "http://localhost:8080/auth/realms/master/.well-known/openid-configuration"

# The ID of the OpenID Connect client in your IdP
opensearch_security.openid.client_id: "opensearch-dashboards-sso"

# The client secret of the OpenID Connect client
opensearch_security.openid.client_secret: "xxxxxxxxxxxxxxxxxxxxxxxxxx"

opensearch_security.openid.base_redirect_url: http://localhost:5601

opensearch_security.openid.scope: openid profile email address phone microprofile-jwt

@cwperks , could you please help me to verify that if you can/cannot reproduce this issue? Thanks.

@cwperks
Copy link
Member

cwperks commented Jun 2, 2023

@RyanL1997 I was looking at this this morning and was just about to give up trying to reproduce, but eventually I was able to reproduce this locally. I reproduced this using the release candidate build for 2.8 so I have confirmed that it affects the latest version. My setup locally is the demo configuration with the plugins.security.cache.ttl_minutes=0 added. At first I tried reproducing without the OIDC backend in the list of backends trying to test out a theory that this only impacts basic authentication since logstash uses basic auth to authenticate, but I was not able to reproduce with only a basic auth backend in the authc list (though it may be possible) and I ended up adding a dummy OIDC backend in the config:

authc:
      basic_internal_auth_domain:
        description: "Authenticate via HTTP Basic against internal users database"
        http_enabled: true
        transport_enabled: true
        order: 0
        http_authenticator:
          type: basic
          challenge: true
        authentication_backend:
          type: intern
      openid_auth_domain:
        http_enabled: true
        transport_enabled: false
        order: 1
        http_authenticator:
          type: openid
          challenge: false
          config:
            subject_key: preferred_username
            openid_connect_url: http://host.docker.internal:8080/auth/realms/master/.well-known/openid-configuration
        authentication_backend:
          type: noop

I was able to reproduce this by writing a python script and using the bulk API:

from opensearchpy import OpenSearch
from opensearchpy import helpers
import random as r
import requests
import json
import base64

proto = 'https'
host = 'localhost'
port = 9200
auth = ('admin', 'admin') # For testing only. Don't store credentials in code.

# Create the client with SSL/TLS enabled, but hostname verification disabled.
client = OpenSearch(
    hosts = [{'host': host, 'port': port}],
    http_compress = True, # enables gzip compression for request bodies
    http_auth = auth,
    use_ssl = True,
    verify_certs = False,
    ssl_assert_hostname = False,
    ssl_show_warn = False
)

# Create an index with non-default settings.
index_name = 'movies'

try:
	response = client.indices.create(f'movies')
	print('\nCreating index:')
	print(response)
except:
	# ignore
	pass

title_choices = ['Titanic', 'Jurassic Park', 'Star Wars']
year_choices = ['2013', '1992', '2023', '2001', '1985']

lorem = [
	'Lorem ipsum dolor sit amet, consectetur adipiscing elit.',
	'Quisque vitae varius ex, eu volutpat orci.',
	'Aenean ullamcorper orci et vulputate fermentum.',
	'Cras erat dui, finibus vel lectus ac, pharetra dictum odio.',
	'Nullam tempus scelerisque purus, sed mattis elit condimentum nec.',
	'Etiam risus sapien, auctor eu volutpat sit amet, porta in nunc.',
	'Pellentesque habitant morbi tristique senectus et netus et malesuada fames ac turpis egestas.',
	'Proin ipsum purus, laoreet quis dictum a, laoreet sed ligula.',
	'Integer ultricies malesuada quam.',
	'Cras vel elit sed mi placerat pharetra eget vel odio.',
	'Duis ac nulla varius diam ultrices rutrum.'
]

basic_auth_token = base64.b64encode('admin:admin'.encode()).decode()

headers = {
	'Content-Type': 'application/json',
	'Authorization': f'Basic {basic_auth_token}'
}


bulk = []
for i in range(100000):
	document = {
	  'title': r.choice(title_choices),
	  'year': r.choice(year_choices),
	  'description': r.choice(lorem)
	}	

	bulk.append({"_index": index_name, "_id": f"{i}", "_source": document})

	if (i+1) % 100 == 0:
		print(f'\nAdding documents {i - 99} - {i}')
		helpers.bulk(client, bulk)
		bulk = []

It took a few minutes after running to see the error.

@cwperks
Copy link
Member

cwperks commented Jun 2, 2023

I haven't tested it out, but one thing I would try is checking if the set of security roles already contains a role before adding it to the set in this block: https://github.com/opensearch-project/security/blob/main/src/main/java/org/opensearch/security/user/User.java#L275-L284

It would be helpful to understand why the securityRoles set is being written to while simultaneously another thread is trying to instantiate an unmodifiableSet from that data structure.

@robinsillem
Copy link

I have the same issue with opensearch 2.6.0. It occurs on all our environments, with irregular timing, but about every 3 minutes on average. Logs are similar to the v2.3.0 logs above, but the exception is thrown from a different place in PriviiegesEvaluator, if that offers you any further insight (stack trace below)

java.util.ConcurrentModificationException: null
	at java.util.HashMap$HashIterator.nextNode(HashMap.java:1597) ~[?:?]
	at java.util.HashMap$KeyIterator.next(HashMap.java:1620) ~[?:?]
	at java.util.Collections$UnmodifiableCollection$1.next(Collections.java:1054) ~[?:?]
	at com.google.common.collect.Sets$1$1.computeNext(Sets.java:754) ~[guava-30.0-jre.jar:?]
	at com.google.common.collect.AbstractIterator.tryToComputeNext(AbstractIterator.java:141) ~[guava-30.0-jre.jar:?]
	at com.google.common.collect.AbstractIterator.hasNext(AbstractIterator.java:136) ~[guava-30.0-jre.jar:?]
	at java.lang.String.join(String.java:3325) ~[?:?]
	at org.opensearch.security.privileges.PrivilegesEvaluator.setUserInfoInThreadContext(PrivilegesEvaluator.java:191) ~[opensearch-security-2.6.0.0.jar:2.6.0.0]
	at org.opensearch.security.privileges.PrivilegesEvaluator.evaluate(PrivilegesEvaluator.java:237) ~[opensearch-security-2.6.0.0.jar:2.6.0.0]
	at org.opensearch.security.filter.SecurityFilter.apply0(SecurityFilter.java:303) [opensearch-security-2.6.0.0.jar:2.6.0.0]
	at org.opensearch.security.filter.SecurityFilter.apply(SecurityFilter.java:149) [opensearch-security-2.6.0.0.jar:2.6.0.0]
	at org.opensearch.action.support.TransportAction$RequestFilterChain.proceed(TransportAction.java:216) [opensearch-2.6.0.jar:2.6.0]
	at org.opensearch.action.support.TransportAction.execute(TransportAction.java:188) [opensearch-2.6.0.jar:2.6.0]
	at org.opensearch.action.support.TransportAction.execute(TransportAction.java:107) [opensearch-2.6.0.jar:2.6.0]
	at org.opensearch.client.node.NodeClient.executeLocally(NodeClient.java:110) [opensearch-2.6.0.jar:2.6.0]
	at org.opensearch.client.node.NodeClient.doExecute(NodeClient.java:97) [opensearch-2.6.0.jar:2.6.0]
	at org.opensearch.client.support.AbstractClient.execute(AbstractClient.java:465) [opensearch-2.6.0.jar:2.6.0]
	at org.opensearch.client.support.AbstractClient.bulk(AbstractClient.java:541) [opensearch-2.6.0.jar:2.6.0]
	at org.opensearch.rest.action.document.RestBulkAction.lambda$prepareRequest$0(RestBulkAction.java:111) [opensearch-2.6.0.jar:2.6.0]
	at org.opensearch.rest.BaseRestHandler.handleRequest(BaseRestHandler.java:125) [opensearch-2.6.0.jar:2.6.0]
	at org.opensearch.security.filter.SecurityRestFilter$1.handleRequest(SecurityRestFilter.java:127) [opensearch-security-2.6.0.0.jar:2.6.0.0]
	at org.opensearch.rest.RestController.dispatchRequest(RestController.java:312) [opensearch-2.6.0.jar:2.6.0]
	at org.opensearch.rest.RestController.tryAllHandlers(RestController.java:398) [opensearch-2.6.0.jar:2.6.0]
	at org.opensearch.rest.RestController.dispatchRequest(RestController.java:241) [opensearch-2.6.0.jar:2.6.0]
	at org.opensearch.security.ssl.http.netty.ValidatingDispatcher.dispatchRequest(ValidatingDispatcher.java:63) [opensearch-security-2.6.0.0.jar:2.6.0.0]
	at org.opensearch.http.AbstractHttpServerTransport.dispatchRequest(AbstractHttpServerTransport.java:366) [opensearch-2.6.0.jar:2.6.0]
	at org.opensearch.http.AbstractHttpServerTransport.handleIncomingRequest(AbstractHttpServerTransport.java:445) [opensearch-2.6.0.jar:2.6.0]
	at org.opensearch.http.AbstractHttpServerTransport.incomingRequest(AbstractHttpServerTransport.java:356) [opensearch-2.6.0.jar:2.6.0]
	at org.opensearch.http.netty4.Netty4HttpRequestHandler.channelRead0(Netty4HttpRequestHandler.java:55) [transport-netty4-client-2.6.0.jar:2.6.0]
	at org.opensearch.http.netty4.Netty4HttpRequestHandler.channelRead0(Netty4HttpRequestHandler.java:41) [transport-netty4-client-2.6.0.jar:2.6.0]
	at io.netty.channel.SimpleChannelInboundHandler.channelRead(SimpleChannelInboundHandler.java:99) [netty-transport-4.1.87.Final.jar:4.1.87.Final]
	at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:444) [netty-transport-4.1.87.Final.jar:4.1.87.Final]
	at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:420) [netty-transport-4.1.87.Final.jar:4.1.87.Final]
	at io.netty.channel.AbstractChannelHandlerContext.fireChannelRead(AbstractChannelHandlerContext.java:412) [netty-transport-4.1.87.Final.jar:4.1.87.Final]
	at org.opensearch.http.netty4.Netty4HttpPipeliningHandler.channelRead(Netty4HttpPipeliningHandler.java:71) [transport-netty4-client-2.6.0.jar:2.6.0]
	at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:442) [netty-transport-4.1.87.Final.jar:4.1.87.Final]
	at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:420) [netty-transport-4.1.87.Final.jar:4.1.87.Final]
	at io.netty.channel.AbstractChannelHandlerContext.fireChannelRead(AbstractChannelHandlerContext.java:412) [netty-transport-4.1.87.Final.jar:4.1.87.Final]
	at io.netty.handler.codec.MessageToMessageDecoder.channelRead(MessageToMessageDecoder.java:103) [netty-codec-4.1.87.Final.jar:4.1.87.Final]
	at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:444) [netty-transport-4.1.87.Final.jar:4.1.87.Final]
	at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:420) [netty-transport-4.1.87.Final.jar:4.1.87.Final]
	at io.netty.channel.AbstractChannelHandlerContext.fireChannelRead(AbstractChannelHandlerContext.java:412) [netty-transport-4.1.87.Final.jar:4.1.87.Final]
	at io.netty.handler.codec.MessageToMessageDecoder.channelRead(MessageToMessageDecoder.java:103) [netty-codec-4.1.87.Final.jar:4.1.87.Final]
	at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:444) [netty-transport-4.1.87.Final.jar:4.1.87.Final]
	at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:420) [netty-transport-4.1.87.Final.jar:4.1.87.Final]
	at io.netty.channel.AbstractChannelHandlerContext.fireChannelRead(AbstractChannelHandlerContext.java:412) [netty-transport-4.1.87.Final.jar:4.1.87.Final]
	at io.netty.handler.codec.MessageToMessageDecoder.channelRead(MessageToMessageDecoder.java:103) [netty-codec-4.1.87.Final.jar:4.1.87.Final]
	at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:444) [netty-transport-4.1.87.Final.jar:4.1.87.Final]
	at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:420) [netty-transport-4.1.87.Final.jar:4.1.87.Final]
	at io.netty.channel.AbstractChannelHandlerContext.fireChannelRead(AbstractChannelHandlerContext.java:412) [netty-transport-4.1.87.Final.jar:4.1.87.Final]
	at io.netty.handler.codec.ByteToMessageDecoder.fireChannelRead(ByteToMessageDecoder.java:346) [netty-codec-4.1.87.Final.jar:4.1.87.Final]
	at io.netty.handler.codec.ByteToMessageDecoder.channelRead(ByteToMessageDecoder.java:318) [netty-codec-4.1.87.Final.jar:4.1.87.Final]
	at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:444) [netty-transport-4.1.87.Final.jar:4.1.87.Final]
	at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:420) [netty-transport-4.1.87.Final.jar:4.1.87.Final]
	at io.netty.channel.AbstractChannelHandlerContext.fireChannelRead(AbstractChannelHandlerContext.java:412) [netty-transport-4.1.87.Final.jar:4.1.87.Final]
	at io.netty.handler.timeout.IdleStateHandler.channelRead(IdleStateHandler.java:286) [netty-handler-4.1.87.Final.jar:4.1.87.Final]
	at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:442) [netty-transport-4.1.87.Final.jar:4.1.87.Final]
	at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:420) [netty-transport-4.1.87.Final.jar:4.1.87.Final]
	at io.netty.channel.AbstractChannelHandlerContext.fireChannelRead(AbstractChannelHandlerContext.java:412) [netty-transport-4.1.87.Final.jar:4.1.87.Final]
	at io.netty.handler.codec.MessageToMessageDecoder.channelRead(MessageToMessageDecoder.java:103) [netty-codec-4.1.87.Final.jar:4.1.87.Final]
	at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:444) [netty-transport-4.1.87.Final.jar:4.1.87.Final]
	at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:420) [netty-transport-4.1.87.Final.jar:4.1.87.Final]
	at io.netty.channel.AbstractChannelHandlerContext.fireChannelRead(AbstractChannelHandlerContext.java:412) [netty-transport-4.1.87.Final.jar:4.1.87.Final]
	at io.netty.handler.ssl.SslHandler.unwrap(SslHandler.java:1373) [netty-handler-4.1.87.Final.jar:4.1.87.Final]
	at io.netty.handler.ssl.SslHandler.decodeJdkCompatible(SslHandler.java:1236) [netty-handler-4.1.87.Final.jar:4.1.87.Final]
	at io.netty.handler.ssl.SslHandler.decode(SslHandler.java:1285) [netty-handler-4.1.87.Final.jar:4.1.87.Final]
	at io.netty.handler.codec.ByteToMessageDecoder.decodeRemovalReentryProtection(ByteToMessageDecoder.java:529) [netty-codec-4.1.87.Final.jar:4.1.87.Final]
	at io.netty.handler.codec.ByteToMessageDecoder.callDecode(ByteToMessageDecoder.java:468) [netty-codec-4.1.87.Final.jar:4.1.87.Final]
	at io.netty.handler.codec.ByteToMessageDecoder.channelRead(ByteToMessageDecoder.java:290) [netty-codec-4.1.87.Final.jar:4.1.87.Final]
	at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:444) [netty-transport-4.1.87.Final.jar:4.1.87.Final]
	at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:420) [netty-transport-4.1.87.Final.jar:4.1.87.Final]
	at io.netty.channel.AbstractChannelHandlerContext.fireChannelRead(AbstractChannelHandlerContext.java:412) [netty-transport-4.1.87.Final.jar:4.1.87.Final]
	at io.netty.channel.DefaultChannelPipeline$HeadContext.channelRead(DefaultChannelPipeline.java:1410) [netty-transport-4.1.87.Final.jar:4.1.87.Final]
	at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:440) [netty-transport-4.1.87.Final.jar:4.1.87.Final]
	at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:420) [netty-transport-4.1.87.Final.jar:4.1.87.Final]
	at io.netty.channel.DefaultChannelPipeline.fireChannelRead(DefaultChannelPipeline.java:919) [netty-transport-4.1.87.Final.jar:4.1.87.Final]
	at io.netty.channel.nio.AbstractNioByteChannel$NioByteUnsafe.read(AbstractNioByteChannel.java:166) [netty-transport-4.1.87.Final.jar:4.1.87.Final]
	at io.netty.channel.nio.NioEventLoop.processSelectedKey(NioEventLoop.java:788) [netty-transport-4.1.87.Final.jar:4.1.87.Final]
	at io.netty.channel.nio.NioEventLoop.processSelectedKeysPlain(NioEventLoop.java:689) [netty-transport-4.1.87.Final.jar:4.1.87.Final]
	at io.netty.channel.nio.NioEventLoop.processSelectedKeys(NioEventLoop.java:652) [netty-transport-4.1.87.Final.jar:4.1.87.Final]
	at io.netty.channel.nio.NioEventLoop.run(NioEventLoop.java:562) [netty-transport-4.1.87.Final.jar:4.1.87.Final]
	at io.netty.util.concurrent.SingleThreadEventExecutor$4.run(SingleThreadEventExecutor.java:997) [netty-common-4.1.87.Final.jar:4.1.87.Final]
	at io.netty.util.internal.ThreadExecutorMap$2.run(ThreadExecutorMap.java:74) [netty-common-4.1.87.Final.jar:4.1.87.Final]
	at java.lang.Thread.run(Thread.java:833) [?:?]

Our /usr/share/opensearch/config/opensearch.yml is:


network.host: 0.0.0.0

s3:
  client:
    default:
      endpoint: s3.eu-west-2.amazonaws.com
      max_retries: 3
      protocol: https
      region: eu-west-2
      read_timeout: 50s
      use_throttle_retries: true
      identity_token_file: /usr/share/opensearch/plugins/repository-s3/token/token

plugins:
  security:
    cache:
      ttl_minutes: 0

    authcz:
      admin_dn:
        - CN=opensearch-admin.opensearch-preprod.svc.cluster.local

    ssl:
      transport:
        pemcert_filepath: "/usr/share/opensearch/config/tls/tls.crt"
        pemkey_filepath: "/usr/share/opensearch/config/tls/tls.key"
        pemtrustedcas_filepath: "/usr/share/opensearch/config/tls/ca.crt"
        enforce_hostname_verification: false
        resolve_hostname: false
      http:
        enabled: true
        pemcert_filepath: "/usr/share/opensearch/config/tls/tls.crt"
        pemkey_filepath: "/usr/share/opensearch/config/tls/tls.key"
        pemtrustedcas_filepath: "/usr/share/opensearch/config/tls/ca.crt"
        enabled_protocols:
          - "TLSv1.2"
          - "TLSv1.3"

    audit.type: internal_opensearch
    allow_unsafe_democertificates: false
    allow_default_init_securityindex: true
    nodes_dn:
      - CN=opensearch-cluster-master.opensearch-preprod.svc.cluster.local
      - CN=opensearch-cluster-data.opensearch-preprod.svc.cluster.local
      - CN=opensearch-cluster-master-headless.opensearch-preprod.svc.cluster.local
      - CN=opensearch-cluster-data-headless.opensearch-preprod.svc.cluster.local

    enable_snapshot_restore_privilege: true
    check_snapshot_restore_write_privileges: true
    restapi:
      roles_enabled: ["all_access", "security_rest_api_access"]

@RyanL1997
Copy link
Collaborator

RyanL1997 commented Jun 20, 2023

Hi guys, thank you for providing all the details. I have successfully reproduce the error by following @cwperks's procedures:

Screenshot 2023-06-20 at 2 53 04 PM

So, the first thing I would like to tryout is that to make these two functions synchronized.
I will keep updating the findings here.

@cwperks
Copy link
Member

cwperks commented Jun 22, 2023

In testing, one of the primary places this issue arises is here in PrivilegesEvaluator.setUserInfoInThreadContext.

A potential solution could be to wrap the user.getSecurityRoles() call with Set.copyOf(user.getSecurityRoles()) to get a copy of the data structure so that it's not susceptible to concurrent modification. Another option could be to make the call of user.getSecurityRoles() return a copy of the roles. Right now its returning Collections.unmodifiableSet(this.securityRoles) which does not return a copy of the securityRoles data structure.

@RyanL1997
Copy link
Collaborator

Attach to @cwperks's comment and also move some of offline discussions over here:

The line of Collections.unmodifiableSet(this.securityRoles) does not really create a copy of set, instead it returns an unmodifiable "view" of the original set. So in summary, changes made to the original set after creating an unmodifiable view will not be reflected in the unmodifiable view. The unmodifiable view presents a read-only perspective of the original set as it existed at the time of creation.

@cwperks
Copy link
Member

cwperks commented Jun 23, 2023

@RyanL1997 Exactly. With the concurrent modification imagine this:

Thread 1

joiner.add(String.join(",", Sets.union(user.getSecurityRoles(), mappedRoles)));

Thread 2

user.addSecurityRoles(Set.of("role1", "role2"));

This is the scenario where concurrent modification can happen. Sets.union is using the user.getSecurityRoles() to union with the mapped roles and during the execution of that function if another thread is trying to modify user.getSecurityRoles() which is one of the places where the exception gets thrown.

@davidlago davidlago added the v2.10.0 Issues targeting release v2.10.0 label Jul 31, 2023
opensearch-trigger-bot bot pushed a commit that referenced this issue Aug 7, 2023
…esolve ConcurrentModificationException on bulk request (#3094)

### Description

This PR changes the order of `setUserInfoInThreadContext` and
`addSecurityRoles` to ensure that `addSecurityRoles` is called before
`setUserInfoInThreadContext`. The ConcurrentModificationException would
arise in scenarios where the first thread was done with
`setUserInfoInThreadContext` and had moved onto `addSecurityRoles` to
add the mapped roles into the user's set of security roles. On the first
call to `addSecurityRoles` it may be populating the set with data, any
subsequent call would be a noop in other threads.

If simultaneously there is another thread executing
`setUserInfoInThreadContext` while the first thread is in
`addSecurityRoles` then a ConcurrentModificationException is thrown
inside the `Sets.union(...)` call.

By calling `addSecurityRoles` before `setUserInfoInThreadContext`, it
can be guaranteed that no ConcurrentModificationException could be
thrown because the user's security roles will already be set and any
thread that attempts another call will be a noop.

* Category (Enhancement, New feature, Bug fix, Test fix, Refactoring,
Maintenance, Documentation)

Bug fix

### Issues Resolved

#2263

### Testing

Tested by running a python script to bulk insert 100k documents and
using tmux to run the script in multiple shells at once. Before the
change the bug is reproducible regularly, but after the change the bug
cannot be reproduced.

### Check List
- [ ] New functionality includes testing
- [ ] New functionality has been documented
- [ ] Commits are signed per the DCO using --signoff

By submitting this pull request, I confirm that my contribution is made
under the terms of the Apache 2.0 license.
For more information on following Developer Certificate of Origin and
signing off your commits, please check
[here](https://github.com/opensearch-project/OpenSearch/blob/main/CONTRIBUTING.md#developer-certificate-of-origin).

---------

Signed-off-by: Craig Perkins <[email protected]>
Signed-off-by: Craig Perkins <[email protected]>
(cherry picked from commit cd699bb)
Signed-off-by: github-actions[bot] <github-actions[bot]@users.noreply.github.com>
stephen-crawford pushed a commit to stephen-crawford/security that referenced this issue Aug 11, 2023
…esolve ConcurrentModificationException on bulk request (opensearch-project#3094)

### Description

This PR changes the order of `setUserInfoInThreadContext` and
`addSecurityRoles` to ensure that `addSecurityRoles` is called before
`setUserInfoInThreadContext`. The ConcurrentModificationException would
arise in scenarios where the first thread was done with
`setUserInfoInThreadContext` and had moved onto `addSecurityRoles` to
add the mapped roles into the user's set of security roles. On the first
call to `addSecurityRoles` it may be populating the set with data, any
subsequent call would be a noop in other threads.

If simultaneously there is another thread executing
`setUserInfoInThreadContext` while the first thread is in
`addSecurityRoles` then a ConcurrentModificationException is thrown
inside the `Sets.union(...)` call.

By calling `addSecurityRoles` before `setUserInfoInThreadContext`, it
can be guaranteed that no ConcurrentModificationException could be
thrown because the user's security roles will already be set and any
thread that attempts another call will be a noop.

* Category (Enhancement, New feature, Bug fix, Test fix, Refactoring,
Maintenance, Documentation)

Bug fix

### Issues Resolved

opensearch-project#2263

### Testing

Tested by running a python script to bulk insert 100k documents and
using tmux to run the script in multiple shells at once. Before the
change the bug is reproducible regularly, but after the change the bug
cannot be reproduced.

### Check List
- [ ] New functionality includes testing
- [ ] New functionality has been documented
- [ ] Commits are signed per the DCO using --signoff

By submitting this pull request, I confirm that my contribution is made
under the terms of the Apache 2.0 license.
For more information on following Developer Certificate of Origin and
signing off your commits, please check
[here](https://github.com/opensearch-project/OpenSearch/blob/main/CONTRIBUTING.md#developer-certificate-of-origin).

---------

Signed-off-by: Craig Perkins <[email protected]>
Signed-off-by: Craig Perkins <[email protected]>
opensearch-trigger-bot bot pushed a commit that referenced this issue Aug 14, 2023
…esolve ConcurrentModificationException on bulk request (#3094)

### Description

This PR changes the order of `setUserInfoInThreadContext` and
`addSecurityRoles` to ensure that `addSecurityRoles` is called before
`setUserInfoInThreadContext`. The ConcurrentModificationException would
arise in scenarios where the first thread was done with
`setUserInfoInThreadContext` and had moved onto `addSecurityRoles` to
add the mapped roles into the user's set of security roles. On the first
call to `addSecurityRoles` it may be populating the set with data, any
subsequent call would be a noop in other threads.

If simultaneously there is another thread executing
`setUserInfoInThreadContext` while the first thread is in
`addSecurityRoles` then a ConcurrentModificationException is thrown
inside the `Sets.union(...)` call.

By calling `addSecurityRoles` before `setUserInfoInThreadContext`, it
can be guaranteed that no ConcurrentModificationException could be
thrown because the user's security roles will already be set and any
thread that attempts another call will be a noop.

* Category (Enhancement, New feature, Bug fix, Test fix, Refactoring,
Maintenance, Documentation)

Bug fix

### Issues Resolved

#2263

### Testing

Tested by running a python script to bulk insert 100k documents and
using tmux to run the script in multiple shells at once. Before the
change the bug is reproducible regularly, but after the change the bug
cannot be reproduced.

### Check List
- [ ] New functionality includes testing
- [ ] New functionality has been documented
- [ ] Commits are signed per the DCO using --signoff

By submitting this pull request, I confirm that my contribution is made
under the terms of the Apache 2.0 license.
For more information on following Developer Certificate of Origin and
signing off your commits, please check
[here](https://github.com/opensearch-project/OpenSearch/blob/main/CONTRIBUTING.md#developer-certificate-of-origin).

---------

Signed-off-by: Craig Perkins <[email protected]>
Signed-off-by: Craig Perkins <[email protected]>
(cherry picked from commit cd699bb)
Signed-off-by: github-actions[bot] <github-actions[bot]@users.noreply.github.com>
cwperks added a commit to cwperks/security that referenced this issue Aug 16, 2023
…esolve ConcurrentModificationException on bulk request (opensearch-project#3094)

This PR changes the order of `setUserInfoInThreadContext` and
`addSecurityRoles` to ensure that `addSecurityRoles` is called before
`setUserInfoInThreadContext`. The ConcurrentModificationException would
arise in scenarios where the first thread was done with
`setUserInfoInThreadContext` and had moved onto `addSecurityRoles` to
add the mapped roles into the user's set of security roles. On the first
call to `addSecurityRoles` it may be populating the set with data, any
subsequent call would be a noop in other threads.

If simultaneously there is another thread executing
`setUserInfoInThreadContext` while the first thread is in
`addSecurityRoles` then a ConcurrentModificationException is thrown
inside the `Sets.union(...)` call.

By calling `addSecurityRoles` before `setUserInfoInThreadContext`, it
can be guaranteed that no ConcurrentModificationException could be
thrown because the user's security roles will already be set and any
thread that attempts another call will be a noop.

* Category (Enhancement, New feature, Bug fix, Test fix, Refactoring,
Maintenance, Documentation)

Bug fix

opensearch-project#2263

Tested by running a python script to bulk insert 100k documents and
using tmux to run the script in multiple shells at once. Before the
change the bug is reproducible regularly, but after the change the bug
cannot be reproduced.

- [ ] New functionality includes testing
- [ ] New functionality has been documented
- [ ] Commits are signed per the DCO using --signoff

By submitting this pull request, I confirm that my contribution is made
under the terms of the Apache 2.0 license.
For more information on following Developer Certificate of Origin and
signing off your commits, please check
[here](https://github.com/opensearch-project/OpenSearch/blob/main/CONTRIBUTING.md#developer-certificate-of-origin).

---------

Signed-off-by: Craig Perkins <[email protected]>
Signed-off-by: Craig Perkins <[email protected]>
(cherry picked from commit cd699bb)
cwperks added a commit to cwperks/security that referenced this issue Aug 16, 2023
…esolve ConcurrentModificationException on bulk request (opensearch-project#3094)

This PR changes the order of `setUserInfoInThreadContext` and
`addSecurityRoles` to ensure that `addSecurityRoles` is called before
`setUserInfoInThreadContext`. The ConcurrentModificationException would
arise in scenarios where the first thread was done with
`setUserInfoInThreadContext` and had moved onto `addSecurityRoles` to
add the mapped roles into the user's set of security roles. On the first
call to `addSecurityRoles` it may be populating the set with data, any
subsequent call would be a noop in other threads.

If simultaneously there is another thread executing
`setUserInfoInThreadContext` while the first thread is in
`addSecurityRoles` then a ConcurrentModificationException is thrown
inside the `Sets.union(...)` call.

By calling `addSecurityRoles` before `setUserInfoInThreadContext`, it
can be guaranteed that no ConcurrentModificationException could be
thrown because the user's security roles will already be set and any
thread that attempts another call will be a noop.

* Category (Enhancement, New feature, Bug fix, Test fix, Refactoring,
Maintenance, Documentation)

Bug fix

opensearch-project#2263

Tested by running a python script to bulk insert 100k documents and
using tmux to run the script in multiple shells at once. Before the
change the bug is reproducible regularly, but after the change the bug
cannot be reproduced.

- [ ] New functionality includes testing
- [ ] New functionality has been documented
- [ ] Commits are signed per the DCO using --signoff

By submitting this pull request, I confirm that my contribution is made
under the terms of the Apache 2.0 license.
For more information on following Developer Certificate of Origin and
signing off your commits, please check
[here](https://github.com/opensearch-project/OpenSearch/blob/main/CONTRIBUTING.md#developer-certificate-of-origin).

---------

Signed-off-by: Craig Perkins <[email protected]>
Signed-off-by: Craig Perkins <[email protected]>
(cherry picked from commit cd699bb)
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
bug Something isn't working triaged Issues labeled as 'Triaged' have been reviewed and are deemed actionable. v2.10.0 Issues targeting release v2.10.0
Projects
None yet
Development

Successfully merging a pull request may close this issue.

7 participants