-
Notifications
You must be signed in to change notification settings - Fork 282
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
[BUG] Scroll query is broken for multi node clusters for LDAP users #3582
Comments
[Triage] Please fill this issue out with more description of the issue. |
Updated the description. |
I cannot reproduce this issue. When running a multi-node cluster and creating a scroll, it will create the readerContext with an instance of LdapUser regardless of the node that creates the context. (The node that creates the context will be a node containing a shard for the index). The receiving node serializes the authenticated LdapUser uses the transport layer to forward the request to create a scroll to a node with a shard where its deserialized as LdapUser and stored. When a user uses the scroll id to then fetch more results, they authenticate as an LdapUser and it matches what is stored in the reader context. Can you provide more information about how to reproduce this issue? There could be edge cases like an internal user and LDAP user with the same username or a user may try to user impersonation to impersonate another user in the system. |
I reproduced the issue using user impersonation. For my LDAP configuration I use the following ldif file directory.ldif# --- OUs -------------------------------------dn: ou=Groups,dc=example,dc=org dn: ou=People,dc=example,dc=org --- People ----------------------------------dn: cn=jsmith,ou=People,dc=example,dc=org --- Groups ----------------------------------dn: cn=Administrator,ou=Groups,dc=example,dc=org and in
Create the scroll query with an LDAP user and the reader context stores an instance of LDAP user. Then use user impersonation to impersonate the LDAP user:
When impersonating
|
Closing this issue. #3805 has been merged and backported. |
What is the issue?
Security plugin supports different types of user like InjectedUser and LDAPUser for various Authentication mechanism, where those various type of users are serialized and passed to different nodes after AuthN/AuthZ on co-ordinating node to avoid re-AuthN/AuthZ.
With this PR #2765 which was added to improve performance by avoiding serialization on same node.
But a corner case got missed i.e. scroll queries where user who create scroll query can only access it and it's done by security plugin by creating reader context on scroll query and store it locally which also stores user information code: https://github.com/opensearch-project/security/blob/2.9/src/main/java/org/opensearch/security/OpenSearchSecurityPlugin.java#L653,L679.
Now let consider scenario where we've two node cluster let's say Node-1 and Node-2. When create scroll request lend on Node-1 it will have readerContext with Injected/LDAP user with respective class type and Node-2 will have readerContext with user with User.class type. Now if scroll get request lend on Node-1 then it will work as both current user and readerContext user are of same type. But if request lend on Node-2 it will serialize current user and send to Node-1 now current user will of type User.class but readerContext will have user with class Injected/LDAP user which will fail as both users have different class types.
How can one reproduce the bug?
A simple scroll request on a multi node cluster should reproduce the issue.
What is the expected behavior?
Scroll query should work on multi node cluster for different types of users.
What is your host/environment?
OS v2.9
Security Plugin
Do you have any additional context?
code where security plugin makes user match for scroll query: https://github.com/opensearch-project/security/blob/2.9/src/main/java/org/opensearch/security/OpenSearchSecurityPlugin.java#L692.
Exit criteria:
The text was updated successfully, but these errors were encountered: