Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

[FEA] Shim followup - investigate version matching logic #409

Closed
tgravescs opened this issue Jul 23, 2020 · 3 comments
Closed

[FEA] Shim followup - investigate version matching logic #409

tgravescs opened this issue Jul 23, 2020 · 3 comments
Labels
feature request New feature or request P1 Nice to have for release

Comments

@tgravescs
Copy link
Collaborator

Shimloader logic for matching versions is currently very strict and must match exactly and we throw an exception when a shim layer is not found.

Investigate if we want to relax that in case users modify the version. Like 3.0.0.1 but its really compatible with 3.0.0 for instance. Do we want to allow snapshots? Generally should be ok unless we update our code and they use older snapshot.

@tgravescs tgravescs added feature request New feature or request ? - Needs Triage Need team to review and classify labels Jul 23, 2020
@sameerz sameerz added P1 Nice to have for release and removed ? - Needs Triage Need team to review and classify labels Jul 28, 2020
@jlowe
Copy link
Member

jlowe commented Sep 14, 2020

I think minimally we need a plugin config that can override the detection logic and specify the shim that should be used. This helps cover the case where the user is using a custom version of Spark that we've never tested/seen but has a chance of working with a specific shim. At that point the user is a bit on their own since we've not tested that combination, but at least they have a possible path to move forward. cc: @sameerz for his thoughts on this.

@sameerz
Copy link
Collaborator

sameerz commented Sep 15, 2020

@jlowe agreed we can let the plugin run on an untested version of Spark, as long as we have a message indicating we are running on a version of Spark we have not tested the plugin with.

@tgravescs
Copy link
Collaborator Author

seems like the override has been working we can close for now until becomes issue

pxLi pushed a commit to pxLi/spark-rapids that referenced this issue May 12, 2022
Cert builder was not initialized with proper issuer
tgravescs pushed a commit to tgravescs/spark-rapids that referenced this issue Nov 30, 2023
[auto-merge] bot-auto-merge-branch-22.08 to branch-22.10 [skip ci] [bot]
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
feature request New feature or request P1 Nice to have for release
Projects
None yet
Development

No branches or pull requests

3 participants