-
Notifications
You must be signed in to change notification settings - Fork 28.5k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
[SPARK-43448][BUILD] Remove dummy dependency hadoop-openstack
#41133
Conversation
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
It's really interesting because that kind of dependency deletion happened in the maintenance release (Apache Hadoop 3.3.5).
we emptied the jar but left the stub artifact there so that things which did explicitly pull it in wouldn't start breaking. Now that spark is 3.3.5+ only most of the hadoop-cloud-storage dependencies can be reworked down to just
|
@steveloughran does Hadoop 3.3.5 guarantee compatibility w/ previous versions? e.g. is it OK to use Hadoop 3.3.5 client to access Hadoop 3.3.0~3.3.4 server? |
SPARK-42537 covers the full cleanup. w.r.t this patch: LGTM. |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Thank you for the context, @pan3793 and @steveloughran .
Merged to master for Apache Spark 3.5.0. |
should be. IPC is all based on protobuf and we try not to remove things to avoid breaking existing code. HDFS compatibility across major versions is something which mattersd a lot, I believe webhdfs has the strongest guarantees. what does break, guaranteed, is mixing hadoop libraries from different versions on the classpath. Avoid that. on and for cloudstuff openssl/wildfly is a source of extreme brittleness, even though when it works it's often faster than JVM ssl |
Got it, thanks @steveloughran |
### What changes were proposed in this pull request? This PR aims to downgrade the Apache Hadoop dependency to 3.3.4 in `Apache Spark 3.5` in order to prevent any regression from `Apache Spark 3.4.x`. In other words, although `Apache Spark 3.5.x` will lose many bug fixes of Apache Hadoop 3.3.5 and 3.3.6, it will be in the same situation with `Apache Spark 3.4.x`. - SPARK-44197 Upgrade Hadoop to 3.3.6 (#41744) - SPARK-42913 Upgrade Hadoop to 3.3.5 (#39124) - SPARK-43448 Remove dummy dependency `hadoop-openstack` (#41133) On top of reverting SPARK-44197 and SPARK-42913, this PR has additional dependency exclusion change due to the following. - SPARK-43880 Organize `hadoop-cloud` in standard maven project structure (#41380) ### Why are the changes needed? There is a community report on S3A committer performance regression. Although it's one liner fix, there is no available Hadoop release with that fix at this time. - HADOOP-18757: Bump corePoolSize of HadoopThreadPoolExecutor in s3a committer (apache/hadoop#5706) ### Does this PR introduce _any_ user-facing change? No. ### How was this patch tested? Pass the CIs. Closes #42345 from dongjoon-hyun/SPARK-44678. Authored-by: Dongjoon Hyun <[email protected]> Signed-off-by: Dongjoon Hyun <[email protected]>
What changes were proposed in this pull request?
Remove the dummy dependency
hadoop-openstack
from Spark binary artifacts.Why are the changes needed?
HADOOP-18442 removed the
hadoop-openstack
and temporarily retained a dummy jar for the downstream project which consumes it.Does this PR introduce any user-facing change?
No.
How was this patch tested?
Pass GA.