Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Cross-compile all shims from JDK17 to JDK8 #3

Merged
Merged
Show file tree
Hide file tree
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension


Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
10 changes: 5 additions & 5 deletions dist/scripts/binary-dedupe.sh
Original file line number Diff line number Diff line change
Expand Up @@ -173,12 +173,12 @@ function verify_same_sha_for_unshimmed() {
# TODO currently RapidsShuffleManager is "removed" from /spark* by construction in
# dist pom.xml via ant. We could delegate this logic to this script
# and make both simmpler
if [[ ! "$class_file_quoted" =~ (com/nvidia/spark/rapids/spark[34].*/.*ShuffleManager.class|org/apache/spark/sql/rapids/shims/spark[34].*/ProxyRapidsShuffleInternalManager.class) ]]; then
if [[ ! "$class_file_quoted" =~ com/nvidia/spark/rapids/spark[34].*/.*ShuffleManager.class ]]; then
Copy link
Owner

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I don't understand why we got rid of the ProxyRapidsShuffleInternalManager

Copy link
Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

tech debt, should have been done in NVIDIA#6030


if ! grep -q "/spark.\+/$class_file_quoted" "$SPARK_SHARED_TXT"; then
echo >&2 "$class_file is not bitwise-identical across shims"
exit 255
fi
if ! grep -q "/spark.\+/$class_file_quoted" "$SPARK_SHARED_TXT"; then
echo >&2 "$class_file is not bitwise-identical across shims"
exit 255
fi
fi
}

Expand Down
42 changes: 35 additions & 7 deletions jdk-profiles/pom.xml
Original file line number Diff line number Diff line change
Expand Up @@ -31,17 +31,45 @@
<version>24.08.0-SNAPSHOT</version>
<profiles>
<profile>
<id>jdk9plus</id>
<properties>
<scala.plugin.version>4.6.1</scala.plugin.version>
<maven.compiler.source>${java.specification.version}</maven.compiler.source>
<maven.compiler.release>${maven.compiler.source}</maven.compiler.release>
<maven.compiler.target>${maven.compiler.source}</maven.compiler.target>
</properties>
<id>jdk8</id>
<activation>
<jdk>8</jdk>
</activation>
<build>
<pluginManagement>
<plugins>
<plugin>
<groupId>net.alchim31.maven</groupId>
<artifactId>scala-maven-plugin</artifactId>
<version>${scala.plugin.version}</version>
<configuration>
<target>${java.major.version}</target>
</configuration>
</plugin>
</plugins>
</pluginManagement>
</build>
</profile>
<profile>
<id>jdk9plus</id>
<activation>
<!-- activate for all java versions after 9 -->
<jdk>[9,)</jdk>
</activation>
<build>
<pluginManagement>
<plugins>
<plugin>
<groupId>net.alchim31.maven</groupId>
<artifactId>scala-maven-plugin</artifactId>
<version>${scala.plugin.version}</version>
<configuration>
<release>${java.major.version}</release>
</configuration>
</plugin>
</plugins>
</pluginManagement>
</build>
</profile>
</profiles>
</project>
16 changes: 7 additions & 9 deletions pom.xml
Original file line number Diff line number Diff line change
Expand Up @@ -505,8 +505,6 @@
</property>
</activation>
<properties>
<!-- Downgrade scala plugin version due to: https://github.com/sbt/sbt/issues/4305 -->
<scala.plugin.version>3.4.4</scala.plugin.version>
<spark.version.classifier>spark330db</spark.version.classifier>
<spark.version>${spark330db.version}</spark.version>
<spark.test.version>${spark330db.version}</spark.test.version>
Expand All @@ -531,8 +529,6 @@
</property>
</activation>
<properties>
<!-- Downgrade scala plugin version due to: https://github.com/sbt/sbt/issues/4305 -->
<scala.plugin.version>3.4.4</scala.plugin.version>
<spark.version.classifier>spark332db</spark.version.classifier>
<spark.version>${spark332db.version}</spark.version>
<spark.test.version>${spark332db.version}</spark.test.version>
Expand All @@ -556,8 +552,6 @@
</property>
</activation>
<properties>
<!-- Downgrade scala plugin version due to: https://github.com/sbt/sbt/issues/4305 -->
<scala.plugin.version>3.4.4</scala.plugin.version>
<spark.version.classifier>spark341db</spark.version.classifier>
<spark.version>${spark341db.version}</spark.version>
<spark.test.version>${spark341db.version}</spark.test.version>
Expand Down Expand Up @@ -755,7 +749,6 @@
<allowConventionalDistJar>false</allowConventionalDistJar>
<buildver>311</buildver>
<maven.compiler.source>1.8</maven.compiler.source>
<maven.compiler.target>1.8</maven.compiler.target>
<java.major.version>8</java.major.version>
<spark.version>${spark311.version}</spark.version>
<spark.test.version>${spark.version}</spark.test.version>
Expand Down Expand Up @@ -829,7 +822,8 @@
<spark351.version>3.5.1</spark351.version>
<spark400.version>4.0.0-SNAPSHOT</spark400.version>
<mockito.version>3.12.4</mockito.version>
<scala.plugin.version>4.3.0</scala.plugin.version>
<!-- same as Apache Spark 4.0.0 -->
<scala.plugin.version>4.7.1</scala.plugin.version>
<maven.install.plugin.version>3.1.1</maven.install.plugin.version>
<maven.jar.plugin.version>3.3.0</maven.jar.plugin.version>
<scalatest-maven-plugin.version>2.0.2</scalatest-maven-plugin.version>
Expand Down Expand Up @@ -1559,18 +1553,22 @@ This will force full Scala code rebuild in downstream modules.
<message>Minimum Maven version 3.6.x required</message>
<version>[3.6,)</version>
</requireMavenVersion>
<!-- #if scala-2.12 -->
<requireJavaVersion>
<message>Only Java 8, 11, and 17 are supported!</message>
<version>[1.8,1.9),[11,12),[17,18)</version>
</requireJavaVersion>
<!-- #if scala-2.12 -->
<requireProperty>
<property>buildver</property>
<regex>^(?!400).*$</regex>
<regexMessage>Spark 4.0.0 is only supported for Scala 2.13</regexMessage>
</requireProperty>
<!-- #endif scala-2.12 -->
<!-- #if scala-2.13 --><!--
<requireJavaVersion>
<message>Build for Scala 2.13 is only available with Java 17+</message>
<version>[17,)</version>
</requireJavaVersion>
<requireProperty>
<regexMessage>Unexpected buildver value ${buildver} for a Scala 2.13 build, only Apache Spark versions 3.3.0 (330) and higher are supported, no vendor builds such as 330db</regexMessage>
<property>buildver</property>
Expand Down
42 changes: 35 additions & 7 deletions scala2.13/jdk-profiles/pom.xml
Original file line number Diff line number Diff line change
Expand Up @@ -31,17 +31,45 @@
<version>24.08.0-SNAPSHOT</version>
<profiles>
<profile>
<id>jdk9plus</id>
<properties>
<scala.plugin.version>4.6.1</scala.plugin.version>
<maven.compiler.source>${java.specification.version}</maven.compiler.source>
<maven.compiler.release>${maven.compiler.source}</maven.compiler.release>
<maven.compiler.target>${maven.compiler.source}</maven.compiler.target>
</properties>
<id>jdk8</id>
<activation>
<jdk>8</jdk>
</activation>
<build>
<pluginManagement>
<plugins>
<plugin>
<groupId>net.alchim31.maven</groupId>
<artifactId>scala-maven-plugin</artifactId>
<version>${scala.plugin.version}</version>
<configuration>
<target>${java.major.version}</target>
</configuration>
</plugin>
</plugins>
</pluginManagement>
</build>
</profile>
<profile>
<id>jdk9plus</id>
<activation>
<!-- activate for all java versions after 9 -->
<jdk>[9,)</jdk>
</activation>
<build>
<pluginManagement>
<plugins>
<plugin>
<groupId>net.alchim31.maven</groupId>
<artifactId>scala-maven-plugin</artifactId>
<version>${scala.plugin.version}</version>
<configuration>
<release>${java.major.version}</release>
</configuration>
</plugin>
</plugins>
</pluginManagement>
</build>
</profile>
</profiles>
</project>
16 changes: 7 additions & 9 deletions scala2.13/pom.xml
Original file line number Diff line number Diff line change
Expand Up @@ -505,8 +505,6 @@
</property>
</activation>
<properties>
<!-- Downgrade scala plugin version due to: https://github.com/sbt/sbt/issues/4305 -->
<scala.plugin.version>3.4.4</scala.plugin.version>
<spark.version.classifier>spark330db</spark.version.classifier>
<spark.version>${spark330db.version}</spark.version>
<spark.test.version>${spark330db.version}</spark.test.version>
Expand All @@ -531,8 +529,6 @@
</property>
</activation>
<properties>
<!-- Downgrade scala plugin version due to: https://github.com/sbt/sbt/issues/4305 -->
<scala.plugin.version>3.4.4</scala.plugin.version>
<spark.version.classifier>spark332db</spark.version.classifier>
<spark.version>${spark332db.version}</spark.version>
<spark.test.version>${spark332db.version}</spark.test.version>
Expand All @@ -556,8 +552,6 @@
</property>
</activation>
<properties>
<!-- Downgrade scala plugin version due to: https://github.com/sbt/sbt/issues/4305 -->
<scala.plugin.version>3.4.4</scala.plugin.version>
<spark.version.classifier>spark341db</spark.version.classifier>
<spark.version>${spark341db.version}</spark.version>
<spark.test.version>${spark341db.version}</spark.test.version>
Expand Down Expand Up @@ -755,7 +749,6 @@
<allowConventionalDistJar>false</allowConventionalDistJar>
<buildver>311</buildver>
<maven.compiler.source>1.8</maven.compiler.source>
<maven.compiler.target>1.8</maven.compiler.target>
<java.major.version>8</java.major.version>
<spark.version>${spark330.version}</spark.version>
<spark.test.version>${spark.version}</spark.test.version>
Expand Down Expand Up @@ -829,7 +822,8 @@
<spark351.version>3.5.1</spark351.version>
<spark400.version>4.0.0-SNAPSHOT</spark400.version>
<mockito.version>3.12.4</mockito.version>
<scala.plugin.version>4.3.0</scala.plugin.version>
<!-- same as Apache Spark 4.0.0 -->
<scala.plugin.version>4.7.1</scala.plugin.version>
<maven.install.plugin.version>3.1.1</maven.install.plugin.version>
<maven.jar.plugin.version>3.3.0</maven.jar.plugin.version>
<scalatest-maven-plugin.version>2.0.2</scalatest-maven-plugin.version>
Expand Down Expand Up @@ -1559,18 +1553,22 @@ This will force full Scala code rebuild in downstream modules.
<message>Minimum Maven version 3.6.x required</message>
<version>[3.6,)</version>
</requireMavenVersion>
<!-- #if scala-2.12 --><!--
<requireJavaVersion>
<message>Only Java 8, 11, and 17 are supported!</message>
<version>[1.8,1.9),[11,12),[17,18)</version>
</requireJavaVersion>
<!-- #if scala-2.12 --><!--
<requireProperty>
<property>buildver</property>
<regex>^(?!400).*$</regex>
<regexMessage>Spark 4.0.0 is only supported for Scala 2.13</regexMessage>
</requireProperty>
--><!-- #endif scala-2.12 -->
<!-- #if scala-2.13 -->
<requireJavaVersion>
<message>Build for Scala 2.13 is only available with Java 17+</message>
<version>[17,)</version>
</requireJavaVersion>
<requireProperty>
<regexMessage>Unexpected buildver value ${buildver} for a Scala 2.13 build, only Apache Spark versions 3.3.0 (330) and higher are supported, no vendor builds such as 330db</regexMessage>
<property>buildver</property>
Expand Down
Original file line number Diff line number Diff line change
@@ -1,5 +1,5 @@
/*
* Copyright (c) 2019-2021, NVIDIA CORPORATION.
* Copyright (c) 2019-2024, NVIDIA CORPORATION.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
Expand All @@ -25,7 +25,7 @@ import org.apache.spark.internal.Logging
* The RAPIDS plugin for Spark.
* To enable this plugin, set the config "spark.plugins" to `com.nvidia.spark.SQLPlugin`
*/
class SQLPlugin extends SparkPlugin with Logging {
class SQLPlugin extends SparkPlugin {
override def driverPlugin(): DriverPlugin = ShimLoader.newDriverPlugin()

override def executorPlugin(): ExecutorPlugin = ShimLoader.newExecutorPlugin()
Expand Down
Loading
Loading