Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Shim Layer to support multiple Spark versions #414

Merged
merged 62 commits into from
Jul 23, 2020
Merged
Show file tree
Hide file tree
Changes from all commits
Commits
Show all changes
62 commits
Select commit Hold shift + click to select a range
688c757
Working 30 and 31
tgravescs Jul 14, 2020
e9b3268
Build with both spark3.0 and spark 3.1
tgravescs Jul 14, 2020
ab09032
minor fixes
tgravescs Jul 14, 2020
e511110
Formatting
tgravescs Jul 14, 2020
0d97687
put back building configs
tgravescs Jul 16, 2020
773edfd
iMove GpuFileSourceScanExec to spark specific dirs
Jul 16, 2020
8b6ec0a
Fix order of params
tgravescs Jul 16, 2020
ad9f2e0
remove logging
tgravescs Jul 17, 2020
c48e918
move spark30 and spark31 into shims modules
tgravescs Jul 17, 2020
c98b38e
Add missing files
tgravescs Jul 17, 2020
918cbd4
Move packages and use serviceloader
tgravescs Jul 20, 2020
bcd5eb0
Move GpuFirst to shim
tgravescs Jul 20, 2020
edf01fa
First and Last moved
tgravescs Jul 20, 2020
af27493
Allow multiple serviceloaders in dist jar
tgravescs Jul 20, 2020
06c403d
Cleanup
tgravescs Jul 20, 2020
5d45aeb
Fixes
tgravescs Jul 21, 2020
e138385
Cleanup
tgravescs Jul 21, 2020
b77bf34
pom fixes to generate docs
tgravescs Jul 21, 2020
261bcc7
Fix Suite for shim classes and cleanup
tgravescs Jul 21, 2020
5731bb9
shim layer for Rapids Shuffle Manager
tgravescs Jul 21, 2020
46db449
Shim for shuffle manager
tgravescs Jul 21, 2020
af1d79d
add in getRapidsShuffleManagerClass
tgravescs Jul 21, 2020
7952d9f
Cleanup shuffle manager
tgravescs Jul 21, 2020
e495820
Changes for shuffle manager
tgravescs Jul 21, 2020
50bad9d
Cleanup
tgravescs Jul 21, 2020
350c34b
Change spark3.1 getGpuBuildSide
tgravescs Jul 21, 2020
0a9aeed
MapOutputTracker api
tgravescs Jul 21, 2020
9b611f4
shim for mapoutputTracker api
tgravescs Jul 21, 2020
27d786c
explicitly set version in shims
tgravescs Jul 21, 2020
7b7e26e
Merge remote-tracking branch 'origin/branch-0.2' into shimBranch0.2
tgravescs Jul 21, 2020
df7916d
Move ScalaUDF to Shim
tgravescs Jul 22, 2020
4632228
Remove unneeded use of GPUBuildSide
tgravescs Jul 22, 2020
b798a05
Revert some changes to joins
tgravescs Jul 22, 2020
456b784
Fix spacing in pom
tgravescs Jul 22, 2020
fa1b463
More join changes
tgravescs Jul 22, 2020
5bb4f99
more cleanup
tgravescs Jul 22, 2020
f9efe33
more cleanup
tgravescs Jul 22, 2020
6661e4a
more cleanup
tgravescs Jul 22, 2020
507fbe5
Merge remote-tracking branch 'origin/branch-0.2' into shimBranch0.2
tgravescs Jul 22, 2020
acece84
Fix merge issue
tgravescs Jul 22, 2020
ba377d4
Add newline
tgravescs Jul 22, 2020
8f26f12
fix line length
tgravescs Jul 22, 2020
b1b6155
Fix import order
tgravescs Jul 22, 2020
e637012
Remove unneeded changes in GpuFirst
tgravescs Jul 22, 2020
4aed4d0
Cleanup poms and versions check for 3.1
tgravescs Jul 22, 2020
9bddb28
move slf4j dep up
tgravescs Jul 22, 2020
c89bdf9
Change parent pom path
tgravescs Jul 22, 2020
787e12b
move rat exclude check to shim poms since parent changed
tgravescs Jul 22, 2020
9540447
Switch to use parent pom instead of aggregator module
tgravescs Jul 22, 2020
7686068
Add spark 3.0.1
Jul 22, 2020
41c132d
Spark 3.1 shim use GpuFirst and GpuLast from 3.0.1
Jul 22, 2020
48991ce
Change to have getExprs/getExecs return map that can be reused betwee…
tgravescs Jul 22, 2020
315fa3f
Fix up types
tgravescs Jul 23, 2020
b7c711e
Fix comment
tgravescs Jul 23, 2020
98ff237
Fix comments
tgravescs Jul 23, 2020
f5a22f3
Rename spark30 to spark300
tgravescs Jul 23, 2020
680864c
move spark 31 to spark310
tgravescs Jul 23, 2020
118b7b4
renames
tgravescs Jul 23, 2020
2ea4767
cleanup
tgravescs Jul 23, 2020
b6283dd
move RapidsShuffleManager 301
tgravescs Jul 23, 2020
716518a
Document profiles for unit tests
tgravescs Jul 23, 2020
cf1c60d
cleanup
tgravescs Jul 23, 2020
File filter

Filter by extension

Filter by extension


Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
38 changes: 38 additions & 0 deletions dist/pom.xml
Original file line number Diff line number Diff line change
Expand Up @@ -41,6 +41,17 @@
<artifactId>rapids-4-spark-shuffle_${scala.binary.version}</artifactId>
<version>${project.version}</version>
</dependency>
<dependency>
<groupId>com.nvidia</groupId>
<artifactId>rapids-4-spark-shims-aggregator_${scala.binary.version}</artifactId>
<version>${project.version}</version>
</dependency>
<dependency>
<!-- required for conf generation script -->
<groupId>org.apache.spark</groupId>
<artifactId>spark-sql_${scala.binary.version}</artifactId>
<scope>provided</scope>
</dependency>
</dependencies>

<build>
Expand All @@ -49,6 +60,9 @@
<groupId>org.apache.maven.plugins</groupId>
<artifactId>maven-shade-plugin</artifactId>
<configuration>
<transformers>
<transformer implementation="org.apache.maven.plugins.shade.resource.ServicesResourceTransformer"/>
</transformers>
<shadedArtifactAttached>false</shadedArtifactAttached>
<createDependencyReducedPom>true</createDependencyReducedPom>
<relocations>
Expand Down Expand Up @@ -94,6 +108,30 @@
</execution>
</executions>
</plugin>
<plugin>
<groupId>net.alchim31.maven</groupId>
<artifactId>scala-maven-plugin</artifactId>
<executions>
<execution>
<id>update_config</id>
<phase>verify</phase>
<goals>
<goal>run</goal>
</goals>
</execution>
</executions>
<configuration>
<launchers>
<launcher>
<id>update_rapids_config</id>
<mainClass>com.nvidia.spark.rapids.RapidsConf</mainClass>
tgravescs marked this conversation as resolved.
Show resolved Hide resolved
<args>
<arg>${project.basedir}/../docs/configs.md</arg>
</args>
</launcher>
</launchers>
</configuration>
</plugin>
<plugin>
<groupId>org.apache.rat</groupId>
<artifactId>apache-rat-plugin</artifactId>
Expand Down
8 changes: 6 additions & 2 deletions docs/get-started/getting-started.md
Original file line number Diff line number Diff line change
Expand Up @@ -417,11 +417,15 @@ With `nv_peer_mem`, IB/RoCE-based transfers can perform zero-copy transfers dire
2) Install [UCX 1.8.1](https://github.com/openucx/ucx/releases/tag/v1.8.1).

3) You will need to configure your spark job with extra settings for UCX (we are looking to
simplify these settings in the near future):
simplify these settings in the near future). Choose the version of the shuffle manager
that matches your Spark version. Currently we support
Spark 3.0.0 (com.nvidia.spark.rapids.spark300.RapidsShuffleManager) and
Spark 3.0.1 (com.nvidia.spark.rapids.spark301.RapidsShuffleManager) and
Spark 3.1.0 (com.nvidia.spark.rapids.spark310.RapidsShuffleManager):

```shell
...
--conf spark.shuffle.manager=com.nvidia.spark.RapidsShuffleManager \
--conf spark.shuffle.manager=com.nvidia.spark.rapids.spark300.RapidsShuffleManager \
--conf spark.shuffle.service.enabled=false \
--conf spark.rapids.shuffle.transport.enabled=true \
--conf spark.executorEnv.UCX_TLS=cuda_copy,cuda_ipc,rc,tcp \
Expand Down
3 changes: 3 additions & 0 deletions docs/testing.md
Original file line number Diff line number Diff line change
Expand Up @@ -41,6 +41,9 @@ They generally follow TPCH but are not guaranteed to be the same.

Unit tests exist in the tests directory. This is unconventional and is done so we can run the tests
on the final shaded version of the plugin. It also helps with how we collect code coverage.
You can run the unit tests against different versions of Spark using the different profiles. The
default version runs again Spark 3.0.0, `-Pspark301tests` runs against Spark 3.0.1, and `-Pspark310tests`
runs unit tests against Spark 3.1.0.

## Integration tests

Expand Down
28 changes: 28 additions & 0 deletions integration_tests/pom.xml
Original file line number Diff line number Diff line change
Expand Up @@ -28,14 +28,42 @@
<artifactId>rapids-4-spark-integration-tests_2.12</artifactId>
<version>0.2.0-SNAPSHOT</version>

<properties>
<spark.test.version>3.0.0</spark.test.version>
</properties>
<profiles>
<profile>
<id>spark301tests</id>
<properties>
<spark.test.version>3.0.1-SNAPSHOT</spark.test.version>
</properties>
</profile>
<profile>
<id>spark310tests</id>
<properties>
<spark.test.version>3.1.0-SNAPSHOT</spark.test.version>
</properties>
</profile>
</profiles>

<dependencies>
<dependency>
<groupId>org.slf4j</groupId>
<artifactId>jul-to-slf4j</artifactId>
</dependency>
<dependency>
<groupId>org.slf4j</groupId>
<artifactId>jcl-over-slf4j</artifactId>
<!-- runtime scope is appropriate, but causes SBT build problems -->
</dependency>
<dependency>
<groupId>org.scala-lang</groupId>
<artifactId>scala-library</artifactId>
</dependency>
<dependency>
<groupId>org.apache.spark</groupId>
<artifactId>spark-sql_${scala.binary.version}</artifactId>
<version>${spark.test.version}</version>
</dependency>
<dependency>
<groupId>org.scalatest</groupId>
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -16,7 +16,7 @@

package com.nvidia.spark.rapids.tests.mortgage

import com.nvidia.spark.RapidsShuffleManager
import com.nvidia.spark.rapids.ShimLoader
import org.scalatest.FunSuite

import org.apache.spark.sql.SparkSession
Expand All @@ -34,7 +34,7 @@ class MortgageSparkSuite extends FunSuite {
.config("spark.rapids.sql.test.enabled", false)
.config("spark.rapids.sql.incompatibleOps.enabled", true)
.config("spark.rapids.sql.hasNans", false)
val rapidsShuffle = classOf[RapidsShuffleManager].getCanonicalName
val rapidsShuffle = ShimLoader.getSparkShims.getRapidsShuffleManagerClass
val prop = System.getProperty("rapids.shuffle.manager.override", "false")
if (prop.equalsIgnoreCase("true")) {
println("RAPIDS SHUFFLE MANAGER ACTIVE")
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -16,8 +16,8 @@

package com.nvidia.spark.rapids.tests.tpch

import com.nvidia.spark.RapidsShuffleManager
import com.nvidia.spark.rapids.{ColumnarRdd, ExecutionPlanCaptureCallback}
import com.nvidia.spark.rapids.ShimLoader
import org.scalatest.{BeforeAndAfterAll, FunSuite}

import org.apache.spark.sql.{DataFrame, SparkSession}
Expand All @@ -44,7 +44,7 @@ class TpchLikeSparkSuite extends FunSuite with BeforeAndAfterAll {
.config("spark.rapids.sql.explain", true)
.config("spark.rapids.sql.incompatibleOps.enabled", true)
.config("spark.rapids.sql.hasNans", false)
val rapidsShuffle = classOf[RapidsShuffleManager].getCanonicalName
val rapidsShuffle = ShimLoader.getSparkShims.getRapidsShuffleManagerClass
val prop = System.getProperty("rapids.shuffle.manager.override", "false")
if (prop.equalsIgnoreCase("true")) {
println("RAPIDS SHUFFLE MANAGER ACTIVE")
Expand Down
29 changes: 29 additions & 0 deletions pom.xml
Original file line number Diff line number Diff line change
Expand Up @@ -76,6 +76,7 @@
<module>sql-plugin</module>
<module>tests</module>
<module>integration_tests</module>
<module>shims</module>
<module>api_validation</module>
</modules>

Expand Down Expand Up @@ -128,6 +129,12 @@
<rat.consoleOutput>true</rat.consoleOutput>
</properties>
</profile>
<profile>
<id>spark301tests</id>
</profile>
<profile>
<id>spark310tests</id>
</profile>
</profiles>

<properties>
Expand All @@ -152,6 +159,7 @@
<project.reporting.sourceEncoding>UTF-8</project.reporting.sourceEncoding>
<pytest.TEST_TAGS>not qarun</pytest.TEST_TAGS>
<rat.consoleOutput>false</rat.consoleOutput>
<slf4j.version>1.7.30</slf4j.version>
</properties>

<dependencyManagement>
Expand All @@ -168,6 +176,17 @@
<classifier>${cuda.version}</classifier>
<scope>provided</scope>
</dependency>
<dependency>
<groupId>org.slf4j</groupId>
<artifactId>jul-to-slf4j</artifactId>
<version>${slf4j.version}</version>
</dependency>
<dependency>
<groupId>org.slf4j</groupId>
<artifactId>jcl-over-slf4j</artifactId>
<version>${slf4j.version}</version>
<!-- runtime scope is appropriate, but causes SBT build problems -->
</dependency>
<dependency>
<groupId>org.scala-lang</groupId>
<artifactId>scala-library</artifactId>
Expand Down Expand Up @@ -547,5 +566,15 @@
<enabled>true</enabled>
</snapshots>
</repository>
<repository>
<id>apache-snapshots-repo</id>
<url>https://repository.apache.org/content/repositories/snapshots/</url>
<releases>
<enabled>false</enabled>
</releases>
<snapshots>
<enabled>true</enabled>
</snapshots>
</repository>
</repositories>
</project>
55 changes: 55 additions & 0 deletions shims/aggregator/pom.xml
Original file line number Diff line number Diff line change
@@ -0,0 +1,55 @@
<?xml version="1.0" encoding="UTF-8"?>
<!--
Copyright (c) 2020, NVIDIA CORPORATION.

Licensed under the Apache License, Version 2.0 (the "License");
you may not use this file except in compliance with the License.
You may obtain a copy of the License at

http://www.apache.org/licenses/LICENSE-2.0

Unless required by applicable law or agreed to in writing, software
distributed under the License is distributed on an "AS IS" BASIS,
WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
See the License for the specific language governing permissions and
limitations under the License.
-->
<project xmlns="http://maven.apache.org/POM/4.0.0"
xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance"
xsi:schemaLocation="http://maven.apache.org/POM/4.0.0 http://maven.apache.org/xsd/maven-4.0.0.xsd">
<modelVersion>4.0.0</modelVersion>

<parent>
<groupId>com.nvidia</groupId>
<artifactId>rapids-4-spark-shims_2.12</artifactId>
<version>0.2.0-SNAPSHOT</version>
<relativePath>../pom.xml</relativePath>
</parent>
<groupId>com.nvidia</groupId>
<artifactId>rapids-4-spark-shims-aggregator_2.12</artifactId>
<packaging>jar</packaging>
<name>RAPIDS Accelerator for Apache Spark SQL Plugin Shim Aggregator</name>
<description>The RAPIDS SQL plugin for Apache Spark Shim Aggregator</description>
<version>0.2.0-SNAPSHOT</version>

<dependencies>
<dependency>
<groupId>com.nvidia</groupId>
<artifactId>rapids-4-spark-shims-spark310_${scala.binary.version}</artifactId>
<version>${project.version}</version>
<scope>compile</scope>
</dependency>
<dependency>
<groupId>com.nvidia</groupId>
<artifactId>rapids-4-spark-shims-spark301_${scala.binary.version}</artifactId>
<version>${project.version}</version>
<scope>compile</scope>
</dependency>
<dependency>
<groupId>com.nvidia</groupId>
<artifactId>rapids-4-spark-shims-spark300_${scala.binary.version}</artifactId>
<version>${project.version}</version>
<scope>compile</scope>
</dependency>
</dependencies>
</project>
72 changes: 72 additions & 0 deletions shims/pom.xml
Original file line number Diff line number Diff line change
@@ -0,0 +1,72 @@
<?xml version="1.0" encoding="UTF-8"?>
<!--
Copyright (c) 2020, NVIDIA CORPORATION.

Licensed under the Apache License, Version 2.0 (the "License");
you may not use this file except in compliance with the License.
You may obtain a copy of the License at

http://www.apache.org/licenses/LICENSE-2.0

Unless required by applicable law or agreed to in writing, software
distributed under the License is distributed on an "AS IS" BASIS,
WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
See the License for the specific language governing permissions and
limitations under the License.
-->
<project xmlns="http://maven.apache.org/POM/4.0.0"
xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance"
xsi:schemaLocation="http://maven.apache.org/POM/4.0.0 http://maven.apache.org/xsd/maven-4.0.0.xsd">
<modelVersion>4.0.0</modelVersion>

<parent>
<groupId>com.nvidia</groupId>
<artifactId>rapids-4-spark-parent</artifactId>
<version>0.2.0-SNAPSHOT</version>
<relativePath>../pom.xml</relativePath>
</parent>
<groupId>com.nvidia</groupId>
<artifactId>rapids-4-spark-shims_2.12</artifactId>
<packaging>pom</packaging>
<name>RAPIDS Accelerator for Apache Spark SQL Plugin Shims</name>
<description>The RAPIDS SQL plugin for Apache Spark Shims</description>
<version>0.2.0-SNAPSHOT</version>

<modules>
<module>spark300</module>
<module>spark301</module>
<module>spark310</module>
<module>aggregator</module>
</modules>
<dependencies>
<dependency>
<groupId>com.nvidia</groupId>
<artifactId>rapids-4-spark-sql_${scala.binary.version}</artifactId>
<version>${project.version}</version>
<scope>provided</scope>
</dependency>
<dependency>
<groupId>ai.rapids</groupId>
<artifactId>cudf</artifactId>
<classifier>${cuda.version}</classifier>
<scope>provided</scope>
</dependency>
</dependencies>
<build>
<plugins>
<plugin>
<groupId>net.alchim31.maven</groupId>
<artifactId>scala-maven-plugin</artifactId>
</plugin>
<plugin>
<groupId>org.apache.rat</groupId>
<artifactId>apache-rat-plugin</artifactId>
<configuration>
<excludes>
<exclude>**/src/main/resources/META-INF/services/*</exclude>
</excludes>
</configuration>
</plugin>
</plugins>
</build>
</project>
Loading