You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Describe the bug
Running with a jar built with -DallowConventionalDistJar=true failed on startup:
22/05/23 17:53:19 ERROR SparkContext: Error initializing SparkContext.
java.lang.AssertionError: assertion failed: Classpath should contain the resource for META-INF/services/com.nvidia.spark.rapids.SparkShimServiceProvider
at scala.Predef$.assert(Predef.scala:223)
at com.nvidia.spark.rapids.ShimLoader$.detectShimProvider(ShimLoader.scala:307)
at com.nvidia.spark.rapids.ShimLoader$.findShimProvider(ShimLoader.scala:355)
at com.nvidia.spark.rapids.ShimLoader$.initShimProviderIfNeeded(ShimLoader.scala:100)
at com.nvidia.spark.rapids.ShimLoader$.getShimClassLoader(ShimLoader.scala:250)
at com.nvidia.spark.rapids.ShimLoader$.loadClass(ShimLoader.scala:383)
at com.nvidia.spark.rapids.ShimLoader$.newInstanceOf(ShimLoader.scala:389)
at com.nvidia.spark.rapids.ShimLoader$.newDriverPlugin(ShimLoader.scala:418)
at com.nvidia.spark.SQLPlugin.driverPlugin(SQLPlugin.scala:29)
Examining the contents of the jar shows the service provider file is present but empty:
$ jar tvf dist/target/rapids-4-spark_2.12-22.06.0-SNAPSHOT-cuda11.jar | grep SparkShimServiceProvider
0 Mon May 23 12:48:28 CDT 2022 META-INF/services/com.nvidia.spark.rapids.SparkShimServiceProvider
779 Mon May 23 12:47:10 CDT 2022 com/nvidia/spark/rapids/SparkShimServiceProvider.class
1392 Mon May 23 12:47:08 CDT 2022 com/nvidia/spark/rapids/shims/spark321/SparkShimServiceProvider$.class
2037 Mon May 23 12:47:08 CDT 2022 com/nvidia/spark/rapids/shims/spark321/SparkShimServiceProvider.class
Steps/Code to reproduce bug
Build jar for Spark 3.2.1 via: mvn clean package -Dbuildver=321 -DskipTests -DallowConventionalDistJar=true
Run jar against Spark 3.2.1
Expected behavior
RAPIDS Accelerator starts up without an exception.
Environment details (please complete the following information)
Spark 3.2.1
The text was updated successfully, but these errors were encountered:
jlowe
changed the title
[BUG] Shim service provider failure when using jar built with
[BUG] Shim service provider failure when using jar built with -DallowConventionalDistJar
May 23, 2022
This PR closes#5596 by moving truncate close to when shim service file
concatentation is about to generate a new list.
The empty file for ShimServiceProvider created during init-properties is not populated via
concat in the conventional jar path, later overwriting the original file
from the aggregator jar. it's also bad practice to have side effects
while still initializing properties. This PR corrects these issues.
Signed-off-by: Gera Shegalov <[email protected]>
Describe the bug
Running with a jar built with
-DallowConventionalDistJar=true
failed on startup:Examining the contents of the jar shows the service provider file is present but empty:
Steps/Code to reproduce bug
mvn clean package -Dbuildver=321 -DskipTests -DallowConventionalDistJar=true
Expected behavior
RAPIDS Accelerator starts up without an exception.
Environment details (please complete the following information)
Spark 3.2.1
The text was updated successfully, but these errors were encountered: