Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

NullPointerException #89

Open
wangwenGreentea opened this issue Nov 19, 2018 · 4 comments
Open

NullPointerException #89

wangwenGreentea opened this issue Nov 19, 2018 · 4 comments

Comments

@wangwenGreentea
Copy link

I test your examples:GpuDSArrayMult,but get errors.

Caused by: java.lang.NullPointerException
at com.ibm.gpuenabler.CUDAManager.cachedLoadModule(CUDAManager.scala:72)
at com.ibm.gpuenabler.CUDAManager.getModule(CUDAManager.scala:62)
at org.apache.spark.sql.catalyst.expressions.GeneratedClass$JCUDAIteratorImpl.processGPU(Unknown Source)
at org.apache.spark.sql.catalyst.expressions.GeneratedClass$JCUDAIteratorImpl.hasNext(Unknown Source)
at com.ibm.gpuenabler.MAPGPUExec$$anonfun$doExecute$1.apply(CUDADSUtils.scala:152)
at com.ibm.gpuenabler.MAPGPUExec$$anonfun$doExecute$1.apply(CUDADSUtils.scala:73)
at org.apache.spark.rdd.RDD$$anonfun$mapPartitionsWithIndex$1$$anonfun$apply$26.apply(RDD.scala:844)
at org.apache.spark.rdd.RDD$$anonfun$mapPartitionsWithIndex$1$$anonfun$apply$26.apply(RDD.scala:844)
at org.apache.spark.rdd.MapPartitionsRDD.compute(MapPartitionsRDD.scala:38)
at org.apache.spark.rdd.RDD.computeOrReadCheckpoint(RDD.scala:323)
at org.apache.spark.rdd.RDD.iterator(RDD.scala:287)
at org.apache.spark.rdd.MapPartitionsRDD.compute(MapPartitionsRDD.scala:38)
at org.apache.spark.rdd.RDD.computeOrReadCheckpoint(RDD.scala:323)

@asdfghjBennyHuang
Copy link

It's actually same as my issue #86 . Indeed need some help on this.

If you just want to make it run, you can probably try disabling the spark.gpuenabler.autocache, I can eliminate the error and sort of resolve the issue, but we probably need to fix the issue properly?

@wangwenGreentea
Copy link
Author

It's actually same as my issue #86 . Indeed need some help on this.

If you just want to make it run, you can probably try disabling the spark.gpuenabler.autocache, I can eliminate the error and sort of resolve the issue, but we probably need to fix the issue properly?

@wangwenGreentea
Copy link
Author

It's actually same as my issue #86 . Indeed need some help on this.
If you just want to make it run, you can probably try disabling the spark.gpuenabler.autocache, I can eliminate the error and sort of resolve the issue, but we probably need to fix the issue properly?

Thankyou! Even I run it as local[*] , the same error still occurs.How to disable spark.gpuenabler.autocache

@asdfghjBennyHuang
Copy link

You can add a line in your $SPARK_HOME/conf/spark-defaults.conf. You can also refer to https://spark.apache.org/docs/latest/configuration.html

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants