-
Notifications
You must be signed in to change notification settings - Fork 201
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
(Py)Spark 3.0 / Java 11 fails with java.lang.UnsupportedOperationException: sun.misc.Unsafe or java.nio.DirectByteBuffer.<init>(long, int) not available
#200
Comments
According to Netty it fixed compatibility with Java 11 in But connector uses spark-bigquery-connector/build.sbt Line 86 in 5ff9a75
We will update Netty version in the next connector release, meanwhile you can update Netty version locally and build connector jar by yourself. |
Thanks for the quick update. I am unable to build the jar locally (and my knowledge of sbt-assembly is really not solid enough to dig this up). I'll test the next release! |
Fixed by PR #204 |
Should I still be experiencing this error, even though I downloaded the release |
export SPARK_SUBMIT_OPTS="--illegal-access=permit -Dio.netty.tryReflectionSetAccessible=true " |
When using the connector with Spark 3.0.0 (on Java 11), trying to read a dataset from BigQuery fails with the error at the bottom.
This is a known problem with Java 9+ and Spark as shown on their documentation (https://spark.apache.org/docs/3.0.0/). Here is the pull request in question
apache/spark#26552
My Spark instance is launched with the
-Dio.netty.tryReflectionSetAccessible=true
flags enabled and Pandas UDF/Arrow conversion are working. I downloaded a sample data set from BigQuery to test my code and it works without any issues.Steps to reproduce
With Spark 3.0.0 / Java 11.
Launch PySpark with the
--conf spark.driver.extraJavaOptions="-Dio.netty.tryReflectionSetAccessible=true" --conf spark.executor.extraJavaOptions="-Dio.netty.tryReflectionSetAccessible=true"
options.Try to read a BigQuery dataset
Stacktrace
The text was updated successfully, but these errors were encountered: