-
Notifications
You must be signed in to change notification settings - Fork 28.5k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
[WIP][test-java11] Test Hadoop 2.7 with JDK 11 #26533
Conversation
Test build #113822 has finished for PR 26533 at commit
|
Test build #113824 has finished for PR 26533 at commit
|
retest this please |
Test build #113830 has finished for PR 26533 at commit
|
Test build #113831 has finished for PR 26533 at commit
|
The test cases should be fixed as of 65a189c |
Thank you @HyukjinKwon |
All Failed Tests: org.apache.spark.sql.kafka010.KafkaSinkMicroBatchStreamingSuite.(It is not a test it is a sbt.testing.SuiteSelector)
org.apache.spark.sql.kafka010.KafkaDelegationTokenSuite.(It is not a test it is a sbt.testing.SuiteSelector) How to reproduce: export JAVA_HOME=/usr/lib/jdk-11.0.3
export PATH=$JAVA_HOME/bin:$PATH
build/sbt "sql-kafka-0-10/test-only *.KafkaSinkMicroBatchStreamingSuite" -Phadoop-3.2 -Dhadoop.version=2.7.4 -Dcurator.version=2.7.1
build/sbt "sql-kafka-0-10/test-only *.KafkaDelegationTokenSuite" -Phadoop-3.2 -Dhadoop.version=2.7.4 -Dcurator.version=2.7.1 |
# Conflicts: # dev/deps/spark-deps-hadoop-3.2
Test build #113847 has finished for PR 26533 at commit
|
@HyukjinKwon @BryanCutler It seems failed tests related to https://issues.apache.org/jira/browse/ARROW-5412? |
…Unsafe or java.nio.DirectByteBuffer.<init>(long, int) not available
right, I think for JDK >= 9 this needs to be set "-Dio.netty.tryReflectionSetAccessible=true". This was due to some Netty API that Arrow started using. I can look into it and try to figure out what we should do for Spark. |
Oh, I saw you just added it as a property in the pom. Hopefully that should do the trick. |
Test build #113859 has finished for PR 26533 at commit
|
retest this please |
@BryanCutler It seems not working: export JAVA_HOME=/usr/lib/jdk-11.0.3
export PATH=$JAVA_HOME/bin:$PATH
build/sbt "sql/test-only *.ArrowConvertersSuite" -Phadoop-3.2 -Dio.netty.tryReflectionSetAccessible=true |
Test build #113865 has finished for PR 26533 at commit
|
@wangyum . Arrow broke all tests including PySpark/R. I'm working in the following PR. After fixing that, I'll ping you here. Please see the scala stuff there, too. |
Thank you @dongjoon-hyun |
retest this please |
Test build #113926 has finished for PR 26533 at commit
|
Test build #113935 has finished for PR 26533 at commit
|
Test build #113933 has finished for PR 26533 at commit
|
All Failed Tests: org.apache.spark.sql.kafka010.KafkaSinkMicroBatchStreamingSuite.(It is not a test it is a sbt.testing.SuiteSelector)
org.apache.spark.sql.kafka010.KafkaDelegationTokenSuite.(It is not a test it is a sbt.testing.SuiteSelector) How to reproduce: export JAVA_HOME=/usr/lib/jdk-11.0.3
export PATH=$JAVA_HOME/bin:$PATH
build/sbt "sql-kafka-0-10/test-only *.KafkaSinkMicroBatchStreamingSuite" -Phadoop-3.2 -Dhadoop.version=2.7.4 -Dcurator.version=2.7.1
build/sbt "sql-kafka-0-10/test-only *.KafkaDelegationTokenSuite" -Phadoop-3.2 -Dhadoop.version=2.7.4 -Dcurator.version=2.7.1 The failed tests may be related to https://issues.apache.org/jira/browse/HADOOP-12911 |
Test build #114073 has finished for PR 26533 at commit
|
retest this please |
Test build #114077 has finished for PR 26533 at commit
|
Test build #114338 has finished for PR 26533 at commit
|
# Conflicts: # pom.xml
Test build #114342 has finished for PR 26533 at commit
|
retest this please |
Test build #114343 has finished for PR 26533 at commit
|
Test build #114349 has finished for PR 26533 at commit
|
Test build #114580 has finished for PR 26533 at commit
|
Test build #114624 has finished for PR 26533 at commit
|
The issue fixed by #26594 |
Another issue:
More details: https://issues.apache.org/jira/browse/HADOOP-12760 |
What changes were proposed in this pull request?
This PR test Hadoop 2.7 with JDK 11 on our jenkins.