-
Notifications
You must be signed in to change notification settings - Fork 77
Error in Running test #950
Comments
Hi @Manoj-red-hat, by now we have resolved all compatibility issues for spark3.1 & 3.2 only in Gazelle's main code. There are still a few pending compatibility issues in UT. Please use |
@PHILO-HE when we are planning to merge main code base into release ? |
Hi @Manoj-red-hat, we have already kicked off the preparation for the next official release, i.e., 1.4.0. For migration to arrow 7.0.0, we have no plan in short term. BTW, could you please tell us your company name? And what are your expectations for Gazelle in your production environment? |
@PHILO-HE , I am working for a startup zettabolt.
I am exploring gazelle for hetrogenous(FPGA+CPU) spark development. One more question @PHILO-HE ,
Columnar Projection with Filter Columnar Sort |
Exactly, the batch size is still limited to 65536 due to columnar sort. This limitation will NOT be changed in the upcoming release. |
As clarified, we have no bandwidth to fix compatibility issues in UT. For the latest 1.4.0 release, the issues are still NOT fixed. To run UT, please use spark 3.1.1 profile. |
Describe the bug
[INFO] --- scala-maven-plugin:4.3.0:testCompile (scala-test-compile) @ spark-columnar-core ---
[INFO] /home/legion/gazelle_workspace/gazelle_plugin/native-sql-engine/core/src/test/java:-1: info: compiling
[INFO] /home/legion/gazelle_workspace/gazelle_plugin/native-sql-engine/core/src/test/scala:-1: info: compiling
[INFO] Compiling 450 source files to /home/legion/gazelle_workspace/gazelle_plugin/native-sql-engine/core/target/scala-2.12/test-classes at 1654155879277
[ERROR] /home/legion/gazelle_workspace/gazelle_plugin/native-sql-engine/core/src/test/scala/com/intel/oap/execution/ArrowRowToColumnarExecSuite.scala:299: error: too many arguments (2) for method stringToDate: (s: org.apache.spark.unsafe.types.UTF8String)Option[Int]
[ERROR] Seq(DateTimeUtils.stringToDate(UTF8String.fromString("1970-1-1"), defaultZoneId).get,
[ERROR] ^
[ERROR] /home/legion/gazelle_workspace/gazelle_plugin/native-sql-engine/core/src/test/scala/com/intel/oap/execution/ArrowRowToColumnarExecSuite.scala:300: error: too many arguments (2) for method stringToDate: (s: org.apache.spark.unsafe.types.UTF8String)Option[Int]
[ERROR] DateTimeUtils.stringToDate(UTF8String.fromString("1970-1-1"), defaultZoneId).get))))
[ERROR] ^
[ERROR] /home/legion/gazelle_workspace/gazelle_plugin/native-sql-engine/core/src/test/scala/com/intel/oap/execution/ArrowRowToColumnarExecSuite.scala:311: error: too many arguments (2) for method stringToDate: (s: org.apache.spark.unsafe.types.UTF8String)Option[Int]
[ERROR] assert(DateTimeUtils.stringToDate(UTF8String.fromString("1970-1-1"), defaultZoneId).get ==
[ERROR] ^
[ERROR] /home/legion/gazelle_workspace/gazelle_plugin/native-sql-engine/core/src/test/scala/com/intel/oap/execution/ArrowRowToColumnarExecSuite.scala:313: error: too many arguments (2) for method stringToDate: (s: org.apache.spark.unsafe.types.UTF8String)Option[Int]
[ERROR] assert(DateTimeUtils.stringToDate(UTF8String.fromString("1970-1-1"), defaultZoneId).get ==
[ERROR] ^
[ERROR] /home/legion/gazelle_workspace/gazelle_plugin/native-sql-engine/core/src/test/scala/org/apache/spark/shuffle/ColumnarShuffleWriterSuite.scala:113: error: value writeIndexFileAndCommit is not a member of org.apache.spark.shuffle.IndexShuffleBlockResolver
[ERROR] possible cause: maybe a semicolon is missing before `value writeIndexFileAndCommit'?
[ERROR] .writeIndexFileAndCommit(anyInt, anyLong, any(classOf[Array[Long]]), any(classOf[File]))
[ERROR] ^
[ERROR] /home/legion/gazelle_workspace/gazelle_plugin/native-sql-engine/core/src/test/scala/org/apache/spark/sql/CharVarcharTestSuite.scala:23: error: object InMemoryPartitionTableCatalog is not a member of package org.apache.spark.sql.connector
[ERROR] import org.apache.spark.sql.connector.{InMemoryPartitionTableCatalog, SchemaRequiredDataSource}
[ERROR] ^
[ERROR] /home/legion/gazelle_workspace/gazelle_plugin/native-sql-engine/core/src/test/scala/org/apache/spark/sql/CharVarcharTestSuite.scala:847: error: not found: type InMemoryPartitionTableCatalog
[ERROR] .set("spark.sql.catalog.testcat", classOf[InMemoryPartitionTableCatalog].getName)
To Reproduce
mvn -Phadoop-3.2,spark-3.2,full-scala-compiler package -Dbuild_arrow=ON -Dbuild_protobuf=ON -Dbuild_jemaoc=ON -DfailIfNoTests=false -DargLine="-Dspark.test.home=$spark_home" -Dexec.skip=true -Dmaven.test.failure.ignore=true
Expected behavior
All run
Additional context
Add any other context about the problem here.
The text was updated successfully, but these errors were encountered: