Skip to content
This repository has been archived by the owner on Sep 18, 2023. It is now read-only.

concat_ws #984

Closed
jackylee-ch opened this issue Jun 21, 2022 · 1 comment
Closed

concat_ws #984

jackylee-ch opened this issue Jun 21, 2022 · 1 comment
Assignees
Labels
bug Something isn't working

Comments

@jackylee-ch
Copy link
Contributor

Describe the bug
The problem sql: select concat_ws('', col_str) from table.
The problem's stacktrace

Caused by: java.util.NoSuchElementException: next on empty iterator
	at scala.collection.Iterator$$anon$2.next(Iterator.scala:41)
	at scala.collection.Iterator$$anon$2.next(Iterator.scala:39)
	at scala.collection.IndexedSeqLike$Elements.next(IndexedSeqLike.scala:63)
	at com.intel.oap.expression.ColumnarConcatWs.doColumnarCodeGen(ColumnarConcatOperator.scala:58)
	at com.intel.oap.expression.ColumnarRegExpExtract.doColumnarCodeGen(ColumnarTernaryOperator.scala:195)
	at com.intel.oap.expression.ColumnarAlias.doColumnarCodeGen(ColumnarNamedExpressions.scala:39)
	at com.intel.oap.expression.ColumnarConditionProjector$.$anonfun$init$3(ColumnarConditionProjector.scala:371)
	at scala.collection.immutable.List.map(List.scala:293)
	at com.intel.oap.expression.ColumnarConditionProjector$.init(ColumnarConditionProjector.scala:369)
	at com.intel.oap.expression.ColumnarConditionProjector$.create(ColumnarConditionProjector.scala:461)
	at com.intel.oap.execution.ColumnarConditionProjectExec.$anonfun$doExecuteColumnar$1(ColumnarBasicPhysicalOperators.scala:283)
	at org.apache.spark.rdd.RDD.$anonfun$mapPartitions$2(RDD.scala:863)
	at org.apache.spark.rdd.RDD.$anonfun$mapPartitions$2$adapted(RDD.scala:863)
	at org.apache.spark.rdd.MapPartitionsRDD.compute(MapPartitionsRDD.scala:52)
	at org.apache.spark.rdd.RDD.computeOrReadCheckpoint(RDD.scala:373)
	at org.apache.spark.rdd.RDD.iterator(RDD.scala:337)
	at org.apache.spark.rdd.MapPartitionsRDD.compute(MapPartitionsRDD.scala:52)
	at org.apache.spark.rdd.RDD.computeOrReadCheckpoint(RDD.scala:373)
	at org.apache.spark.rdd.RDD.iterator(RDD.scala:337)
	at org.apache.spark.rdd.MapPartitionsRDD.compute(MapPartitionsRDD.scala:52)
	at org.apache.spark.rdd.RDD.computeOrReadCheckpoint(RDD.scala:373)
	at org.apache.spark.rdd.RDD.iterator(RDD.scala:337)
	at org.apache.spark.scheduler.ResultTask.runTask(ResultTask.scala:90)
	at org.apache.spark.scheduler.Task.run(Task.scala:131)
	at org.apache.spark.executor.Executor$TaskRunner.$anonfun$run$3(Executor.scala:510)
	at org.apache.spark.util.Utils$.tryWithSafeFinally(Utils.scala:1491)
	at org.apache.spark.executor.Executor$TaskRunner.run(Executor.scala:513)
	at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
	at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
	at java.lang.Thread.run(Thread.java:748)

To Reproduce

create table test(a string) using arrow;
insert overwrite test values("0");
select concat_ws('', a) from test;

Expected behavior
return 0

Additional context
None

@jackylee-ch jackylee-ch added the bug Something isn't working label Jun 21, 2022
@jackylee-ch
Copy link
Contributor Author

jackylee-ch commented Jun 21, 2022

We also meet java.lang.UnsupportedOperationException with the sql bellow.

select concat_ws(',', aa) as test from test_wolong_table group by test;

The stacktrace:

java.lang.UnsupportedOperationException:  --> class org.apache.spark.sql.catalyst.expressions.ConcatWs | concat_ws(,, aa#551, da) is not currently supported.
	at com.intel.oap.expression.ColumnarExpressionConverter$.containsSubquery(ColumnarExpressionConverter.scala:579)
	at org.apache.spark.sql.execution.ColumnarCollapseCodegenStages.$anonfun$containsSubquery$1(ColumnarCollapseCodegenStages.scala:137)
	at org.apache.spark.sql.execution.ColumnarCollapseCodegenStages.$anonfun$containsSubquery$1$adapted(ColumnarCollapseCodegenStages.scala:137)
	at scala.collection.immutable.List.map(List.scala:293)
	at org.apache.spark.sql.execution.ColumnarCollapseCodegenStages.containsSubquery(ColumnarCollapseCodegenStages.scala:137)
	at org.apache.spark.sql.execution.ColumnarCollapseCodegenStages.existsJoins(ColumnarCollapseCodegenStages.scala:156)
	at org.apache.spark.sql.execution.ColumnarCollapseCodegenStages.$anonfun$existsJoins$5(ColumnarCollapseCodegenStages.scala:154)
	at org.apache.spark.sql.execution.ColumnarCollapseCodegenStages.$anonfun$existsJoins$5$adapted(ColumnarCollapseCodegenStages.scala:154)
	at scala.collection.TraversableLike.$anonfun$map$1(TraversableLike.scala:286)
	at scala.collection.Iterator.foreach(Iterator.scala:943)
	at scala.collection.Iterator.foreach$(Iterator.scala:943)
	at scala.collection.AbstractIterator.foreach(Iterator.scala:1431)
	at scala.collection.IterableLike.foreach(IterableLike.scala:74)
	at scala.collection.IterableLike.foreach$(IterableLike.scala:73)

Sign up for free to subscribe to this conversation on GitHub. Already have an account? Sign in.
Labels
bug Something isn't working
Projects
None yet
Development

No branches or pull requests

2 participants