Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

HiveWarehouseDataSourceReader: Unable to read table schema #285

Open
jjhangu opened this issue Jun 10, 2020 · 0 comments
Open

HiveWarehouseDataSourceReader: Unable to read table schema #285

jjhangu opened this issue Jun 10, 2020 · 0 comments

Comments

@jjhangu
Copy link

jjhangu commented Jun 10, 2020

I'm using hive-warehouse-connector_2.11-1.0.0.3.1.5.49-1.jar
spark version 2.3.2
hive version 3.1.0.3.1.0.0-78

show database : Okay
create table : Okay
select table : Error occurred
Caused by: org.apache.hive.service.cli.HiveSQLException: Error while compiling statement: FAILED: SemanticException [Error 10011]: Invalid function get_llap_splits

I attached full log below. could you tell me why this happen. i searched google few hours. but could't get exact answer.

application code

HiveWarehouseSessionImpl hiveWarehouseSession = HiveWarehouseBuilder.session(spark).build();
hiveWarehouseSession.setDatabase("dbname");
hiveWarehouseSession.showTables().show(100, false);
hiveWarehouseSession.executeQuery("select * from orc_test1").show(100, false);
20/06/10 14:05:17 ERROR LlapBaseInputFormat: Closing connection due to error
shadehive.org.apache.hive.service.cli.HiveSQLException: Error while compiling statement: FAILED: SemanticException [Error 10011]: Invalid function get_llap_splits
        at shadehive.org.apache.hive.jdbc.Utils.verifySuccess(Utils.java:300)
        at shadehive.org.apache.hive.jdbc.Utils.verifySuccessWithInfo(Utils.java:286)
        at shadehive.org.apache.hive.jdbc.HiveStatement.runAsyncOnServer(HiveStatement.java:324)
        at shadehive.org.apache.hive.jdbc.HiveStatement.execute(HiveStatement.java:265)
        at shadehive.org.apache.hive.jdbc.HiveStatement.executeQuery(HiveStatement.java:497)
        at org.apache.hadoop.hive.llap.LlapBaseInputFormat.getSplits(LlapBaseInputFormat.java:310)
        at com.hortonworks.spark.sql.hive.llap.readers.HiveWarehouseDataSourceReader.getTableSchema(HiveWarehouseDataSourceReader.java:152)
        at com.hortonworks.spark.sql.hive.llap.readers.HiveWarehouseDataSourceReader.readSchema(HiveWarehouseDataSourceReader.java:166)
        at com.hortonworks.spark.sql.hive.llap.readers.HiveWarehouseDataSourceReader.shouldEnableBatchRead(HiveWarehouseDataSourceReader.java:97)
        at com.hortonworks.spark.sql.hive.llap.readers.HiveWarehouseDataSourceReader.<init>(HiveWarehouseDataSourceReader.java:86)
        at com.hortonworks.spark.sql.hive.llap.HiveWarehouseConnector.getDataSourceReader(HiveWarehouseConnector.java:92)
        at com.hortonworks.spark.sql.hive.llap.HiveWarehouseConnector.createReader(HiveWarehouseConnector.java:51)
        at org.apache.spark.sql.DataFrameReader.load(DataFrameReader.scala:206)
        at org.apache.spark.sql.DataFrameReader.load(DataFrameReader.scala:164)
        at com.hortonworks.spark.sql.hive.llap.HiveWarehouseSessionImpl.executeQueryInternal(HiveWarehouseSessionImpl.java:134)
        at com.hortonworks.spark.sql.hive.llap.HiveWarehouseSessionImpl.executeSmart(HiveWarehouseSessionImpl.java:190)
        at com.hortonworks.spark.sql.hive.llap.HiveWarehouseSessionImpl.executeQuery(HiveWarehouseSessionImpl.java:116)
        at com.naver.mercury.sample.sparkjob.SparkHiveWarehouseConnectorTest.start(SparkHiveWarehouseConnectorTest.java:44)
        at com.naver.mercury.sample.sparkjob.SparkHiveWarehouseConnectorTest.main(SparkHiveWarehouseConnectorTest.java:23)
        at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
        at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
        at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
        at java.lang.reflect.Method.invoke(Method.java:498)
        at org.apache.spark.deploy.JavaMainApplication.start(SparkApplication.scala:52)
        at org.apache.spark.deploy.SparkSubmit$.org$apache$spark$deploy$SparkSubmit$$runMain(SparkSubmit.scala:904)
        at org.apache.spark.deploy.SparkSubmit$.doRunMain$1(SparkSubmit.scala:198)
        at org.apache.spark.deploy.SparkSubmit$.submit(SparkSubmit.scala:228)
        at org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:137)
        at org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala)
Caused by: org.apache.hive.service.cli.HiveSQLException: Error while compiling statement: FAILED: SemanticException [Error 10011]: Invalid function get_llap_splits
        at org.apache.hive.service.cli.operation.Operation.toSQLException(Operation.java:335)
        at org.apache.hive.service.cli.operation.SQLOperation.prepare(SQLOperation.java:199)
        at org.apache.hive.service.cli.operation.SQLOperation.runInternal(SQLOperation.java:262)
        at org.apache.hive.service.cli.operation.Operation.run(Operation.java:247)
        at org.apache.hive.service.cli.session.HiveSessionImpl.executeStatementInternal(HiveSessionImpl.java:575)
        at org.apache.hive.service.cli.session.HiveSessionImpl.executeStatementAsync(HiveSessionImpl.java:561)
        at sun.reflect.GeneratedMethodAccessor58.invoke(Unknown Source)
        at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
        at java.lang.reflect.Method.invoke(Method.java:498)
        at org.apache.hive.service.cli.session.HiveSessionProxy.invoke(HiveSessionProxy.java:78)
        at org.apache.hive.service.cli.session.HiveSessionProxy.access$000(HiveSessionProxy.java:36)
        at org.apache.hive.service.cli.session.HiveSessionProxy$1.run(HiveSessionProxy.java:63)
        at java.security.AccessController.doPrivileged(Native Method)
        at javax.security.auth.Subject.doAs(Subject.java:422)
        at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1729)
        at org.apache.hive.service.cli.session.HiveSessionProxy.invoke(HiveSessionProxy.java:59)
        at com.sun.proxy.$Proxy71.executeStatementAsync(Unknown Source)
        at org.apache.hive.service.cli.CLIService.executeStatementAsync(CLIService.java:315)
        at org.apache.hive.service.cli.thrift.ThriftCLIService.ExecuteStatement(ThriftCLIService.java:566)
        at org.apache.hive.service.rpc.thrift.TCLIService$Processor$ExecuteStatement.getResult(TCLIService.java:1557)
        at org.apache.hive.service.rpc.thrift.TCLIService$Processor$ExecuteStatement.getResult(TCLIService.java:1542)
        at org.apache.thrift.ProcessFunction.process(ProcessFunction.java:39)
        at org.apache.thrift.TBaseProcessor.process(TBaseProcessor.java:39)
        at org.apache.hadoop.hive.metastore.security.HadoopThriftAuthBridge$Server$TUGIAssumingProcessor.process(HadoopThriftAuthBridge.java:647)
        at org.apache.thrift.server.TThreadPoolServer$WorkerProcess.run(TThreadPoolServer.java:286)
        at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142)
        at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617)
        at java.lang.Thread.run(Thread.java:745)
Caused by: org.apache.hadoop.hive.ql.parse.SemanticException: Invalid function get_llap_splits
        at org.apache.hadoop.hive.ql.parse.SemanticAnalyzer.doPhase1GetAllAggregations(SemanticAnalyzer.java:912)
        at org.apache.hadoop.hive.ql.parse.SemanticAnalyzer.doPhase1GetAggregationsFromSelect(SemanticAnalyzer.java:645)
        at org.apache.hadoop.hive.ql.parse.SemanticAnalyzer.doPhase1(SemanticAnalyzer.java:1562)
        at org.apache.hadoop.hive.ql.parse.SemanticAnalyzer.doPhase1(SemanticAnalyzer.java:1868)
        at org.apache.hadoop.hive.ql.parse.SemanticAnalyzer.doPhase1(SemanticAnalyzer.java:1868)
        at org.apache.hadoop.hive.ql.parse.SemanticAnalyzer.genResolvedParseTree(SemanticAnalyzer.java:12155)
        at org.apache.hadoop.hive.ql.parse.SemanticAnalyzer.analyzeInternal(SemanticAnalyzer.java:12257)
        at org.apache.hadoop.hive.ql.parse.CalcitePlanner.analyzeInternal(CalcitePlanner.java:360)
        at org.apache.hadoop.hive.ql.parse.BaseSemanticAnalyzer.analyze(BaseSemanticAnalyzer.java:289)
        at org.apache.hadoop.hive.ql.Driver.compile(Driver.java:664)
        at org.apache.hadoop.hive.ql.Driver.compileInternal(Driver.java:1869)
        at org.apache.hadoop.hive.ql.Driver.compileAndRespond(Driver.java:1816)
        at org.apache.hadoop.hive.ql.Driver.compileAndRespond(Driver.java:1811)
        at org.apache.hadoop.hive.ql.reexec.ReExecDriver.compileAndRespond(ReExecDriver.java:126)
        at org.apache.hive.service.cli.operation.SQLOperation.prepare(SQLOperation.java:197)
        ... 26 more
20/06/10 14:05:17 ERROR HiveWarehouseDataSourceReader: Unable to read table schema
Exception in thread "main" java.lang.RuntimeException: java.io.IOException: shadehive.org.apache.hive.service.cli.HiveSQLException: Error while compiling statement: FAILED: SemanticException [Error 10011]: Invalid function get_llap_splits
        at com.hortonworks.spark.sql.hive.llap.readers.HiveWarehouseDataSourceReader.readSchema(HiveWarehouseDataSourceReader.java:172)
        at com.hortonworks.spark.sql.hive.llap.readers.HiveWarehouseDataSourceReader.shouldEnableBatchRead(HiveWarehouseDataSourceReader.java:97)
        at com.hortonworks.spark.sql.hive.llap.readers.HiveWarehouseDataSourceReader.<init>(HiveWarehouseDataSourceReader.java:86)
        at com.hortonworks.spark.sql.hive.llap.HiveWarehouseConnector.getDataSourceReader(HiveWarehouseConnector.java:92)
        at com.hortonworks.spark.sql.hive.llap.HiveWarehouseConnector.createReader(HiveWarehouseConnector.java:51)
        at org.apache.spark.sql.DataFrameReader.load(DataFrameReader.scala:206)
        at org.apache.spark.sql.DataFrameReader.load(DataFrameReader.scala:164)
        at com.hortonworks.spark.sql.hive.llap.HiveWarehouseSessionImpl.executeQueryInternal(HiveWarehouseSessionImpl.java:134)
        at com.hortonworks.spark.sql.hive.llap.HiveWarehouseSessionImpl.executeSmart(HiveWarehouseSessionImpl.java:190)
        at com.hortonworks.spark.sql.hive.llap.HiveWarehouseSessionImpl.executeQuery(HiveWarehouseSessionImpl.java:116)
        at com.naver.mercury.sample.sparkjob.SparkHiveWarehouseConnectorTest.start(SparkHiveWarehouseConnectorTest.java:44)
        at com.naver.mercury.sample.sparkjob.SparkHiveWarehouseConnectorTest.main(SparkHiveWarehouseConnectorTest.java:23)
        at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
        at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
        at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
        at java.lang.reflect.Method.invoke(Method.java:498)
        at org.apache.spark.deploy.JavaMainApplication.start(SparkApplication.scala:52)
        at org.apache.spark.deploy.SparkSubmit$.org$apache$spark$deploy$SparkSubmit$$runMain(SparkSubmit.scala:904)
        at org.apache.spark.deploy.SparkSubmit$.doRunMain$1(SparkSubmit.scala:198)
        at org.apache.spark.deploy.SparkSubmit$.submit(SparkSubmit.scala:228)
        at org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:137)
        at org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala)
Caused by: java.io.IOException: shadehive.org.apache.hive.service.cli.HiveSQLException: Error while compiling statement: FAILED: SemanticException [Error 10011]: Invalid function get_llap_splits
        at org.apache.hadoop.hive.llap.LlapBaseInputFormat.getSplits(LlapBaseInputFormat.java:350)
        at com.hortonworks.spark.sql.hive.llap.readers.HiveWarehouseDataSourceReader.getTableSchema(HiveWarehouseDataSourceReader.java:152)
        at com.hortonworks.spark.sql.hive.llap.readers.HiveWarehouseDataSourceReader.readSchema(HiveWarehouseDataSourceReader.java:166)
        ... 21 more
Caused by: shadehive.org.apache.hive.service.cli.HiveSQLException: Error while compiling statement: FAILED: SemanticException [Error 10011]: Invalid function get_llap_splits
        at shadehive.org.apache.hive.jdbc.Utils.verifySuccess(Utils.java:300)
        at shadehive.org.apache.hive.jdbc.Utils.verifySuccessWithInfo(Utils.java:286)
        at shadehive.org.apache.hive.jdbc.HiveStatement.runAsyncOnServer(HiveStatement.java:324)
        at shadehive.org.apache.hive.jdbc.HiveStatement.execute(HiveStatement.java:265)
        at shadehive.org.apache.hive.jdbc.HiveStatement.executeQuery(HiveStatement.java:497)
        at org.apache.hadoop.hive.llap.LlapBaseInputFormat.getSplits(LlapBaseInputFormat.java:310)
        ... 23 more
Caused by: org.apache.hive.service.cli.HiveSQLException: Error while compiling statement: FAILED: SemanticException [Error 10011]: Invalid function get_llap_splits
        at org.apache.hive.service.cli.operation.Operation.toSQLException(Operation.java:335)
        at org.apache.hive.service.cli.operation.SQLOperation.prepare(SQLOperation.java:199)
        at org.apache.hive.service.cli.operation.SQLOperation.runInternal(SQLOperation.java:262)
        at org.apache.hive.service.cli.operation.Operation.run(Operation.java:247)
        at org.apache.hive.service.cli.session.HiveSessionImpl.executeStatementInternal(HiveSessionImpl.java:575)
        at org.apache.hive.service.cli.session.HiveSessionImpl.executeStatementAsync(HiveSessionImpl.java:561)
        at sun.reflect.GeneratedMethodAccessor58.invoke(Unknown Source)
        at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
        at java.lang.reflect.Method.invoke(Method.java:498)
        at org.apache.hive.service.cli.session.HiveSessionProxy.invoke(HiveSessionProxy.java:78)
        at org.apache.hive.service.cli.session.HiveSessionProxy.access$000(HiveSessionProxy.java:36)
        at org.apache.hive.service.cli.session.HiveSessionProxy$1.run(HiveSessionProxy.java:63)
        at java.security.AccessController.doPrivileged(Native Method)
        at javax.security.auth.Subject.doAs(Subject.java:422)
        at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1729)
        at org.apache.hive.service.cli.session.HiveSessionProxy.invoke(HiveSessionProxy.java:59)
        at com.sun.proxy.$Proxy71.executeStatementAsync(Unknown Source)
        at org.apache.hive.service.cli.CLIService.executeStatementAsync(CLIService.java:315)
        at org.apache.hive.service.cli.thrift.ThriftCLIService.ExecuteStatement(ThriftCLIService.java:566)
        at org.apache.hive.service.rpc.thrift.TCLIService$Processor$ExecuteStatement.getResult(TCLIService.java:1557)
        at org.apache.hive.service.rpc.thrift.TCLIService$Processor$ExecuteStatement.getResult(TCLIService.java:1542)
        at org.apache.thrift.ProcessFunction.process(ProcessFunction.java:39)
        at org.apache.thrift.TBaseProcessor.process(TBaseProcessor.java:39)
        at org.apache.hadoop.hive.metastore.security.HadoopThriftAuthBridge$Server$TUGIAssumingProcessor.process(HadoopThriftAuthBridge.java:647)
        at org.apache.thrift.server.TThreadPoolServer$WorkerProcess.run(TThreadPoolServer.java:286)
        at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142)
        at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617)
        at java.lang.Thread.run(Thread.java:745)
Caused by: org.apache.hadoop.hive.ql.parse.SemanticException: Invalid function get_llap_splits
        at org.apache.hadoop.hive.ql.parse.SemanticAnalyzer.doPhase1GetAllAggregations(SemanticAnalyzer.java:912)
        at org.apache.hadoop.hive.ql.parse.SemanticAnalyzer.doPhase1GetAggregationsFromSelect(SemanticAnalyzer.java:645)
        at org.apache.hadoop.hive.ql.parse.SemanticAnalyzer.doPhase1(SemanticAnalyzer.java:1562)
        at org.apache.hadoop.hive.ql.parse.SemanticAnalyzer.doPhase1(SemanticAnalyzer.java:1868)
        at org.apache.hadoop.hive.ql.parse.SemanticAnalyzer.doPhase1(SemanticAnalyzer.java:1868)
        at org.apache.hadoop.hive.ql.parse.SemanticAnalyzer.genResolvedParseTree(SemanticAnalyzer.java:12155)
        at org.apache.hadoop.hive.ql.parse.SemanticAnalyzer.analyzeInternal(SemanticAnalyzer.java:12257)
        at org.apache.hadoop.hive.ql.parse.CalcitePlanner.analyzeInternal(CalcitePlanner.java:360)
        at org.apache.hadoop.hive.ql.parse.BaseSemanticAnalyzer.analyze(BaseSemanticAnalyzer.java:289)
        at org.apache.hadoop.hive.ql.Driver.compile(Driver.java:664)
        at org.apache.hadoop.hive.ql.Driver.compileInternal(Driver.java:1869)
        at org.apache.hadoop.hive.ql.Driver.compileAndRespond(Driver.java:1816)
        at org.apache.hadoop.hive.ql.Driver.compileAndRespond(Driver.java:1811)
        at org.apache.hadoop.hive.ql.reexec.ReExecDriver.compileAndRespond(ReExecDriver.java:126)
        at org.apache.hive.service.cli.operation.SQLOperation.prepare(SQLOperation.java:197)
        ... 26 more
20/06/10 14:05:17 INFO SparkContext: Invoking stop() from shutdown hook

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

1 participant