fix test - have to use streaming relation to avoid EventTimeWatermark… #470
build_main.yml
on: push
Run
/
Check changes
30s
Run
/
Breaking change detection with Buf (branch-3.5)
50s
Run
/
Run TPC-DS queries with SF=1
47m 7s
Run
/
Run Docker integration tests
31m 44s
Run
/
Run Spark on Kubernetes Integration test
52m 49s
Matrix: Run / build
Matrix: Run / java-other-versions
Run
/
Build modules: sparkr
26m 0s
Run
/
Linters, licenses, dependencies and documentation generation
1h 16m
Matrix: Run / pyspark
Annotations
16 errors and 2 warnings
Run / Run Docker integration tests
Process completed with exit code 18.
|
Run / Run Spark on Kubernetes Integration test
HashSet() did not contain "decomtest-230bcd8bfb787c73-exec-1".
|
Run / Run Spark on Kubernetes Integration test
HashSet() did not contain "decomtest-f02a938bfb795bab-exec-1".
|
Run / Run Spark on Kubernetes Integration test
sleep interrupted
|
Run / Run Spark on Kubernetes Integration test
sleep interrupted
|
Run / Run Spark on Kubernetes Integration test
Task io.fabric8.kubernetes.client.utils.internal.SerialExecutor$$Lambda$683/0x00007f94785c8228@5e0f51e rejected from java.util.concurrent.ThreadPoolExecutor@5bebe4a3[Shutting down, pool size = 2, active threads = 2, queued tasks = 0, completed tasks = 299]
|
Run / Run Spark on Kubernetes Integration test
Task io.fabric8.kubernetes.client.utils.internal.SerialExecutor$$Lambda$683/0x00007f94785c8228@63fae1c9 rejected from java.util.concurrent.ThreadPoolExecutor@5bebe4a3[Shutting down, pool size = 1, active threads = 1, queued tasks = 0, completed tasks = 300]
|
Run / Run Spark on Kubernetes Integration test
HashSet() did not contain "decomtest-13d33e8bfb8a2c16-exec-1".
|
Run / Run Spark on Kubernetes Integration test
HashSet() did not contain "decomtest-fe6bfa8bfb8b0943-exec-1".
|
Run / Run Spark on Kubernetes Integration test
HashSet() did not contain "decomtest-8687218bfb8ea2cc-exec-1".
|
Run / Run Spark on Kubernetes Integration test
Status(apiVersion=v1, code=404, details=StatusDetails(causes=[], group=null, kind=pods, name=spark-test-app-97c641dc23454be5b0a397b09c6fa6af-driver, retryAfterSeconds=null, uid=null, additionalProperties={}), kind=Status, message=pods "spark-test-app-97c641dc23454be5b0a397b09c6fa6af-driver" not found, metadata=ListMeta(_continue=null, remainingItemCount=null, resourceVersion=null, selfLink=null, additionalProperties={}), reason=NotFound, status=Failure, additionalProperties={})..
|
Run / Build modules: pyspark-sql, pyspark-resource, pyspark-testing
Process completed with exit code 19.
|
OracleIntegrationSuite.(It is not a test it is a sbt.testing.SuiteSelector):
OracleIntegrationSuite#L1
org.scalatest.exceptions.TestFailedDueToTimeoutException: The code passed to eventually never returned normally. Attempted 418 times over 7.00294288005 minutes. Last failure message: ORA-12541: Cannot connect. No listener at host 10.1.0.64 port 41521. (CONNECTION_ID=n/7CJHzZQcWSGkeUQONQlw==)
https://docs.oracle.com/error-help/db/ora-12541/.
|
|
python/pyspark/sql/tests/pandas/test_pandas_map.py.test_self_join:
python/pyspark/sql/tests/pandas/test_pandas_map.py#L1
[Errno 111] Connection refused
|
|
Run / Build modules: pyspark-errors
No files were found with the provided path: **/target/test-reports/*.xml. No artifacts will be uploaded.
|
Run / Build modules: pyspark-core, pyspark-streaming
No files were found with the provided path: **/target/test-reports/*.xml. No artifacts will be uploaded.
|
Artifacts
Produced during runtime
Name | Size | |
---|---|---|
site
Expired
|
60.4 MB |
|
test-results-api, catalyst, hive-thriftserver--17-hadoop3-hive2.3
Expired
|
2.81 MB |
|
test-results-core, unsafe, kvstore, avro, utils, network-common, network-shuffle, repl, launcher, examples, sketch--17-hadoop3-hive2.3
Expired
|
133 KB |
|
test-results-docker-integration--17-hadoop3-hive2.3
Expired
|
121 KB |
|
test-results-hive-- other tests-17-hadoop3-hive2.3
Expired
|
922 KB |
|
test-results-hive-- slow tests-17-hadoop3-hive2.3
Expired
|
857 KB |
|
test-results-mllib-local, mllib, graphx--17-hadoop3-hive2.3
Expired
|
1.32 MB |
|
test-results-pyspark-connect--17-hadoop3-hive2.3
Expired
|
411 KB |
|
test-results-pyspark-mllib, pyspark-ml, pyspark-ml-connect--17-hadoop3-hive2.3
Expired
|
1.1 MB |
|
test-results-pyspark-pandas--17-hadoop3-hive2.3
Expired
|
1.46 MB |
|
test-results-pyspark-pandas-connect-part0--17-hadoop3-hive2.3
Expired
|
1.32 MB |
|
test-results-pyspark-pandas-connect-part1--17-hadoop3-hive2.3
Expired
|
1.42 MB |
|
test-results-pyspark-pandas-connect-part2--17-hadoop3-hive2.3
Expired
|
953 KB |
|
test-results-pyspark-pandas-connect-part3--17-hadoop3-hive2.3
Expired
|
530 KB |
|
test-results-pyspark-pandas-slow--17-hadoop3-hive2.3
Expired
|
2.86 MB |
|
test-results-pyspark-sql, pyspark-resource, pyspark-testing--17-hadoop3-hive2.3
Expired
|
42.4 KB |
|
test-results-sparkr--17-hadoop3-hive2.3
Expired
|
280 KB |
|
test-results-sql-- extended tests-17-hadoop3-hive2.3
Expired
|
3.01 MB |
|
test-results-sql-- other tests-17-hadoop3-hive2.3
Expired
|
4.32 MB |
|
test-results-sql-- slow tests-17-hadoop3-hive2.3
Expired
|
2.81 MB |
|
test-results-streaming, sql-kafka-0-10, streaming-kafka-0-10, yarn, kubernetes, hadoop-cloud, spark-ganglia-lgpl, connect, protobuf--17-hadoop3-hive2.3
Expired
|
791 KB |
|
test-results-tpcds--17-hadoop3-hive2.3
Expired
|
21.8 KB |
|
unit-tests-log-docker-integration--17-hadoop3-hive2.3
Expired
|
3.61 MB |
|
unit-tests-log-pyspark-sql, pyspark-resource, pyspark-testing--17-hadoop3-hive2.3
Expired
|
369 MB |
|