SPARK 46840 added org.apache.spark.sql.execution.benchmark.Collation… #8
build_main.yml
on: push
Run
/
Check changes
35s
Run
/
Protobuf breaking change detection and Python CodeGen check
1m 18s
Run
/
Run TPC-DS queries with SF=1
46m 11s
Run
/
Run Docker integration tests
40m 35s
Run
/
Run Spark on Kubernetes Integration test
56m 35s
Run
/
Run Spark UI tests
20s
Matrix: Run / build
Matrix: Run / maven-build
Run
/
Build modules: sparkr
28m 24s
Run
/
Linters, licenses, dependencies and documentation generation
55m 27s
Matrix: Run / pyspark
Annotations
11 errors and 2 warnings
Run / Run Spark on Kubernetes Integration test
HashSet() did not contain "decomtest-d365c88e4f320de4-exec-1".
|
Run / Run Spark on Kubernetes Integration test
HashSet() did not contain "decomtest-1924ae8e4f32f471-exec-1".
|
Run / Run Spark on Kubernetes Integration test
sleep interrupted
|
Run / Run Spark on Kubernetes Integration test
sleep interrupted
|
Run / Run Spark on Kubernetes Integration test
Task io.fabric8.kubernetes.client.utils.internal.SerialExecutor$$Lambda$695/0x00007284205c08e8@3dc7395a rejected from java.util.concurrent.ThreadPoolExecutor@4d4b5a5f[Shutting down, pool size = 4, active threads = 2, queued tasks = 0, completed tasks = 394]
|
Run / Run Spark on Kubernetes Integration test
Task io.fabric8.kubernetes.client.utils.internal.SerialExecutor$$Lambda$695/0x00007284205c08e8@d8658e7 rejected from java.util.concurrent.ThreadPoolExecutor@4d4b5a5f[Shutting down, pool size = 1, active threads = 1, queued tasks = 0, completed tasks = 395]
|
Run / Run Spark on Kubernetes Integration test
HashSet() did not contain "decomtest-5c23cf8e4f44d852-exec-1".
|
Run / Run Spark on Kubernetes Integration test
HashSet() did not contain "decomtest-0031428e4f45b890-exec-1".
|
Run / Run Spark on Kubernetes Integration test
HashSet() did not contain "decomtest-9639728e4f495de2-exec-1".
|
Run / Run Spark on Kubernetes Integration test
Status(apiVersion=v1, code=404, details=StatusDetails(causes=[], group=null, kind=pods, name=spark-test-app-a90c42ff15f44d82a479ffc0e2c7b0e5-driver, retryAfterSeconds=null, uid=null, additionalProperties={}), kind=Status, message=pods "spark-test-app-a90c42ff15f44d82a479ffc0e2c7b0e5-driver" not found, metadata=ListMeta(_continue=null, remainingItemCount=null, resourceVersion=null, selfLink=null, additionalProperties={}), reason=NotFound, status=Failure, additionalProperties={})..
|
Run / Build modules: pyspark-connect
Process completed with exit code 19.
|
Run / Protobuf breaking change detection and Python CodeGen check
Node.js 16 actions are deprecated. Please update the following actions to use Node.js 20: bufbuild/buf-lint-action@v1, bufbuild/buf-breaking-action@v1. For more information see: https://github.blog/changelog/2023-09-22-github-actions-transitioning-from-node-16-to-node-20/.
|
Run / Build modules: pyspark-core, pyspark-errors, pyspark-streaming
No files were found with the provided path: **/target/test-reports/*.xml. No artifacts will be uploaded.
|
Artifacts
Produced during runtime
Name | Size | |
---|---|---|
test-results-pyspark-mllib, pyspark-ml, pyspark-ml-connect--17-hadoop3-hive2.3-python3.9
Expired
|
175 KB |
|