Skip to content

[DO NOT MERGE] Run all PostCommit and PreCommit Tests against Release Branch #3489

[DO NOT MERGE] Run all PostCommit and PreCommit Tests against Release Branch

[DO NOT MERGE] Run all PostCommit and PreCommit Tests against Release Branch #3489

Triggered via pull request November 13, 2024 22:18
@damccormdamccorm
synchronize #33101
Status Success
Total duration 28m 56s
Artifacts

beam_PostCommit_Python_MongoDBIO_IT.yml

on: pull_request_target
Matrix: beam_PostCommit_Python_MongoDBIO_IT
Fit to window
Zoom out
Zoom in

Annotations

33 errors and 43 warnings
Error processing result file: sdks/java/io/cdap/build/test-results/test/TEST-org.apache.beam.sdk.io.cdap.CdapIOTest.xml#L0
CData section too big found, line 31422, column 46 (TEST-org.apache.beam.sdk.io.cdap.CdapIOTest.xml, line 31422)
Error processing result file: sdks/java/io/jdbc/build/test-results/integrationTest/TEST-org.apache.beam.sdk.io.jdbc.JdbcIOAutoPartitioningIT.xml#L0
CData section too big found, line 50946, column 11 (TEST-org.apache.beam.sdk.io.jdbc.JdbcIOAutoPartitioningIT.xml, line 50946)
Error processing result file: runners/samza/build/test-results/validatesRunner/TEST-org.apache.beam.sdk.transforms.ViewTest.xml#L0
Memory allocation failed : Huge input lookup, line 62637, column 88 (TEST-org.apache.beam.sdk.transforms.ViewTest.xml, line 62637)
Error processing result file: runners/samza/build/test-results/validatesRunner/TEST-org.apache.beam.sdk.io.TFRecordIOTest.xml#L0
CData section too big found, line 1797, column 9688374 (TEST-org.apache.beam.sdk.io.TFRecordIOTest.xml, line 1797)
Error processing result file: runners/samza/build/test-results/validatesRunner/TEST-org.apache.beam.sdk.io.TextIOReadTest$CompressedReadTest.xml#L0
Memory allocation failed : Huge input lookup, line 58054, column 259 (TEST-org.apache.beam.sdk.io.TextIOReadTest$CompressedReadTest.xml, line 58054)
Error processing result file: runners/samza/build/test-results/validatesRunner/TEST-org.apache.beam.sdk.testing.PAssertTest.xml#L0
Memory allocation failed : Huge input lookup, line 97211, column 68 (TEST-org.apache.beam.sdk.testing.PAssertTest.xml, line 97211)
Error processing result file: sdks/java/io/cdap/build/test-results/test/TEST-org.apache.beam.sdk.io.cdap.CdapIOTest.xml#L0
CData section too big found, line 25849, column 9091 (TEST-org.apache.beam.sdk.io.cdap.CdapIOTest.xml, line 25849)
Error processing result file: runners/samza/job-server/build/test-results/validatesPortableRunnerEmbedded/TEST-org.apache.beam.sdk.transforms.ViewTest.xml#L0
Memory allocation failed : Huge input lookup, line 47947, column 90 (TEST-org.apache.beam.sdk.transforms.ViewTest.xml, line 47947)
Error processing result file: runners/flink/1.19/job-server/build/test-results/validatesPortableRunnerStreaming/TEST-org.apache.beam.sdk.transforms.ViewTest.xml#L0
Memory allocation failed : Huge input lookup, line 101750, column 34 (TEST-org.apache.beam.sdk.transforms.ViewTest.xml, line 101750)
Error processing result file: sdks/java/io/kafka/kafka-312/build/test-results/kafkaVersion312BatchIT/TEST-org.apache.beam.sdk.io.kafka.KafkaIOIT.xml#L0
CData section too big found, line 95731, column 99 (TEST-org.apache.beam.sdk.io.kafka.KafkaIOIT.xml, line 95731)
Error processing result file: runners/spark/3/job-server/build/test-results/validatesPortableRunnerStreaming/TEST-org.apache.beam.sdk.metrics.MetricsTest$AttemptedMetricTests.xml#L0
CData section too big found, line 67640, column 97 (TEST-org.apache.beam.sdk.metrics.MetricsTest$AttemptedMetricTests.xml, line 67640)
Error processing result file: runners/spark/3/job-server/build/test-results/validatesPortableRunnerStreaming/TEST-org.apache.beam.sdk.PipelineTest.xml#L0
CData section too big found, line 64984, column 118 (TEST-org.apache.beam.sdk.PipelineTest.xml, line 64984)
Error processing result file: runners/spark/3/job-server/build/test-results/validatesPortableRunnerStreaming/TEST-org.apache.beam.sdk.transforms.ParDoTest$TimestampTests.xml#L0
CData section too big found, line 64704, column 325 (TEST-org.apache.beam.sdk.transforms.ParDoTest$TimestampTests.xml, line 64704)
Error processing result file: runners/spark/3/job-server/build/test-results/validatesPortableRunnerStreaming/TEST-org.apache.beam.sdk.transforms.CombineTest$WindowingTests.xml#L0
CData section too big found, line 62500, column 24 (TEST-org.apache.beam.sdk.transforms.CombineTest$WindowingTests.xml, line 62500)
Error processing result file: runners/spark/3/job-server/build/test-results/validatesPortableRunnerStreaming/TEST-org.apache.beam.sdk.transforms.GroupByKeyTest$BasicTests.xml#L0
CData section too big found, line 63739, column 9 (TEST-org.apache.beam.sdk.transforms.GroupByKeyTest$BasicTests.xml, line 63739)
Error processing result file: runners/spark/3/job-server/build/test-results/validatesPortableRunnerStreaming/TEST-org.apache.beam.sdk.transforms.KeysTest.xml#L0
CData section too big found, line 65140, column 31 (TEST-org.apache.beam.sdk.transforms.KeysTest.xml, line 65140)
Error processing result file: runners/spark/3/job-server/build/test-results/validatesPortableRunnerStreaming/TEST-org.apache.beam.sdk.transforms.ParDoSchemaTest.xml#L0
CData section too big found, line 66068, column 55 (TEST-org.apache.beam.sdk.transforms.ParDoSchemaTest.xml, line 66068)
Error processing result file: runners/spark/3/job-server/build/test-results/validatesPortableRunnerStreaming/TEST-org.apache.beam.sdk.transforms.ParDoTest$MultipleInputsAndOutputTests.xml#L0
CData section too big found, line 63415, column 212 (TEST-org.apache.beam.sdk.transforms.ParDoTest$MultipleInputsAndOutputTests.xml, line 63415)
Error processing result file: runners/spark/3/job-server/build/test-results/validatesPortableRunnerStreaming/TEST-org.apache.beam.sdk.transforms.WithTimestampsTest.xml#L0
CData section too big found, line 63450, column 155 (TEST-org.apache.beam.sdk.transforms.WithTimestampsTest.xml, line 63450)
Error processing result file: runners/spark/3/job-server/build/test-results/validatesPortableRunnerStreaming/TEST-org.apache.beam.sdk.transforms.ParDoTest$BasicTests.xml#L0
CData section too big found, line 64714, column 164 (TEST-org.apache.beam.sdk.transforms.ParDoTest$BasicTests.xml, line 64714)
Error processing result file: runners/spark/3/job-server/build/test-results/validatesPortableRunnerStreaming/TEST-org.apache.beam.sdk.transforms.windowing.WindowTest.xml#L0
CData section too big found, line 64098, column 12 (TEST-org.apache.beam.sdk.transforms.windowing.WindowTest.xml, line 64098)
Error processing result file: runners/spark/3/job-server/build/test-results/validatesPortableRunnerStreaming/TEST-org.apache.beam.sdk.transforms.ParDoTest$LifecycleTests.xml#L0
CData section too big found, line 63448, column 108 (TEST-org.apache.beam.sdk.transforms.ParDoTest$LifecycleTests.xml, line 63448)
Error processing result file: runners/spark/3/job-server/build/test-results/validatesPortableRunnerStreaming/TEST-org.apache.beam.sdk.transforms.ReifyTimestampsTest.xml#L0
CData section too big found, line 64610, column 100 (TEST-org.apache.beam.sdk.transforms.ReifyTimestampsTest.xml, line 64610)
Error processing result file: runners/spark/3/job-server/build/test-results/validatesPortableRunnerStreaming/TEST-org.apache.beam.sdk.transforms.CreateTest.xml#L0
CData section too big found, line 64809, column 80 (TEST-org.apache.beam.sdk.transforms.CreateTest.xml, line 64809)
Error processing result file: runners/spark/3/job-server/build/test-results/validatesPortableRunnerStreaming/TEST-org.apache.beam.sdk.values.PCollectionTupleTest.xml#L0
CData section too big found, line 64584, column 11 (TEST-org.apache.beam.sdk.values.PCollectionTupleTest.xml, line 64584)
Error processing result file: runners/spark/3/job-server/build/test-results/validatesPortableRunnerStreaming/TEST-org.apache.beam.sdk.transforms.CombineTest$BasicTests.xml#L0
CData section too big found, line 63240, column 84 (TEST-org.apache.beam.sdk.transforms.CombineTest$BasicTests.xml, line 63240)
Error processing result file: runners/spark/3/job-server/build/test-results/validatesPortableRunnerStreaming/TEST-org.apache.beam.sdk.transforms.ParDoLifecycleTest.xml#L0
CData section too big found, line 68970, column 161 (TEST-org.apache.beam.sdk.transforms.ParDoLifecycleTest.xml, line 68970)
Error processing result file: runners/spark/3/job-server/build/test-results/validatesPortableRunnerStreaming/TEST-org.apache.beam.sdk.transforms.RedistributeTest.xml#L0
CData section too big found, line 64988, column 125 (TEST-org.apache.beam.sdk.transforms.RedistributeTest.xml, line 64988)
Error processing result file: runners/spark/3/job-server/build/test-results/validatesPortableRunnerStreaming/TEST-org.apache.beam.sdk.transforms.FlattenTest.xml#L0
CData section too big found, line 64428, column 179 (TEST-org.apache.beam.sdk.transforms.FlattenTest.xml, line 64428)
Error processing result file: runners/spark/3/job-server/build/test-results/validatesPortableRunnerStreaming/TEST-org.apache.beam.sdk.values.PCollectionRowTupleTest.xml#L0
CData section too big found, line 64346, column 41 (TEST-org.apache.beam.sdk.values.PCollectionRowTupleTest.xml, line 64346)
Error processing result file: runners/spark/3/job-server/build/test-results/validatesPortableRunnerStreaming/TEST-org.apache.beam.sdk.testing.PAssertTest.xml#L0
CData section too big found, line 64760, column 27 (TEST-org.apache.beam.sdk.testing.PAssertTest.xml, line 64760)
Error processing result file: runners/spark/3/job-server/build/test-results/validatesPortableRunnerStreaming/TEST-org.apache.beam.sdk.transforms.ReshuffleTest.xml#L0
CData section too big found, line 63575, column 121 (TEST-org.apache.beam.sdk.transforms.ReshuffleTest.xml, line 63575)
testBigtableWriteAndRead (org.apache.beam.it.gcp.bigtable.BigTableIOLT) failed: org.apache.beam.it.gcp.bigtable.BigTableIOLT#L0
it/google-cloud-platform/build/test-results/BigTablePerformanceTest/TEST-org.apache.beam.it.gcp.bigtable.BigTableIOLT.xml [took 5s]
classMethod (org.apache.beam.it.gcp.bigquery.BigQueryStreamingLT) failed: org.apache.beam.it.gcp.bigquery.BigQueryStreamingLT#L0
it/google-cloud-platform/build/test-results/BigQueryStorageApiStreamingPerformanceTest/TEST-org.apache.beam.it.gcp.bigquery.BigQueryStreamingLT.xml [took 0s]
testWriteBatchesToDynamicWithTimeout (org.apache.beam.sdk.io.aws2.sqs.SqsIOWriteBatchesTest) failed: org.apache.beam.sdk.io.aws2.sqs.SqsIOWriteBatchesTest#L0
sdks/java/io/amazon-web-services2/build/test-results/test/TEST-org.apache.beam.sdk.io.aws2.sqs.SqsIOWriteBatchesTest.xml [took 0s]
1 out of 11 runs failed: testWithResume (org.apache.beam.runners.spark.translation.streaming.ResumeFromCheckpointStreamingTest): org.apache.beam.runners.spark.translation.streaming.ResumeFromCheckpointStreamingTest#L0
runners/spark/3/build/test-results/sparkVersion331Test/TEST-org.apache.beam.runners.spark.translation.streaming.ResumeFromCheckpointStreamingTest.xml [took 32s]
testOnNewWorkerMetadata_correctlyRemovesStaleWindmillServers (org.apache.beam.runners.dataflow.worker.streaming.harness.FanOutStreamingEngineWorkerHarnessTest) failed: org.apache.beam.runners.dataflow.worker.streaming.harness.FanOutStreamingEngineWorkerHarnessTest#L0
runners/google-cloud-dataflow-java/worker/build/test-results/test/TEST-org.apache.beam.runners.dataflow.worker.streaming.harness.FanOutStreamingEngineWorkerHarnessTest.xml [took 5s]
beam_PostCommit_Python_MongoDBIO_IT (Run Python MongoDBIO_IT)
The following actions use a deprecated Node.js version and will be forced to run on node20: actions/setup-python@v4, actions/setup-java@v3, gradle/gradle-build-action@v2. For more info: https://github.blog/changelog/2024-03-07-github-actions-all-actions-will-run-on-node20-instead-of-node16-by-default/
beam_PostCommit_Python_MongoDBIO_IT (Run Python MongoDBIO_IT)
Could not find any files for **/pytest*.xml
testWindowedWordCountInStreamingStaticSharding (org.apache.beam.examples.WindowedWordCountIT) failed: org.apache.beam.examples.WindowedWordCountIT#L0
runners/google-cloud-dataflow-java/arm/build/test-results/examplesJavaRunnerV2IntegrationTestARM/TEST-org.apache.beam.examples.WindowedWordCountIT.xml [took 48s]
testWindowedWordCountInStreamingStaticSharding (org.apache.beam.examples.WindowedWordCountIT) failed: org.apache.beam.examples.WindowedWordCountIT#L0
runners/google-cloud-dataflow-java/arm/build/test-results/examplesJavaRunnerV2IntegrationTestARM/TEST-org.apache.beam.examples.WindowedWordCountIT.xml [took 56s]
testWindowedWordCountInStreamingStaticSharding (org.apache.beam.examples.WindowedWordCountIT) failed: org.apache.beam.examples.WindowedWordCountIT#L0
runners/google-cloud-dataflow-java/arm/build/test-results/examplesJavaRunnerV2IntegrationTestARM/TEST-org.apache.beam.examples.WindowedWordCountIT.xml [took 1m 1s]
testInvalidRowCaughtByBigquery[0] (org.apache.beam.sdk.io.gcp.bigquery.StorageApiSinkFailedRowsIT) failed: org.apache.beam.sdk.io.gcp.bigquery.StorageApiSinkFailedRowsIT#L0
sdks/java/io/google-cloud-platform/build/test-results/integrationTest/TEST-org.apache.beam.sdk.io.gcp.bigquery.StorageApiSinkFailedRowsIT.xml [took 42s]
testAfterProcessingTimeContinuationTriggerUsingState (org.apache.beam.sdk.transforms.GroupByKeyTest$BasicTests) failed: org.apache.beam.sdk.transforms.GroupByKeyTest$BasicTests#L0
runners/flink/1.19/build/test-results/validatesRunnerStreaming/TEST-org.apache.beam.sdk.transforms.GroupByKeyTest$BasicTests.xml [took 2s]
testE2EJpms (org.apache.beam.sdk.jpmstests.JpmsIT) failed: org.apache.beam.sdk.jpmstests.JpmsIT#L0
sdks/java/testing/jpms-tests/build/test-results/dataflowRunnerIntegrationTest/TEST-org.apache.beam.sdk.jpmstests.JpmsIT.xml [took 1h 3m 44s]
test_row_coder (apache_beam.coders.coders_property_based_test.ProperyTestingCoders) failed: apache_beam.coders.coders_property_based_test.ProperyTestingCoders#L0
sdks/python/test-suites/tox/py39/build/srcs/sdks/python/pytest_py39-cloudcoverage.xml [took 1m 3s]
testE2EJpms (org.apache.beam.sdk.jpmstests.JpmsIT) failed: org.apache.beam.sdk.jpmstests.JpmsIT#L0
sdks/java/testing/jpms-tests/build/test-results/dataflowRunnerIntegrationTest/TEST-org.apache.beam.sdk.jpmstests.JpmsIT.xml [took 1h 3m 44s]
testBatchDynamicDestinations (org.apache.beam.sdk.io.gcp.bigquery.providers.BigQueryManagedIT) failed: org.apache.beam.sdk.io.gcp.bigquery.providers.BigQueryManagedIT#L0
runners/google-cloud-dataflow-java/build/test-results/googleCloudPlatformRunnerV2IntegrationTest/TEST-org.apache.beam.sdk.io.gcp.bigquery.providers.BigQueryManagedIT.xml [took 1s]
testBatchFileLoadsWriteRead (org.apache.beam.sdk.io.gcp.bigquery.providers.BigQueryManagedIT) failed: org.apache.beam.sdk.io.gcp.bigquery.providers.BigQueryManagedIT#L0
runners/google-cloud-dataflow-java/build/test-results/googleCloudPlatformRunnerV2IntegrationTest/TEST-org.apache.beam.sdk.io.gcp.bigquery.providers.BigQueryManagedIT.xml [took 0s]
testWriteUnboundedWithCustomBatchParameters (org.apache.beam.sdk.io.TextIOWriteTest) failed: org.apache.beam.sdk.io.TextIOWriteTest#L0
runners/direct-java/build/test-results/needsRunnerTests/TEST-org.apache.beam.sdk.io.TextIOWriteTest.xml [took 0s]
testBatchDynamicDestinations (org.apache.beam.sdk.io.gcp.bigquery.providers.BigQueryManagedIT) failed: org.apache.beam.sdk.io.gcp.bigquery.providers.BigQueryManagedIT#L0
runners/google-cloud-dataflow-java/build/test-results/googleCloudPlatformLegacyWorkerIntegrationTest/TEST-org.apache.beam.sdk.io.gcp.bigquery.providers.BigQueryManagedIT.xml [took 1s]
testBatchFileLoadsWriteRead (org.apache.beam.sdk.io.gcp.bigquery.providers.BigQueryManagedIT) failed: org.apache.beam.sdk.io.gcp.bigquery.providers.BigQueryManagedIT#L0
runners/google-cloud-dataflow-java/build/test-results/googleCloudPlatformLegacyWorkerIntegrationTest/TEST-org.apache.beam.sdk.io.gcp.bigquery.providers.BigQueryManagedIT.xml [took 0s]
testWindowedWordCountInStreamingStaticSharding (org.apache.beam.examples.WindowedWordCountIT) failed: org.apache.beam.examples.WindowedWordCountIT#L0
runners/google-cloud-dataflow-java/arm/build/test-results/examplesJavaRunnerV2IntegrationTestARM/TEST-org.apache.beam.examples.WindowedWordCountIT.xml [took 28s]
testE2EJpms (org.apache.beam.sdk.jpmstests.JpmsIT) failed: org.apache.beam.sdk.jpmstests.JpmsIT#L0
sdks/java/testing/jpms-tests/build/test-results/dataflowRunnerIntegrationTest/TEST-org.apache.beam.sdk.jpmstests.JpmsIT.xml [took 1h 3m 30s]
testE2EJpms (org.apache.beam.sdk.jpmstests.JpmsIT) failed: org.apache.beam.sdk.jpmstests.JpmsIT#L0
sdks/java/testing/jpms-tests/build/test-results/dataflowRunnerIntegrationTest/TEST-org.apache.beam.sdk.jpmstests.JpmsIT.xml [took 1h 3m 33s]
testBatchDynamicDestinations (org.apache.beam.sdk.io.gcp.bigquery.providers.BigQueryManagedIT) failed: org.apache.beam.sdk.io.gcp.bigquery.providers.BigQueryManagedIT#L0
runners/google-cloud-dataflow-java/build/test-results/googleCloudPlatformRunnerV2IntegrationTest/TEST-org.apache.beam.sdk.io.gcp.bigquery.providers.BigQueryManagedIT.xml [took 1s]
testBatchFileLoadsWriteRead (org.apache.beam.sdk.io.gcp.bigquery.providers.BigQueryManagedIT) failed: org.apache.beam.sdk.io.gcp.bigquery.providers.BigQueryManagedIT#L0
runners/google-cloud-dataflow-java/build/test-results/googleCloudPlatformRunnerV2IntegrationTest/TEST-org.apache.beam.sdk.io.gcp.bigquery.providers.BigQueryManagedIT.xml [took 0s]
testBatchDynamicDestinations (org.apache.beam.sdk.io.gcp.bigquery.providers.BigQueryManagedIT) failed: org.apache.beam.sdk.io.gcp.bigquery.providers.BigQueryManagedIT#L0
runners/google-cloud-dataflow-java/build/test-results/googleCloudPlatformLegacyWorkerIntegrationTest/TEST-org.apache.beam.sdk.io.gcp.bigquery.providers.BigQueryManagedIT.xml [took 1s]
testBatchFileLoadsWriteRead (org.apache.beam.sdk.io.gcp.bigquery.providers.BigQueryManagedIT) failed: org.apache.beam.sdk.io.gcp.bigquery.providers.BigQueryManagedIT#L0
runners/google-cloud-dataflow-java/build/test-results/googleCloudPlatformLegacyWorkerIntegrationTest/TEST-org.apache.beam.sdk.io.gcp.bigquery.providers.BigQueryManagedIT.xml [took 0s]