-
Notifications
You must be signed in to change notification settings - Fork 56
Commit
This commit does not belong to any branch on this repository, and may belong to a fork outside of the repository.
- Loading branch information
1 parent
99403b1
commit b8689ad
Showing
33 changed files
with
1,630 additions
and
5 deletions.
There are no files selected for viewing
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
Original file line number | Diff line number | Diff line change |
---|---|---|
@@ -0,0 +1,105 @@ | ||
# Copyright © 2023 Cask Data, Inc. | ||
# Licensed under the Apache License, Version 2.0 (the "License"); you may not | ||
# use this file except in compliance with the License. You may obtain a copy of | ||
# the License at | ||
# http://www.apache.org/licenses/LICENSE-2.0 | ||
# Unless required by applicable law or agreed to in writing, software | ||
# distributed under the License is distributed on an "AS IS" BASIS, WITHOUT | ||
# WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the | ||
# License for the specific language governing permissions and limitations under | ||
# the License. | ||
|
||
# This workflow will build a Java project with Maven | ||
# For more information see: https://help.github.com/actions/language-and-framework-guides/building-and-testing-java-with-maven | ||
# Note: Any changes to this workflow would be used only after merging into develop | ||
name: Build e2e tests | ||
|
||
on: | ||
push: | ||
branches: [ develop ] | ||
pull_request: | ||
branches: [ develop ] | ||
types: [ opened, synchronize, reopened, labeled ] | ||
workflow_dispatch: | ||
|
||
jobs: | ||
build: | ||
runs-on: k8s-runner-e2e | ||
# We allow builds: | ||
# 1) When triggered manually | ||
# 2) When it's a merge into a branch | ||
# 3) For PRs that are labeled as build and | ||
# - It's a code change | ||
# - A build label was just added | ||
# A bit complex, but prevents builds when other labels are manipulated | ||
if: > | ||
github.event_name == 'workflow_dispatch' | ||
|| github.event_name == 'push' | ||
|| (contains(github.event.pull_request.labels.*.name, 'build') | ||
&& (github.event.action != 'labeled' || github.event.label.name == 'build') | ||
) | ||
strategy: | ||
matrix: | ||
module: [wrangler-transform] | ||
fail-fast: false | ||
|
||
steps: | ||
# Pinned 1.0.0 version | ||
- uses: actions/checkout@v3 | ||
with: | ||
path: plugin | ||
submodules: 'recursive' | ||
ref: ${{ github.event.workflow_run.head_sha }} | ||
|
||
- uses: dorny/paths-filter@b2feaf19c27470162a626bd6fa8438ae5b263721 | ||
if: github.event_name != 'workflow_dispatch' && github.event_name != 'push' | ||
id: filter | ||
with: | ||
working-directory: plugin | ||
filters: | | ||
e2e-test: | ||
- '${{ matrix.module }}/**/e2e-test/**' | ||
- name: Checkout e2e test repo | ||
uses: actions/checkout@v3 | ||
with: | ||
repository: cdapio/cdap-e2e-tests | ||
path: e2e | ||
|
||
- name: Cache | ||
uses: actions/cache@v3 | ||
with: | ||
path: ~/.m2/repository | ||
key: ${{ runner.os }}-maven-${{ github.workflow }}-${{ hashFiles('**/pom.xml') }} | ||
restore-keys: | | ||
${{ runner.os }}-maven-${{ github.workflow }} | ||
- name: Run required e2e tests | ||
if: github.event_name != 'workflow_dispatch' && github.event_name != 'push' && steps.filter.outputs.e2e-test == 'false' | ||
run: python3 e2e/src/main/scripts/run_e2e_test.py --module ${{ matrix.module }} --testRunner TestRunnerRequired.java | ||
|
||
- name: Run all e2e tests | ||
if: github.event_name == 'workflow_dispatch' || github.event_name == 'push' || steps.filter.outputs.e2e-test == 'true' | ||
run: python3 e2e/src/main/scripts/run_e2e_test.py --module ${{ matrix.module }} | ||
|
||
- name: Upload report | ||
uses: actions/upload-artifact@v3 | ||
if: always() | ||
with: | ||
name: Cucumber report - ${{ matrix.module }} | ||
path: ./**/target/cucumber-reports | ||
|
||
- name: Upload debug files | ||
uses: actions/upload-artifact@v3 | ||
if: always() | ||
with: | ||
name: Debug files - ${{ matrix.module }} | ||
path: ./**/target/e2e-debug | ||
|
||
- name: Upload files to GCS | ||
uses: google-github-actions/upload-cloud-storage@v0 | ||
if: always() | ||
with: | ||
path: ./plugin | ||
destination: e2e-tests-cucumber-reports/${{ github.event.repository.name }}/${{ github.ref }} | ||
glob: '**/target/cucumber-reports/**' |
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
Original file line number | Diff line number | Diff line change |
---|---|---|
|
@@ -215,5 +215,4 @@ | |
</build> | ||
</profile> | ||
</profiles> | ||
|
||
</project> |
43 changes: 43 additions & 0 deletions
43
wrangler-transform/src/e2e-test/features/Wrangler/ParseAsCsv.feature
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
Original file line number | Diff line number | Diff line change |
---|---|---|
@@ -0,0 +1,43 @@ | ||
# Copyright © 2023 Cask Data, Inc. | ||
# | ||
# Licensed under the Apache License, Version 2.0 (the "License"); you may not | ||
# use this file except in compliance with the License. You may obtain a copy of | ||
# the License at | ||
# | ||
# http://www.apache.org/licenses/LICENSE-2.0 | ||
# | ||
# Unless required by applicable law or agreed to in writing, software | ||
# distributed under the License is distributed on an "AS IS" BASIS, WITHOUT | ||
# WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the | ||
# License for the specific language governing permissions and limitations under | ||
# the License. | ||
|
||
@Wrangler | ||
Feature: Wrangler - Run time scenarios | ||
|
||
@BQ_SOURCE_CSV_TEST @BQ_SINK_TEST | ||
Scenario: To verify User is able to run a pipeline using parse csv directive | ||
Given Open Datafusion Project to configure pipeline | ||
Then Click on the Plus Green Button to import the pipelines | ||
Then Select the file for importing the pipeline for the plugin "Directive_parse_csv" | ||
Then Navigate to the properties page of plugin: "BigQueryTable" | ||
Then Replace input plugin property: "project" with value: "projectId" | ||
Then Replace input plugin property: "dataset" with value: "dataset" | ||
Then Replace input plugin property: "table" with value: "bqSourceTable" | ||
Then Click on the Get Schema button | ||
Then Click on the Validate button | ||
Then Close the Plugin Properties page | ||
Then Navigate to the properties page of plugin: "BigQuery2" | ||
Then Replace input plugin property: "project" with value: "projectId" | ||
Then Replace input plugin property: "table" with value: "bqTargetTable" | ||
Then Replace input plugin property: "dataset" with value: "dataset" | ||
Then Click on the Validate button | ||
Then Close the Plugin Properties page | ||
Then Rename the pipeline | ||
Then Deploy the pipeline | ||
Then Run the Pipeline in Runtime | ||
Then Wait till pipeline is in running state | ||
Then Open and capture logs | ||
Then Verify the pipeline status is "Succeeded" | ||
Then Close the pipeline logs | ||
Then Validate The Data From BQ To BQ With Actual And Expected File for: "ExpectedDirective_parse_csv" |
43 changes: 43 additions & 0 deletions
43
wrangler-transform/src/e2e-test/features/Wrangler/ParseAsFixedLength.feature
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
Original file line number | Diff line number | Diff line change |
---|---|---|
@@ -0,0 +1,43 @@ | ||
# Copyright © 2023 Cask Data, Inc. | ||
# | ||
# Licensed under the Apache License, Version 2.0 (the "License"); you may not | ||
# use this file except in compliance with the License. You may obtain a copy of | ||
# the License at | ||
# | ||
# http://www.apache.org/licenses/LICENSE-2.0 | ||
# | ||
# Unless required by applicable law or agreed to in writing, software | ||
# distributed under the License is distributed on an "AS IS" BASIS, WITHOUT | ||
# WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the | ||
# License for the specific language governing permissions and limitations under | ||
# the License. | ||
|
||
@Wrangler | ||
Feature: Wrangler - Run time scenarios | ||
|
||
@BQ_SOURCE_FXDLEN_TEST @BQ_SINK_TEST | ||
Scenario: To verify User is able to run a pipeline using parse fixedlength directive | ||
Given Open Datafusion Project to configure pipeline | ||
Then Click on the Plus Green Button to import the pipelines | ||
Then Select the file for importing the pipeline for the plugin "Directive_parse_Fixed_Length" | ||
Then Navigate to the properties page of plugin: "BigQueryTable" | ||
Then Replace input plugin property: "project" with value: "projectId" | ||
Then Replace input plugin property: "dataset" with value: "dataset" | ||
Then Replace input plugin property: "table" with value: "bqSourceTable" | ||
Then Click on the Get Schema button | ||
Then Click on the Validate button | ||
Then Close the Plugin Properties page | ||
Then Navigate to the properties page of plugin: "BigQuery2" | ||
Then Replace input plugin property: "project" with value: "projectId" | ||
Then Replace input plugin property: "table" with value: "bqTargetTable" | ||
Then Replace input plugin property: "dataset" with value: "dataset" | ||
Then Click on the Validate button | ||
Then Close the Plugin Properties page | ||
Then Rename the pipeline | ||
Then Deploy the pipeline | ||
Then Run the Pipeline in Runtime | ||
Then Wait till pipeline is in running state | ||
Then Open and capture logs | ||
Then Verify the pipeline status is "Succeeded" | ||
Then Close the pipeline logs | ||
Then Validate The Data From BQ To BQ With Actual And Expected File for: "ExpectedDirective_parse_FixedLength" |
Oops, something went wrong.