Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

wrangler e2e tests #665

Merged
Show file tree
Hide file tree
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension


Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
6 changes: 3 additions & 3 deletions .github/workflows/build.yml
Original file line number Diff line number Diff line change
Expand Up @@ -30,11 +30,11 @@ jobs:
steps:
# Pinned 1.0.0 version
- uses: haya14busa/action-workflow_run-status@967ed83efa565c257675ed70cfe5231f062ddd94
- uses: actions/checkout@v2.3.4
- uses: actions/checkout@v3
with:
ref: ${{ github.event.workflow_run.head_branch }}
- name: Cache
uses: actions/cache@v2.1.3
uses: actions/cache@v3
with:
path: ~/.m2/repository
key: ${{ runner.os }}-maven-${{ github.workflow }}-${{ hashFiles('**/pom.xml') }}
Expand All @@ -43,7 +43,7 @@ jobs:
- name: Build with Maven
run: mvn clean test -fae -T 2 -B -V -DcloudBuild -Dmaven.wagon.http.retryHandler.count=3 -Dmaven.wagon.httpconnectionManager.ttlSeconds=25
- name: Archive build artifacts
uses: actions/upload-artifact@v2.2.2
uses: actions/upload-artifact@v3
if: always()
with:
name: Build debug files
Expand Down
2 changes: 1 addition & 1 deletion pom.xml
Original file line number Diff line number Diff line change
Expand Up @@ -492,7 +492,7 @@
<execution>
<goals>
<goal>integration-test</goal>
<goal>verify</goal>
<!-- <goal>verify</goal>-->
</goals>
</execution>
</executions>
Expand Down
Original file line number Diff line number Diff line change
@@ -0,0 +1,71 @@
# Copyright © 2023 Cask Data, Inc.
#
# Licensed under the Apache License, Version 2.0 (the "License"); you may not
# use this file except in compliance with the License. You may obtain a copy of
# the License at
#
# http://www.apache.org/licenses/LICENSE-2.0
#
# Unless required by applicable law or agreed to in writing, software
# distributed under the License is distributed on an "AS IS" BASIS, WITHOUT
# WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the
# License for the specific language governing permissions and limitations under
# the License.

@Wrangler
Feature: datatype parsers

@BQ_SOURCE_TS_TEST @BQ_SINK_TEST
Scenario: To verify User is able to run a pipeline using parse timestamp directive
Given Open Datafusion Project to configure pipeline
Then Click on the Plus Green Button to import the pipelines
Then Select the file for importing the pipeline for the plugin "Directive_parse_Timestamp"
Then Navigate to the properties page of plugin: "BigQueryTable"
Then Replace input plugin property: "project" with value: "projectId"
Then Replace input plugin property: "dataset" with value: "dataset"
Then Replace input plugin property: "table" with value: "bqSourceTable"
Then Click on the Get Schema button
Then Click on the Validate button
Then Close the Plugin Properties page
Then Navigate to the properties page of plugin: "BigQuery2"
Then Replace input plugin property: "project" with value: "projectId"
Then Replace input plugin property: "table" with value: "bqTargetTable"
Then Replace input plugin property: "dataset" with value: "dataset"
Then Click on the Validate button
Then Close the Plugin Properties page
Then Rename the pipeline
Then Deploy the pipeline
Then Run the Pipeline in Runtime
Then Wait till pipeline is in running state
Then Open and capture logs
Then Verify the pipeline status is "Succeeded"
Then Close the pipeline logs
Then Validate The Data From BQ To BQ With Actual And Expected File for: "ExpectedDirective_parse_Timestamp"


@BQ_SOURCE_DATETIME_TEST @BQ_SINK_TEST
Scenario: To verify User is able to run a pipeline using parse datetime directive
Given Open Datafusion Project to configure pipeline
Then Click on the Plus Green Button to import the pipelines
Then Select the file for importing the pipeline for the plugin "Directive_parse_Datetime"
Then Navigate to the properties page of plugin: "BigQueryTable"
Then Replace input plugin property: "project" with value: "projectId"
Then Replace input plugin property: "dataset" with value: "dataset"
Then Replace input plugin property: "table" with value: "bqSourceTable"
Then Click on the Get Schema button
Then Click on the Validate button
Then Close the Plugin Properties page
Then Navigate to the properties page of plugin: "BigQuery2"
Then Replace input plugin property: "project" with value: "projectId"
Then Replace input plugin property: "table" with value: "bqTargetTable"
Then Replace input plugin property: "dataset" with value: "dataset"
Then Click on the Validate button
Then Close the Plugin Properties page
Then Rename the pipeline
Then Deploy the pipeline
Then Run the Pipeline in Runtime
Then Wait till pipeline is in running state
Then Open and capture logs
Then Verify the pipeline status is "Succeeded"
Then Close the pipeline logs
Then Validate The Data From BQ To BQ With Actual And Expected File for: "ExpectedDirective_parse_Datetime"
Original file line number Diff line number Diff line change
@@ -0,0 +1,43 @@
# Copyright © 2023 Cask Data, Inc.
#
# Licensed under the Apache License, Version 2.0 (the "License"); you may not
# use this file except in compliance with the License. You may obtain a copy of
# the License at
#
# http://www.apache.org/licenses/LICENSE-2.0
#
# Unless required by applicable law or agreed to in writing, software
# distributed under the License is distributed on an "AS IS" BASIS, WITHOUT
# WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the
# License for the specific language governing permissions and limitations under
# the License.

@Wrangler
Feature: parse as fixed length

@BQ_SOURCE_FXDLEN_TEST @BQ_SINK_TEST
Scenario: To verify User is able to run a pipeline using parse fixedlength directive
Given Open Datafusion Project to configure pipeline
Then Click on the Plus Green Button to import the pipelines
Then Select the file for importing the pipeline for the plugin "Directive_parse_Fixed_Length"
Then Navigate to the properties page of plugin: "BigQueryTable"
Then Replace input plugin property: "project" with value: "projectId"
Then Replace input plugin property: "dataset" with value: "dataset"
Then Replace input plugin property: "table" with value: "bqSourceTable"
Then Click on the Get Schema button
Then Click on the Validate button
Then Close the Plugin Properties page
Then Navigate to the properties page of plugin: "BigQuery2"
Then Replace input plugin property: "project" with value: "projectId"
Then Replace input plugin property: "table" with value: "bqTargetTable"
Then Replace input plugin property: "dataset" with value: "dataset"
Then Click on the Validate button
Then Close the Plugin Properties page
Then Rename the pipeline
Then Deploy the pipeline
Then Run the Pipeline in Runtime
Then Wait till pipeline is in running state
Then Open and capture logs
Then Verify the pipeline status is "Succeeded"
Then Close the pipeline logs
Then Validate The Data From BQ To BQ With Actual And Expected File for: "ExpectedDirective_parse_FixedLength"
Original file line number Diff line number Diff line change
@@ -0,0 +1,43 @@
# Copyright © 2023 Cask Data, Inc.
#
# Licensed under the Apache License, Version 2.0 (the "License"); you may not
# use this file except in compliance with the License. You may obtain a copy of
# the License at
#
# http://www.apache.org/licenses/LICENSE-2.0
#
# Unless required by applicable law or agreed to in writing, software
# distributed under the License is distributed on an "AS IS" BASIS, WITHOUT
# WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the
# License for the specific language governing permissions and limitations under
# the License.

@Wrangler
Feature: parse as HL7

@BQ_SOURCE_HL7_TEST @BQ_SINK_TEST
Scenario: To verify User is able to run a pipeline using parse hl7 directive
Given Open Datafusion Project to configure pipeline
Then Click on the Plus Green Button to import the pipelines
Then Select the file for importing the pipeline for the plugin "Directive_parse_hl7"
Then Navigate to the properties page of plugin: "BigQueryTable"
Then Replace input plugin property: "project" with value: "projectId"
Then Replace input plugin property: "dataset" with value: "dataset"
Then Replace input plugin property: "table" with value: "bqSourceTable"
Then Click on the Get Schema button
Then Click on the Validate button
Then Close the Plugin Properties page
Then Navigate to the properties page of plugin: "BigQuery2"
Then Replace input plugin property: "project" with value: "projectId"
Then Replace input plugin property: "table" with value: "bqTargetTable"
Then Replace input plugin property: "dataset" with value: "dataset"
Then Click on the Validate button
Then Close the Plugin Properties page
Then Rename the pipeline
Then Deploy the pipeline
Then Run the Pipeline in Runtime
Then Wait till pipeline is in running state
Then Open and capture logs
Then Verify the pipeline status is "Succeeded"
Then Close the pipeline logs
Then Validate The Data From BQ To BQ With Actual And Expected File for: "ExpectedDirective_parse_hl7"
Original file line number Diff line number Diff line change
Expand Up @@ -17,34 +17,31 @@
package io.cdap.plugin.common.stepsdesign;

import com.google.cloud.bigquery.BigQueryException;
import com.google.cloud.storage.Blob;
import com.google.cloud.storage.StorageException;
import io.cdap.e2e.utils.BigQueryClient;
import io.cdap.e2e.utils.PluginPropertyUtils;
import io.cdap.e2e.utils.StorageClient;
import io.cucumber.java.After;
import io.cucumber.java.Before;
import org.apache.commons.lang3.RandomStringUtils;
import org.apache.commons.lang3.StringUtils;
import org.junit.Assert;
import stepsdesign.BeforeActions;

import java.io.IOException;
import java.net.URISyntaxException;
import java.nio.charset.StandardCharsets;
import java.nio.file.Files;
import java.nio.file.Paths;
import java.sql.SQLException;
import java.util.NoSuchElementException;
import java.util.UUID;

import static io.cdap.e2e.pages.locators.CdfGCSLocators.filePath;

/**
* Setup BQ for Wrangler tests.
*/
public class TestSetupHooks {

@Before(order = 1, value = "@BQ_SOURCE_CSV_TEST")
public static void createTempSourceBQTable() throws IOException, InterruptedException {
createSourceBQTableWithQueries(PluginPropertyUtils.pluginProp("CreateBQTableQueryFileCsv"),
PluginPropertyUtils.pluginProp("InsertBQDataQueryFileCsv"));
}
@Before(order = 1, value = "@BQ_SINK_TEST")
public static void setTempTargetBQTableName() {
String bqTargetTableName = "E2E_TARGET_" + UUID.randomUUID().toString().replaceAll("-", "_");
Expand All @@ -71,10 +68,33 @@ public static void deleteTempTargetBQTable() throws IOException, InterruptedExce
/**
* Create BigQuery table.
*/
@Before(order = 1, value = "@BQ_SOURCE_CSV_TEST")
public static void createTempSourceBQTable() throws IOException, InterruptedException {
createSourceBQTableWithQueries(PluginPropertyUtils.pluginProp("CreateBQTableQueryFileCsv"),
PluginPropertyUtils.pluginProp("InsertBQDataQueryFileCsv"));
@Before(order = 1, value = "@BQ_SOURCE_FXDLEN_TEST")
public static void createTempSourceBQTableFxdLen() throws IOException, InterruptedException {
createSourceBQTableWithQueries(PluginPropertyUtils.pluginProp("CreateBQDataQueryFileFxdLen"),
PluginPropertyUtils.pluginProp("InsertBQDataQueryFileFxdLen"));
}
@Before(order = 1, value = "@BQ_SOURCE_HL7_TEST")
public static void createTempSourceBQTableHl7() throws IOException, InterruptedException {
createSourceBQTableWithQueries(PluginPropertyUtils.pluginProp("CreateBQDataQueryFileHl7"),
PluginPropertyUtils.pluginProp("InsertBQDataQueryFileHl7"));
}
@Before(order = 1, value = "@BQ_SOURCE_TS_TEST")
public static void createTempSourceBQTableTimestamp() throws IOException, InterruptedException {
createSourceBQTableWithQueries(PluginPropertyUtils.pluginProp("CreateBQDataQueryFileTimestamp"),
PluginPropertyUtils.pluginProp("InsertBQDataQueryFileTimestamp"));
}
@Before(order = 1, value = "@BQ_SOURCE_DATETIME_TEST")
public static void createTempSourceBQTableDateTime() throws IOException, InterruptedException {
createSourceBQTableWithQueries(PluginPropertyUtils.pluginProp("CreateBQDataQueryFileDatetime"),
PluginPropertyUtils.pluginProp("InsertBQDataQueryFileDatetime"));
}

@After(order = 1, value = "@BQ_SOURCE_TEST")
public static void deleteTempSourceBQTable() throws IOException, InterruptedException {
String bqSourceTable = PluginPropertyUtils.pluginProp("bqSourceTable");
BigQueryClient.dropBqQuery(bqSourceTable);
BeforeActions.scenario.write("BQ source Table " + bqSourceTable + " deleted successfully");
PluginPropertyUtils.removePluginProp("bqSourceTable");
}

private static void createSourceBQTableWithQueries(String bqCreateTableQueryFile, String bqInsertDataQueryFile)
Expand Down
Original file line number Diff line number Diff line change
@@ -0,0 +1,3 @@
{"create_date":"2023","id":1,"timecolumn":"2006-03-18"}
{"create_date":"2023","id":2,"timecolumn":"2007-03-18"}
{"create_date":"2023","id":3,"timecolumn":"2008-04-19"}
Original file line number Diff line number Diff line change
@@ -0,0 +1,3 @@
{"create_date":"2021-01-21T00:00:00Z","diff_date":594345600000,"format_price":"$1.00","id":1.0,"price":"$1","time":"2018-09-07T14:57:51.892Z","update_date":"2002-03-23T00:00:00"}
{"create_date":"2022-01-22T00:00:00Z","diff_date":562723200000,"format_price":"$2.00","id":2.0,"price":"$2","time":"2018-09-07T14:57:51.896Z","update_date":"2004-03-24T00:00:00"}
{"create_date":"2023-01-23T00:00:00Z","diff_date":652060800000,"format_price":"$3.00","id":3.0,"price":"$3","time":"2018-09-07T14:57:51.898Z","update_date":"2002-05-26T00:00:00"}
Original file line number Diff line number Diff line change
@@ -0,0 +1,2 @@
{"Url":"http://example.com:80/docs/books/tutorial/index.html?name=networking#DOWNLOADING","fixedlength":"21 10 ABCXYZ","fixedlength_1":"21","fixedlength_3":" ABC","fixedlength_4":"XYZ","fixedlength_encode_base32":"GIYSAIBRGAQCAQKCINMFSWQ=","fixedlength_encode_base32_decode_base32":"21 10 ABCXYZ","id":" 10","url_authority":"example.com:80","url_filename":"/docs/books/tutorial/index.html?name=networking","url_host":"example.com","url_path":"/docs/books/tutorial/index.html","url_port":80,"url_protocol":"http","url_query":"name=networking","url_query_1":"name","url_query_2":"networking"}
{"Url":"http://geeks.com:80/docs/chair/tutorial/index.html?name=networking#DOWNLOADING","fixedlength":"19 13 ABCXYZ","fixedlength_1":"19","fixedlength_3":" ABC","fixedlength_4":"XYZ","fixedlength_encode_base32":"GE4SAIBRGMQCAQKCINMFSWQ=","fixedlength_encode_base32_decode_base32":"19 13 ABCXYZ","id":" 13","url_authority":"geeks.com:80","url_filename":"/docs/chair/tutorial/index.html?name=networking","url_host":"geeks.com","url_path":"/docs/chair/tutorial/index.html","url_port":80,"url_protocol":"http","url_query":"name=networking","url_query_1":"name","url_query_2":"networking"}
Original file line number Diff line number Diff line change
@@ -0,0 +1,2 @@
{"Body":"s��\u0011y�X��\u0006�H���","Body_hl7_MSH_9_1":"ALM","address":"test","id":"3"}
{"Body":"F<��\u001c����#J��^�:","Body_hl7_MSH_9_1":"BLM","address":"address2","id":"4"}
Original file line number Diff line number Diff line change
@@ -0,0 +1 @@
create table `DATASET.TABLE_NAME` (id STRING, create_date STRING, timestamp STRING)
Original file line number Diff line number Diff line change
@@ -0,0 +1 @@
create table `DATASET.TABLE_NAME` (url STRING, fixedlength STRING)
Original file line number Diff line number Diff line change
@@ -0,0 +1 @@
create table `DATASET.TABLE_NAME` (create_date STRING, update_date STRING, time BIGINT, price STRING)
Original file line number Diff line number Diff line change
@@ -0,0 +1 @@
create table `DATASET.TABLE_NAME` (address STRING, Body STRING)
Original file line number Diff line number Diff line change
@@ -0,0 +1,5 @@
INSERT INTO DATASET.TABLE_NAME (id,create_date,timestamp)
VALUES
('1','2021-01-21','2006-02-18T05:03:42Z[UTC]'),
('2','2022-02-22','2007-01-18T04:03:22Z[UTC]'),
('3','2023-03-23','2008-07-19T08:04:22Z[UTC]');
Original file line number Diff line number Diff line change
@@ -0,0 +1,5 @@
INSERT INTO DATASET.TABLE_NAME (url,fixedlength)
VALUES
('http://example.com:80/docs/books/tutorial/index.html?name=networking#DOWNLOADING','21 10 ABCXYZ'),
('http://geeks.com:80/docs/chair/tutorial/index.html?name=networking#DOWNLOADING','19 13 ABCXYZ'),
('http://amazing.com:80/docs/tables/tutorial/index.html?name=networking#DOWNLOADING','18 14 CDEFGH');
Original file line number Diff line number Diff line change
@@ -0,0 +1,5 @@
INSERT INTO DATASET.TABLE_NAME (address,Body)
VALUES
('address1','MSH|^~?2||.|||199908180016||ADT^A04|ADT.1.1698593|P|3'),
('address2','MSH|^~?2||.|||199908180016||BSC^A04|ADT.1.1698593|P|4'),
('','MSH|^~?2||.|||199908180016||JKL^A04|ADT.1.1698593|P|5');
Original file line number Diff line number Diff line change
@@ -0,0 +1,5 @@
INSERT INTO DATASET.TABLE_NAME (create_date,update_date,time,price)
VALUES
('2021-01-21','2002-03-23',1536332271892,'$1'),
('2022-01-22','2004-03-24',1536332271896,'$2'),
('2023-01-23','2002-05-26',1536332271898,'$3');
Original file line number Diff line number Diff line change
@@ -1,10 +1,23 @@
#json file path
Directive_parse_Fixed_Length=testData/Wrangler/parse_fixedlength_wrangle-cdap-data-pipeline.json
Directive_parse_hl7=testData/Wrangler/parse_HL7_Wrangler-cdap-data-pipeline (1).json
Directive_parse_Timestamp=testData/Wrangler/parse_timestamp_wrangle-cdap-data-pipeline.json
Directive_parse_Datetime=testData/Wrangler/parse_datetime_wrangle-cdap-data-pipeline.json
Directive_parse_csv=testData/Wrangler\
/parse_csv_wrangle-cdap-data-pipeline.json
bqSourceTable=dummy
sourcePath=example/hello.csv
gcsSourceBucket=dummy
#bq queries file path

CreateBQDataQueryFileFxdLen=BQtesdata/BigQuery/BigQueryCreateTableQueryFxdlen.txt
InsertBQDataQueryFileFxdLen=BQtesdata/BigQuery/BigQueryInsertDataQueryFxdlen.txt
CreateBQDataQueryFileHl7=BQtesdata/BigQuery/BigQueryCreateTableQueryhl7.txt
InsertBQDataQueryFileHl7=BQtesdata/BigQuery/BigQueryInsertDataQueryHl7.txt
CreateBQDataQueryFileTimestamp=BQtesdata/BigQuery/BigQueryCreateTableQueryTimestamp.txt
InsertBQDataQueryFileTimestamp=BQtesdata/BigQuery/BigQueryInsertDataQueryTimestamp.txt
CreateBQDataQueryFileDatetime=BQtesdata/BigQuery/BigQueryCreateTableQueryDatetime.txt
InsertBQDataQueryFileDatetime=BQtesdata/BigQuery/BigQueryInsertDataQueryDatetime.txt
CreateBQTableQueryFileCsv=BQtesdata/BigQuery/BigQueryCreateTableQueryCsv.txt
InsertBQDataQueryFileCsv=BQtesdata/BigQuery/BigQueryInsertDataQueryCsv.txt

Expand All @@ -13,4 +26,9 @@ projectId=cdf-athena
dataset=test_automation
dataset2=Wrangler
#expectedBQFiles

ExpectedDirective_parse_FixedLength=BQValidationExpectedFiles/Directive_parse_fixedlength
ExpectedDirective_parse_hl7=BQValidationExpectedFiles/Directive_parse_hl7
ExpectedDirective_parse_Datetime=BQValidationExpectedFiles/Directive_parse_DateTime
ExpectedDirective_parse_Timestamp=BQValidationExpectedFiles/Directive_parse_Timestamp
ExpectedDirective_parse_csv=BQValidationExpectedFiles/Directive_parse_csv
Loading
Loading