Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

[ETL-616] Implement Great Expectations to run on parquet data #139

Merged
merged 26 commits into from
Sep 13, 2024
Merged
Show file tree
Hide file tree
Changes from all commits
Commits
Show all changes
26 commits
Select commit Hold shift + click to select a range
ac378fd
initial commit for testing
rxu17 Sep 4, 2024
35c8422
update sample expectations
rxu17 Sep 5, 2024
f42717e
add two data types
rxu17 Sep 5, 2024
5483162
correct to fitbitdailydata
rxu17 Sep 5, 2024
4610016
fix expectation
rxu17 Sep 5, 2024
c4b85f4
add complete script
rxu17 Sep 5, 2024
9d1a0d5
initial cf config and template
rxu17 Sep 5, 2024
6ae0e24
correct formatting, refactor triggers
rxu17 Sep 5, 2024
26e31e9
fix job name
rxu17 Sep 6, 2024
8e45790
refactor gx code, add tests, adjust gx version
rxu17 Sep 6, 2024
2ad2f61
refactor gx code, add tests, adjust gx version
rxu17 Sep 6, 2024
3cd0422
make consistent naming
rxu17 Sep 6, 2024
2dae40e
remove hardcoded args
rxu17 Sep 6, 2024
6a07092
add integration tests, remove null rows code, add dep for urllib3<2
rxu17 Sep 6, 2024
84f5985
change to lowercase data type
rxu17 Sep 6, 2024
ee6a812
add prod cf configs, add perm for glue role for shareable artifacts b…
rxu17 Sep 6, 2024
15d7992
rename, include prod ver
rxu17 Sep 6, 2024
7720c8e
add test to catch exception
rxu17 Sep 6, 2024
db2fa8f
add conditional creation of triggers due to what is available in expe…
rxu17 Sep 9, 2024
19f0a8b
update README for tests, add in testing for our scripts
rxu17 Sep 10, 2024
19d1df5
chain cmd together
rxu17 Sep 10, 2024
a21c0d6
update prod
rxu17 Sep 10, 2024
6fbff2b
gather tests, correct key_prefix to key, add missing params to prod g…
rxu17 Sep 11, 2024
198ae92
remove slash
rxu17 Sep 11, 2024
0ec67b1
add gx glue version as var in config
rxu17 Sep 12, 2024
4a2c9f9
merge conflicts
rxu17 Sep 12, 2024
File filter

Filter by extension

Filter by extension


Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
35 changes: 29 additions & 6 deletions .github/workflows/README.md
Original file line number Diff line number Diff line change
@@ -1,5 +1,7 @@
# recover github workflows

## Overview

Recover ETL has four github workflows:

- workflows/upload-and-deploy.yaml
Expand All @@ -14,35 +16,56 @@ Recover ETL has four github workflows:
| codeql-analysis | on-push from feature branch, feature branch merged into main |
| cleanup | feature branch deleted |


## upload-files

Copies pilot data sets from ingestion bucket to input data bucket for use in integration test. Note that this behavior assumes that there are files in the ingestion bucket. Could add updates to make this robust and throw an error if the ingestion bucket path is empty.

## upload-and-deploy

Here are some more detailed descriptions and troubleshooting tips for some jobs within each workflow:

### upload-files

Copies pilot data sets from ingestion bucket to input data bucket for use in integration test. Note that this behavior assumes that there are files in the ingestion bucket. Could add updates to make this robust and throw an error if the ingestion bucket path is empty.
### Current Testing Related Jobs

### nonglue-unit-tests
#### nonglue-unit-tests

See [testing](/tests/README.md) for more info on the background behind these tests. Here, both the `recover-dev-input-data` and `recover-dev-processed-data` buckets' synapse folders are tested for STS access every time something is pushed to the feature branch and when the feature branch is merged to main.

This is like an integration test and because it depends on connection to Synapse, sometimes the connection will be stalled, broken, etc. Usually this test will only take 1 min or less. Sometimes just re-running this job will do the trick.

### pytest-docker
#### pytest-docker

This sets up and uploads the two docker images to ECR repository.
**Note: A ECR repo called `pytest` would need to exist in the AWS account we are pushing docker images to prior to running this GH action.**

Some behavioral aspects to note - there were limitations with the matrix method in Github action jobs thus had to unmask account id to pass it as an output for `glue-unit-tests` to use. The matrix method at this time [see issue thread](https://github.com/orgs/community/discussions/17245) doesn't support dynamic job outputs and the workaround seemed more complex to implement, thus we weren't able to pass the path of the uploaded docker container directly and had to use a static output. This leads us to use `steps.login-ecr.outputs.registry` which contains account id directly so the output could be passed and the docker container could be found and used.

### glue-unit-tests
#### glue-unit-tests

See [testing](/tests/README.md) for more info on the background behind these tests.

For the JSON to Parquet tests sometimes there may be a scenario where a github workflow gets stopped early due to an issue/gets canceled.

With the current way when the `test_json_to_parquet.py` run, sometimes the glue table, glue crawler role and other resources may have been created already for the given branch (and didn’t get deleted because the test didn’t run all the way through) and will error out when the github workflow gets triggered again because it hits the `AlreadyExistsException`. This is currently resolved manually by deleting the resource(s) that has been created in the AWS account and re-running the github jobs that failed.

### Adding Test Commands to Github Workflow Jobs

After developing and running tests locally, you need to ensure the tests are run in the CI pipeline. To add your tests to under the `upload-and-deploy` job:

Add your test commands under the appropriate job (see above for summaries on the specific testing related jobs), for example:

```yaml
jobs:
build:
runs-on: ubuntu-latest
steps:
# Other steps...
- name: Run tests
run: |
pytest tests/

```

### sceptre-deploy-develop

### integration-test-develop-cleanup
Expand Down
28 changes: 19 additions & 9 deletions .github/workflows/upload-and-deploy.yaml
Original file line number Diff line number Diff line change
Expand Up @@ -134,11 +134,14 @@ jobs:
pipenv install ecs_logging~=2.0
pipenv install pytest-datadir

- name: Test lambda scripts with pytest
- name: Test scripts with pytest (lambda, etc.)
run: |
pipenv run python -m pytest tests/test_s3_event_config_lambda.py -v
pipenv run python -m pytest tests/test_s3_to_glue_lambda.py -v
pipenv run python -m pytest -v tests/test_lambda_raw.py
pipenv run python -m pytest \
tests/test_s3_event_config_lambda.py \
tests/test_s3_to_glue_lambda.py \
tests/test_lambda_dispatch.py \
tests/test_consume_logs.py \
tests/test_lambda_raw.py -v

- name: Test dev synapse folders for STS access with pytest
run: >
Expand Down Expand Up @@ -249,18 +252,25 @@ jobs:
if: github.ref_name != 'main'
run: echo "NAMESPACE=$GITHUB_REF_NAME" >> $GITHUB_ENV

- name: Run Pytest unit tests under AWS 3.0
- name: Run Pytest unit tests under AWS Glue 3.0
if: matrix.tag_name == 'aws_glue_3'
run: |
su - glue_user --command "cd $GITHUB_WORKSPACE && python3 -m pytest tests/test_s3_to_json.py -v"
su - glue_user --command "cd $GITHUB_WORKSPACE && python3 -m pytest tests/test_compare_parquet_datasets.py -v"
su - glue_user --command "cd $GITHUB_WORKSPACE && python3 -m pytest \
tests/test_s3_to_json.py \
tests/test_compare_parquet_datasets.py -v"

- name: Run Pytest unit tests under AWS 4.0
- name: Run unit tests for JSON to Parquet under AWS Glue 4.0
if: matrix.tag_name == 'aws_glue_4'
run: >
su - glue_user --command "cd $GITHUB_WORKSPACE &&
python3 -m pytest tests/test_json_to_parquet.py --namespace $NAMESPACE -v"

- name: Run unit tests for Great Expectations on Parquet under AWS Glue 4.0
if: matrix.tag_name == 'aws_glue_4'
run: >
su - glue_user --command "cd $GITHUB_WORKSPACE &&
python3 -m pytest tests/test_run_great_expectations_on_parquet.py -v"
rxu17 marked this conversation as resolved.
Show resolved Hide resolved

sceptre-deploy-develop:
name: Deploys branch using sceptre
runs-on: ubuntu-latest
Expand All @@ -287,7 +297,7 @@ jobs:
run: echo "NAMESPACE=$GITHUB_REF_NAME" >> $GITHUB_ENV

- name: "Deploy sceptre stacks to dev"
run: pipenv run sceptre --var "namespace=${{ env.NAMESPACE }}" launch develop --yes
run: pipenv run sceptre --debug --var "namespace=${{ env.NAMESPACE }}" launch develop --yes

- name: Delete preexisting S3 event notification for this namespace
uses: gagoar/invoke-aws-lambda@v3
Expand Down
1 change: 1 addition & 0 deletions config/config.yaml
Original file line number Diff line number Diff line change
Expand Up @@ -7,6 +7,7 @@ template_key_prefix: "{{ var.namespace | default('main') }}/templates"
glue_python_shell_python_version: "3.9"
glue_python_shell_glue_version: "3.0"
json_to_parquet_glue_version: "4.0"
great_expectations_job_glue_version: "4.0"
default_stack_tags:
Department: DNT
Project: recover
Expand Down
1 change: 1 addition & 0 deletions config/develop/glue-job-role.yaml
Original file line number Diff line number Diff line change
Expand Up @@ -6,5 +6,6 @@ parameters:
S3IntermediateBucketName: {{ stack_group_config.intermediate_bucket_name }}
S3ParquetBucketName: {{ stack_group_config.processed_data_bucket_name }}
S3ArtifactBucketName: {{ stack_group_config.template_bucket_name }}
S3ShareableArtifactBucketName: {{ stack_group_config.shareable_artifacts_vpn_bucket_name }}
rxu17 marked this conversation as resolved.
Show resolved Hide resolved
stack_tags:
{{ stack_group_config.default_stack_tags }}
Original file line number Diff line number Diff line change
@@ -0,0 +1,18 @@
template:
path: glue-job-run-great-expectations-on-parquet.j2
dependencies:
- develop/glue-job-role.yaml
stack_name: "{{ stack_group_config.namespace }}-glue-job-RunGreatExpectationsParquet"
parameters:
Namespace: {{ stack_group_config.namespace }}
JobDescription: Runs great expectations on a set of data
JobRole: !stack_output_external glue-job-role::RoleArn
TempS3Bucket: {{ stack_group_config.processed_data_bucket_name }}
S3ScriptBucket: {{ stack_group_config.template_bucket_name }}
S3ScriptKey: '{{ stack_group_config.namespace }}/src/glue/jobs/run_great_expectations_on_parquet.py'
GlueVersion: "{{ stack_group_config.great_expectations_job_glue_version }}"
AdditionalPythonModules: "great_expectations~=0.18,urllib3<2"
stack_tags:
{{ stack_group_config.default_stack_tags }}
sceptre_user_data:
dataset_schemas: !file src/glue/resources/table_columns.yaml
4 changes: 4 additions & 0 deletions config/develop/namespaced/glue-workflow.yaml
Original file line number Diff line number Diff line change
Expand Up @@ -6,6 +6,7 @@ dependencies:
- develop/namespaced/glue-job-S3ToJsonS3.yaml
- develop/namespaced/glue-job-JSONToParquet.yaml
- develop/namespaced/glue-job-compare-parquet.yaml
- develop/namespaced/glue-job-run-great-expectations-on-parquet.yaml
- develop/glue-job-role.yaml
- develop/s3-cloudformation-bucket.yaml
parameters:
Expand All @@ -19,7 +20,10 @@ parameters:
CompareParquetMainNamespace: "main"
S3SourceBucketName: {{ stack_group_config.input_bucket_name }}
CloudformationBucketName: {{ stack_group_config.template_bucket_name }}
ShareableArtifactsBucketName: {{ stack_group_config.shareable_artifacts_vpn_bucket_name }}
ExpectationSuiteKey: "{{ stack_group_config.namespace }}/src/glue/resources/data_values_expectations.json"
stack_tags:
{{ stack_group_config.default_stack_tags }}
sceptre_user_data:
dataset_schemas: !file src/glue/resources/table_columns.yaml
data_values_expectations: !file src/glue/resources/data_values_expectations.json
1 change: 1 addition & 0 deletions config/prod/glue-job-role.yaml
Original file line number Diff line number Diff line change
Expand Up @@ -6,5 +6,6 @@ parameters:
S3IntermediateBucketName: {{ stack_group_config.intermediate_bucket_name }}
S3ParquetBucketName: {{ stack_group_config.processed_data_bucket_name }}
S3ArtifactBucketName: {{ stack_group_config.template_bucket_name }}
S3ShareableArtifactBucketName: {{ stack_group_config.shareable_artifacts_vpn_bucket_name }}
stack_tags:
{{ stack_group_config.default_stack_tags }}
Original file line number Diff line number Diff line change
@@ -0,0 +1,18 @@
template:
path: glue-job-run-great-expectations-on-parquet.j2
dependencies:
- prod/glue-job-role.yaml
stack_name: "{{ stack_group_config.namespace }}-glue-job-RunGreatExpectationsParquet"
parameters:
Namespace: {{ stack_group_config.namespace }}
JobDescription: Runs great expectations on a set of data
JobRole: !stack_output_external glue-job-role::RoleArn
TempS3Bucket: {{ stack_group_config.processed_data_bucket_name }}
S3ScriptBucket: {{ stack_group_config.template_bucket_name }}
S3ScriptKey: '{{ stack_group_config.namespace }}/src/glue/jobs/run_great_expectations_on_parquet.py'
GlueVersion: "{{ stack_group_config.great_expectations_job_glue_version }}"
AdditionalPythonModules: "great_expectations~=0.18,urllib3<2"
stack_tags:
{{ stack_group_config.default_stack_tags }}
sceptre_user_data:
dataset_schemas: !file src/glue/resources/table_columns.yaml
4 changes: 4 additions & 0 deletions config/prod/namespaced/glue-workflow.yaml
Original file line number Diff line number Diff line change
Expand Up @@ -6,6 +6,7 @@ dependencies:
- prod/namespaced/glue-job-S3ToJsonS3.yaml
- prod/namespaced/glue-job-JSONToParquet.yaml
- prod/namespaced/glue-job-compare-parquet.yaml
- prod/namespaced/glue-job-run-great-expectations-on-parquet.yaml
- prod/glue-job-role.yaml
- prod/s3-cloudformation-bucket.yaml
parameters:
Expand All @@ -19,7 +20,10 @@ parameters:
CompareParquetMainNamespace: "main"
S3SourceBucketName: {{ stack_group_config.input_bucket_name }}
CloudformationBucketName: {{ stack_group_config.template_bucket_name }}
ShareableArtifactsBucketName: {{ stack_group_config.shareable_artifacts_vpn_bucket_name }}
ExpectationSuiteKey: "{{ stack_group_config.namespace }}/src/glue/resources/data_values_expectations.json"
stack_tags:
{{ stack_group_config.default_stack_tags }}
sceptre_user_data:
dataset_schemas: !file src/glue/resources/table_columns.yaml
data_values_expectations: !file src/glue/resources/data_values_expectations.json
Loading
Loading