Skip to content

Commit

Permalink
Merge pull request #36 from aws-solutions/develop
Browse files Browse the repository at this point in the history
Update to version v2.2.1
  • Loading branch information
fhoueto-amz committed May 30, 2024
2 parents bf7a0be + d48df78 commit 7f60526
Show file tree
Hide file tree
Showing 27 changed files with 285 additions and 228 deletions.
4 changes: 2 additions & 2 deletions .github/ISSUE_TEMPLATE/bug_report.md
Original file line number Diff line number Diff line change
Expand Up @@ -24,11 +24,11 @@ To get the version of the solution, you can look at the description of the creat
- [ ] Region: [e.g. us-east-1]
- [ ] Was the solution modified from the version published on this repository?
- [ ] If the answer to the previous question was yes, are the changes available on GitHub?
- [ ] Have you checked your [service quotas](https://docs.aws.amazon.com/general/latest/gr/aws_service_limits.html) for the sevices this solution uses?
- [ ] Have you checked your [service quotas](https://docs.aws.amazon.com/general/latest/gr/aws_service_limits.html) for the services this solution uses?
- [ ] Were there any errors in the CloudWatch Logs?

**Screenshots**
If applicable, add screenshots to help explain your problem (please **DO NOT include sensitive information**).

**Additional context**
Add any other context about the problem here.
Add any other context about the problem here.
11 changes: 11 additions & 0 deletions CHANGELOG.md
Original file line number Diff line number Diff line change
Expand Up @@ -5,6 +5,17 @@ All notable changes to this project will be documented in this file.
The format is based on [Keep a Changelog](https://keepachangelog.com/en/1.0.0/),
and this project adheres to [Semantic Versioning](https://semver.org/spec/v2.0.0.html).

## [2.2.1] - 2024-05-27

### Updated

- boto3, botocore updated to 1.34.98
- sagemaker-python-sdk updated to 2.218.0 for CVE-2024-34073
- moto testing framework updated to 5.0.6 to remove dependency on python-jose due to CVE
- lambda memory sizes increased to 512
- requests package updated to 2.32.0 due to CVE-2024-35195
- PutBucketTagging permission added to orchestrator lambda iam policy


## [2.2.0] - 2023-08-03

Expand Down
22 changes: 17 additions & 5 deletions NOTICE.txt
Original file line number Diff line number Diff line change
Expand Up @@ -14,7 +14,9 @@ This software includes third party software subject to the following copyrights:
Jinja2 BSD License
MarkupSafe BSD License
PyYAML MIT License
Werkzeug BSD License
Werkzeug BSD License
annotated-types MIT License
antlr4-python3-runtime BSD License
attrs MIT License
aws-cdk-lib Apache-2.0
aws-cdk.asset-awscli-v1 Apache-2.0
Expand Down Expand Up @@ -47,22 +49,27 @@ google-pasta Apache Software License
graphql-core MIT License
idna BSD License
importlib-metadata Apache Software License
importlib-resources Apache Software License
iniconfig MIT License
importlib-resources Apache Software License
importlib_resources Apache Software License
iniconfig MIT License
jmespath MIT License
joserfc BSD License (BSD-3-Clause)
jschema-to-python MIT License
jsii Apache Software License
jsondiff MIT License
jsonpatch BSD License
jsonpickle BSD License
jsonpointer BSD License
jsonpointer BSD License
jsonpath-ng Apache Software License
jsonschema MIT License
jsonschema-path Apache Software License
jsonschema-spec Apache Software License
jsonschema-specifications MIT License
junit-xml Freely Distributable; MIT License
lazy-object-proxy BSD License
moto Apache Software License
mpmath BSD License
multipart Apache Software License
multiprocess BSD License
networkx BSD License
numpy BSD License
Expand All @@ -75,14 +82,18 @@ pathos BSD License
pbr Apache Software License
platformdirs MIT License
pluggy MIT License
ply BSD License
pox BSD License
ppft BSD License
protobuf BSD-3-Clause
protobuf3-to-dict Public Domain
psutil BSD-3-Clause
publication MIT License
py-partiql-parser MIT License
pyasn1 BSD License
pycparser BSD License
pydantic MIT License
pydantic MIT License
pydantic_core MIT License
pyparsing MIT License
pytest MIT License
pytest-cov MIT License
Expand All @@ -106,6 +117,7 @@ sshpubkeys BSD License
sympy BSD License
tblib BSD License
tomli MIT License
tqdm MIT License
typeguard MIT License
types-PyYAML Apache Software License
typing_extensions Python Software Foundation License
Expand Down
31 changes: 17 additions & 14 deletions README.md
Original file line number Diff line number Diff line change
Expand Up @@ -60,7 +60,7 @@ Upon successfully cloning the repository into your local development environment
├── README.md
├── deployment [folder containing build/test scripts]
│   ├── build-s3-dist.sh
│ ├── run-all-tests.sh
│ ├── run-unit-tests.sh
│   ├── cdk-solution-helper
└── source
├── infrastructure [folder containing CDK code and lambdas for ML pipelines]
Expand Down Expand Up @@ -100,7 +100,7 @@ Upon successfully cloning the repository into your local development environment

Clone this git repository.

`git clone https://github.com/awslabs/<repository_name>`
`git clone https://github.com/aws-solutions/mlops-workload-orchestrator.git`

---

Expand All @@ -109,12 +109,17 @@ Clone this git repository.
- To run the unit tests

```
cd <rootDir>/source
chmod +x ./run-all-tests.sh
./run-all-tests.sh
cd <rootDir>/deployment
chmod +x ./run-unit-tests.sh
./run-unit-tests.sh
```

- Configure the bucket name of your target Amazon S3 distribution bucket
- Determine the name of a target Amazon S3 distribution bucket that should be created in your account, the name should be suffixed with the region that the deployment will be made in, example my-solution-name-<region> such as my-solution-name-us-east-1
- Create the bucket in the target account
- Set environment variables in your shell as follows, note that *DIST_OUTPUT_BUCKET* should not include the region suffix of the bucket name, that will be automatically appended later
- SOLUTION_NAME - The name of This solution (example: mlops-workload-orchestrator)
- VERSION - The version number of the change


```
export DIST_OUTPUT_BUCKET=my-bucket-name
Expand All @@ -133,19 +138,17 @@ chmod +x ./build-s3-dist.sh
- Upload the distributable assets to your Amazon S3 bucket in your account. Note: ensure that you own the Amazon S3 bucket before uploading the assets. To upload the assets to the S3 bucket, you can use the AWS Console or the AWS CLI as shown below.

```
aws s3 cp ./global-s3-assets/ s3://my-bucket-name-<aws_region>/mlops-workload-orchestrator/<my-version>/ --recursive --acl bucket-owner-full-control --profile aws-cred-profile-name
aws s3 cp ./regional-s3-assets/ s3://my-bucket-name-<aws_region>/mlops-workload-orchestrator/<my-version>/ --recursive --acl bucket-owner-full-control --profile aws-cred-profile-name
aws s3 cp ./global-s3-assets/ s3://$DIST_OUTPUT_BUCKET-<aws_region>/$SOLUTION_NAME/$VERSION/ --recursive --acl bucket-owner-full-control --profile aws-cred-profile-name
aws s3 cp ./regional-s3-assets/ s3://$DIST_OUTPUT_BUCKET-<aws_region>/$SOLUTION_NAME/$VERSION/ --recursive --acl bucket-owner-full-control --profile aws-cred-profile-name
```

- In the destination bucket under the solution name and version folders, there should be templates for single and multi-region deployments. These will end with a *.template* suffix.
- Copy the *Object URL* link for the preferred deployment architecture.
- Create the deployment using CloudFormation with the template link.

---

- Parameter details

```
$DIST_OUTPUT_BUCKET - This is the global name of the distribution. For the bucket name, the AWS Region is added to the global name (example: 'my-bucket-name-us-east-1') to create a regional bucket. The lambda artifact should be uploaded to the regional buckets for the CloudFormation template to pick it up for deployment.
$SOLUTION_NAME - The name of This solution (example: mlops-workload-orchestrator)
$VERSION - The version number of the change
```

## Uninstall the solution

Expand Down
15 changes: 8 additions & 7 deletions deployment/run-all-tests.sh → deployment/run-unit-tests.sh
Original file line number Diff line number Diff line change
Expand Up @@ -72,7 +72,7 @@ run_python_test() {
# Use -vv for debugging
python3 -m pytest --cov --cov-fail-under=80 --cov-report=term-missing --cov-report "xml:$coverage_report_path"
if [ "$?" = "1" ]; then
echo "(source/run-all-tests.sh) ERROR: there is likely output above." 1>&2
echo "(deployment/run-unit-tests.sh) ERROR: there is likely output above." 1>&2
exit 1
fi
sed -i -e "s,<source>$source_dir,<source>source,g" $coverage_report_path
Expand All @@ -89,7 +89,7 @@ run_javascript_lambda_test() {
npm ci
npm test
if [ "$?" = "1" ]; then
echo "(source/run-all-tests.sh) ERROR: there is likely output above." 1>&2
echo "(deployment/run-unit-tests.sh) ERROR: there is likely output above." 1>&2
exit 1
fi
[ "${CLEAN:-true}" = "true" ] && rm -fr coverage
Expand Down Expand Up @@ -154,8 +154,9 @@ run_blueprint_lambda_test() {
}

# Save the current working directory and set source directory
starting_dir=$PWD
cd ../source
source_dir=$PWD
cd $source_dir

# setup coverage report directory
coverage_dir=$source_dir/test/coverage-reports
Expand All @@ -164,9 +165,9 @@ mkdir -p $coverage_dir
# Clean the test environment before running tests and after finished running tests
# The variable is option with default of 'true'. It can be overwritten by caller
# setting the CLEAN environment variable. For example
# $ CLEAN=true ./run-all-tests.sh
# $ CLEAN=true ./run-unit-tests.sh
# or
# $ CLEAN=false ./run-all-tests.sh
# $ CLEAN=false ./run-unit-tests.sh
#
CLEAN="${CLEAN:-true}"

Expand All @@ -180,5 +181,5 @@ run_cdk_project_test
deactivate


# Return to the source/ level where we started
cd $source_dir
# Return to the folder where where we started
cd $starting_dir
Original file line number Diff line number Diff line change
Expand Up @@ -13,7 +13,7 @@
import os
from unittest.mock import MagicMock, patch
import pytest
from moto import mock_sts
from moto import mock_aws
import botocore.session
from botocore.stub import Stubber, ANY
from main import handler
Expand Down Expand Up @@ -72,7 +72,7 @@ def event():
}


@mock_sts
@mock_aws
def test_handler_success(sm_expected_params, sm_response_200, event):
sm_client = get_client("sagemaker")
sm_stubber = Stubber(sm_client)
Expand All @@ -85,7 +85,6 @@ def test_handler_success(sm_expected_params, sm_response_200, event):
reset_client()


@mock_sts
def test_handler_fail(sm_expected_params, sm_response_500, event):
sm_client = get_client("sagemaker")
sm_stubber = Stubber(sm_client)
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -34,7 +34,7 @@
mocked_invalid_user_parms,
mocked_describe_response
)
from moto import mock_cloudformation, mock_s3
from moto import mock_aws
from unittest.mock import patch
from stackset_helpers import (
find_artifact,
Expand All @@ -58,7 +58,7 @@
client_to_patch = "boto3.client"


@mock_cloudformation
@mock_aws
def test_create_stackset_and_instances(
stackset_name,
mocked_template,
Expand Down Expand Up @@ -149,7 +149,7 @@ def test_get_stackset_instance_status(



@mock_cloudformation
@mock_aws
def test_update_stackset(
stackset_name,
mocked_template,
Expand Down Expand Up @@ -203,7 +203,7 @@ def test_update_stackset_error(
)


@mock_cloudformation
@mock_aws
def test_stackset_exists(stackset_name, mocked_template, mocked_template_parameters, mocked_regions):
cf_client = boto3.client("cloudformation", region_name=mocked_regions[0])
# assert the stackset does not exist
Expand Down Expand Up @@ -504,14 +504,14 @@ def test_lambda_handler(
mocked_put_job_failure.assert_called()


@mock_s3
@mock_aws
def test_setup_s3_client(mocked_codepipeline_event):
job_data = mocked_codepipeline_event[cp_job]["data"]
s3_clinet = setup_s3_client(job_data)
assert s3_clinet is not None


@mock_s3
@mock_aws
@patch("zipfile.ZipFile")
def test_get_template(mocked_zipfile, mocked_codepipeline_event, mocked_regions):
job_data = mocked_codepipeline_event[cp_job]["data"]
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -13,7 +13,7 @@
import boto3
import pytest
from unittest.mock import patch
from moto import mock_lambda
from moto import mock_aws
from index import invoke_lambda, no_op, handler


Expand Down Expand Up @@ -52,7 +52,7 @@ def test_invoke_lambda(mocked_client, invoke_event, invoke_bad_event):
)


@mock_lambda
@mock_aws
def test_invoke_lambda_error(invoke_event):
mocked_client = boto3.client("lambda")
with pytest.raises(Exception):
Expand Down
Original file line number Diff line number Diff line change
@@ -1,3 +1,3 @@
botocore==1.29.155
boto3==1.26.155
sagemaker==2.165.0
botocore==1.34.98
boto3==1.34.98
sagemaker==2.218.0
Original file line number Diff line number Diff line change
Expand Up @@ -114,12 +114,12 @@ def __init__(self, scope: Construct, id: str, **kwargs) -> None:
# add ArtifactBucket cfn supression (not needing a logging bucket)
image_builder_pipeline.node.find_child(
"ArtifactsBucket"
).node.default_child.cfn_options.metadata = suppress_pipeline_bucket()
).node.default_child.cfn_options.metadata = { "cfn_nag": suppress_pipeline_bucket() }

# add supression for complex policy
image_builder_pipeline.node.find_child("Role").node.find_child(
"DefaultPolicy"
).node.default_child.cfn_options.metadata = suppress_iam_complex()
).node.default_child.cfn_options.metadata = { "cfn_nag": suppress_iam_complex() }

# attaching iam permissions to the pipelines
pipeline_permissions(image_builder_pipeline, assets_bucket)
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -291,6 +291,6 @@ def _create_stack_outputs(self):
CfnOutput(
self,
id="ValidationDataLocation",
value=f"https://s3.console.aws.amazon.com/s3/buckets/{self.assets_bucket_name.value_as_string}/{self.training_data.value_as_string}",
description="Training data used by the training job",
value=f"https://s3.console.aws.amazon.com/s3/buckets/{self.assets_bucket_name.value_as_string}/{self.validation_data.value_as_string}",
description="Validation data used by the training job",
).node.condition = self.validation_data_provided
Loading

0 comments on commit 7f60526

Please sign in to comment.