From 7007984080af22d718f7168619c029cf28045a94 Mon Sep 17 00:00:00 2001 From: Kimonas Sotirchos Date: Sat, 31 Aug 2024 12:05:57 +0300 Subject: [PATCH] Address review comments --- .github/workflows/scan-images.yaml | 25 +- scripts/README.md | 26 ++ scripts/airgapped/README.md | 8 +- scripts/airgapped/get-all-images.py | 177 -------------- scripts/airgapped/requirements.txt | 4 +- scripts/get-all-images.py | 268 +++++++++++++++++++++ scripts/requirements.txt | 5 + tests/airgapped/README.md | 18 ++ tests/airgapped/airgap.sh | 4 +- tests/airgapped/ckf-1.8-testing-images.txt | 4 + tests/airgapped/ckf.sh | 6 +- tests/airgapped/katib/README.md | 2 +- tests/airgapped/knative/README.md | 2 +- tests/airgapped/pipelines/README.md | 2 +- tests/airgapped/setup/setup.sh | 2 + tests/airgapped/training/README.md | 2 +- 16 files changed, 343 insertions(+), 212 deletions(-) create mode 100644 scripts/README.md delete mode 100644 scripts/airgapped/get-all-images.py create mode 100755 scripts/get-all-images.py create mode 100644 scripts/requirements.txt create mode 100644 tests/airgapped/ckf-1.8-testing-images.txt diff --git a/.github/workflows/scan-images.yaml b/.github/workflows/scan-images.yaml index 76632a6a..d64c5fcf 100644 --- a/.github/workflows/scan-images.yaml +++ b/.github/workflows/scan-images.yaml @@ -15,22 +15,8 @@ jobs: - releases/1.8/stable/kubeflow - releases/1.9/stable - releases/latest/edge - runs-on: ubuntu-24.04 + runs-on: [self-hosted, linux, X64, jammy, large] steps: - # Ideally we'd use self-hosted runners, but this effort is still not stable - # This action will remove unused software (dotnet, haskell, android libs, codeql, - # and docker images) from the GH runner, which will liberate around 60 GB of storage - # distributed in 40GB for root and around 20 for a mnt point. - - name: Maximise GH runner space - uses: easimon/maximize-build-space@v7 - with: - root-reserve-mb: 29696 - remove-dotnet: 'true' - remove-haskell: 'true' - remove-android: 'true' - remove-codeql: 'true' - remove-docker-images: 'true' - - name: Checkout uses: actions/checkout@v3 with: @@ -42,11 +28,6 @@ jobs: sudo snap install yq echo "date=$(date '+%Y-%m-%d-%H-%M-%S')" >> $GITHUB_OUTPUT - - name: Set up Python ${{ matrix.python-version }} - uses: actions/setup-python@v5 - with: - python-version: 3.12 - - name: Checkout kubeflow-ci uses: actions/checkout@v3 with: @@ -63,8 +44,8 @@ jobs: RELEASE=${BUNDLE_SPLIT[1]} RISK=${BUNDLE_SPLIT[2]} - pip3 install -r scripts/airgapped/requirements.txt - python3 scripts/airgapped/get-all-images.py ${{ matrix.bundle }}/bundle.yaml > image_list.txt + pip3 install -r scripts/requirements.txt + python3 scripts/get-all-images.py ${{ matrix.bundle }}/bundle.yaml > image_list.txt echo "Image list:" cat ./image_list.txt echo "release_risk=${RELEASE}-${RISK}" >> $GITHUB_OUTPUT diff --git a/scripts/README.md b/scripts/README.md new file mode 100644 index 00000000..c44953dd --- /dev/null +++ b/scripts/README.md @@ -0,0 +1,26 @@ +# Utility Script + +This directory contains helper scripts for Charmed Kubeflow, during CI and not only. + +## Gather images used by a bundle + +You can get a list of all the OCI images used by the bundle by running the following command: +```bash +pip3 install -r scritps/requirements.txt + +python3 scripts/get-all-images.py \ + --append-images tests/airgapped/ckf-1.8-testing-images.txt \ + releases/1.8/stable/kubeflow/bundle.yaml \ + > images-all.txt +``` + +The script will gather the images in the following way: +1. For each `application` in the provided `bundle.yaml` file: +2. detect if it's owned by us or another team (by looking at the `_github_dependency_repo_name` and such metadata) +3. clone its repo, by looking at `_github_repo_name` and such metadata +4. If owned by another team: only parse it's `metadata.yaml` and look for `oci-resources` +5. If owned by us: run the `tools/get-images.sh` script the repo **must** have +6. If a repo does not have `tools/get-images.sh` (i.e. kubeflow-roles) then the script should skip the repo +7. If the `get-images.sh` script either fails (return code non zero) or has error logs then the script should **fail** +8. Aggregate the outputs of all `get-images.sh` scripts to one output +9. If user passed an argument `--append-images` then the script will amend a list of images we need for airgap testing diff --git a/scripts/airgapped/README.md b/scripts/airgapped/README.md index a3fec430..82bdf7c1 100644 --- a/scripts/airgapped/README.md +++ b/scripts/airgapped/README.md @@ -7,12 +7,13 @@ to create airgap artifacts or via our testing scripts. We'll document some use-case scenarios here for the different scripts. ## Prerequisites +NOTE: All the commands are expected to be run from the root directory of the repo To use the scripts in this directory you'll need to install a couple of Python and Ubuntu packages on the host machine, driving the test (not the LXC machine that will contain the airgapped environment). ``` -pip3 install -r requirements.txt +pip3 install -r scripts/airgapped/requirements.txt sudo apt install pigz sudo snap install docker sudo snap install yq @@ -32,9 +33,10 @@ This script makes the following assumptions: the images for that repo ```bash -python3 scripts/airgapped/get-all-images.py \ +python3 scripts/get-all-images.py \ + --append-images=tests/airgapped/ckf-1.8-testing-images.txt \ releases/1.8/stable/kubeflow/bundle.yaml \ - --airgap-testing > images.txt + > images.txt ``` ## Pull images to docker cache diff --git a/scripts/airgapped/get-all-images.py b/scripts/airgapped/get-all-images.py deleted file mode 100644 index 0948b3de..00000000 --- a/scripts/airgapped/get-all-images.py +++ /dev/null @@ -1,177 +0,0 @@ -import argparse -import logging -import shutil -import subprocess - -import git -import yaml - -# logging -LOG_FORMAT = "%(levelname)s \t| %(message)s" -logging.basicConfig(format=LOG_FORMAT, level=logging.INFO) -log = logging.getLogger(__name__) - -# consts -EXCLUDE_REPOS = ["kubeflow-roles"] - -GH_REPO_KEY = "_github_repo_name" -GH_BRANCH_KEY = "_github_repo_branch" -GH_DEPENDENCY_REPO_KEY = "_github_dependency_repo_name" -GH_DEPENDENCY_BRANCH_KEY = "_github_dependency_repo_branch" -GET_IMAGES_SH = "tools/get-images.sh" - -AIRGAP_TESTING_IMAGES = [ - "charmedkubeflow/pipelines-runner:ckf-1.8", - "docker.io/kubeflowkatib/simple-pbt:v0.16.0", - "ghcr.io/knative/helloworld-go:latest", - "gcr.io/kubeflow-ci/tf-mnist-with-summaries:1.0", -] - - -def is_dependency_app(app: dict) -> bool: - """Detect if app in bundle is not owned by Analytics team.""" - if GH_DEPENDENCY_REPO_KEY in app and GH_DEPENDENCY_BRANCH_KEY in app: - return True - - return False - - -def bundle_app_contains_gh_metadata(app: dict) -> bool: - """ - Given an application in a bundle check if it contains github metadata keys - to be able to be parsed properly. - """ - if is_dependency_app(app): - return True - - if GH_REPO_KEY in app and GH_BRANCH_KEY in app: - return True - - return False - - -def clone_repo(repo_name: str, branch: str) -> str: - """Clones locally a repo and returns the folder created.""" - repo_url = f"https://github.com/canonical/{repo_name}.git" - - logging.info(f"Cloning repo {repo_url}") - repo = git.Repo.clone_from(repo_url, str(repo_name)) - - logging.info(f"Checking out to branch {branch}") - repo.git.checkout(branch) - - return repo_name - - -def get_analytics_app_images(app: dict) -> list[str]: - """ - This function gets the images used by a charm developed by us, by: - 1. Cloning the repo of the charm - 2. Running the repo's tools/get-images.sh - 3. Delete the repo - - If the tools/get-images.sh of a repo fails for any reason then this - script will also fail. - """ - images = [] - repo_dir = clone_repo(app[GH_REPO_KEY], app[GH_BRANCH_KEY]) - - logging.info(f"Executing repo's {GET_IMAGES_SH} script") - process = subprocess.Popen(["bash", "tools/get-images.sh"], cwd=repo_dir, - stdout=subprocess.PIPE, stderr=subprocess.PIPE) - stdout, stderr = process.communicate() - if stderr: - raise ValueError("Script '%s' for charm '%s' had error logs: \n%s" % - (GET_IMAGES_SH, app["charm"], stderr.decode("utf-8")) - ) - images = stdout.decode("utf-8").split("\n") - - # cleanup - shutil.rmtree(repo_dir) - return images - - -def get_dependency_app_images(app: dict) -> list[str]: - """ - This function gets the images used by a dependency charm by: - 1. Cloning the repo of the charm - 2. Looking at its metadata.yaml for "upstream-source" keys - 3. Delete the repo - """ - images = [] - repo_dir = clone_repo(app[GH_DEPENDENCY_REPO_KEY], - app[GH_DEPENDENCY_BRANCH_KEY]) - with open("%s/metadata.yaml" % repo_dir, 'r') as metadata_file: - metadata_dict = yaml.safe_load(metadata_file) - - for _, rsrc in metadata_dict["resources"].items(): - if "type" in rsrc and rsrc["type"] == "oci-image": - img = rsrc["upstream-source"] - logging.info("Found image %s" % img) - images.append(img) - - # cleanup - shutil.rmtree(repo_dir) - return images - - -def get_bundle_images(bundle_dict: dict) -> list[str]: - """Return a list of images used by a bundle""" - images = [] - - for app_name, app in bundle_dict["applications"].items(): - logging.info(f"Handling app {app_name}") - - # exclude repos we know don't have images - if app_name in EXCLUDE_REPOS: - logging.info("Ignoring charm %s", app["charm"]) - continue - - # Follow default image-gather process for dependency apps - if is_dependency_app(app): - logging.info("Dependency app '%s' with charm '%s'", app_name, - app["charm"]) - images.extend(get_dependency_app_images(app)) - continue - - # Ensure analytics app has necessary github repo/branch metadata - if not bundle_app_contains_gh_metadata(app): - raise KeyError( - "Application '%s' doesn't include expected gh metadata keys" % - app_name - ) - - images.extend(get_analytics_app_images(app)) - - return images - - -if __name__ == "__main__": - parser = argparse.ArgumentParser( - description="Gather all images from a bundle" - ) - parser.add_argument("bundle") - parser.add_argument("--airgap-testing", action="store_true") - - args = parser.parse_args() - - bundle_dict = {} - with open(args.bundle, 'r') as file: - bundle_dict = yaml.safe_load(file) - - # keep unique images and sort them - images_set = set(get_bundle_images(bundle_dict)) - if '' in images_set: - images_set.remove('') - - images = list(images_set) - images.sort() - - # append the airgap images - if args.airgap_testing: - images.extend(AIRGAP_TESTING_IMAGES) - - logging.info(f"Found {len(images)} different images") - - for img in images: - print(img) diff --git a/scripts/airgapped/requirements.txt b/scripts/airgapped/requirements.txt index 96aeb7aa..751a4b38 100644 --- a/scripts/airgapped/requirements.txt +++ b/scripts/airgapped/requirements.txt @@ -1,5 +1,3 @@ docker -#FIXME: remove requests pin when https://github.com/docker/docker-py/issues/3256 is solved -requests<2.32.0 +requests PyYAML -gitpython diff --git a/scripts/get-all-images.py b/scripts/get-all-images.py new file mode 100755 index 00000000..08d4e983 --- /dev/null +++ b/scripts/get-all-images.py @@ -0,0 +1,268 @@ +#!/usr/bin/env python3 + +import argparse +import logging +import subprocess +import os +import sys +import contextlib +import tempfile + +import git +import yaml + +from typing import Iterator +from pathlib import Path + +# logging +LOG_FORMAT = "%(levelname)s \t| %(message)s" +logging.basicConfig(format=LOG_FORMAT, level=logging.INFO) +log = logging.getLogger(__name__) + +# consts +EXCLUDE_CHARMS = ["kubeflow-roles"] + +GH_REPO_KEY = "_github_repo_name" +GH_BRANCH_KEY = "_github_repo_branch" +GH_DEPENDENCY_REPO_KEY = "_github_dependency_repo_name" +GH_DEPENDENCY_BRANCH_KEY = "_github_dependency_repo_branch" +GET_IMAGES_SH = "tools/get-images.sh" + + +def is_dependency_app(app: dict) -> bool: + """ + Return True if app in bundle is not owned by Analytics team, False + otherwise. + + Args: + app(dict): app metadata from a bundle.yaml in dictionary form + + Returns: + True if app has GH dependency metadata, else False + """ + if GH_DEPENDENCY_REPO_KEY in app and GH_DEPENDENCY_BRANCH_KEY in app: + return True + + return False + + +def bundle_app_contains_gh_metadata(app: dict) -> bool: + """ + Given an application in a bundle check if it contains github metadata keys + to be able to be parsed properly. + + Args: + app(dict): app metadata from a bundle.yaml in dictionary form + + Returns: + True if app has GH metadata, else False + """ + if is_dependency_app(app): + return True + + if GH_REPO_KEY in app and GH_BRANCH_KEY in app: + return True + + return False + + +def validate_bundle(bundle: dict): + """ + Given a bundle, parse all the applications and ensure they contain the + correct metadata. + + Args: + bundle: Dictionary of the loaded bundle + """ + for app_name, app in bundle["applications"].items(): + if bundle_app_contains_gh_metadata(app): + continue + + logging.error("Application '%s' doesn't include expected gh metadata keys.", + app_name) + sys.exit(1) + + +@contextlib.contextmanager +def clone_git_repo(repo_name: str, branch: str) -> Iterator[git.PathLike]: + """ + Clones locally a repo and returns the path of the folder created. + + Args: + repo_name(str): name of the repo to clone + branch(str): branch to checkout to, once cloned the repo + """ + repo_url = f"https://github.com/canonical/{repo_name}.git" + + # we can't use the default /tmp/ dir because of + # https://github.com/mikefarah/yq/issues/1808 + with tempfile.TemporaryDirectory(dir=os.getcwd()) as tmp: + logging.info(f"Cloning repo {repo_url}") + repo = git.Repo.clone_from(repo_url, tmp) + + logging.info(f"Checking out to branch {branch}") + repo.git.checkout(branch) + + yield repo.working_dir + + +def get_analytics_app_images(app: dict) -> list[str]: + """ + This function gets the images used by a charm developed by us, by: + 1. Cloning the repo of the charm + 2. Running the repo's tools/get-images.sh + 3. Delete the repo + + If the tools/get-images.sh of a repo fails for any reason then this + script will also fail. + """ + images = [] + repo_name = app[GH_REPO_KEY] + repo_branch = app[GH_BRANCH_KEY] + + with clone_git_repo(repo_name, repo_branch) as repo_dir: + logging.info(f"Executing repo's {GET_IMAGES_SH} script") + try: + process = subprocess.run(["bash", "tools/get-images.sh"], + cwd=repo_dir, capture_output=True, + text=True, check=True) + except subprocess.CalledProcessError as exc: + logging.error("Script '%s' for charm '%s' failed: %s", + GET_IMAGES_SH, app["charm"], exc.stderr) + raise exc + + images = process.stdout.strip().split("\n") + + logging.info("Found the following images:") + for image in images: + logging.info("* " + image) + + return images + + +def get_dependency_app_images(app: dict) -> list[str]: + """ + This function gets the images used by a dependency charm by: + 1. Cloning the repo of the charm + 2. Looking at its metadata.yaml for "upstream-source" keys + 3. Delete the repo + """ + images = [] + repo_name = app[GH_DEPENDENCY_REPO_KEY] + repo_branch = app[GH_DEPENDENCY_BRANCH_KEY] + with clone_git_repo(repo_name, repo_branch) as repo_dir: + metatada_file = f"{repo_dir}/metadata.yaml" + metadata_dict = yaml.safe_load(Path(metatada_file).read_text()) + + for _, rsrc in metadata_dict["resources"].items(): + if rsrc.get("type", "") != "oci-image": + continue + + images.append(rsrc["upstream-source"]) + + logging.info("Found the following images:") + for image in images: + logging.info("* " + image) + + return images + + +def cleanup_images(images: list[str]) -> list[str]: + """ + Given a list of OCI registry images ensure + 1. there are no duplicates + 2. there are no images with empty name + 3. the list is sorted + + Args: + images: List of images to be processed + + Returns: + A list with unique and sorted values. + """ + images_set = set(images) + if "" in images_set: + images_set.remove('') + + unique_images = list(images_set) + unique_images.sort() + + return unique_images + + +def get_bundle_images(bundle_path: str) -> list[str]: + """Return a list of images used by a bundle""" + bundle_dict = yaml.safe_load(Path(bundle_path).read_text()) + validate_bundle(bundle_dict) + + images = [] + + for app_name, app in bundle_dict["applications"].items(): + logging.info(f"Handling app {app_name}") + + # exclude repos we know don't have images + # if we find we keep extending this const, we should introduce an + # argument in the script for dynamically exluding repos/charms + if app_name in EXCLUDE_CHARMS: + logging.info("Ignoring charm %s", app["charm"]) + continue + + # Follow default image-gather process for dependency apps + if is_dependency_app(app): + logging.info("Dependency app '%s' with charm '%s'", app_name, + app["charm"]) + images.extend(get_dependency_app_images(app)) + continue + + # image from analytics team + images.extend(get_analytics_app_images(app)) + + return cleanup_images(images) + + +def get_static_images_from_file(images_file_path: str) -> list[str]: + """ + Return a list of images stored in a text file and separated by \n. + + Args: + images_file_path: Path of the file containing images + + Returns: + list of strings, containing the images in the file + """ + with open(images_file_path, "r") as file: + images = file.readlines() + + cleaned_images = [image.strip() for image in images] + for image in cleaned_images: + logging.info(image) + + return cleaned_images + + +def main(): + parser = argparse.ArgumentParser( + description="Gather all images from a bundle" + ) + parser.add_argument("bundle") + parser.add_argument("--append-images", + help="Appends list of images from input file.") + + args = parser.parse_args() + + images = get_bundle_images(args.bundle) + + # append the airgap images + if args.append_images: + logging.info("Appending images found in file '%s'", args.append_images) + extra_images = get_static_images_from_file(args.append_images) + images.extend(extra_images) + + logging.info(f"Found {len(images)} different images") + + for img in images: + print(img) + + +if __name__ == "__main__": + main() diff --git a/scripts/requirements.txt b/scripts/requirements.txt new file mode 100644 index 00000000..8fe79fa6 --- /dev/null +++ b/scripts/requirements.txt @@ -0,0 +1,5 @@ +gitpython +tenacity +boto3 +click +pyyaml diff --git a/tests/airgapped/README.md b/tests/airgapped/README.md index fa36c7a7..721ebf3f 100644 --- a/tests/airgapped/README.md +++ b/tests/airgapped/README.md @@ -7,6 +7,20 @@ The scripts for testing an airgapped installation have the following requirement Additionally, the host machine will use Docker the pull all the images needed, and LXC to create the airgapped container. +## Update testing images + +The scripts that setup and test the airgapped environment will also need to load +a list of predefined images, for running tests in the airgapped environment. + +Every release has its own set of images. If you are creating a new release, you'll +need to create a new directory and the corresponding images file. + +You can find such files under `tests/airgapped//testing-images.txt` + +To understand how those image are being used by the tests, please take a look +at the +[Test Charmed Kubeflow components in airgapped](#test-charmed-kubeflow-components-in-airgapped) + ## Setup the environment This repository contains a script for setting up the environment: @@ -25,6 +39,7 @@ You can run the script that will spin up an airgapped microk8s cluster with: --node-name airgapped-microk8s \ --microk8s-channel 1.29-strict/stable \ --bundle-path releases/1.9/stable/bundle.yaml \ + --testing-images-path tests/airgapped/ckf-1.9-testing-images.txt \ --juju-channel 3.4/stable ``` @@ -128,3 +143,6 @@ To test Charmed Kubeflow components in airgapped, follow the instructions in the * [KNative](./knative/README.md) * [Pipelines](./pipelines/README.md) * [Training Operator](./training/README.md) + +Make sure to follow the first part of this guide on updating the oci images that need to be present +in the airgapped cluster in order to execute tests. diff --git a/tests/airgapped/airgap.sh b/tests/airgapped/airgap.sh index 661293d2..62fbe598 100755 --- a/tests/airgapped/airgap.sh +++ b/tests/airgapped/airgap.sh @@ -125,6 +125,7 @@ DISTRO="${DISTRO:-"ubuntu:22.04"}" MICROK8S_CHANNEL="${MICROK8S_CHANNEL:-}" JUJU_CHANNEL="${JUJU_CHANNEL:-"2.9/stable"}" BUNDLE_PATH="${BUNDLE_PATH:-"releases/latest/edge/bundle.yaml"}" +TESTING_IMAGES_PATH="${TESTING_IMAGES_PATH:-"tests/airgapped/ckf-1.8-testing-images.txt"}" LIBRARY_MODE=false @@ -136,6 +137,7 @@ while true; do --microk8s-channel ) MICROK8S_CHANNEL="$2"; shift 2 ;; --juju-channel) JUJU_CHANNEL="$2"; shift 2 ;; --bundle-path) BUNDLE_PATH="$2"; shift 2 ;; + --testing-images-path) TESTING_IMAGES_PATH="$2"; shift 2 ;; -h | --help ) prog=$(basename -s.wrapper "$0") echo "Usage: $prog [options...]" @@ -158,7 +160,7 @@ done if [ "$LIBRARY_MODE" == "false" ]; then echo "1/X -- (us) Create images tar.gz" - create_images_tar "$BUNDLE_PATH" + create_images_tar "$BUNDLE_PATH" "$TESTING_IMAGES_PATH" echo "2/X -- (us) Create charms tar.gz" create_charms_tar "$BUNDLE_PATH" echo "3/X -- (client) Setup K8s cluster (MicroK8s)" diff --git a/tests/airgapped/ckf-1.8-testing-images.txt b/tests/airgapped/ckf-1.8-testing-images.txt new file mode 100644 index 00000000..6a944ee0 --- /dev/null +++ b/tests/airgapped/ckf-1.8-testing-images.txt @@ -0,0 +1,4 @@ + charmedkubeflow/pipelines-runner:ckf-1.8 + docker.io/kubeflowkatib/simple-pbt:v0.16.0 + ghcr.io/knative/helloworld-go:latest + gcr.io/kubeflow-ci/tf-mnist-with-summaries:1.0 diff --git a/tests/airgapped/ckf.sh b/tests/airgapped/ckf.sh index 80d6e3b9..23357a91 100644 --- a/tests/airgapped/ckf.sh +++ b/tests/airgapped/ckf.sh @@ -7,6 +7,7 @@ function create_images_tar() { local BUNDLE_PATH=$1 + local TESTING_IMAGES_PATH=$2 if [ -f "images.tar.gz" ]; then echo "images.tar.gz exists. Will not recreate it." @@ -16,9 +17,10 @@ function create_images_tar() { pip3 install -r scripts/airgapped/requirements.txt echo "Generating list of images of Charmed Kubeflow" - python3 scripts/airgapped/get-all-images.py \ + python3 scripts/get-all-images.py \ + --append-images "$TESTING_IMAGES_PATH" \ "$BUNDLE_PATH" \ - --airgap-testing > images.txt + > images.txt echo "Using produced list to load it into our machine's docker cache" python3 scripts/airgapped/save-images-to-cache.py images.txt diff --git a/tests/airgapped/katib/README.md b/tests/airgapped/katib/README.md index 2b9782c9..eceb91cc 100644 --- a/tests/airgapped/katib/README.md +++ b/tests/airgapped/katib/README.md @@ -6,7 +6,7 @@ This directory is dedicated to testing Katib in an airgapped environment. Prepare the airgapped environment and deploy CKF by following the steps in [Airgapped test scripts](https://github.com/canonical/bundle-kubeflow/tree/main/tests/airgapped#testing-airgapped-installation). -Once you run the test scripts, the `kubeflowkatib/simple-pbt:v0.16.0` image used in the `simple-pbt` experiment will be included in your airgapped environment. It's specifically added in the [`get-all-images.py` script](../../../scripts/airgapped/get-all-images.py). +Once you run the test scripts, the `kubeflowkatib/simple-pbt:v0.16.0` image used in the `simple-pbt` experiment will be included in your airgapped environment. It's specifically added in the [`get-all-images.py` script](../../../scripts/get-all-images.py). ## How to test Katib in an Airgapped environment 1. Connect to the dashboard by visiting the IP of your airgapped VM. To get the IP run: diff --git a/tests/airgapped/knative/README.md b/tests/airgapped/knative/README.md index 665cc48d..6f7bab2f 100644 --- a/tests/airgapped/knative/README.md +++ b/tests/airgapped/knative/README.md @@ -6,7 +6,7 @@ This directory is dedicated to testing Knative in an airgapped environment. Prepare the airgapped environment and deploy CKF by following the steps in [Airgapped test scripts](https://github.com/canonical/bundle-kubeflow/tree/main/tests/airgapped#testing-airgapped-installation). -Once you run the test scripts, the `knative/helloworld-go` image used in the `helloworld` example will be included in your airgapped environment. It's specifically added in the [`get-all-images.py` script](../../../scripts/airgapped/get-all-images.py). +Once you run the test scripts, the `knative/helloworld-go` image used in the `helloworld` example will be included in your airgapped environment. It's specifically added in the [`get-all-images.py` script](../../../scripts/get-all-images.py). ## How to test Knative in an Airgapped environment 1. Connect to the dashboard by visiting the IP of your airgapped VM. To get the IP run: diff --git a/tests/airgapped/pipelines/README.md b/tests/airgapped/pipelines/README.md index f6b1112d..a4e45c46 100644 --- a/tests/airgapped/pipelines/README.md +++ b/tests/airgapped/pipelines/README.md @@ -3,7 +3,7 @@ ## The `kfp-airgapped-ipynb` Notebook To test Pipelines in Airgapped, we are using the Notebook in this directory. It contains the Data passing pipeline example, with the configuration of the Pipeline components to use the `pipelines-runner` [image](./pipelines-runner/README.md). -The `pipelines-runner` image will be included in your airgapped environment given that you used the [Airgapped test scripts](../README.md). It's specifically added in the [`get-all-images.py` script](../../../scripts/airgapped/get-all-images.py). +The `pipelines-runner` image will be included in your airgapped environment given that you used the [Airgapped test scripts](../README.md). It's specifically added in the [`get-all-images.py` script](../../../scripts/get-all-images.py). ## How to test Pipelines in an Airgapped environment 1. Prepare the airgapped environment and Deploy CKF by following the steps in [Airgapped test scripts](../README.md). diff --git a/tests/airgapped/setup/setup.sh b/tests/airgapped/setup/setup.sh index a8ae032e..0f14238a 100755 --- a/tests/airgapped/setup/setup.sh +++ b/tests/airgapped/setup/setup.sh @@ -7,4 +7,6 @@ cat tests/airgapped/lxd.profile | lxd init --preseed ./scripts/airgapped/prerequisites.sh ./tests/airgapped/setup/lxd-docker-networking.sh +pip3 install -r ./scripts/requirements.txt + echo "Setup completed. Reboot your machine before running the tests for the docker commands to run without sudo." diff --git a/tests/airgapped/training/README.md b/tests/airgapped/training/README.md index a95c3daa..752a78b9 100644 --- a/tests/airgapped/training/README.md +++ b/tests/airgapped/training/README.md @@ -6,7 +6,7 @@ This directory is dedicated to testing training operator in an airgapped environ Prepare the airgapped environment and deploy CKF by following the steps in [Airgapped test scripts](../README.md#testing-airgapped-installation). -Once you run the test scripts, the `kubeflow-ci/tf-mnist-with-summaries:1.0` image used in the `tfjob-simple` training job will be included in your airgapped environment. It's specifically added in the [`get-all-images.py` script](../../../scripts/airgapped/get-all-images.py). +Once you run the test scripts, the `kubeflow-ci/tf-mnist-with-summaries:1.0` image used in the `tfjob-simple` training job will be included in your airgapped environment. It's specifically added in the [`get-all-images.py` script](../../../scripts/get-all-images.py). ## How to test training operator in an Airgapped environment 1. Connect to the dashboard by visiting the IP of your airgapped VM. To get the IP run: