Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

ci: add workflow to check for vulnerabilities in images #381

Merged
merged 1 commit into from
Feb 6, 2024
Merged
Show file tree
Hide file tree
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension


Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
148 changes: 148 additions & 0 deletions .github/workflows/sec-scan.yml
Original file line number Diff line number Diff line change
@@ -0,0 +1,148 @@
---
# The aim of this GitHub workflow is to update the `ci/securitty-scan/security_scan_results.md` with latest security scan results.
name: Update notebook image security reports
on:
workflow_dispatch:
inputs:
branch:
required: true
description: "Provide the name of the branch you want to update ex main, vYYYYx etc: "
schedule:
- cron: "0 0 */21 * 5" #Scheduled every third Friday
env:
SEC_SCAN_BRANCH: sec-scan-${{ github.run_id }}
BRANCH_NAME: main
RELEASE_VERSION_N: 2023b
RELEASE_VERSION_N_1: 2023a
jobs:
initialize:
runs-on: ubuntu-latest
permissions:
contents: write
steps:
- name: Install Skopeo CLI
shell: bash
run: |
sudo apt-get -y update
sudo apt-get -y install skopeo
# Checkout the branch
- name: Checkout branch
uses: actions/checkout@v3
with:
ref: ${{ env.BRANCH_NAME }}

# Create a new branch
- name: Create a new branch
run: |
echo ${{ env.SEC_SCAN_BRANCH }}
git checkout -b ${{ env.SEC_SCAN_BRANCH }}
git push --set-upstream origin ${{ env.SEC_SCAN_BRANCH }}
check-vulnerabilities:
needs: [initialize]
runs-on: ubuntu-latest
permissions:
contents: write
steps:
- name: Configure Git
run: |
git config --global user.email "github-actions[bot]@users.noreply.github.com"
git config --global user.name "GitHub Actions"
# Get the latest weekly build commit hash: https://github.com/opendatahub-io/notebooks/commits/2023b
- name: Checkout upstream notebooks repo
uses: actions/checkout@v3
with:
repository: opendatahub-io/notebooks.git
ref: ${{ env.RELEASE_VERSION_N }}

- name: Retrieve latest weekly commit hash from the "N" branch
id: hash-n
shell: bash
run: |
echo "HASH_N=$(git rev-parse --short HEAD)" >> ${GITHUB_OUTPUT}
- name: Checkout "N - 1" branch
uses: actions/checkout@v3
with:
repository: opendatahub-io/notebooks.git
ref: ${{ env.RELEASE_VERSION_N_1 }}

- name: Retrieve latest weekly commit hash from the "N - 1" branch
id: hash-n-1
shell: bash
run: |
echo "HASH_N_1=$(git rev-parse --short HEAD)" >> ${GITHUB_OUTPUT}
- name: Checkout "main" branch
uses: actions/checkout@v3
with:
repository: opendatahub-io/notebooks.git
ref: main

- name: Retrieve latest weekly commit hash from the "main" branch
id: hash-main
shell: bash
run: |
echo "LATEST_MAIN_COMMIT=$(git rev-parse --short HEAD)" >> ${GITHUB_OUTPUT}
# Checkout the release branch to apply the updates
- name: Checkout release branch
uses: actions/checkout@v3
with:
ref: ${{ env.SEC_SCAN_BRANCH }}

- name: setup python
uses: actions/setup-python@v4
with:
python-version: '3.10' # install the python version needed

- name: install python packages
run: |
python -m pip install --upgrade pip
pip install requests
- name: execute py script # run trial.py
env:
HASH_N: ${{ steps.hash-n.outputs.HASH_N }}
RELEASE_VERSION_N: ${{ env.RELEASE_VERSION_N }}

HASH_N_1: ${{ steps.hash-n-1.outputs.HASH_N_1 }}
RELEASE_VERSION_N_1: ${{ env.RELEASE_VERSION_N_1 }}

LATEST_MAIN_COMMIT: ${{ steps.hash-main.outputs.LATEST_MAIN_COMMIT }}
run: make scan-image-vulnerabilities

- name: Push the files
run: |
git fetch origin ${{ env.SEC_SCAN_BRANCH }} && git pull origin ${{ env.SEC_SCAN_BRANCH }} && git add . && git commit -m "Update security scans" && git push origin ${{ env.SEC_SCAN_BRANCH }}
# Creates the Pull Request
open-pull-request:
needs: [check-vulnerabilities]
runs-on: ubuntu-latest
permissions:
pull-requests: write
steps:
- name: Checkout repo
uses: actions/checkout@v3

- name: pull-request
uses: repo-sync/pull-request@v2
with:
source_branch: ${{ env.SEC_SCAN_BRANCH }}
destination_branch: ${{ env.BRANCH_NAME}}
github_token: ${{ secrets.GITHUB_TOKEN }}
pr_label: "automated pr"
pr_title: "[Security Scanner Action] Weekly update of security vulnerabilities reported by Quay"
pr_body: |
:rocket: This is an automated Pull Request.
This PR updates:
* `ci/security-scan/security_scan_results.md` file with the latest security vulnerabilities reported by Quay.
* `ci/security-scan/weekly_commit_ids` with the latest updated SHA digests of the notebooks (N & N-1)
Created by `/.github/workflows/sec-scan.yaml`
:exclamation: **IMPORTANT NOTE**: Remember to delete the ` ${{ env.SEC_SCAN_BRANCH }}` branch after merging the changes
Copy link
Member

@jiridanek jiridanek Jan 31, 2024

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Consider enabling https://docs.github.com/en/repositories/configuring-branches-and-merges-in-your-repository/configuring-pull-request-merges/managing-the-automatic-deletion-of-branches in repository settings. Autodeleted branches can be restored with push of a button if still needed later.

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Thanks for the suggestion. We can club it with automating deletion for this workflow as well.

7 changes: 6 additions & 1 deletion Makefile
Original file line number Diff line number Diff line change
Expand Up @@ -468,4 +468,9 @@ refresh-pipfilelock-files:
cd runtimes/tensorflow/ubi8-python-3.8 && pipenv lock
cd runtimes/tensorflow/ubi9-python-3.9 && pipenv lock
cd base/c9s-python-3.9 && pipenv lock


# This is only for the workflow action
# For running manually, set the required environment variables
.PHONY: scan-image-vulnerabilities
scan-image-vulnerabilities:
python ci/security-scan/quay_security_analysis.py
182 changes: 182 additions & 0 deletions ci/security-scan/quay_security_analysis.py
Original file line number Diff line number Diff line change
@@ -0,0 +1,182 @@
import os
import subprocess
import re
from datetime import date
import requests
from collections import Counter
import fileinput

branch_dictionary = {}

commit_id_path = "ci/security-scan/weekly_commit_ids.env"

IMAGES_MAIN = [
"odh-minimal-notebook-image-main",
"odh-runtime-minimal-notebook-image-main",
"odh-runtime-data-science-notebook-image-main",
"odh-minimal-gpu-notebook-image-main",
"odh-pytorch-gpu-notebook-image-main",
"odh-generic-data-science-notebook-image-main",
"odh-tensorflow-gpu-notebook-image-main",
"odh-trustyai-notebook-image-main",
"odh-habana-notebook-image-main",
"odh-codeserver-notebook-main",
"odh-rstudio-notebook-main",
"odh-rstudio-gpu-notebook-main"
]

IMAGES = [
"odh-minimal-notebook-image-n",
"odh-runtime-minimal-notebook-image-n",
"odh-runtime-data-science-notebook-image-n",
"odh-minimal-gpu-notebook-image-n",
"odh-pytorch-gpu-notebook-image-n",
"odh-runtime-pytorch-notebook-image-n",
"odh-generic-data-science-notebook-image-n",
"odh-tensorflow-gpu-notebook-image-n",
"odh-runtime-tensorflow-notebook-image-n",
"odh-trustyai-notebook-image-n",
"odh-habana-notebook-image-n",
"odh-codeserver-notebook-n",
"odh-rstudio-notebook-n",
"odh-rstudio-gpu-notebook-n"
]

IMAGES_N_1 = [
"odh-minimal-notebook-image-n-1",
"odh-runtime-minimal-notebook-image-n-1",
"odh-minimal-gpu-notebook-image-n-1",
"odh-pytorch-gpu-notebook-image-n-1",
"odh-runtime-pytorch-notebook-image-n-1",
"odh-runtime-data-science-notebook-image-n-1",
"odh-generic-data-science-notebook-image-n-1",
"odh-tensorflow-gpu-notebook-image-n-1",
"odh-runtime-tensorflow-notebook-image-n-1",
"odh-trustyai-notebook-image-n-1",
"odh-codeserver-notebook-n-1",
"odh-rstudio-notebook-n-1",
"odh-rstudio-gpu-notebook-n-1"
]

def generate_markdown_table(branch_dictionary):
markdown_data = ""
for key, value in branch_dictionary.items():
markdown_data += f"| [{key}](https://quay.io/repository/opendatahub/workbench-images/manifest/{value['sha']}?tab=vulnerabilities) |"
for severity in ['Medium', 'Low', 'Unknown', 'High', 'Critical']:
count = value.get(severity, 0) # Get count for the severity, default to 0 if not present
markdown_data += f" {count} |"
markdown_data += "\n"
return markdown_data

def process_image(image, commit_id_path, RELEASE_VERSION_N, HASH_N):
with open(commit_id_path, 'r') as params_file:
img_line = next(line for line in params_file if re.search(f"{image}=", line))
img = img_line.split('=')[1].strip()

registry = img.split('@')[0]

src_tag_cmd = f'skopeo inspect docker://{img} | jq \'.Env[] | select(startswith("OPENSHIFT_BUILD_NAME=")) | split("=")[1]\''
src_tag = subprocess.check_output(src_tag_cmd, shell=True, text=True).strip().strip('"').replace('-amd64', '')

regex = ""

if RELEASE_VERSION_N == "":
regex = f"{src_tag}-(\\d+-)?{HASH_N}"
else:
regex = f"{src_tag}-{RELEASE_VERSION_N}-\\d+-{HASH_N}"

latest_tag_cmd = f'skopeo inspect docker://{img} | jq -r --arg regex "{regex}" \'.RepoTags | map(select(. | test($regex))) | .[0]\''
latest_tag = subprocess.check_output(latest_tag_cmd, shell=True, text=True).strip()

digest_cmd = f'skopeo inspect docker://{registry}:{latest_tag} | jq .Digest | tr -d \'"\''
digest = subprocess.check_output(digest_cmd, shell=True, text=True).strip()

if digest is None or digest == "":
return

output = f"{registry}@{digest}"

sha_ = output.split(":")[1]

url = f"https://quay.io/api/v1/repository/opendatahub/workbench-images/manifest/sha256:{sha_}/security"

response = requests.get(url)
data = response.json()

vulnerabilities = []

for feature in data['data']['Layer']['Features']:
if(len(feature['Vulnerabilities']) > 0):
for vulnerability in feature['Vulnerabilities']:
vulnerabilities.append(vulnerability)

severity_levels = [entry.get("Severity", "Unknown") for entry in vulnerabilities]
severity_counts = Counter(severity_levels)

branch_dictionary[latest_tag] = {}
branch_dictionary[latest_tag]['sha']= digest

for severity, count in severity_counts.items():
branch_dictionary[latest_tag][severity] = count

for line in fileinput.input(commit_id_path, inplace=True):
if line.startswith(f"{image}="):
line = f"{image}={output}\n"
print(line, end="")

today = date.today()
d2 = today.strftime("%B %d, %Y")

LATEST_MAIN_COMMIT = os.environ['LATEST_MAIN_COMMIT']

for i, image in enumerate(IMAGES_MAIN):
process_image(image, commit_id_path, "", LATEST_MAIN_COMMIT)

branch_main_data = generate_markdown_table(branch_dictionary)
branch_dictionary = {}

RELEASE_VERSION_N = os.environ['RELEASE_VERSION_N']
HASH_N = os.environ['HASH_N']

for i, image in enumerate(IMAGES):
process_image(image, commit_id_path, RELEASE_VERSION_N, HASH_N)

branch_n_data = generate_markdown_table(branch_dictionary)
branch_dictionary = {}

RELEASE_VERSION_N_1 = os.environ['RELEASE_VERSION_N_1']
HASH_N_1 = os.environ['HASH_N_1']

for i, image in enumerate(IMAGES_N_1):
process_image(image, commit_id_path, RELEASE_VERSION_N_1, HASH_N_1)

branch_n_1_data = generate_markdown_table(branch_dictionary)

markdown_content = """# Security Scan Results
Date: {todays_date}
# Branch main
| Image Name | Medium | Low | Unknown | High | Critical |
|------------|-------|-----|---------|------|------|
{branch_main}
# Branch N
| Image Name | Medium | Low | Unknown | High | Critical |
|------------|-------|-----|---------|------|------|
{branch_n}
# Branch N - 1
| Image Name | Medium | Low | Unknown | High | Critical |
|------------|-------|-----|---------|------|------|
{branch_n_1}
"""

final_markdown = markdown_content.format(branch_n=branch_n_data, todays_date=d2, branch_n_1=branch_n_1_data, branch_main=branch_main_data)

# Writing to the markdown file
with open("ci/security-scan/security_scan_results.md", "w") as markdown_file:
markdown_file.write(final_markdown)
43 changes: 43 additions & 0 deletions ci/security-scan/weekly_commit_ids.env
Original file line number Diff line number Diff line change
@@ -0,0 +1,43 @@
odh-minimal-notebook-image-main=quay.io/opendatahub/workbench-images@sha256:e9d6a6ee0e1ce6878d3a00dd9ffe85ffb536298ce1fe4d9c577ba0159c69a7f3
atheo89 marked this conversation as resolved.
Show resolved Hide resolved
odh-minimal-notebook-image-n=quay.io/opendatahub/workbench-images@sha256:cde20ac445d25c70d95042a546334c398ed3fca73e85530f0ffef3cbdb6ec746
odh-minimal-notebook-image-n-1=quay.io/opendatahub/workbench-images@sha256:9323e689ec6ab1abb3cbdfd6258811bd57376c6b3e48f71838408cbb0b8b24a3
odh-minimal-gpu-notebook-image-main=quay.io/opendatahub/workbench-images@sha256:07770e7eba2145309eed705261e3e295c53a05912a822bf8a64b4d284cfb79ca
odh-minimal-gpu-notebook-image-n=quay.io/opendatahub/workbench-images@sha256:0f2f49da81f12f900579f5ccf0f1990e2ea94a2c1a2b8848dce6f9e9d2dd6d6f
odh-minimal-gpu-notebook-image-n-1=quay.io/opendatahub/workbench-images@sha256:96d9273857b1ba7bb3428fc78d283e32196a0476b5fce25ed6ebf89e965b09f7
odh-pytorch-gpu-notebook-image-main=quay.io/opendatahub/workbench-images@sha256:26238198f397dca96b72015dc25bd7fe4969bb00eb4d4cff718a32c3d8fda3fc
odh-pytorch-gpu-notebook-image-n=quay.io/opendatahub/workbench-images@sha256:3881889e511bde525d560b7dbbd655ea7586d7bed89502d1a4ce55ac24866ab1
odh-pytorch-gpu-notebook-image-n-1=quay.io/opendatahub/workbench-images@sha256:cf24bd469c283aeeeffa4ff3771ee10219f4446c4afef5f9d4c6c84c54bd81ce
odh-generic-data-science-notebook-image-main=quay.io/opendatahub/workbench-images@sha256:702694adc61071f93c8705de61badcbecc9e248af0041f8d59fca748b6a10d8d
odh-generic-data-science-notebook-image-n=quay.io/opendatahub/workbench-images@sha256:57d8e32ac014dc39d1912577e2decff1b10bb2f06f4293c963e687687a580b05
odh-generic-data-science-notebook-image-n-1=quay.io/opendatahub/workbench-images@sha256:306084cb5de139bc01f1b72e7fd23ff3db89318094980309af6ca4103b84888f
odh-tensorflow-gpu-notebook-image-main=quay.io/opendatahub/workbench-images@sha256:c12649d2405504afaef2c338600ac5d38a3ae104a790a9e119f61e80dfae0fad
odh-tensorflow-gpu-notebook-image-n=quay.io/opendatahub/workbench-images@sha256:d87c30a4c41d189f24273953c60536d9a710d407289733ccc809a4f5e1549bd0
odh-tensorflow-gpu-notebook-image-n-1=quay.io/opendatahub/workbench-images@sha256:cd6d8830a2f49dff70ece1686a6f17508681a850bacde4c757d497cbc59827ef
odh-trustyai-notebook-image-main=quay.io/opendatahub/workbench-images@sha256:276f3b67b62555d746de208976d596759ccac8bd26660900c2e7185380fe043d
odh-trustyai-notebook-image-n=quay.io/opendatahub/workbench-images@sha256:bf2087d3a1859f3bb9cd3d4636ad1507bc4b1c44f0e12aa2f95e9d50e6f8d6eb
odh-trustyai-notebook-image-n-1=quay.io/opendatahub/workbench-images@sha256:5b5bae7a11f2e34b67726a86d24b8f2c35c701a48d80abbdbc91030033d2fc1f
odh-habana-notebook-image-main=quay.io/opendatahub/workbench-images@sha256:f5237ad45f84a9adfc5e30d6fab809dcd7fd10dc9048b3c82f8dfe71d2d7eb2c
odh-habana-notebook-image-n=quay.io/opendatahub/workbench-images@sha256:b0821ae2abe45387a371108ac08e7474b64255e5c4519de5da594b4617fd79fe
odh-codeserver-notebook-main=quay.io/opendatahub/workbench-images@sha256:2797380eaf0f05d6002e9fbb41a6a8b5368b658230ba46b07341c9c96797f591
odh-codeserver-notebook-n=quay.io/opendatahub/workbench-images@sha256:1c5bcbfc222dfb59849fee67e050719c688c93d3608f7b46edbe5666263641f3
odh-codeserver-notebook-n-1=quay.io/opendatahub/workbench-images@sha256:fd5b9f65c0f46d4c093e2f58fce305eeb125bf19ee1d88f67b9fafe56142e92d
odh-rstudio-notebook-main=quay.io/opendatahub/workbench-images@sha256:cffcf81ca0dba140d3dfc5ab452eebd6db92e55da5bdfbe3f931661489a8a596
odh-rstudio-notebook-n=quay.io/opendatahub/workbench-images@sha256:8e99e4e3800db121d02b50adec5eba27746bf89d32dba3e2b17e8d750ac53608
odh-rstudio-notebook-n-1=quay.io/opendatahub/workbench-images@sha256:75d6764e1155c1d18dc4472ff319f9291d0d9703b19ee1374e902b6ab7f55cfb
odh-rstudio-gpu-notebook-main=quay.io/opendatahub/workbench-images@sha256:41d07177990519db629796f743b6dcb663bc8090e4c8248348f746b2fa4f7dbb
odh-rstudio-gpu-notebook-n=quay.io/opendatahub/workbench-images@sha256:3ad0bb5f3b8c2ca1d29a423913b6d8f32353d9787c5f38e4b56a9a922f6e3cdd
odh-rstudio-gpu-notebook-n-1=quay.io/opendatahub/workbench-images@sha256:aef5fd12264651abf286e9a4efbe25ca002cc257fbc6f1a5daf39fd55c7d6206
odh-runtime-minimal-notebook-image-main=quay.io/opendatahub/workbench-images@sha256:b02d8970449a48362a9f54ea563692b8d4c0e9f1f689ea1cf6bd2da18538a421
odh-runtime-minimal-notebook-image-n=quay.io/opendatahub/workbench-images@sha256:91224cde193645c231e454bdcc25ab1aa40dd7c7bc466c87baffa8c03f5e3128
odh-runtime-minimal-notebook-image-n-1=quay.io/opendatahub/workbench-images@sha256:41dd881199fd93ffccc4f00c16a69ad16f27f1e4877373ad96ff7a94b9564972
odh-runtime-data-science-notebook-image-main=quay.io/opendatahub/workbench-images@sha256:26c4433b2869c27e59e2c9b3c693b548e6103251fb1f698d25ddf963ba8cafdf
odh-runtime-data-science-notebook-image-n=quay.io/opendatahub/workbench-images@sha256:80be5d437517207860e454c82ba6a6d7a4555f27ccc393219c6999cb468a96ad
odh-runtime-data-science-notebook-image-n-1=quay.io/opendatahub/workbench-images@sha256:a5bfdd5a783cecd9cb74b11f62259f683ecd2b9df2f681b5d84db5a5b20d8589
odh-runtime-pytorch-notebook-image-main=quay.io/opendatahub/workbench-images@sha256:ac50e25a6fc3feaa1dccf16fb5042c5cae0972b0fa7b6eae0e7bf2afbf0f60d8
odh-runtime-pytorch-notebook-image-n=quay.io/opendatahub/workbench-images@sha256:351be872d943f950fd1b11c0b45f6d60d60c138e40c5c49ccad14542d80f950d
odh-runtime-pytorch-notebook-image-n-1=quay.io/opendatahub/workbench-images@sha256:083b81bf7a8f80bf8b1eebbb8d0ad63137c39cd91a2b9c29d76c240ce02013d9
odh-runtime-tensorflow-notebook-image-main=quay.io/opendatahub/workbench-images@sha256:de2d2d466c4de06f6edac851005749a7132b1de334506824d58a5d39b5d6d3c0
odh-runtime-tensorflow-notebook-image-n=quay.io/opendatahub/workbench-images@sha256:562a5b50afa0b3c19a8f84e66576ff1c746ac6369a168547bcc5d089ebd4ef91
odh-runtime-tensorflow-notebook-image-n-1=quay.io/opendatahub/workbench-images@sha256:162d64c8af9a3c16146c743df4db3351294c85022351388978c9649fbd12ff27


Loading
Loading