Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

workflow started failing this week #1016

Merged
merged 17 commits into from
Sep 18, 2024
Merged
11 changes: 8 additions & 3 deletions .github/scripts/define_versions.sh
Original file line number Diff line number Diff line change
Expand Up @@ -25,11 +25,16 @@ versions+=" 1.6.16"
versions+=" 1.6.17"
versions+=" 1.6.18"
versions+=" 1.6.19"
versions+=" 1.6.20"

# future versions (release tags that are expected)
versions+=" 1.6.20"
versions+=" 1.6.21"
versions+=" 1.6.22"
versions+=" 1.7.0"
versions+=" 1.7.1"
versions+=" 1.7.2"
versions+=" 1.7.3"
versions+=" 1.7.4"
versions+=" 1.7.5"
versions+=" 1.7.6"

export versions

Expand Down
118 changes: 27 additions & 91 deletions .github/workflows/code.yml
Original file line number Diff line number Diff line change
Expand Up @@ -37,78 +37,24 @@ jobs:
python_version: "3.11"
# see .ruff.toml file for options

install-catalogs:
name: Install & cache databroker catalogs
runs-on: ubuntu-latest
needs: lint
strategy:
matrix:
python-version:
- "3.11"
max-parallel: 5

steps:
- uses: actions/checkout@v4

- uses: actions/setup-python@v5
with:
python-version: "3.11"

- name: Create Python ${{ matrix.python-version }} environment
uses: mamba-org/setup-micromamba@v1
with:
cache-environment: true
cache-environment-key: env-key-${{ matrix.python-version }}
condarc: |
channel-priority: flexible
environment-file: environment.yml
environment-name: anaconda-test-env-py-${{ matrix.python-version }}

- name: Unpack
run: |
set -vxeuo pipefail
which databroker-pack
which databroker-unpack
cd resources
bash ./unpack.sh
cd ..

- name: Directory Listings
run: |
set -vxeuo pipefail
ls -lAFghR ~/.local/
ls -lAFghR /tmp/*_test/

- name: Prepare archival content
run: |
set -vxeuo pipefail
mkdir -p ~/databroker_catalogs/
mv ~/.local ~/databroker_catalogs/
mv /tmp/*_test ~/databroker_catalogs/

- name: Archive catalog artifacts
uses: actions/upload-artifact@v4
with:
name: databroker_catalogs
path: ~/databroker_catalogs

test-matrix:
name: Python ${{ matrix.python-version }}
runs-on: ubuntu-latest
needs: install-catalogs
needs: lint
strategy:
matrix:
python-version:
- "3.8"
- "3.9"
- "3.10"
- "3.11"
# - "3.12"
max-parallel: 5

steps:
- uses: actions/checkout@v4

- name: Create Python ${{ matrix.python-version }} environment
# needed for Unpack step
uses: mamba-org/setup-micromamba@v1
with:
cache-environment: true
Expand All @@ -124,18 +70,25 @@ jobs:
python=${{ matrix.python-version }}
setuptools-scm

- name: Initial diagnostics
- name: Unpack
run: |
set -vxeuo pipefail
micromamba info
micromamba list
conda config --show-sources
conda config --show
micromamba env list
printenv | sort
pip install databroker-pack
which databroker-pack
which databroker-unpack
cd resources
bash ./unpack.sh
cmd="import databroker;"
cmd+=" print(list(databroker.catalog));"
cmd+=" print(databroker.catalog_search_path());"
python -c "${cmd}"
cd ..

- name: Directories before Docker
run: ls -lAFghrt ~/
# - name: Directory Listings
# run: |
# set -vxeuo pipefail
# ls -lAFghR ~/.local/share/intake
# ls -lAFghR /tmp/*_test/

- name: Start EPICS IOCs in Docker
run: |
Expand All @@ -146,9 +99,6 @@ jobs:
ls -lAFgh /tmp/docker_ioc/iocad/
ls -lAFgh /tmp/docker_ioc/iocgp/

- name: Directories after Docker
run: ls -lAFghrt ~/

- name: Confirm EPICS IOC is available via caget
shell: bash -l {0}
run: |
Expand All @@ -165,6 +115,9 @@ jobs:
shell: bash -l {0}
run: |
python -c "import epics; print(epics.caget('gp:UPTIME'))"
CMD="import epics"
CMD+="; print(epics.caget('gp:UPTIME'))"
python -c "${CMD}"

- name: Confirm EPICS IOC is available via ophyd
shell: bash -l {0}
Expand All @@ -175,30 +128,13 @@ jobs:
CMD+="; up.wait_for_connection()"
CMD+="; print(up.get(), pv.get())"
python -c "${CMD}"

- name: Download catalog artifacts
uses: actions/download-artifact@v4
with:
name: databroker_catalogs
path: ~/databroker_catalogs

- name: Restore archival content
run: |
set -vxeuo pipefail
mkdir -p ~/.local/share/intake
mv ~/databroker_catalogs/.local/share/intake/* ~/.local/share/intake
mv ~/databroker_catalogs/*_test /tmp/

- name: Diagnostics
shell: bash -l {0}
run: |
set -vxeuo pipefail
df -HT
micromamba list


- name: Test catalog length, expect 53
shell: bash -l {0}
run: python -c "import databroker; print(len(databroker.catalog['apstools_test']))"
run: |
CMD="import databroker"
CMD+="; print(len(databroker.catalog['apstools_test']))"
python -c "${CMD}"

- name: Run tests with pytest & coverage
shell: bash -l {0}
Expand Down
1 change: 1 addition & 0 deletions CHANGES.rst
Original file line number Diff line number Diff line change
Expand Up @@ -36,6 +36,7 @@ describe future plans.
-----------

* Added two PVs to PTC10 support.
* Unit tests now support Python version 3.9, 3.10, & 3.11.

1.6.20
******
Expand Down
7 changes: 6 additions & 1 deletion docs/source/_static/switcher.json
Original file line number Diff line number Diff line change
Expand Up @@ -5,7 +5,12 @@
"url": "https://bcda-aps.github.io/apstools/dev/"
},
{
"name": "1.6.20 (latest)",
"name": "1.7.0 (latest)",
"version": "1.7.0",
"url": "https://bcda-aps.github.io/apstools/1.7.0/"
},
{
"name": "1.6.20",
"version": "1.6.20",
"url": "https://bcda-aps.github.io/apstools/1.6.20/"
},
Expand Down
2 changes: 1 addition & 1 deletion environment.yml
Original file line number Diff line number Diff line change
Expand Up @@ -6,7 +6,7 @@ channels:
- nodefaults

dependencies:
- python >=3.8, <=3.11
- python >=3.9
- area-detector-handlers
- bluesky >=1.6.7, !=1.11.0
- bluesky-live
Expand Down
11 changes: 6 additions & 5 deletions pyproject.toml
Original file line number Diff line number Diff line change
Expand Up @@ -22,7 +22,7 @@ maintainers = [
{ name="Eric Codrea" },
]
readme = "README.md"
requires-python = ">=3.8"
requires-python = ">=3.9"
keywords = ["EPICS", "data acquisition", "diffraction", "NeXus", "HDF5", "SPEC", "MatPlotLib"]
license = {file = "LICENSE.txt"}
classifiers = [
Expand All @@ -33,10 +33,11 @@ classifiers = [
"License :: Public Domain",
"Programming Language :: Python",
"Programming Language :: Python :: 3",
"Programming Language :: Python :: 3.8",
"Programming Language :: Python :: 3.9",
"Programming Language :: Python :: 3.10",
"Programming Language :: Python :: 3.11",
# "Programming Language :: Python :: 3.9",
# "Programming Language :: Python :: 3.10",
# "Programming Language :: Python :: 3.11",
# "Programming Language :: Python :: 3.12",
# "Programming Language :: Python :: 3.13",
"Topic :: Scientific/Engineering",
"Topic :: Scientific/Engineering :: Astronomy",
"Topic :: Scientific/Engineering :: Bio-Informatics",
Expand Down