Skip to content

Commit

Permalink
Merge branch 'main' into add_subcoordy_api
Browse files Browse the repository at this point in the history
  • Loading branch information
droumis authored Aug 5, 2024
2 parents 4fb716f + bca5ec2 commit e551793
Show file tree
Hide file tree
Showing 9 changed files with 29 additions and 5 deletions.
4 changes: 0 additions & 4 deletions .github/workflows/docs.yaml
Original file line number Diff line number Diff line change
Expand Up @@ -50,8 +50,6 @@ jobs:
run: |
echo "Deploying from ref ${GITHUB_REF#refs/*/}"
echo "tag=${GITHUB_REF#refs/*/}" >> $GITHUB_OUTPUT
- name: bokeh sampledata
run: bokeh sampledata
- name: install dev nbsite
run: pip install --pre -U nbsite
- name: conda info
Expand Down Expand Up @@ -115,8 +113,6 @@ jobs:
run: |
echo "Deploying from ref ${GITHUB_REF#refs/*/}"
echo "tag=${GITHUB_REF#refs/*/}" >> $GITHUB_OUTPUT
- name: bokeh sampledata
run: bokeh sampledata
- name: build docs
run: sphinx-build -b html doc builtdocs
- name: report failure
Expand Down
2 changes: 2 additions & 0 deletions .github/workflows/test.yaml
Original file line number Diff line number Diff line change
Expand Up @@ -132,6 +132,7 @@ jobs:
- name: conda list
run: conda list
- name: bokeh sampledata
if: ${{ matrix.python-version == '3.9'}}
run: bokeh sampledata
- name: unit tests
run: pytest -v hvplot --cov=hvplot --cov-append
Expand Down Expand Up @@ -162,6 +163,7 @@ jobs:
- name: pip list
run: pip list
- name: bokeh sampledata
if: ${{ matrix.python-version == '3.9'}}
run: bokeh sampledata
- name: unit tests
run: pytest -v hvplot --cov=hvplot --cov-append
Expand Down
22 changes: 21 additions & 1 deletion doc/conftest.py
Original file line number Diff line number Diff line change
@@ -1,7 +1,11 @@
import dask
from importlib.util import find_spec

import dask

from packaging.version import Version
from bokeh.io.webdriver import webdriver_control


collect_ignore_glob = [
'user_guide/Streaming.ipynb',
]
Expand Down Expand Up @@ -45,3 +49,19 @@
# From Dask 2024.3.0 they now use `dask_expr` by default
# https://github.com/dask/dask/issues/10995
dask.config.set({'dataframe.query-planning': False})


# https://github.com/pydata/xarray/pull/9182
try:
import xarray as xr
except ImportError:
pass
else:
import numpy as np

if Version(np.__version__) >= Version('2.0.0') and Version(xr.__version__) <= Version(
'2024.6.0'
):
collect_ignore_glob += [
'user_guide/Gridded_Data.ipynb',
]
File renamed without changes.
1 change: 1 addition & 0 deletions envs/py3.10-tests.yaml
Original file line number Diff line number Diff line change
Expand Up @@ -15,6 +15,7 @@ channels:
dependencies:
- python=3.10
- bokeh>=3.1
- bokeh_sampledata
- cartopy
- colorcet>=2
- dask
Expand Down
1 change: 1 addition & 0 deletions envs/py3.11-docs.yaml
Original file line number Diff line number Diff line change
Expand Up @@ -15,6 +15,7 @@ channels:
dependencies:
- python=3.11
- bokeh>=3.1
- bokeh_sampledata
- cartopy
- colorcet>=2
- dask>=2021.3.0
Expand Down
1 change: 1 addition & 0 deletions envs/py3.11-tests.yaml
Original file line number Diff line number Diff line change
Expand Up @@ -15,6 +15,7 @@ channels:
dependencies:
- python=3.11
- bokeh>=3.1
- bokeh_sampledata
- cartopy
- colorcet>=2
- dask
Expand Down
1 change: 1 addition & 0 deletions envs/py3.12-tests.yaml
Original file line number Diff line number Diff line change
Expand Up @@ -15,6 +15,7 @@ channels:
dependencies:
- python=3.12
- bokeh>=3.1
- bokeh_sampledata
- cartopy
- colorcet>=2
- dask
Expand Down
2 changes: 2 additions & 0 deletions pyproject.toml
Original file line number Diff line number Diff line change
Expand Up @@ -65,6 +65,7 @@ tests-core = [
"ruff",
"scipy",
"xarray",
"bokeh_sampledata; python_version >= '3.10'",
]
# Optional tests dependencies, i.e. one should be able
# to run and pass the test suite without installing any
Expand Down Expand Up @@ -126,6 +127,7 @@ examples = [
"xarray >=0.18.2",
"xyzservices >=2022.9.0",
"geodatasets >=2023.12.0",
"bokeh_sampledata; python_version >= '3.10'",
]
tests-nb = [
"pytest-xdist",
Expand Down

0 comments on commit e551793

Please sign in to comment.