Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Restore ability to run pytest on individual files or directories #1362

Merged
merged 7 commits into from
Mar 14, 2024
Merged
Show file tree
Hide file tree
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension


Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
2 changes: 1 addition & 1 deletion .github/workflows/core.yml
Original file line number Diff line number Diff line change
Expand Up @@ -119,7 +119,7 @@ jobs:
echo PYTEST_ADDOPTS="$PYTEST_ADDOPTS --cov --cov-report=xml" >> "$GITHUB_ENV"
- name: Run pytest (not integration)
run: |
pytest -m "not integration"
pytest --pyargs idaes -m "not integration"
- name: Upload coverage report as GHA workflow artifact
if: matrix.cov-report
uses: actions/upload-artifact@v4
Expand Down
2 changes: 1 addition & 1 deletion .github/workflows/integration.yml
Original file line number Diff line number Diff line change
Expand Up @@ -154,7 +154,7 @@ jobs:
with:
install-target: -r requirements-dev.txt
- name: Run pytest (integration)
run: pytest -m integration
run: pytest --pyargs idaes -m integration

examples:
name: Run examples (py${{ matrix.python-version }}/${{ matrix.os }})
Expand Down
65 changes: 41 additions & 24 deletions idaes/conftest.py
Original file line number Diff line number Diff line change
Expand Up @@ -41,6 +41,47 @@
####


def pytest_addoption(parser):
parser.addoption(
"--performance",
action="store_true",
dest="performance",
default=False,
help="enable performance decorated tests",
)


MARKERS = {
"build": "test of model build methods",
"cubic_root": "test requires the compiled cubic root finder",
"iapws": "test requires the compiled IAPWS95 property package",
"initialization": "test of initialization methods. These generally require a solver as well",
"solver": "test requires a solver",
"ui": "tests of an aspect of the ui",
"unit": "quick tests that do not require a solver, must run in <2s",
"component": "quick tests that may require a solver",
"integration": "long duration tests",
"performance": "tests for the IDAES performance testing suite",
}


def pytest_configure(config: pytest.Config):
for name, description in MARKERS.items():
config.addinivalue_line("markers", f"{name}: {description}")

if not config.option.performance:
if len(config.option.markexpr) > 0:
setattr(
config.option,
"markexpr",
f"{config.option.markexpr} and not performance",
)
else:
setattr(config.option, "markexpr", "not performance")

Check warning on line 80 in idaes/conftest.py

View check run for this annotation

Codecov / codecov/patch

idaes/conftest.py#L80

Added line #L80 was not covered by tests
else:
setattr(config.option, "markexpr", "performance")

Check warning on line 82 in idaes/conftest.py

View check run for this annotation

Codecov / codecov/patch

idaes/conftest.py#L82

Added line #L82 was not covered by tests


REQUIRED_MARKERS = {"unit", "component", "integration", "performance"}
ALL_PLATFORMS = {"darwin", "linux", "win32"}

Expand Down Expand Up @@ -115,30 +156,6 @@
pytest.fail(msg)


def pytest_addoption(parser):
parser.addoption(
"--performance",
action="store_true",
dest="performance",
default=False,
help="enable performance decorated tests",
)


def pytest_configure(config):
if not config.option.performance:
if len(config.option.markexpr) > 0:
setattr(
config.option,
"markexpr",
f"{config.option.markexpr} and not performance",
)
else:
setattr(config.option, "markexpr", "not performance")
else:
setattr(config.option, "markexpr", "performance")


ModuleName = str


Expand Down
18 changes: 3 additions & 15 deletions pytest.ini
Original file line number Diff line number Diff line change
@@ -1,19 +1,7 @@
[pytest]
addopts = --pyargs idaes
--durations=100
-W ignore
addopts = --durations=100
--durations-min=2
log_file = pytest.log
log_file_date_format = %Y-%m-%dT%H:%M:%S
log_file_format = %(asctime)s %(levelname)-7s <%(filename)s:%(lineno)d> %(message)s
log_file_level = INFO
markers =
build: test of model build methods
cubic_root : test requires the compiled cubic root finder
iapws: test requires the compiled IAPWS95 property package
initialization: test of initialization methods. These generally require a solver as well
solver: test requires a solver
ui: tests of an aspect of the ui
unit: quick tests that do not require a solver, must run in <2s
component: quick tests that may require a solver
integration: long duration tests
performance: tests for the IDAES performance testing suite
log_file_level = INFO
Loading