Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

BSM2-P Effluent Metrics w/ Flowsheet Constraints #1503

Draft
wants to merge 60 commits into
base: main
Choose a base branch
from

Conversation

MarcusHolly
Copy link
Contributor

Summary/Motivation:

An extension of #1492 that adds the effluent violation constraints into the flowsheet

Changes proposed in this PR:

  • Adds effluent violation constraints into BSM2-P

Legal Acknowledgement

By contributing to this software project, I agree to the following terms and conditions for my contribution:

  1. I agree my contributions are submitted under the license terms described in the LICENSE.txt file at the top level of this directory.
  2. I represent I am authorized to make the contributions and grant the license. If my employer has rights to intellectual property that includes these contributions, I represent that I have received permission to make contributions and grant the required license on behalf of that employer.

@MarcusHolly MarcusHolly self-assigned this Oct 2, 2024
@MarcusHolly MarcusHolly changed the title BSM2-P Effluent Metric Constraints BSM2-P Effluent Metrics w/ Flowsheet Constraints Oct 2, 2024
@MarcusHolly MarcusHolly added the Priority:Normal Normal Priority Issue or PR label Oct 3, 2024
@lbianchi-lbl
Copy link
Contributor

lbianchi-lbl commented Oct 24, 2024

@MarcusHolly to make debugging faster by avoiding to run the entire test suite, you can search the .github/workflows/checks.yml file for pytest --pyargs watertap (there should be 2 places when it appears) and replace it with pytest --pyargs watertap -k test_BSM2_P_extension (i.e. -k followed by a space and then the name of the test file that you want to run by itself without the .py suffix. More info https://docs.pytest.org/en/stable/how-to/usage.html)

@adam-a-a
Copy link
Contributor

adam-a-a commented Oct 29, 2024

@lbianchi-lbl I should also note that when I run tests locally on this PR, the same tests that fail here pass on my machine. I would expect the tests to fail locally if the workflow were truly aligned. Is there a possibility that something is out-of-sync? I wonder if it has anything to do with the ipopt-watertap solver and whether that is incorporated in workflows (to start). Although we know this is a troublesome flowsheet, I think we should still expect some alignment between tests that run locally vs. workflows, so I am wondering if this warrants further investigation.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
Priority:Normal Normal Priority Issue or PR
Projects
None yet
Development

Successfully merging this pull request may close these issues.

3 participants