Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

pymatgen>=2024.9.10 updated after bugfix and conflict resolve #188

Merged
merged 8 commits into from
Sep 18, 2024

Conversation

abhardwaj73
Copy link
Contributor

@abhardwaj73 abhardwaj73 commented Sep 11, 2024

Summary

Major changes:

  • fix 1: bump pymatgen version to 2024.9.10 after bugfix
  • fix 2: drop support for python 3.9 following numpy policy

Issue #158

@abhardwaj73
Copy link
Contributor Author

abhardwaj73 commented Sep 11, 2024

Looks like it can't find the newest pymatgen version 2024.9.10

ERROR: Could not find a version that satisfies the requirement pymatgen>=2024.9.10 (from versions: 1.0.4, 1.0.5, 1.1.0, 1.1.1, 1.1.2, 1.2.0, 1.2.1, 1.2.2, 1.2.3, 1.2.4, 1.2.8, 1.2.9, 1.5.0, 1.6.0, 1.7.0, 1.7.2, 1.8.0, 1.8.2, 1.8.3, 1.9.0, 2.0.0, 2.1.0, 2.1.2, 2.2.0, 2.2.1, 2.2.2, 2.2.3, 2.2.4, 2.2.6, 2.3.0, 2.3.1, 2.3.2, 2.4.0, 2.4.1, 2.4.2, 2.4.3, 2.5.0, 2.5.1, 2.5.2, 2.5.3, 2.5.4, 2.5.5, 2.6.1, 2.6.2, 2.6.3, 2.6.4, 2.6.5, 2.6.6, 2.7.0, 2.7.1, 2.7.2b0, 2.7.3, 2.7.4, 2.7.5, 2.7.6, 2.7.7, 2.7.8, 2.7.9, 2.8.0, 2.8.1, 2.8.2, 2.8.3, 2.8.5, 2.8.6, 2.8.7, 2.8.8, 2.8.9, 2.8.10, 2.9.0, 2.9.1, 2.9.2, 2.9.3, 2.9.4, 2.9.5, 2.9.6, 2.9.7, 2.9.8, 2.9.9, 2.9.10, 2.9.11, 2.9.12, 2.9.13, 2.9.14, 2.10.0, 2.10.1, 2.10.2, 2.10.3, 2.10.4, 2.10.5, 2.10.6, 3.0.0, 3.0.1, 3.0.2, 3.0.3, 3.0.4, 3.0.5, 3.0.6, 3.0.7, 3.0.8, 3.0.9, 3.0.10, 3.0.11, 3.0.13, 3.1.0, 3.1.1, 3.1.2, 3.1.3, 3.1.4, 3.1.5, 3.1.6, 3.1.7, 3.1.8, 3.1.9, 3.2.0, 3.2.1, 3.2.2, 3.2.3, 3.2.4, 3.2.5, 3.2.6, 3.2.7, 3.2.8, 3.2.9, 3.2.10, 3.3.0, 3.3.1, 3.3.2, 3.3.3, 3.3.4, 3.3.5, 3.3.6, 3.4.0, 3.5.0, 3.5.1, 3.5.2, 3.5.3, 3.6.0, 3.6.1, 3.7.0, 3.7.1, 4.0.0, 4.0.1, 4.0.2, 4.1.0, 4.1.1, 4.2.0, 4.2.1, 4.2.2, 4.2.3, 4.2.4, 4.2.5, 4.3.0, 4.3.1, 4.3.2, 4.4.0, 4.4.1, 4.4.2, 4.4.3, 4.4.4, 4.4.5, 4.4.6, 4.4.7, 4.4.8, 4.4.9, 4.4.10, 4.4.11, 4.4.12, 4.5.0, 4.5.1, 4.5.2, 4.5.3, 4.5.4, 4.5.5, 4.5.6, 4.5.7, 4.6.0, 4.6.1, 4.6.2, 4.7.0, 4.7.1, 4.7.2, 4.7.3, 4.7.4, 4.7.5, 4.7.6, 4.7.7, 2017.6.8, 2017.6.22, 2017.6.24, 2017.7.4, 2017.7.21, 2017.8.4, 2017.8.14, 2017.8.16, 2017.8.20, 2017.8.21, 2017.9.1, 2017.9.3, 2017.9.23, 2017.10.16, 2017.11.6, 2017.11.9, 2017.11.27, 2017.11.30, 2017.12.6, 2017.12.8, 2017.12.15, 2017.12.16, 2017.12.30, 2018.1.19, 2018.1.29, 2018.2.13, 2018.3.2, 2018.3.13, 2018.3.14, 2018.3.23, 2018.4.6, 2018.4.20, 2018.5.3, 2018.5.14, 2018.5.21, 2018.5.22, 2018.6.11, 2018.6.27, 2018.7.15, 2018.7.23, 2018.8.7, 2018.8.10, 2018.9.1, 2018.9.12, 2018.9.19, 2018.9.30, 2018.10.18, 2018.11.6, 2018.11.30, 2018.12.12, 2019.1.13, 2019.1.24, 2019.2.4, 2019.2.24, 2019.2.28, 2019.3.13, 2019.3.27, 2019.4.11, 2019.5.1, 2019.5.8, 2019.5.28, 2019.6.5, 2019.6.20, 2019.7.2, 2019.7.21, 2019.7.30, 2019.8.14, 2019.8.23, 2019.9.7, 2019.9.8, 2019.9.12, 2019.9.16, 2019.10.2, 2019.10.3, 2019.10.4, 2019.10.16, 2019.11.11, 2019.12.3, 2019.12.22, 2020.1.10, 2020.1.28, 2020.3.2, 2020.3.13, 2020.4.2, 2020.4.29, 2020.6.8, 2020.7.3, 2020.7.10, 2020.7.14, 2020.7.16, 2020.7.18, 2020.8.3, 2020.8.13, 2020.9.14, 2020.10.9, 2020.10.9.1, 2020.10.20, 2020.11.11, 2020.12.3, 2020.12.18, 2020.12.31, 2021.2.8, 2021.2.8.1, 2021.2.13, 2021.2.14, 2021.2.16, 2021.3.3, 2021.3.5, 2021.3.9, 2022.0.3, 2022.0.4, 2022.0.5, 2022.0.6, 2022.0.7, 2022.0.8, 2022.0.9, 2022.0.10, 2022.0.11, 2022.0.12, 2022.0.13, 2022.0.14, 2022.0.15, 2022.0.16, 2022.0.17, 2022.1.8, 2022.1.9, 2022.1.20, 2022.1.24, 2022.2.1, 2022.2.7, 2022.2.10, 2022.3.7, 2022.3.22, 2022.3.24, 2022.3.29, 2022.4.19, 2022.4.26, 2022.5.17, 2022.5.18, 2022.5.18.1, 2022.5.19, 2022.5.26, 2022.7.8, 2022.7.19, 2022.7.24.1, 2022.7.25, 2022.8.23, 2022.9.8, 2022.9.21, 2022.10.22, 2022.11.1, 2022.11.7, 2023.1.9, 2023.1.20, 2023.1.30, 2023.2.22, 2023.2.28, 2023.3.10, 2023.3.23, 2023.5.8, 2023.5.10, 2023.5.31, 2023.6.23, 2023.7.11, 2023.7.14, 2023.7.17, 2023.7.20, 2023.8.10, 2023.9.2, 2023.9.10, 2023.9.25, 2023.10.3, 2023.10.4, 2023.10.11, 2023.11.10, 2023.11.12, 2023.12.18, 2024.1.26, 2024.1.27, 2024.2.8, 2024.2.20, 2024.2.23, 2024.3.1, 2024.4.12, 2024.4.13, 2024.5.1, 2024.6.4, 2024.6.10, 2024.7.18, 2024.8.9)

@rkingsbury
Copy link
Member

Ah, I guess that's because pymatgen just dropped support for python 3.9 (I had missed that in the CHANGELOG). So the python 3.9 unit test can't install that version.

Per the README file, we try to follow the numpy policy in terms of python version support (and so does pymatgen), so since they've dropped 3.9, we can too.

I know this expands the scope of the PR somewhat, but if you're up for it, feel free to take a stab at dropping python 3.9. You would need to update the requirement in pyproject.toml, a few places in the documentation (just search for "3.,9"), and the github actions workflows.

The other option is to add a conditional requirement in pyproject.toml, where we keep the existing pymatgen version for python==3.9, and bump it to the latest version for python>3.9.

Copy link

codecov bot commented Sep 11, 2024

Codecov Report

All modified and coverable lines are covered by tests ✅

Project coverage is 81.96%. Comparing base (c2d5726) to head (f322273).
Report is 11 commits behind head on main.

Additional details and impacted files
@@            Coverage Diff             @@
##             main     #188      +/-   ##
==========================================
+ Coverage   75.84%   81.96%   +6.12%     
==========================================
  Files           9        9              
  Lines        1486     1486              
  Branches      322      322              
==========================================
+ Hits         1127     1218      +91     
+ Misses        315      224      -91     
  Partials       44       44              

☔ View full report in Codecov by Sentry.
📢 Have feedback on the report? Share it here.

Copy link
Member

@rkingsbury rkingsbury left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Nice work @abhardwaj73 , thanks for this. In addition to the one comment above, it looks like the ubuntu tests started to fail because of a floating point precision error:

FAILED tests/test_phreeqc.py::test_equilibrate - AssertionError: assert False
 +  where False = <function isclose at 0x7f623e12a070>(0.0010000227561117677, 0.001)
 +    where <function isclose at 0x7f623e12a070> = np.isclose
 +    and   0.0010000227561117677 = <Quantity(0.0010000227561117677, 'mole')>.magnitude
 +      where <Quantity(0.0010000227561117677, 'mole')> = get_total_amount('C(4)', 'mol')
 +        where get_total_amount = <pyEQL.solution.Solution object at 0x7f6217847b50>.get_total_amount
================== 1 failed, 130 passed, 3 xfailed in 55.68s ===================

As you can see, the test uses np.isclose to compare the actual and expected output, and this one happens to use the default tolerance of 1e-08. The actual and expected outputs differ by just barely more than that, so it fails.

I've manually re-run that test to make sure this wasn't just a quirk of a single worker (computer), but I'm also fine with relaxing this tolerance to atol=1e-7 or atol=1e-6, because that is still much smaller than the concentration we're testing.

.github/workflows/testing.yaml Outdated Show resolved Hide resolved
@rkingsbury
Copy link
Member

Nice work @abhardwaj73 , thanks for this. In addition to the one comment above, it looks like the ubuntu tests started to fail because of a floating point precision error:

FAILED tests/test_phreeqc.py::test_equilibrate - AssertionError: assert False
 +  where False = <function isclose at 0x7f623e12a070>(0.0010000227561117677, 0.001)
 +    where <function isclose at 0x7f623e12a070> = np.isclose
 +    and   0.0010000227561117677 = <Quantity(0.0010000227561117677, 'mole')>.magnitude
 +      where <Quantity(0.0010000227561117677, 'mole')> = get_total_amount('C(4)', 'mol')
 +        where get_total_amount = <pyEQL.solution.Solution object at 0x7f6217847b50>.get_total_amount
================== 1 failed, 130 passed, 3 xfailed in 55.68s ===================

As you can see, the test uses np.isclose to compare the actual and expected output, and this one happens to use the default tolerance of 1e-08. The actual and expected outputs differ by just barely more than that, so it fails.

I've manually re-run that test to make sure this wasn't just a quirk of a single worker (computer), but I'm also fine with relaxing this tolerance to atol=1e-7 or atol=1e-6, because that is still much smaller than the concentration we're testing.

It's not clear to me exactly why this started now, but it must be related to the fact that we're now using python 3.10 instead of python 3.9 for the ubuntu tests.

tests/test_phreeqc.py Outdated Show resolved Hide resolved
@abhardwaj73
Copy link
Contributor Author

abhardwaj73 commented Sep 17, 2024

Adding atol=1e-7 in the affected C(4) case helped pass the ubuntu test. All other tests passed except for the codecov/project test (Project coverage is 81.96% (-1.22% from the last commit)).

Currently, pyEQL requires pymatgen==2024.5.1 (mentioned in the pyproject.toml). If someone uses the latest version of pymatgen, they get an error about pyEQL. It should be resolved with this pull request, since I've updated it to pymatgen>=2024.9.10

@rkingsbury rkingsbury added dependencies Pull requests that update a dependency file github_actions Pull requests that update GitHub Actions code labels Sep 18, 2024
@rkingsbury rkingsbury merged commit 95dcffd into KingsburyLab:main Sep 18, 2024
12 of 13 checks passed
@rkingsbury
Copy link
Member

Thanks @abhardwaj73 !

Closes #158

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
dependencies Pull requests that update a dependency file github_actions Pull requests that update GitHub Actions code
Projects
None yet
Development

Successfully merging this pull request may close these issues.

2 participants