-
Notifications
You must be signed in to change notification settings - Fork 64
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Upgrade torch to 2.4.0 in pyproject.toml #382
Conversation
Signed-off-by: Tsz Wai Ko <[email protected]>
…taining and using element offsets
…ction including both
…the update of sympy
Signed-off-by: Tsz Wai Ko <[email protected]>
…s and Simulations using the M3GNet Universal Potential.ipynb
Co-authored-by: coderabbitai[bot] <136622811+coderabbitai[bot]@users.noreply.github.com> Signed-off-by: Tsz Wai Ko <[email protected]>
Signed-off-by: Tsz Wai Ko <[email protected]>
WalkthroughThe changes in this pull request primarily involve updates to the Changes
Possibly related PRs
Suggested labels
Suggested reviewers
Thank you for using CodeRabbit. We offer it for free to the OSS community and would appreciate your support in helping us grow. If you find it useful, would you consider giving us a shout-out on your favorite social media? 🪧 TipsChatThere are 3 ways to chat with CodeRabbit:
Note: Be mindful of the bot's finite context window. It's strongly recommended to break down tasks such as reading entire modules into smaller chunks. For a focused discussion, use review comments to chat about specific files and their changes, instead of using the PR comments. CodeRabbit Commands (Invoked using PR comments)
Other keywords and placeholders
Documentation and Community
|
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Actionable comments posted: 0
🧹 Outside diff range and nitpick comments (1)
src/matgl/layers/_core.py (1)
Line range hint
85-89
: Typographical error in the docstringIn the docstring for the
__init__
method ofMLP_norm
, the parameter description foractivation
is duplicated. Please remove the redundant word.Apply this diff to fix the typo:
Args: dims: Dimensions of each layer of MLP. - activation: activation: Activation function. + activation: Activation function. activate_last: Whether to apply activation to last layer. use_bias: Whether to use bias.
📜 Review details
Configuration used: .coderabbit.yaml
Review profile: CHILL
📒 Files selected for processing (1)
- src/matgl/layers/_core.py (2 hunks)
🧰 Additional context used
🔇 Additional comments (2)
src/matgl/layers/_core.py (2)
5-5
: Importing 'cast' is appropriate for type safetyThe inclusion of
cast
from thetyping
module is necessary for the explicit type casting used later in the code.
133-133
: Ensuring type safety with explicitcast
Using
cast(dict, norm_kwargs)
ensures thatnorm_kwargs
is recognized as a dictionary by static type checkers. This is appropriate and helps maintain type safety.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Actionable comments posted: 0
🧹 Outside diff range and nitpick comments (1)
pyproject.toml (1)
53-56
: Summary of dependency changes and suggested next stepsThis PR makes two key changes to dependencies:
- Downgrades the maximum allowed version of
dgl
from 2.4.0 to 2.2.1- Upgrades the maximum allowed version of
torch
from 2.2.1 to 2.4.0While the
torch
upgrade aligns with the PR objectives, thedgl
downgrade is unexpected. To ensure these changes are appropriate and don't introduce issues:
- Clarify the rationale behind the
dgl
version constraint change.- Verify compatibility between the new
torch
version and the constraineddgl
version.- Run the full test suite to catch any regressions or compatibility issues.
- Update the PR description to explain both changes, not just the
torch
upgrade.- Consider updating any documentation that might reference these dependency versions.
📜 Review details
Configuration used: .coderabbit.yaml
Review profile: CHILL
📒 Files selected for processing (2)
- pyproject.toml (1 hunks)
- requirements.txt (1 hunks)
✅ Files skipped from review due to trivial changes (1)
- requirements.txt
🧰 Additional context used
🔇 Additional comments (2)
pyproject.toml (2)
56-56
: LGTM! Verify compatibility with existing codebaseThe upgrade of the
torch
dependency to version<=2.4.0
aligns with the PR objectives. This change could potentially introduce new features or performance improvements.To ensure a smooth transition:
- Verify that all existing torch-dependent code in the project is compatible with torch 2.4.0.
- Update any torch-specific code that might leverage new features or changed APIs in version 2.4.0.
- Run the test suite to catch any potential regressions.
To help identify potential areas that might need attention, you can run:
#!/bin/bash # Search for torch imports and usages in the codebase rg --type python "import torch|from torch" -C 5 # Look for any torch version-specific code or comments rg --type python "torch.*2\.[0-9]" -C 5
53-53
: Verify the intention behind downgrading dgl version constraintThe
dgl
dependency version constraint has been tightened from<=2.4.0
to<=2.2.1
. This change restricts the project to use an older version ofdgl
, which seems to contradict the PR's objective of upgrading dependencies.Could you please clarify:
- The reason for downgrading the
dgl
version constraint?- Whether this change is related to addressing the issues with the united test for numpy version 2.0.1 mentioned in the PR description?
- If this change has been tested for compatibility with other dependencies, especially the upgraded
torch
version?To check for potential compatibility issues, you can run:
Summary
Upgrade torch to 2.4.0 in pyproject.toml to see if the united test for numpy 2.0.1 can be fixed
Checklist
ruff
.mypy
.duecredit
@due.dcite
decorators to reference relevant papers by DOI (example)Tip: Install
pre-commit
hooks to auto-check types and linting before every commit: