-
Notifications
You must be signed in to change notification settings - Fork 258
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Fix dataset logic #771
Fix dataset logic #771
Conversation
…unami enumeration (#764) * adding new notebook for using fairchem models with NEBs * adding md tutorials * blocking code cells that arent needed or take too long
Codecov ReportAttention: Patch coverage is
|
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
This makes sense to me. Thanks for catching!
A lot of this would be cleaner if we had the configs sanitized and/or forced to follow a convention. We should probably do that at some point.
blocked by #770 |
The merge-base changed after approval.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Still LGTM =)
These two lines edited in #742
fairchem/src/fairchem/core/trainers/base_trainer.py
Line 310 in 5743a59
fairchem/src/fairchem/core/trainers/base_trainer.py
Line 332 in 5743a59
would set val and test datasets to have the same "src" as the training set in cases where they are not defined at all in a config. An explicit check that their own "src" is defined should fix the issue.