Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

FM-v4 branch into main #752

Merged
merged 101 commits into from
Aug 21, 2024
Merged

FM-v4 branch into main #752

merged 101 commits into from
Aug 21, 2024

Conversation

misko
Copy link
Collaborator

@misko misko commented Jul 9, 2024

Merging a long running branch called fm-v4 created to support quick changes to main for FM work

@misko misko requested review from rayg1234 and wood-b August 20, 2024 17:06
@misko misko added enhancement New feature or request minor Minor version release labels Aug 20, 2024
Copy link

codecov bot commented Aug 20, 2024

Codecov Report

Attention: Patch coverage is 86.20690% with 8 lines in your changes missing coverage. Please review.

Files Patch % Lines
src/fairchem/core/common/utils.py 85.71% 5 Missing ⚠️
src/fairchem/core/datasets/_utils.py 88.88% 1 Missing ⚠️
src/fairchem/core/modules/loss.py 0.00% 1 Missing ⚠️
src/fairchem/core/trainers/base_trainer.py 90.00% 1 Missing ⚠️
Files Coverage Δ
src/fairchem/core/modules/transforms.py 59.37% <100.00%> (+4.20%) ⬆️
src/fairchem/core/datasets/_utils.py 88.88% <88.88%> (-2.78%) ⬇️
src/fairchem/core/modules/loss.py 66.07% <0.00%> (ø)
src/fairchem/core/trainers/base_trainer.py 89.22% <90.00%> (-0.11%) ⬇️
src/fairchem/core/common/utils.py 68.79% <85.71%> (+0.98%) ⬆️

@@ -393,7 +402,11 @@ def create_dict_from_args(args: list, sep: str = "."):
return return_dict


def load_config(path: str, previous_includes: list | None = None):
def load_config(
path: str, previous_includes: list | None = None, include_paths: list | None = None
Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

maybe comment on the diff between paths, previous_includes and include_paths?

Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

also optional - add test on yml with include paths if doesnt exist, can also make task for BE :)

@@ -999,34 +1024,70 @@ class _TrainingContext:
task_name = "s2ef"
elif trainer_name in ["energy", "equiformerv2_energy"]:
task_name = "is2re"
elif "multitask" in trainer_name.lower():
Copy link
Collaborator

@rayg1234 rayg1234 Aug 20, 2024

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

can we get rid of this .lower() and make it exact, this makes it ambiguous on how to specify the trainer name

raise RuntimeError(
f"Required key missing from config: {missing_keys!s}"
)
trainer = trainer_cls(
Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

can we move these into dict and pass in by trainer_cls(**dict), add the arguments that diff manually for multitask, then we can specify the common arguments between trainer and multitask trainer once without copying

@@ -13,7 +13,9 @@
from torch_geometric.data import Data


def rename_data_object_keys(data_object: Data, key_mapping: dict[str, str]) -> Data:
def rename_data_object_keys(
Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

comment on what this does

@@ -571,6 +575,9 @@ def load_checkpoint(
self.step = checkpoint.get("step", 0)
self.best_val_metric = checkpoint.get("best_val_metric", None)
self.primary_metric = checkpoint.get("primary_metric", None)
self.config["cmd"]["parent"] = checkpoint["config"]["cmd"].get(
Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

where is this parent actually used?

Copy link
Collaborator Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I dont think its read anywhere, i believe it was intended to link fine tuning runs to their parent runs. @mshuaibii is this right?

Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

lets remove if its not used

Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Yeah it was introduced to organize things in wandb by this entry. But you guys probably have a different solution at this point so ommitting is fine

from contextlib import suppress

with suppress(ImportError):
from fairchem.experimental.foundation_models.multi_task_dataloader.transforms.data_object import * # noqa
Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

comment

rayg1234
rayg1234 previously approved these changes Aug 20, 2024
Copy link
Collaborator

@rayg1234 rayg1234 left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

image

@@ -232,6 +234,8 @@ def load(self) -> None:
self.load_loss()
self.load_optimizer()
self.load_extras()
if self.config["optim"].get("load_datasets_and_model_then_exit", False):
Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

why is this needed if this is the last line of the init anyways?

@misko misko enabled auto-merge August 21, 2024 00:59
@misko misko added this pull request to the merge queue Aug 21, 2024
Merged via the queue into main with commit 3899aac Aug 21, 2024
9 checks passed
@misko misko deleted the fm-v4 branch August 21, 2024 03:59
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
enhancement New feature or request minor Minor version release
Projects
None yet
Development

Successfully merging this pull request may close these issues.

8 participants