You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
pretrained.py calls following :
g, lg = Graph.atom_dgl_multigraph(
atoms,
cutoff=float(cutoff),
max_neighbors=max_neighbors,
)
This uses the default value of use_canonize = False, which is not necessarly the value which was used for training. From my testing, changing this value for inference greatly influences results (tested on jv_formation_energy_peratom_alignn with POSCAR files from the sample_data folder).
Typically, running the following : python pretrained.py --model_name jv_formation_energy_peratom_alignn --file_format poscar --file_path .\examples\sample_data\POSCAR-JVASP-107772.vasp
gives 0.003147430717945099 if use_canonize = False (default behaviour).
gives -5.2578747272491455e-05 if use_canonize = True
This difference is not huge but seems to be greatly increased when batching mutliple files for inference (my own implementation).
From these results, my guess is that one would want this variable to be stored in the model parameters, rather than called upon training/inference.
Are these normal results ? What is the physical/mathematical meaning of this variable ?
The text was updated successfully, but these errors were encountered:
pretrained.py calls following :
g, lg = Graph.atom_dgl_multigraph(
atoms,
cutoff=float(cutoff),
max_neighbors=max_neighbors,
)
This uses the default value of use_canonize = False, which is not necessarly the value which was used for training. From my testing, changing this value for inference greatly influences results (tested on jv_formation_energy_peratom_alignn with POSCAR files from the sample_data folder).
Typically, running the following : python pretrained.py --model_name jv_formation_energy_peratom_alignn --file_format poscar --file_path .\examples\sample_data\POSCAR-JVASP-107772.vasp
gives 0.003147430717945099 if use_canonize = False (default behaviour).
gives -5.2578747272491455e-05 if use_canonize = True
This difference is not huge but seems to be greatly increased when batching mutliple files for inference (my own implementation).
From these results, my guess is that one would want this variable to be stored in the model parameters, rather than called upon training/inference.
Are these normal results ? What is the physical/mathematical meaning of this variable ?
The text was updated successfully, but these errors were encountered: