Skip to content

Latest commit

 

History

History
9 lines (7 loc) · 462 Bytes

README.md

File metadata and controls

9 lines (7 loc) · 462 Bytes

RT_Transformer_Smiles

Deep neural network model coupled with graph attention network (GAT) and 1D-Transformer to predict Rention Time.

The SMRT dataset from the METLIN library was used for pretraining.

Reference: (1) xue jun, Wang B, li W. RT-Tranformer: Retention Time Prediction for Metabolite Annotation to Assist in Metabolite Identification. ChemRxiv. Cambridge: Cambridge Open Engage; 2023; This content is a preprint and has not been peer-reviewed.