We release an evaluation report about the capabilities of o1 in scientific domains.
We propose reframing single-step retrosynthesis prediction as a molecular string editing task, iteratively refining target molecule strings to generate precursor compounds.
We release a survey paper about scientific large language models, focusing on textual, molecular, protein and genomic languages, as well as their combination.
We propose a method for knowledge graph-enhanced molecular contrastive learning with functional prompt (KANO).
We propose a new molecular representations learning framework that exhibit invariance and robustness against distribution shifts.
Biology
Chemistry
Physics
Materials
Medicine