Using Chemical Language transformer model for Molecular Property prediction regression tasks and visualize attention weights : Part 1

Abhik Seal
7 min readApr 10, 2023

Came across the ChemBERTa-77M-MTR at Hugging face looks like it's pre-trained on 77M molecules. ChemBERT is a large-scale pre-trained molecular transformer model based on the BERT architecture, specifically designed for tasks in chemistry, drug discovery, or…

--

--

Abhik Seal

Data Science / Cheminformatician x-AbbVie , I try to make complicated things looks easier and understandable www.linkedin.com/in/abseal/