Tutorials and Technical Briefings at ISEC 2025
Atul Kumar
ISEC 2025
Recent advancements in large foundation models have revealed impressive capabilities in mastering complex chemical language representations. These models undergo a task-agnostic learning phase, characterized by pre-training on extensive unlabeled corpora followed by fine-tuning on specific downstream tasks. This methodology reduces reliance on labeled data, facilitating data acquisition and broadening the scope of chemical language representation. However, real-world scenarios often pose challenges due to domain shift, a phenomenon where the data distribution in downstream tasks differs from that of the pre-training phase, potentially degrading model performance. To address this, we present a novel causal-based framework for feature selection and domain adaptation to enhance the performance of chemical foundation models on downstream tasks. Our approach employs a multi-stage feature selection method that identifies physico-chemical features based on their direct causal-effect over specific downstream properties. By employing Mordred descriptors and Markov blanket causal graphs, our approach provides insight into the causal relationships between features and target properties for prediction tasks. We evaluate our approach on various foundation model architectures and datasets, demonstrating performance improvements, which showcases the robustness and the agnostic nature of our approach.
Atul Kumar
ISEC 2025
Alain Vaucher, Philippe Schwaller, et al.
AMLD EPFL 2022
Conrad Albrecht, Jannik Schneider, et al.
CVPR 2025
Yi Zhou, Parikshit Ram, et al.
ICLR 2023