Katja-Sophia Csizi, Emanuel Lörtscher
Frontiers in Neuroscience
Combining chemical sensor arrays with machine learning enables designing intelligent systems to perform complex sensing tasks and unveil properties that are not directly accessible through conventional analytical chemistry. However, personalized and portable sensor systems are typically unsuitable for the generation of extensive data sets, thereby limiting the ability to train large models in the chemical sensing realm. Foundation models have demonstrated unprecedented zero-shot learning capabilities on various data structures and modalities, in particular for language and vision. Transfer learning from such models is explored by providing a framework to create effective data representations for chemical sensors and ultimately describe a novel, generalizable approach for AI-assisted chemical sensing. The translation of signals produced by remarkably simple and portable multi-sensor systems into visual fingerprints of liquid samples under test is demonstrated, and it is illustrated that how a pipeline incorporating pretrained vision models yields (Formula presented.) average classification accuracy in four unrelated chemical sensing tasks with limited domain-specific training measurements. This approach matches or outperforms expert-curated sensor signal features, thereby providing a generalization of data processing for ultimate ease-of-use and broad applicability to enable interpretation of multi-signal outputs for generic sensing applications.
Katja-Sophia Csizi, Emanuel Lörtscher
Frontiers in Neuroscience
Stephen Obonyo, Isaiah Onando Mulang’, et al.
NeurIPS 2023
Romeo Kienzler, Johannes Schmude, et al.
Big Data 2023
Geisa Lima, Matheus Esteves Ferreira, et al.
Enbraer 2024