|

|

An AI Agent for cell-type specific brain computer interfaces
Decoding how specific neuronal subtypes contribute to brain function requires linking extracellular electrophysiological features to underlying molecular identities, yet reliable in vivo electrophysiological signal classification remains a major challenge for neuroscience and clinical brain-computer interfaces (BCI). Here, we show that pretrained, general-purpose vision-language models (VLMs) can be repurposed as few-shot learners to classify neuronal cell types directly from electrophysiological features, without task-specific fine-tuning. Validated against optogenetically tagged datasets, this approach enables robust and generalizable subtype inference with minimal supervision. Building on this capability, we developed the BCI AI Agent (BCI-Agent), an autonomous AI framework that integrates vision-based cell-type inference, stable neuron tracking, and automated molecular atlas validation with real-time literature synthesis. BCI-Agent addresses three critical challenges for in vivo electrophysiology: (1) accurate, training-free cell-type classification; (2) automated cross-validation of predictions using molecular atlas references and peer-reviewed literature; and (3) embedding molecular identities within stable, low-dimensional neural manifolds for dynamic decoding. In rodent motor-learning tasks, BCI-Agent revealed stable, cell-type-specific neural trajectories across time that uncover previously inaccessible dimensions of neural computation. Additionally, when applied to human Neuropixels recordings - where direct ground-truth labeling is inherently unavailable - BCI-Agent inferred neuronal subtypes and validated them through integration with human single-cell atlases and literature. By enabling scalable, cell-type-specific inference of in vivo electrophysiology, BCI-Agent provides a new approach for dissecting the contributions of distinct neuronal populations to brain function and dysfunction.
(Читать комментарии) (Добавить комментарий)
|
|