[2604.04033] Topological Sensitivity in Connectome-Constrained Neural Networks

[2604.04033] Topological Sensitivity in Connectome-Constrained Neural Networks

arXiv - Machine Learning 3 min read

About this article

Abstract page for arXiv paper 2604.04033: Topological Sensitivity in Connectome-Constrained Neural Networks

Quantitative Biology > Neurons and Cognition arXiv:2604.04033 (q-bio) [Submitted on 5 Apr 2026] Title:Topological Sensitivity in Connectome-Constrained Neural Networks Authors:Nalin Dhiman View a PDF of the paper titled Topological Sensitivity in Connectome-Constrained Neural Networks, by Nalin Dhiman View PDF HTML (experimental) Abstract:Connectome-constrained neural networks are often evaluated against sparse random controls and then interpreted as evidence that biological graph topology improves learning efficiency. We revisit that claim in a controlled flyvis-based study using a Drosophila connectome, a naive self-loop-matched random graph, and a degree-preserving rewired null. Under weak controls, in which both models were recovered from a connectome-trained checkpoint and the null matched only global graph counts, the connectome appeared substantially better in early loss, mean activity, and runtime. That picture changed under stricter controls. Training both graphs from a shared random initialization removed the early loss advantage, and replacing the naive null by a degree-preserving null removed the apparent activity advantage. A five-sample degree-preserving ensemble and a pre-training activity-scale diagnostic further strengthened this revised interpretation. We also report a descriptive mechanism analysis of the earlier weak-control comparison, but we treat it as behavioral characterization rather than proof of causal superiority. We show that previously report...

Originally published on April 07, 2026. Curated by AI News.

Related Articles

[2603.12365] Optimal Experimental Design for Reliable Learning of History-Dependent Constitutive Laws
Machine Learning

[2603.12365] Optimal Experimental Design for Reliable Learning of History-Dependent Constitutive Laws

Abstract page for arXiv paper 2603.12365: Optimal Experimental Design for Reliable Learning of History-Dependent Constitutive Laws

arXiv - Machine Learning · 4 min ·
[2603.17573] HeiSD: Hybrid Speculative Decoding for Embodied Vision-Language-Action Models with Kinematic Awareness
Machine Learning

[2603.17573] HeiSD: Hybrid Speculative Decoding for Embodied Vision-Language-Action Models with Kinematic Awareness

Abstract page for arXiv paper 2603.17573: HeiSD: Hybrid Speculative Decoding for Embodied Vision-Language-Action Models with Kinematic Aw...

arXiv - Machine Learning · 4 min ·
[2512.20562] Shallow Neural Networks Learn Low-Degree Spherical Polynomials with Feature Learning by Learnable Channel Attention
Machine Learning

[2512.20562] Shallow Neural Networks Learn Low-Degree Spherical Polynomials with Feature Learning by Learnable Channel Attention

Abstract page for arXiv paper 2512.20562: Shallow Neural Networks Learn Low-Degree Spherical Polynomials with Feature Learning by Learnab...

arXiv - Machine Learning · 4 min ·
[2603.07475] A Comparative analysis of Layer-wise Representational Capacity in AR and Diffusion LLMs
Llms

[2603.07475] A Comparative analysis of Layer-wise Representational Capacity in AR and Diffusion LLMs

Abstract page for arXiv paper 2603.07475: A Comparative analysis of Layer-wise Representational Capacity in AR and Diffusion LLMs

arXiv - Machine Learning · 3 min ·
More in Machine Learning: This Week Guide Trending

No comments

No comments yet. Be the first to comment!

Stay updated with AI News

Get the latest news, tools, and insights delivered to your inbox.

Daily or weekly digest • Unsubscribe anytime