Exploring neural manifolds across a wide range of intrinsic dimensions

Wait 5 sec.

by Jacopo Fadanni, Rosalba Pacelli, Alberto Zucchetta, Pietro Rotondo, Michele AllegraThe rapid surge in the number of simultaneously recorded neurons demands reliable tools to explore the latent geometry of high-dimensional neural spaces. Within such spaces, neuronal activity typically lies on a subspace or manifold characterized by an intrinsic dimension (ID) that is much lower than the total number of recorded units. The ID can provide immediate information about the neural code, such as the minimum number of encoded variables and the relation between collective and individual neural activity. Existing studies rely on disparate and potentially unreliable ID estimators, which can contribute to conflicting reports of high-dimensional vs. low-dimensional manifolds. Here, we propose a robust and versatile pipeline for ID estimation, exploiting a local version of the full correlation integral estimator (lFCI). Being able to simultaneously cope with high dimensionality and non-linearity, lFCI overcomes some major limitations of common ID estimation methods. We prove the strength and accuracy of lFCI by applying it on synthetic benchmark data by Altan et al., 2019, where other methods typically underestimate the ID. We apply lFCI to study neural manifolds arising in recurrent neural networks trained on the 20 tasks of the well-known ‘cog-Task’ battery. Across tasks and training repetitions, lFCI uncovers a consistently low ID, which we show to be fundamentally related to the task structure. Finally, we apply lFCI to a reference experimental dataset by Stringer et al., 2019, comprising visual responses to a large set of natural images, strongly supporting previous reports that responses are organized in a high-dimensional manifold. lFCI has the potential to shed light on the current debate about the geometry of neural codes, and its dependence on structural constraints and computational goals in biological and artificial neural networks.