Talk:The BRAIN Initiative and its links to MSM Theme

Computational modeling for US BRAIN initiative: focusing the mathematical microscope

Mark Kramer, Bill Lytton, Raj Vadigepalli, IMAG BRAIN theme group, IMAG Computational Neuroscience group

How will we understand the brain? To start, we must observe the brain's dynamic activity, which consists of rich patterns propagating across the brain’s spatial and temporal scales, within the context of the brain's complex anatomy. A fundamental component of the BRAIN Initiative is the development of new technologies. Here we present suggestions for new technologies in simulation and mathematical modeling – spanning the spatial scale from molecules to large neuronal populations – leveraging the strengths of the Multiscale Modeling (MSM) community.

Although statistical approaches and data mining are critical, these representations lack a direct mechanistic, biological interpretation of brain function, and make connecting phenomenology across scales difficult. To understand the brain will require combined approaches, linking physical and conceptual reasoning - computational experiments - to statistical analysis of observed brain data. MSM leverages existing knowledge about neuronal function to gain insight into the hidden biological mechanisms of underlying brain processes that are not directly observed. This becomes critical in: 1. Connecting scales where information gathered with different techniques is initially incommensurable; 2. Identifying gaps in datasets where critical parameters have not yet been gathered; 3. Making explicit, biological predictions that not only test the adequacy of the model but also test the adequacy of the underlying data set in providing explanations for an emergent brain phenomenon of interest - e.g., dynamical, informational, behavioral, or cognitive.

The following new technologies are needed in simulation and mathematical analysis:

1. Directed high-dimensional nonlinear dynamical analysis methods. Detailed biological models necessarily reside in high-dimensional state space. However, methods for model analysis generally require dimensional reduction to 2 or perhaps 3 dimensions; in these dimensions, important dynamical system features (e.g., manifolds of fixed points and limit cycles, and their stability) can be visualized. Computer simulations allow us to "see" all state variables simultaneously in high-dimensional space (e.g., 1e3 - 1e10 and up state variables). Rather than projecting these images down to our eyes (a data-mining dimensional reduction), we need to further develop “mathematical microscopes” that can drill-down, permitting both a wide angle, global perspective of the full system space (a mathematical telescope setting), and high resolution, focused perspectives on individual system components. These techniques bring our virtual eyes - our manipulations and our evaluation process - into that space -- to assess the meaning of orbits.

2. Development of new measures to determine System Stability, Manifold/Dimensionality, Robustness, and Orbit detection in large nonlinear systems. Develop comparable methods for dynamics based on mixed systems: ODEs + delay equations, + PDEs, + event-driven delay systems.

3. New simulation methods for multiscale models. Simulation at different spatial scales requires different techniques that must be made to work together. At the subcellular scale this requires different stochastic simulators to couple with 1D and 3D diffusion, as well as with reaction schemes defined deterministically or stochastically. At the neuronal network level, this requires hybrid simulations using compartmental cells of various complexity (including automated reduction of compartmental cell size) with extensions of basic integrate-and-fire cells of various complexity. At even higher spatial scales, this requires methods for connecting neuronal networks to simulate brain areas, connecting brain areas to simulate systems, and finally connecting systems to simulate full-brain models.

4. Simulator interoperability. Multiple simulators have been moving towards each other by virtue of all adopting Python as a lingua franca. However, many difficulties remain in terms of getting simulators to reliably plug in. Standard interfacing must be developed via application programming interfaces (APIs) that define what information needs to be sent from one simulator to the other both at same scale and across adjoining scales. This is made more difficult due to the necessity of using Higher Performance Computers (HPCs) which place different segments of a simulation on different processors.

5. Development of databases for system sharing. Again this needs to be done at multiple levels. Databases should be defined not only with the needs of the experimentalists in mind but also to work readily with current and future simulators. A good example of the difficulties of this can be seen in existing anatomical databases, which have largely been developed for visualization by neuroanatomists but are often unfriendly for direct automated access by a simulator trying to extract needed information.

6. Continuous co-system data-mining. Increased automation will mean that data will be coming in continuously both from 2 computer processes: Ongoing (computer-directed automated) experiments will be accompanied by ongoing simulations, with both being probed through ongoing continuous data-mining. This process can be effected at at least 4 levels of complexity. Level 1: data-miner detects faults or errors in the experiment or in a simulation. Level 2: data-miner identifies activity of interest in either set; does comparisons between simulation and experiment pulling out similarities Level 3: Co-adaptation: 3a: simulations will be changed based on parameters just now obtained from experiment; 3b: automated experimental apparatus will be redirected to obtain new parameters, different locations or different aspects of a parameter being measured. Level 4: Simulations will automatically adapt to more closely match simulation via newly developed inductive or selective algorithms. Ideally these algorithms will closely mirror actual biological processes so that the system is modeling development and learning as well as modeling dynamics.

Table sorting checkbox
Off