Cell splitting in neural networks extends strong scaling (Hines et al. 2008)


Neuron tree topology equations can be split into two subtrees and solved on different processors with no change in accuracy, stability, or computational effort; communication costs involve only sending and receiving two double precision values by each subtree at each time step. Application of the cell splitting method to two published network models exhibits good runtime scaling on twice as many processors as could be effectively used with whole-cell balancing.

Model Type: Realistic Network

Region(s) or Organism(s): Generic

Model Concept(s): Methods

Simulation Environment: NEURON

Implementer(s): Hines, Michael [Michael.Hines at Yale.edu]

References:

Hines ML, Eichner H, Schürmann F. (2008). Neuron splitting in compute-bound parallel network simulations enables runtime scaling with twice as many processors. Journal of computational neuroscience. 25 [PubMed]


This website requires cookies and limited processing of your personal data in order to function. By continuing to browse or otherwise use this site, you are agreeing to this use. See our Privacy policy and how to cite and terms of use.