## Variable Steps and Integration methods

Moderator: wwlytton

Bill Connelly
Posts: 86
Joined: Thu May 22, 2008 11:54 pm
Location: Australian National University

### Variable Steps and Integration methods

So my gap junction, inhibitory synapse coupled network (200 cells) runs pretty slowly. I tried various variable step methods; but they all ran considerably slower. And the local step method doesn't run at all. And it's not a very noisy network; most of the time, the cells have only a subtly fluctuating membrane potential

What's going on here?

ted
Posts: 5687
Joined: Wed May 18, 2005 4:50 pm
Location: Yale University School of Medicine
Contact:

### Re: Variable Steps and Integration methods

Models of individual cells that are singly connected (may be branched, but have no closed loops), or networks of such cells that are connected by events delivered by NetCons, are represented by a family of equations whose system matrix is tridiagonal. That is, all elements of the matrix are 0 except for the main diagonal and the diagonal immediately above and below it. The computational cost of solving such a system scales with the number of equations (number of compartments in the model).

Closed loops, whether they result from peculiar cellular architectures or gap junctions, introduce off-diagonal nonzero terms that change the system matrix from tridiagonal to sparse. Sparse matrix solvers impose a computational burden that scales with the square of the number of compartments. A model with 100 compartments that involves gap junctions might take 100 times longer to simulate than a model with 100 compartments but no gap junctions.

Bill Connelly
Posts: 86
Joined: Thu May 22, 2008 11:54 pm
Location: Australian National University

### Re: Variable Steps and Integration methods

Well I suppose that makes sense (well has much as anything which contains the phrase "the system matrix is tridiagonal" can make sense to a humble pharmacologist). But It doesn't seem to make sense that variable timestep integrators should run SLOWER.

ted