length scales and SparseEfficiencyWarning

Extending NEURON to handle reaction-diffusion problems.

Moderators: hines, wwlytton, ramcdougal

Post Reply
bschneiders
Posts: 31
Joined: Thu Feb 02, 2017 11:30 am

length scales and SparseEfficiencyWarning

Post by bschneiders » Fri Nov 08, 2019 7:02 pm

Hi. I have two likely related questions. The first is, I sometimes get the following warning:

Code: Select all

/Applications/NEURON-7.6/nrn/lib/python/neuron/rxd/section1d.py:155: SparseEfficiencyWarning: Changing the sparsity structure of a csr_matrix is expensive. lil_matrix is more efficient.
  g[io, io] += rate_r + rate_l
I get this a bunch of times, through line 174 of section1d.py (the first line - 155- is in the setup for the diffusion matrix), and again in species.py, and again in rxd.py. Sometimes it seg faults - if so, I rerun the exact same code and it runs. Which is why it has taken me so long to get to the bottom of this. Any idea what exactly is going on here? can I change the tolerance somewhere? I know "rate_r" and "rate_l" are related to section.L and section.nseg, which is why I think this is related to the below question.

The second related question is with respect to length scales (the diffusion matrix is an issue with large nsegs/small compartments, which seems to make sense). I have been setting my segment lengths according to the d_lambda rule for most sections, but that only takes electrical properties into account. The diffusion of voltage is way higher than it is for calcium (I am using D_Ca = 0.5 um^2/ms), where my segments should be much smaller with respect to calcium diffusion. Is there any way to separate these length scales?

Note: as you can see, I am still using Neuron 7.6 - apologies if this was addressed in 7.7! I haven't made the switch yet.

ramcdougal
Posts: 160
Joined: Fri Nov 28, 2008 3:38 pm
Location: Yale School of Public Health

Re: length scales and SparseEfficiencyWarning

Post by ramcdougal » Tue Nov 12, 2019 11:17 am

The first issue should go away when you upgrade to 7.7.

As far as tolerance goes: if you're using variable step, you can specify an atolscale when you declare the Species... I don't know that that's related, but I'm mentioning it just in case.

As of this time, we have no automatic advice for automatically discovering the appropriate discretization. We have explored subsegment discretization, but that is currently not supported; for now, 1d simulations must use the same discretization for both chemical and electrical kinetics. An empirical test of the discretization is to try tripling nseg and seeing if it qualitatively changes the results; if so, you need a larger nseg.

That's not great, I know, but for what it's worth 7.7 should simulate the reactions faster.

bschneiders
Posts: 31
Joined: Thu Feb 02, 2017 11:30 am

Re: length scales and SparseEfficiencyWarning

Post by bschneiders » Tue Nov 12, 2019 12:34 pm

That's good to know about 7.7, thanks!

As for the lengths scales and tolerance, I figured that was the case but it was worth a shot. I have set atolscales appropriately I believe (the Atol Scale Tool caught some that I had missed - very handy), but I don't think that catches the discretization I'm talking about. I'll try the test you mention and see if that helps. Thanks!

Post Reply