vector.play memory efficiency

Anything that doesn't fit elsewhere.
Post Reply
pascal
Posts: 89
Joined: Thu Apr 26, 2012 11:51 am

vector.play memory efficiency

Post by pascal »

I have a simulation where I need to play hundreds of vectors into synaptic conductances. These vectors are all just scaled versions of one another. Here is an outline of the code:

Code: Select all

signal=np.loadtxt(#some txt file that defines trace for synapses gmax)
syn_gmax_vec=h.Vector(signal)
        syn_veclist = []
        for cell in cellList:
            norm_vec = syn_gmax_vec / cell.inDeg #divide gmax by number of connections projecting to this cell, so that total synaptic weight is the same for all cells irrespective of number of incoming connections
            syn_veclist.append(norm_vec)
            syn_veclist[-1].play(cell.synlist[0]._ref_gmax, h.dt)
It is my understanding that in the code above, I must keep copies of all the vectors to be played (in this case, I store them in syn_veclist). The problem is that for long simulations with large numbers of neurons, this consumes *a lot* of RAM. Alternatively, I could use a callback that reads in the signal values and adjusts the synaptic conductances timestep-by-timestep, but this would slow down the simulation tremendously.
So here’s my question: is there any way to implement vector.play in such a way that I can just store one vector in memory, and apply scaled versions of it to numerous variables? Thanks for the help.

ted
Site Admin
Posts: 5769
Joined: Wed May 18, 2005 4:50 pm
Location: Yale University School of Medicine
Contact:

Re: vector.play memory efficiency

Post by ted »

The correct answer to your question depends on exactly what you are trying to do. Do you have a sampled time course of synaptic conductance (either from a real experiment, or precomputed) that you want each synaptic instance to follow (give or take an instance-specific scale factor)? Or do you have event-driven synapses and you want to drive their gmax parameters (again, subject to an instance-specific scale factor)?

pascal
Posts: 89
Joined: Thu Apr 26, 2012 11:51 am

Re: vector.play memory efficiency

Post by pascal »

The first option: a sampled time course of synaptic conductance.

ted
Site Admin
Posts: 5769
Joined: Wed May 18, 2005 4:50 pm
Location: Yale University School of Medicine
Contact:

Re: vector.play memory efficiency

Post by ted »

Good. Presumably you are using a point process of some kind as the synaptic mechanism. I will need to see its NMODL source code--is that available online, or would you prefer to email it to me
ted dot carnevale at yale dot edu

pascal
Posts: 89
Joined: Thu Apr 26, 2012 11:51 am

Re: vector.play memory efficiency

Post by pascal »

Here is the mod file Ted recommended for implementing vector.play in a memory-efficient way:

Code: Select all

COMMENT
VecSyn, a "synaptic mechanism" whose conductance
is driven by Vector.play.

Actual synaptic conductance is gs.
gs is the product of a scale factor k and gnorm,
where gnorm is in microsiemens and its values are driven
by a pair of Vectors that define
recorded or precalculated values.

Default parameter values are
gnorm 0 microsiemens
k     1
erev  0 millivolt
so default value of gs and i will be 0.

This implementation silently guards against gs < 0,
but it might be better to issue an error message
and halt the simulation if gs < 0 is encountered.
ENDCOMMENT

NEURON {
   POINT_PROCESS VecSyn
   GLOBAL gnorm : as a reminder to the user
     : (gnorm is a PARAMETER, and PARAMETERs are global by default)
   RANGE k, erev, gs
   NONSPECIFIC_CURRENT i
}

PARAMETER {
   gnorm = 0 (microsiemens)
   k = 1 (1)
   erev = 0 (millivolt)
}

ASSIGNED {
   gs (microsiemens)
   i (nanoamp)
   v (millivolt)
}

BREAKPOINT {
   gs = k*gnorm
   if (gs < 0) {
     i = 0
   } else {
     i = gs*(v - erev)
   }
}
In NEURON, then, you simply define one vector to play into 'gnorm' for many instances of this synapse, and you set 'k' to whatever value you wish for each synapse.

Thanks, Ted!

Post Reply