Hi everyone,
i have observed a quiet annoying process while recording spikes times of a single cell: the time dilates during the simualtion.
I explain: the first 10 seconds lasts 12s
seconds 20 to 30 lasts 20s
seconds 40 to 50 lasts 40s
The times spike are put into a vector which is processed into an histogram, which is then fitted by a custom function. This works pretty well for short time simulations (i.e one or two minutes), but recording times increases so much that recording 10 minutes last more than 2 hours (without counting the processing which is still running)!
Do you have any ideas of improvement?
I supposed that the output vector processing starts after the end of the recording, how can i be sure of that?
Thanks
Simulation duration
Displaying the membran cell potential
It seems this is the source of the problem. The recording duration is really fast without displaying.
The previous recording has been performed within 2 minutes!
The previous recording has been performed within 2 minutes!
-
- Site Admin
- Posts: 6384
- Joined: Wed May 18, 2005 4:50 pm
- Location: Yale University School of Medicine
- Contact:
Your post contains only a few clues, so I will have to make some guesses.
1. Are you using the standard run system to execute a simulation, or have
you invented your own "for" or "while" loop to iterate time from 0 to tstop?
If the latter, does your code try to build a new histogram at each
fadvance()? Don't do that. Use the standard run system, and make a
very simple proc that automates the process of launching a simulation,
then analyzing results after the simulation is complete.
Example:
2. If you are already doing something like the solution proposed in (1),
the problem may be in the algorithm you're using to generate the
histogram. Until recently, NEURON's own hoc library used a histogram
algorithm whose execution times scaled with the square of the number of
data. See the comments by Hines in this thread
https://www.neuron.yale.edu/phpBB2/viewtopic.php?t=991
Try the latest alpha version of NEURON.
1. Are you using the standard run system to execute a simulation, or have
you invented your own "for" or "while" loop to iterate time from 0 to tstop?
If the latter, does your code try to build a new histogram at each
fadvance()? Don't do that. Use the standard run system, and make a
very simple proc that automates the process of launching a simulation,
then analyzing results after the simulation is complete.
Example:
Code: Select all
load_file("nrngui.hoc")
load_file("modelspec.hoc") // specifies biological properties that are represented
load_file("instrumentation.hoc") // specifies clamps, other signal sources
// that perturb the model
load_file("gui.ses") // RunControl, graphs, etc.
load_file( . . . code that records time course of continuous variables to Vectors . . . )
load_file( . . . code that records event times to Vectors . . . )
load_file( . . . procs and funcs that, when called, will analyze recorded results
e.g. proc postprocess() . . . )
proc myrun() {
run() // standard run system's procedure that launches a simulation run
postprocess() // your own procedure that does whatever analysis you like
// e.g. generate a histogram from event times captured to a vector
}
myrun() // executes a simulation, then analyzes results
2. If you are already doing something like the solution proposed in (1),
the problem may be in the algorithm you're using to generate the
histogram. Until recently, NEURON's own hoc library used a histogram
algorithm whose execution times scaled with the square of the number of
data. See the comments by Hines in this thread
https://www.neuron.yale.edu/phpBB2/viewtopic.php?t=991
Try the latest alpha version of NEURON.