Simulation duration
Posted: Fri Oct 12, 2007 4:53 am
Hi everyone,
i have observed a quiet annoying process while recording spikes times of a single cell: the time dilates during the simualtion.
I explain: the first 10 seconds lasts 12s
seconds 20 to 30 lasts 20s
seconds 40 to 50 lasts 40s
The times spike are put into a vector which is processed into an histogram, which is then fitted by a custom function. This works pretty well for short time simulations (i.e one or two minutes), but recording times increases so much that recording 10 minutes last more than 2 hours (without counting the processing which is still running)!
Do you have any ideas of improvement?
I supposed that the output vector processing starts after the end of the recording, how can i be sure of that?
Thanks
i have observed a quiet annoying process while recording spikes times of a single cell: the time dilates during the simualtion.
I explain: the first 10 seconds lasts 12s
seconds 20 to 30 lasts 20s
seconds 40 to 50 lasts 40s
The times spike are put into a vector which is processed into an histogram, which is then fitted by a custom function. This works pretty well for short time simulations (i.e one or two minutes), but recording times increases so much that recording 10 minutes last more than 2 hours (without counting the processing which is still running)!
Do you have any ideas of improvement?
I supposed that the output vector processing starts after the end of the recording, how can i be sure of that?
Thanks