Since the new version (6.2.3) came out I am using the to_python function to copy neuron vectors to python arrays.. The function to_python seems to allocate new memory that it never gives free again..
I wrote a simple example to illustrate this problem. (Points are spaces below)
Example 1:
16GB Memory fills within 30s:!!!!!
import neuron
h = neuron.h
h("""
objref vec
vec = new Vector(1000)
""")
for i in range(100000):
......a = range(1000)
......h.vec.to_python(a)
######################
Example 2:
Everthing OK!
import neuron
h = neuron.h
h("""
objref vec
vec = new Vector(1000)
""")
a = range(1000)
for i in range(100000):
......h.vec.to_python(a)
################
Example 3:
Everthing OK:
import neuron
h = neuron.h
h("""
objref vec
vec = new Vector(1000)
""")
for i in range(100000):
......a = range(1000)
h.vec.to_python(a)
I need to run a loop and the vector I record I do not know the size of before my loop starts, so I need example 1 to be working....
Any ideas, any help?
Armin
to_python() causes big memory problems
Moderator: hines
Re: to_python() causes big memory problems
That leak has been fixed in the alpha version at
http://www.neuron.yale.edu/ftp/neuron/versions/alpha/
The to_python and from_python stil exist but are more or less obsolete since better idioms are now
v = h.Vector(numpy_array)
v = h.Vector(python_list)
python_list = list(v)
numpy_array = numpy.array(v)
Also it gives full speed between numpy and hoc without the requirement that
numpy be available during neuron build time. i.e. the --enable-numpy has been
removed since it is no longer needed.
http://www.neuron.yale.edu/ftp/neuron/versions/alpha/
The to_python and from_python stil exist but are more or less obsolete since better idioms are now
v = h.Vector(numpy_array)
v = h.Vector(python_list)
python_list = list(v)
numpy_array = numpy.array(v)
Also it gives full speed between numpy and hoc without the requirement that
numpy be available during neuron build time. i.e. the --enable-numpy has been
removed since it is no longer needed.
Re: to_python() causes big memory problems
Is there in this new idiom a way to access a vector in python that has been created in a .hoc-script? This is a useful thing about the .to_python() since I have parts of my code in .hoc and parts of the code in .py-scripts.
Re: to_python() causes big memory problems
If to_python() is deprecated, which inplace operation I can use to move from one HocVector to a numpy array?
-
- Site Admin
- Posts: 6384
- Joined: Wed May 18, 2005 4:50 pm
- Location: Yale University School of Medicine
- Contact:
Re: to_python() causes big memory problems
Who said it was deprecated?mattions wrote:If to_python() is deprecated
Re: to_python() causes big memory problems
So it's ok to use it?hines wrote: ...
The to_python and from_python stil exist but are more or less obsolete since better idioms are now
v = h.Vector(numpy_array)
v = h.Vector(python_list)
python_list = list(v)
numpy_array = numpy.array(v)
...
-
- Site Admin
- Posts: 6384
- Joined: Wed May 18, 2005 4:50 pm
- Location: Yale University School of Medicine
- Contact:
Re: to_python() causes big memory problems
to_python and from_python are neither obsolete nor deprecated.
Re: to_python() causes big memory problems
I think the Leak has returned
NEURON -- VERSION 7.3 ansi (1078:2b0c984183df) 2014-04-04
ipython
2.5GB in 5 Seconds.
Also when I run the original code that arb posted
3.1GB
NEURON -- VERSION 7.3 ansi (1078:2b0c984183df) 2014-04-04
ipython
Code: Select all
from neruon import h
Vec = h.Vector()
Vec.indgen(0,100,0.25)
while 1:
Vec.to_python();
Also when I run the original code that arb posted
Code: Select all
import neuron
h = neuron.h
h("""
objref vec
vec = new Vector(1000)
""")
for i in range(100000):
a = range(1000)
h.vec.to_python(a)
Re: to_python() causes big memory problems
Thanks for pointing this out. The fix is
http://www.neuron.yale.edu/hg/neuron/nr ... d2c45fb1b4
http://www.neuron.yale.edu/hg/neuron/nr ... d2c45fb1b4