Issue Installing Neuron in Ubuntu 16.04 LTS
Posted: Wed May 11, 2016 9:42 am
Hi, I have looked around and so far failed to find anyone else with this problem, however if this is a duplicate post please do let me know and ill head over there.
I'm trying to install Neuron for use with pyNN in a fresh install of Ubuntu 16.04 LTS and thus far have had no success.
I have downloaded the nrn.tar.gz and iv.tar.gz files and tried running:
However this yields the following warning:
I have noticed that if I try to call X11 it tells me that it needs installing but on running the following to install it, the system also tells me that 'x11-common is already the newest version (1:7.7+13ubuntu3)' and therefore doesn't install.
I have also tried building the alpha version however this yielded the following when running './build.sh':
The .deb file also seems to fail... however there is no output when using that.
Has anyone managed to get neuron to work with python on Ubuntu 16.04 LTS or seen this X11 issue before?
Many thanks,
Jon.
I'm trying to install Neuron for use with pyNN in a fresh install of Ubuntu 16.04 LTS and thus far have had no success.
I have downloaded the nrn.tar.gz and iv.tar.gz files and tried running:
Code: Select all
./configure --prefix=`pwd` --with-nrnpython --with-paranrn
Code: Select all
...
checking for X... no
configure: error: cannot find X11
Code: Select all
sudo apt-get install x11-common
Code: Select all
libtoolize -c -f -i
libtoolize: putting auxiliary files in '.'.
libtoolize: copying file './config.guess'
libtoolize: copying file './config.sub'
libtoolize: copying file './install-sh'
libtoolize: copying file './ltmain.sh'
libtoolize: Consider adding 'AC_CONFIG_MACRO_DIRS([m4])' to configure.in,
libtoolize: and rerunning libtoolize and aclocal.
libtoolize: Consider adding '-I m4' to ACLOCAL_AMFLAGS in Makefile.am.
aclocal: warning: autoconf input should be named 'configure.ac', not 'configure.in'
Has anyone managed to get neuron to work with python on Ubuntu 16.04 LTS or seen this X11 issue before?
Many thanks,
Jon.