GPU-computing and NEURON?
Moderator: hines
GPU-computing and NEURON?
Recently, I've been reading about the growing trend of using GPUs for general computational programming. For programs that are highly parallel, GPUs offer substantial gains in computational power over CPUs, and I thought that some of the programs I've written for NEURON could stand to run faster. Is it possible to use a GPU with NEURON? Is there any work being done in this area?
-
- Site Admin
- Posts: 6300
- Joined: Wed May 18, 2005 4:50 pm
- Location: Yale University School of Medicine
- Contact:
Re: GPU-computing and NEURON?
GPUs sound wonderful indeed. But that bright star is a long way in the distance. The chief shortcomings of GPUs are:
lack of support for double precision floating point math
lack of open source development tools
If the excitement about the potential of GPUs becomes persistent and pervasive enough, maybe these limitations will disappear sooner rather than later.
In the meanwhile, there is plenty of low-hanging fruit to be plucked. Parallelization of NEURON with MPI has already been in wide use for the past couple of years, and new, big performance improvements contine to appear--for the latest, see the articles by Hines, Eichner, and Scuermann, and Hines, Markram, and Schuermann at http://www.neuron.yale.edu/neuron/bib/nrnpubs.html. Suitable for use on supercomputers, workstation clusters, and standalone PCs and Macs with multicore processors.
Development of multithreaded NEURON is moving along nicely. It has two chief advantages: speed improvements on multicore machines without having to revise source code, and ability to use the GUI (can't do that with parallel execution under MPI). This will require a substantial amount of beta testing, because it involves lots of changes to NEURON's internals, especially stuff related to adaptive integration and the event delivery system.
I should also mention a third development area in which there has been significant progress: the use of Python as as an alternative interpreter. This has the potential to benefit all NEURON users, partly by speeding up the development of new NEURON-specific tools, but also partly by making available the enormous Python libraries of scientific and mathematical software that already exist. And that saves programmer time, which is even more important than computer time.
lack of support for double precision floating point math
lack of open source development tools
If the excitement about the potential of GPUs becomes persistent and pervasive enough, maybe these limitations will disappear sooner rather than later.
In the meanwhile, there is plenty of low-hanging fruit to be plucked. Parallelization of NEURON with MPI has already been in wide use for the past couple of years, and new, big performance improvements contine to appear--for the latest, see the articles by Hines, Eichner, and Scuermann, and Hines, Markram, and Schuermann at http://www.neuron.yale.edu/neuron/bib/nrnpubs.html. Suitable for use on supercomputers, workstation clusters, and standalone PCs and Macs with multicore processors.
Development of multithreaded NEURON is moving along nicely. It has two chief advantages: speed improvements on multicore machines without having to revise source code, and ability to use the GUI (can't do that with parallel execution under MPI). This will require a substantial amount of beta testing, because it involves lots of changes to NEURON's internals, especially stuff related to adaptive integration and the event delivery system.
I should also mention a third development area in which there has been significant progress: the use of Python as as an alternative interpreter. This has the potential to benefit all NEURON users, partly by speeding up the development of new NEURON-specific tools, but also partly by making available the enormous Python libraries of scientific and mathematical software that already exist. And that saves programmer time, which is even more important than computer time.
Re: GPU-computing and NEURON?
Hi,
I wonder if there is anything new happening with NEURON and usage of GPUs since this post? The world of GPU computing has evolved significantly since 2008.
Thanks,
Stephen
I wonder if there is anything new happening with NEURON and usage of GPUs since this post? The world of GPU computing has evolved significantly since 2008.
Thanks,
Stephen
-
- Site Admin
- Posts: 6300
- Joined: Wed May 18, 2005 4:50 pm
- Location: Yale University School of Medicine
- Contact:
Re: GPU-computing and NEURON?
Given a significant expansion of resources (read "additional support"), this would become an active part of the NEURON project. As things stand, there is plenty of other stuff to be done that does have support. In the meantime, NEURON is an open source project, so in principle anyone who wants to start and actively participate in a GPU development branch can certainly do so.