General validation of NEURON?

Post Reply
dkolodru
Posts: 2
Joined: Mon Nov 30, 2020 10:40 am

General validation of NEURON?

Post by dkolodru »

Does anyone know of a resource with general validation data for NEURON?

I am very new to this field, and am trying to figure out how NEURON has been validated with experimental data. I see individual papers with specific cases where a NEURON model is compared to experimental data, but I cannot seem to find an overview of all potential validation data. Does such a thing exist?

Thank you in advance!
ted
Site Admin
Posts: 6287
Joined: Wed May 18, 2005 4:50 pm
Location: Yale University School of Medicine
Contact:

Re: General validation of NEURON?

Post by ted »

An interesting question, not least because it seems skew to NEURON's primary design goal: to serve as a tool for implementing and performing experiments on computational models of biological neurons and networks. Every computational model is only an expression, in computational form, of a conceptual model or hypothesis. "Expression in computational form" means "expression in a way that is equivalent to a set of equations". NEURON's role in computational modeling is to provide a convenient means for creating such expressions, and a computational engine for generating numerical solutions with sufficient speed and accuracy.

Given this background, a proper notion of "validation" of NEURON would mean one of three things:
1. validation of the ease with which users can implement their hypotheses in computational form
2. validation by the modeler that there is a close match between the conceptual model and the expression of that model in computational form
3. validation of the numerical methods that NEURON uses to generate simulations.

We could discuss how each of these can be addressed, but from your question it seems that your actual concern is not about NEURON itself, but about the validity of the models that NEURON users create. That's up to the users themselves, not NEURON or NEURON's developers.

It should also be remembered that the notion of "validation" is easily misapplied, and in some cases may not be valid. Consider this very common scenario in neuroscience research:

A scientist is interested in phenomenon A which has been observed in some biological system (a cell or network of cells). The scientist knows some of the properties P of the biological system, and imagines that a particular subset S might account for phenomenon A. However, the elements in S have their own complexities such that neither unaided intuition nor nonlinear dynamical analysis is sufficient to decide that S can account for A. The scientist uses NEURON (or MATLAB or C or whatever) to build a computational model that describes the behavior and interactions of the elements in S, and uses the model to perform some computational experiments designed to see if A happens. If A does happen, the scientist has a result that is roughly equivalent to an "existence proof." Where in this process is there a role for "validation" in the sense of your use of the term?
I cannot seem to find an overview of all potential validation data. Does such a thing exist?
"Model validation" being a primary responsibiliity of modelers themselves (and only in those cases where one has decided that validation is relevant), the best place to look for evidence of validation is in the work of modelers who have published a series of papers on one or more related models. Or perhaps work done in response to RFAs that explicitly require model validation (maybe some of the SPARC-funded stuff?).
ted
Site Admin
Posts: 6287
Joined: Wed May 18, 2005 4:50 pm
Location: Yale University School of Medicine
Contact:

Re: General validation of NEURON?

Post by ted »

modelers who have published a series of papers on one or more related models
Potential examples include the hippocampus models from I. Soltesz's lab, and the neocortical rhythmogenesis modeling by Stephanie Jones at Brown.
dkolodru
Posts: 2
Joined: Mon Nov 30, 2020 10:40 am

Re: General validation of NEURON?

Post by dkolodru »

Thank you,

This is helpful clarification. I was certainly conflating NEURON with the range of models that have been implemented using NEURON.

Thank you as well for the examples, I think they should be helpful.
ted
Site Admin
Posts: 6287
Joined: Wed May 18, 2005 4:50 pm
Location: Yale University School of Medicine
Contact:

Re: General validation of NEURON?

Post by ted »

Two other places to check: both the Allen Institute and the European Human Brain Project have used NEURON extensively in their research. It seems likely that at least some of that work would have involved validation of the many models they have been building. And of course anyone who happens to read this thread and has participated in either of those projects, or in any other research with NEURON that included model validation, is welcome to comment.
watersun
Posts: 6
Joined: Wed Nov 25, 2020 5:29 am

Re: General validation of NEURON?

Post by watersun »

Just want to throw in that there is a validation suite „HippoUnit“ since this year for CA1 PC models.

https://journals.plos.org/ploscompbiol/ ... bi.1008114

It uses rat experimental data from different sources.
ted
Site Admin
Posts: 6287
Joined: Wed May 18, 2005 4:50 pm
Location: Yale University School of Medicine
Contact:

Re: General validation of NEURON?

Post by ted »

Just want to throw in that there is a validation suite „HippoUnit“ since this year for CA1 PC models.
IMO "HippoUnit" looks like an example of an attempt to use technology to sove a training problem. The training problem is the inadequate preparation of so many who are new to mechanistic modeling of biological systems.

Inadequately prepared in what way? First, many modelers come from a background in mathematics, physics, or engineering, which is both a strength and a weakness. It's a strength because this means they are strong in quantitative methods, maybe even know some nonlinear dynamics, and are familiar with expressing hypotheses in mathematical form. But it's a weakness because so few know anything about biology, and at the onset most lack the judgement and experience necessary to decide what biological complexities should be included in a conceptual model of a biological system or phenomenon.

Second, regardless of background, too few beginning modelers understand that deciding what to leave out of a conceptual model is as important as deciding what to include. Or that model design depends strongly on the purpose for which the model is being constructed. Thus there can be dozens of different models of a synapse, a cell, or a circuit, each one including certain aspects of biology but omitting others, NONE of which replicates all that is known about any particular synapse, cell or circuit, yet each one being useful (therefore "valid") for addressing the particular question that motivated its construction.

And this brings up the difficulty of discovering exactly what aspects of biology are represented in a particular model, and how they are represented. This is a particular problem with models that are expressed in procedural code rather than some form of declarative model specification e.g. NeuroML or CellML. Of all the tools used to create a model specification, whether procedural or declarative, NEURON is the only one that provides a tool (Model View) that presents a concise, interactive and browsable summary of what has actually been created. In principle, any simulator specific to the domain of computational neuroscience could offer its own Model View tool, but AFAIK NEURON is the only one that does.

Finally, a modeling paper's "Introduction" or "Background" section and its "Methods" section are potentially the most useful guides to discovering why any particular biological feature was included in a model. This is why journals are doing their readers no favor when they relegate the Methods section to the tail end of the paper, or worse, to a "Supplemental materials and methods" section that is "optionally available from the journal's web site."
Post Reply