white noise and fox lu

The basics of how to develop, test, and use models.
Post Reply
okanerkaymaz

white noise and fox lu

Post by okanerkaymaz »

how can i do fox-lu algorithm
also in which is adding white noise in alpha,betha, n parameters
ted
Site Admin
Posts: 6384
Joined: Wed May 18, 2005 4:50 pm
Location: Yale University School of Medicine
Contact:

Post by ted »

how can i do fox-lu algorithm
I wouldn't know a fox-lu algorithm if it bit me. What is it?
also in which is adding white noise in alpha,betha, n parameters
I'm not sure what you mean.
okanerkaymaz

Post by okanerkaymaz »

Hi, friends
FOX - Lu Algorithm how does this algorithm add in my code ?
particulary
gn,gh,gm are noise parameters and
after i calculate for every t , how will i add m(t)or h(t) or n(t)?

regards
Raj
Posts: 220
Joined: Thu Jun 09, 2005 1:09 pm
Location: Groningen, The Netherlands
Contact:

Post by Raj »

Original reference:
Fox RF, Lu Y, Physical Review E, 1994.

Available from the APS website:
http://prola.aps.org/pdf/PRE/v49/i4/p3421_1

I see no fundamental problems for implementing this algorhythm in a mod-file making use of neuron random number generator. Just, don't try to use it with variable timestep methods.

Before you start it is important to find out which distributions are available to you in mod. NetStim.mod is using the negative exponential (exprand) distribution and I think the normal distribution (normrand) is also available in mod.
Raj
Posts: 220
Joined: Thu Jun 09, 2005 1:09 pm
Location: Groningen, The Netherlands
Contact:

Post by Raj »

Below you find a mod-file implementing Fox-Lu (Potassium channel only) as I understand it. I have no access to the original paper here, so you will need to verify what I did here. Also it is untested, except for compilation and units.

There is one issue to which I have no quick answer to and I hope Michael or Ted can comment on it. The issue is about which integration schemes can be used with a derivative with a stochastic component. By analogy with the original HH-style mechanism you find cnexp here, but I think a simpler integration scheme might be needed here. However a full answer requires a full knowledge of implementation details of the integration schemes.

Code: Select all

: Fox-Lu algorithm for HH-style K-channel.
: Author: Ronald van Elburg

NEURON {
    SUFFIX HH_FL
    USEION k READ ek WRITE ik
    RANGE gk_single, KSingle 
}

UNITS {
    (S)     = (siemens)
    (mV)    = (millivolt)
    (mA)    = (milliamp)
}

PARAMETER { 
    gk_single = 0.00036 (S/cm2)         : Single channel conductance (here a 
                                        : fantasy value at present) reasonable
                                        : value is needed here.
                                        
    KSingle   = 100   (1)               : The number of potassium channels
}

ASSIGNED {
    v (mV)
    ek (mV)                
    ik (mA/cm2)
}

STATE {
        n
}

BREAKPOINT {
    SOLVE states METHOD cnexp
    ik  = gk_single* KSingle* n^4 * (v - ek)
}

INITIAL {
    : Assume v has been constant for a long time
    n = alpha(v)/(alpha(v) + beta(v))
}

DERIVATIVE states {
    : Computes state variable n at present v & t
    n' = (1-n)*alpha(v) - n*beta(v)+gn()
}

FUNCTION alpha(Vm (mV)) (/ms) {
    LOCAL x
    UNITSOFF
    x = (Vm+55)/10
    if (fabs(x) > 1e-6) {
        alpha = 0.1*x/(1 - exp(-x))
    }else{
        alpha = 0.1/(1 - 0.5*x)
    }
    UNITSON
}

FUNCTION beta(Vm (mV)) (/ms) {
    UNITSOFF
    beta = 0.125*exp(-(Vm+65)/80)
    UNITSON
}

FUNCTION varn() {
    UNITSOFF
    varn=(2/KSingle)*(alpha(v)*beta(v))/(alpha(v)+beta(v))
    UNITSON
}

FUNCTION gn() (/ms) {
    UNITSOFF
    gn=normrand(0,sqrt(varn()))
    UNITSON
}

Post Reply