Time scale calculus is a unification of the theory of difference equations and standard calculus. Invented in 1988 by the German mathematician Stefan Hilger, it has applications in any field that requires simultaneous modelling of discrete and continuous data.

Basic Theory

Define a time scale, or measure chain, T, to be a closed subset of the real line, R.


sigma(t) = inf{s an element of T, s > t}   (forward shift operator)
rou(t) = sup{s an element of T, s < t}     (backward shift operator)

Let t be an element of T.

t is left dense if rou(t) = t,

    right dense if sigma(t) = t,
    left scattered if rou(t) < t,
    right scattered if sigma(t) > t,
    dense if left dense or right dense.

Define graininess mu of a measure chain T by mu(t) = sigma(t) - t

Take a function f : T -> R, where R can be any Banach space, but set to the real line for simplicity.

Definition: generalised derivative or fdelta(t)

For every epsilon > 0 there exists a neighbourhood U of t such that

|f(sigma(t)) - f(s) - fdelta(t)(sigma(t) - s)| =< epsilon|sigma(t)-s|, for all s in U

Take T = R. Then sigma(t) = t, mu(t) = 0, fdelta = f' is the derivative used in standard calculus. If T = Z (the integers), sigma(t) = t + 1, mu(t)=1, fdelta = deltaf is the forward difference operator used in difference equations.