A dynamical system is a deterministic process in which a variable's value changes over time according to a well-defined rule which only involves the variable's current value.
|Table of contents|
2 Types of dynamical systems
3 Examples of dynamical systems
Dynamical systems and chaos theory
This branch of mathematics deals with the long-term qualitative behavior of dynamical systems. Here, the focus is not on finding precise solutions to the equations defining the dynamical system (which is often hopeless), but rather to answer questions like "Will the system settle down to a steady state in the long term, and if so, what are the possible steady states?", or "Does the long-term behavior of the system depend on its initial condition?"
An important goal is to describe the fixed points, or steady states of a given dynamical systems; these are values of the variable which won't change over time. Some of these fixed points are attractive, meaning that if the system starts out in a nearby state, it will converge towards the fixed point.
Similarly, one is interested in periodic points, states of the system which repeat themselves after several timesteps. Periodic points can also be attractive. Sarkovskii's theorem is an interesting statement about the number of periodic points of a one-dimensional discrete dynamical system.
Even simple nonlinear dynamical systems often exhibit almost random, completely unpredictable behavior that has been called chaos. The branch of dynamical systems which deals with the clean definition and investigation of chaos is called chaos theory.
We distinguish between linear dynamical systems and nonlinear dynamical systems. In linear systems, the right-hand-side of the equation is an expression which depends linearly on x, as in