In computability theory, the theory of real computation deals with hypothetical computing machines using infinite-precision real numbers. Within this theory, it is possible to prove interesting statements such as "the complement of the Mandelbrot set is only partially decidable". For other such powerful machines, see super-Turing computation and hypercomputation.

These hypothetical computing machines can be viewed as idealised analog computers which operate on real numbers and are differential, whereas digital computers are limited to computable numbers and are algebraic.

This means that idealised analog computers have a larger information dimension rate (see Information Theory), or potential computing domain, than do digital computers. This in theory enables analog computers to solve problems that are inextricable on digital computers.

Computer theorists often refer to these idealised analog computers as real computers (so called because they operate on the set of real numbers), in order to avoid confusion with real-world analog computers.

Could a real "real computer" ever be built?

Real-world analog computers are far from attaining this ideal, with noise and other errors completely swamping any hypothetical computation-theoretic advantages.

Even assuming perfect Newtonian physics, the atomic nature of matter makes all mechanical linkages highly irregular at small scales. Unless the apparatus is at absolute zero, it will be perturbed by its own thermal vibrations, which are impossible to remove. Similarly, the discrete nature of classical electric charge makes perfectly accurate analog electronic computers impossible, even in the absence of the usual sources of noise such as shot noise and 1/f noise.

Moving beyond classical physics, quantum physics effects such as the Heisenberg uncertainty principle and zero point energy present even more problems for building a real computer, making it difficult or impossible to set up real-valued inputs and read out real-valued results.

Some physical theories such as loop quantum gravity hypothesize that space and time themselves are quantized: this would also present major problems for building a "real computer".