Six Sigma is a quality management program to achieve "six sigma" levels of quality. It was pioneered by Motorola in the mid-1980s and has spread to many other manufacturing companies. It continues to spread to service companies as well. In 2000, Fort Wayne, Indiana became the first city to implement the program in a city government.

Six Sigma aims to have the total number of failures in quality, or customer satisfaction, occur beyond the sixth sigma of likelihood in a normal distribution of customers. Here sigma stands for a step of one standard deviation; designing processes with tolerances of at least six standard deviations will, on reasonable assumptions, yield fewer than 3.4 defects in one million. (See below for those assumptions.)

Achievement of six-sigma quality is defined by Motorola in terms of the number of Defects Per Million Opportunities (DPMO).

That is, fewer than four in one million customers will have a legitimate issue with the company's products and service.

Many people believed that six-sigma quality was impossible, and settled for three to four sigmas. However market leaders have measurably reached six sigmas in numerous processes.

Table of contents
1 Why Six?
2 See also:
3 Lists of related topics
4 External links

Why Six?

Anyone looking at a table of probabilities for the normal (Gaussian) distribution will wonder what six-sigma has to do with 3.4 defects per million thingies. Only one billionth of the normal curve lies beyond six standard deviations, or two billionths if you count both too-high and too-low values. Conversely, a mere three sigma corresponds to just 2.6 problems in a thousand, which would seem a good result in many businesses.

The answer has to do with practical considerations for manufacturing processes. (The following discussion is based loosely on the treatment by Robert V. Binder in a discussion of whether six-sigma practices can apply to software [1].) Suppose that the tolerance for some manufacturing step (perhaps the placement of a hole into which a pin must fit) is 300 microns, and the standard deviation for the process of drilling the hole is 100 microns. Then only about 1 part in 400 will be out of spec. But in a manufacturing process, the average value of a measurement is likely to drift over time, and the drift can be 1.5 standard deviations in either direction. At any time, 6.6% of the output will be off by 1.5 sigma in each direction. Thus, when the process has drifted by 150 microns, 6.6% of the product will be off by 150 + 150 or 300 microns, and therefore out of spec. This is a high defect rate.

If you set the tolerance to six sigma, then a drift of 1.5 sigma in the manufacturing process will still produce a defect only for parts that are more than 4.5 sigma away from the average in the same direction. By the mathematics of the normal curve, this is 3.4 defects per million.

There is another reason for six sigma: a manufactured item probably has more than one part, and some of the parts will have to fit together, which means that the total error in two or more parts must be within tolerance. If each step is done to three-sigma precision, an item with 100 parts will hardly ever be defect-free. With six-sigma, even an object with 10,000 parts can be made defect-free 96% of the time.

Clearly, many things on which people rely (services, software prodcuts, etc.) are not manufactured by machine tools to particular measurements. In these cases, "six sigma" has nothing to do with normal distributions, but refers to a goal of very few defects per million, by analogy to a manufacturing process. The usefulness of the analogy is controversial among those concerned with quality in non-manufacturing processes.

See also:

Lists of related topics

External links