Any system which depends on human reliability is unreliable.
The only difference between the fool and the criminal who attacks a system is that the fool attacks unpredictably and on a broader front.
Undetectable errors are infinite in variety. Detectable errors do not exist, unless deadline is less than three hours away.
Investment in reliability will increase until it exceeds the probable cost of errors, or until someone insists on getting some real work done.
At the source of every error which is blamed on the computer you will find at least two human errors, including the error of blaming it on the computer.
A system tends to grow in terms of complexity rather than of simplification, until the resulting unreliability becomes intolerable.
Self-checking systems tend to have a complexity in proportion to the inherent unreliability of the system in which they are used.
The error-detection and correction capabilities of any system will serve as the key to understanding the type of errors which they cannot handle.
All real programs contain errors until proved otherwise — which is impossible.