anal-y-sis: an examination of a complex, its elements, and their relations
Webster's DictionaryAn optimizing compiler transforms programs to improve their efficiency without changing their output. There are many transformations that improve efficiency:
Register allocation: Keep two nonoverlapping temporaries in the same register.
Common-subexpression elimination: If an expression is computed more than once, eliminate one of the computations.
Dead-code elimination: Delete a computation whose result will never be used.
Constant folding: If the operands of an expression are constants, do the computation at compile time.
This is not a complete list of optimizations. In fact, there can never be a complete list.
NO MAGIC BULLET
Computability theory shows that it will always be possible to invent new optimizing transformations.
Let us say that a fully optimizing compiler is one that transforms each program P to a program Opt(P) that is the smallest program with the same input/output behavior as P. We could also imagine optimizing for speed instead of program size, but let us choose size to simplify the discussion.
For any program Q that produces no output and never halts, Opt(Q) is short and easily recognizable:
INTERMEDIATE REPRESENTATION FOR FLOW ANALYSIS
Therefore, if we had a fully optimizing compiler, we could use it to solve the halting problem; to see if there exists an input on which P halts, just see if Opt(P) is the one-line infinite loop. But we know that no computable algorithm can always tell whether programs halt, so a fully optimizing compiler cannot be written either.