13.5 Proof Length and Complexity
Quantitative study of proofs, including proof size, efficiency, and connections to computational complexity.
31 notes
Quantitative study of proofs, including proof size, efficiency, and connections to computational complexity.
Understanding how the cost of an algorithm grows with input size.
Overview of algorithmic thinking, computational methods, complexity, approximation, and verification in mathematics.
Prove that any comparison-based sorting algorithm requires Ω(n log n) comparisons in the worst case using a decision tree argument.
Linked list algorithms are dominated by pointer traversal and constant-time link updates.
Even when the algorithmic idea is correct, implementations fail in predictable ways.
Benchmarking measures how an implementation behaves on real inputs and real hardware.
Stability and determinism describe how predictably an algorithm behaves when there are ties, repeated values, or multiple valid answers.
Algorithms are usually described with mathematical integers and real numbers.
Data representation is the choice of concrete form used to store the objects in a problem.
Implementation discipline means translating an algorithm into code without changing its meaning accidentally.
Pseudocode is a bridge between the problem statement and an implementation.
A reduction transforms one problem into another problem.
Amortized analysis studies the average cost of operations over a sequence, even when individual operations are sometimes expensive.
Randomized algorithms use random choices during execution.
Dynamic programming solves problems by storing answers to subproblems and reusing them.
Divide and conquer solves a problem by splitting it into smaller subproblems, solving those subproblems, and combining their answers.
A greedy algorithm builds a solution by making one locally best choice at a time.
A brute force baseline is the simplest correct algorithm you can write from the problem statement.
Testing does not prove an algorithm correct, but it exposes mistakes in specifications, invariants, edge cases, and implementation details.
Edge cases are valid inputs that sit at the boundary of the specification.
A lower bound states that every algorithm for a problem must perform at least a certain amount of work in some model of computation.
Big O notation provides a formal way to describe how a function grows.
Space complexity measures how much memory an algorithm uses as a function of input size.
Time complexity describes how the running time of an algorithm grows as the input size grows.
Recursive algorithms replace loop structure with self-reference.
Loop invariants are the primary tool for reasoning about iterative algorithms.
A correctness argument explains why an algorithm returns an acceptable output for every valid input.
An algorithm does not operate on an abstract idea of data.
An algorithm begins with a precise statement of the problem.
This chapter defines how you think about algorithms before you write any code.