Counting & Choice
Read Next:
Counting is deceptively hard. The question “how many ways can you do X?” sounds elementary, but answering it precisely requires a small but powerful toolkit. This post builds that toolkit from first principles, with an eye toward both combinatorics and algorithm analysis.
The Two Fundamental Principles
Multiplication Principle. If a procedure consists of $k$ sequential steps, where step $i$ can be performed in $n_i$ ways regardless of how earlier steps were performed, then the total number of ways to complete the procedure is $n_1 \cdot n_2 \cdots n_k$.
Example. A password of length 3 using lowercase letters: $26 \cdot 26 \cdot 26 = 26^3 = 17{,}576$ possibilities.
Addition Principle. If a procedure can be performed in mutually exclusive ways $A_1, A_2, \ldots, A_k$ (with $|A_i|$ choices for way $i$), then the total number of ways is $|A_1| + |A_2| + \cdots + |A_k|$.
The addition principle is really just set union for disjoint sets: $|A_1 \cup A_2 \cup \cdots \cup A_k| = \sum_{i=1}^k |A_i|$ when $A_i \cap A_j = \emptyset$ for $i \neq j$.
Permutations: Ordered Selection Without Replacement
A permutation of $k$ items chosen from $n$ distinct items is an ordered arrangement. The count is:
$$P(n, k) = \frac{n!}{(n-k)!} = n(n-1)(n-2)\cdots(n-k+1)$$
Proof. The first position can be filled in $n$ ways, the second in $n-1$ ways (one item used), the $k$-th in $n-k+1$ ways. The multiplication principle gives the product.
Special case: $P(n, n) = n!$, the number of ways to arrange all $n$ items.
Combinations: Unordered Selection Without Replacement
A combination of $k$ items from $n$ is an unordered subset. Each subset of size $k$ gives rise to exactly $k!$ ordered permutations (one for each ordering of that subset), so:
$$\binom{n}{k} = \frac{P(n,k)}{k!} = \frac{n!}{k!(n-k)!}$$
Theorem. $\binom{n}{k} = \binom{n}{n-k}$.
Proof. Choosing $k$ items to include is equivalent to choosing $n-k$ items to exclude: $$\frac{n!}{k!(n-k)!} = \frac{n!}{(n-k)!k!}.$$
The Four Cases of Selection
The two binary choices - ordered vs. unordered, with vs. without replacement - give four cases:
+------------------+----------------------------+----------------------------+
| | WITHOUT REPLACEMENT | WITH REPLACEMENT |
+------------------+----------------------------+----------------------------+
| ORDERED | P(n,k) = n!/(n-k)! | n^k |
| (permutation) | arrange 3 from {A,B,C,D} | 3-char string over {A..Z}|
+------------------+----------------------------+----------------------------+
| UNORDERED | C(n,k) = n!/k!(n-k)! | C(n+k-1, k) |
| (combination) | choose 3 from {A,B,C,D} | buy 3 items from 4 types |
+------------------+----------------------------+----------------------------+
The bottom-right case (unordered, with replacement) is the most subtle. It counts multisets.
Stars and Bars: The Multiset Coefficient
Theorem. The number of ways to choose $k$ items from $n$ types with repetition allowed (order does not matter) is:
$$\binom{n+k-1}{k}$$
Proof. Model a selection as a string of $k$ stars (items) and $n-1$ bars (dividers between types). The stars and bars together form a string of length $n+k-1$, and we choose which $k$ positions are stars:
Types: [A | B | C | D]
Example with n=4 types, k=3 items:
A,A,C -> **| |* | -> ** | * |
B,C,D -> | *|* |* -> | * | * | *
Formally, a sequence $(x_1, x_2, \ldots, x_n)$ with each $x_i \geq 0$ and $\sum x_i = k$ (where $x_i$ is the count of type $i$) bijects with placing $n-1$ bars among $k$ stars, giving $\binom{n-1+k}{k}$ arrangements. $\square$
The Pigeonhole Principle
Theorem (Pigeonhole). If $n$ items are placed into $m$ containers with $n > m$, then at least one container holds at least $\lceil n/m \rceil$ items.
Proof. If every container held at most $\lfloor n/m \rfloor < n/m$ items, the total would be at most $m \cdot \lfloor n/m \rfloor < n$, contradicting that all $n$ items are placed.
The pigeonhole principle is surprisingly powerful. Among any 13 people, two share a birth month. Among any $n+1$ integers from ${1, \ldots, 2n}$, two must be consecutive.
Inclusion-Exclusion: A Preview
When events are not disjoint, the addition principle overcounts. The fix is inclusion-exclusion:
$$|A \cup B| = |A| + |B| - |A \cap B|$$
More generally for three sets:
$$|A \cup B \cup C| = |A| + |B| + |C| - |A \cap B| - |A \cap C| - |B \cap C| + |A \cap B \cap C|$$
The general formula will be developed fully in the Inclusion-Exclusion post.
CS Perspective: Counting Complexity and the Birthday Paradox
Algorithm complexity. Counting is the foundation of complexity analysis. An algorithm that tries all $k$-subsets of an $n$-element set runs in $O\binom{n}{k}$ time - exponential for $k = \Theta(n)$, polynomial for constant $k$.
Hash collisions and the birthday paradox. Suppose we insert $m$ keys into a hash table of size $n$. The probability of no collision after $m$ insertions is:
$$P(\text{no collision}) = 1 \cdot \frac{n-1}{n} \cdot \frac{n-2}{n} \cdots \frac{n-m+1}{n} = \frac{n!}{n^m(n-m)!}$$
Using the approximation $\prod_{i=0}^{m-1}(1 - i/n) \approx e^{-m(m-1)/2n}$, the probability of collision exceeds $1/2$ when $m \approx \sqrt{2n \ln 2} \approx 1.177\sqrt{n}$.
With a 32-bit hash ($n = 2^{32} \approx 4 \times 10^9$), collisions become likely after only about $\sqrt{4 \times 10^9} \approx 65{,}000$ insertions. This is the birthday paradox - collisions arrive far sooner than intuition suggests, with consequences for hash function security (hash collisions in cryptographic contexts) and data structure design.
Combinatorial explosion. The number of possible inputs of length $n$ over a binary alphabet is $2^n$. Exhaustive search is feasible only for very small $n$; combinatorics quantifies exactly how quickly the search space grows, motivating smarter algorithms.
The next post develops permutations and combinations further: circular arrangements, derangements, Pascal’s identity, and the binomial theorem.