# 1.7 Space Complexity

Space complexity measures how much memory an algorithm uses as a function of input size. It complements time complexity and often determines whether an algorithm is feasible in practice. Memory includes both the input representation and any additional storage used during execution.

The focus in algorithm design is usually on **auxiliary space**, which is the extra memory beyond the input itself.

## Problem

You have an algorithm and need to determine how much memory it requires, especially whether it fits within constraints or can be improved.

## Solution

Measure memory usage using this structure:

```text
1. Identify input storage.
2. Identify auxiliary variables.
3. Identify data structures allocated during execution.
4. Account for recursion stack if present.
5. Express total as a function of input size.
```

Then simplify to the dominant term.
## Example: Constant Space

```text
max_value(A):
  best = A[0]
  for i = 1 to n - 1:
    if A[i] > best:
      best = A[i]
  return best
```

The algorithm uses:

* Input array of size (n)
* A constant number of variables (`best`, `i`)

Auxiliary space:

```text
O(1)
```
## Example: Linear Space

```text
copy_array(A):
  B = new array of size n
  for i = 0 to n - 1:
    B[i] = A[i]
  return B
```

This allocates a new array of size (n).

Auxiliary space:

```text
O(n)
```
## Example: Recursion Stack

```text
factorial(n):
  if n == 0:
    return 1
  else:
    return n * factorial(n - 1)
```

Each recursive call adds a frame to the call stack. The depth is (n), so:

Auxiliary space:

```text
O(n)
```

Even though no explicit data structure is created, the recursion stack consumes memory.
## Example: Divide and Conquer

Merge sort uses additional space during merging:

```text
merge_sort(A):
  split A into two halves
  sort each half
  merge into a new array
```

The merge step requires temporary storage proportional to the input size.

Auxiliary space:

```text
O(n)
```
## In-Place Algorithms

An algorithm is called **in-place** if it uses constant auxiliary space.

Examples:

* Swapping elements within an array
* Partitioning for quicksort
* Reversing an array

These algorithms modify the input directly and avoid extra memory allocation.
## Space vs Time Trade-offs

Many algorithms trade space for time.

Example: prefix sums

```text
build_prefix(A):
  P[0] = A[0]
  for i = 1 to n - 1:
    P[i] = P[i - 1] + A[i]
```

This uses (O(n)) extra space, but allows range sum queries in constant time.

Without extra space, each query would take linear time.
## Memory Models

Different models affect how space is measured:

* **RAM model**: each variable and array cell counts as one unit
* **Recursive model**: stack frames must be counted
* **Streaming model**: input may not be stored entirely in memory
* **External memory model**: disk access dominates

Always match the model to the problem constraints.
## Common Pitfalls

Ignoring recursion stack space leads to underestimating memory usage.

Counting input space as auxiliary space leads to misleading results. Always separate the two.

Hidden allocations in libraries or data structures may increase memory usage beyond expectations.

Using large intermediate structures when only partial data is needed leads to unnecessary overhead.
## Summary Table

| Pattern                 | Space Complexity |
| ----------------------- | ---------------- |
| Single pass, few vars   | (O(1))           |
| Copy or auxiliary array | (O(n))           |
| Recursion depth (n)     | (O(n))           |
| Balanced recursion      | (O(\log n))      |
| Matrix or grid          | (O(n^2))         |
## Takeaway

Space complexity tracks how memory usage grows with input size. Focus on auxiliary space, include recursion stacks, and consider trade-offs between time and memory. Efficient algorithms often balance both dimensions rather than optimizing only one.

