Skip to content

Slow Sort

A deliberately inefficient recursive sorting algorithm based on divide and conquer followed by maximum placement.

Slow sort is a deliberately inefficient recursive sorting algorithm. It is based on the multiply and surrender paradigm, a parody of divide and conquer. The algorithm recursively sorts both halves, compares their maximum elements, places the larger one at the end, and then recursively sorts the remaining prefix.

It is mostly useful as a teaching example for bad recursion design.

Problem

Given a sequence AA of length nn, reorder it such that

A[0]A[1]A[n1] A[0] \le A[1] \le \cdots \le A[n-1]

Algorithm

For a subarray A[l..r]A[l..r]:

  1. Sort the left half recursively
  2. Sort the right half recursively
  3. Swap the middle and last elements if needed
  4. Recursively sort A[l..r1]A[l..r-1]
slow_sort(A, l, r):
    if l >= r:
        return

    m = (l + r) // 2

    slow_sort(A, l, m)
    slow_sort(A, m + 1, r)

    if A[m] > A[r]:
        swap(A[m], A[r])

    slow_sort(A, l, r - 1)

Example

Let

A=[3,1,2] A = [3, 1, 2]

The algorithm first sorts the left half, then the right half. It compares the maximum of the left half with the maximum of the right half, places the larger value at the end, and recursively sorts the prefix.

Final result:

[1,2,3] [1,2,3]

Correctness

The recursive calls sort the left and right halves. After those calls, the largest value in the left half is at position mm, and the largest value in the right half is at position rr. The comparison between A[m]A[m] and A[r]A[r] places the larger of those two values at rr, so the largest element of A[l..r]A[l..r] is fixed at the end. The final recursive call sorts the remaining prefix A[l..r1]A[l..r-1]. Therefore the whole subarray is sorted.

Complexity

Slow sort has extremely poor running time. Its recurrence is larger than ordinary divide-and-conquer sorting because after sorting both halves, it recursively sorts almost the entire range again.

A common recurrence form is:

T(n)=2T(n/2)+T(n1)+O(1) T(n) = 2T(n/2) + T(n-1) + O(1)

This grows super-polynomially and is far worse than O(n2)O(n^2).

Space complexity:

O(n) O(n)

because recursive calls may nest through repeated prefix sorting.

Properties

  • in-place
  • not stable
  • recursive
  • intentionally inefficient
  • useful as a counterexample

When to Use

Slow sort should not be used in production code. It is useful for teaching recursion, recurrence analysis, and the difference between a correct algorithm and an efficient algorithm.

Implementation

def slow_sort(a, l=0, r=None):
    if r is None:
        r = len(a) - 1

    if l >= r:
        return a

    m = (l + r) // 2

    slow_sort(a, l, m)
    slow_sort(a, m + 1, r)

    if a[m] > a[r]:
        a[m], a[r] = a[r], a[m]

    slow_sort(a, l, r - 1)

    return a
func SlowSort[T constraints.Ordered](a []T, l, r int) {
    if l >= r {
        return
    }

    m := (l + r) / 2

    SlowSort(a, l, m)
    SlowSort(a, m+1, r)

    if a[m] > a[r] {
        a[m], a[r] = a[r], a[m]
    }

    SlowSort(a, l, r-1)
}