Generator object internals, YIELD_VALUE and RESUME opcodes, and frame suspension and resumption mechanics.
Generators are resumable functions. A normal function starts, runs, and finishes with one return value. A generator can start, produce a value, suspend its frame, later resume from the same instruction position, produce another value, and repeat until it finishes.
A generator function is any function body that contains yield.
def numbers():
yield 1
yield 2
yield 3Calling this function does not run the body immediately.
g = numbers()The call creates a generator object. The body starts when the generator is resumed:
print(next(g))
print(next(g))
print(next(g))Output:
1
2
3After the last value, the next resume raises StopIteration.
35.1 Generator Function vs Generator Object
A generator function is the callable defined with def.
A generator object is the resumable iterator returned when the generator function is called.
def gen():
yield 1
print(gen)
print(gen())Conceptually:
gen
function object
gen()
generator object
suspended execution state
code object
frame or frame-like stateThis is different from an ordinary function:
def f():
return 1
f()Calling f runs the body immediately and returns 1.
Calling gen returns an object that can later run the body.
35.2 yield
The yield expression produces a value to the generator’s caller and suspends execution.
def gen():
x = 10
yield x
x = 20
yield xExecution sequence:
create generator object
resume generator
x = 10
yield 10
suspend
resume generator
x = 20
yield 20
suspend
resume generator
finish function
raise StopIterationThe local variable x survives across suspension because the generator keeps its execution state.
35.3 Generators Are Iterators
A generator object implements the iterator protocol.
g = numbers()
print(iter(g) is g)
print(next(g))A generator has:
__iter__
__next__
send
throw
closeThe for loop works because generators are iterators:
for x in numbers():
print(x)Conceptually:
it = iter(numbers())
while True:
try:
x = next(it)
except StopIteration:
break
print(x)35.4 Generator Frames
A generator must preserve execution state between resumes.
That state includes:
code object
instruction position
local variables
value stack
exception state
closure cells
running state
finished stateExample:
def gen():
a = 1
b = 2
yield a
yield bAfter the first yield, the generator must remember:
a = 1
b = 2
next instruction is after first yieldThis is why generators are tightly connected to frames.
35.5 Normal Function Frame vs Generator Frame
A normal function frame usually disappears after return.
def f():
x = 1
return xAfter f() returns, the frame can be cleared.
A generator frame persists while suspended.
def gen():
x = 1
yield xAfter next(gen()) reaches yield, the frame cannot be cleared because it may resume later.
| Feature | Normal function | Generator |
|---|---|---|
| Call runs body immediately | Yes | No |
| Can suspend | No | Yes |
| Keeps locals after yielding | No | Yes |
| Returns one final result | Yes | Final result becomes StopIteration.value |
| Implements iterator protocol | No | Yes |
| Frame lifetime | Usually call duration | Until completion or close |
35.6 next()
next(g) resumes a generator.
def gen():
yield "a"
yield "b"
g = gen()
print(next(g))
print(next(g))Execution:
next(g)
resume at start
yield "a"
return "a" to caller
next(g)
resume after first yield
yield "b"
return "b" to callerThe generator object records the instruction position between calls.
35.7 Completion and StopIteration
When a generator finishes, it raises StopIteration.
def gen():
yield 1
g = gen()
print(next(g))
print(next(g))The second next(g) raises StopIteration.
A generator can finish by:
falling off the end
executing return
raising an exception
being closedFalling off the end is equivalent to returning None.
def gen():
yield 1After yield 1, the function reaches the end. The next resume raises StopIteration.
35.8 return in a Generator
A generator can use return value.
def gen():
yield 1
return 99The return value becomes the value attribute of StopIteration.
g = gen()
print(next(g))
try:
next(g)
except StopIteration as exc:
print(exc.value)Output:
1
99A for loop ignores the final StopIteration.value.
35.9 yield Is an Expression
yield can receive a value through send.
def gen():
x = yield "ready"
yield xUse:
g = gen()
print(next(g))
print(g.send(42))Execution:
next(g)
runs until yield "ready"
returns "ready"
g.send(42)
resumes generator
yield expression evaluates to 42
x = 42
yield xSo yield both sends a value out and can receive a value back in.
35.10 Starting a Generator With send
A newly created generator has not reached its first yield.
Therefore, the first resume must use next(g) or g.send(None).
g = gen()
g.send(None)Sending a non-None value to a just-started generator is an error because there is no suspended yield expression to receive it.
g = gen()
g.send(42)This raises TypeError.
35.11 throw
throw resumes a generator by raising an exception at the suspended yield.
def gen():
try:
yield "ready"
except ValueError:
yield "handled"
g = gen()
print(next(g))
print(g.throw(ValueError))Execution:
next(g)
yield "ready"
g.throw(ValueError)
resume at yield by raising ValueError
except ValueError catches it
yield "handled"throw lets the caller inject an exception into the generator.
35.12 close
close asks a generator to terminate.
def gen():
try:
yield 1
finally:
print("cleanup")
g = gen()
next(g)
g.close()Closing injects GeneratorExit into the generator. The finally block runs.
A generator should not yield a normal value while closing. If it does, CPython raises RuntimeError.
def bad():
try:
yield 1
finally:
yield 2Calling close() after the first yield causes an error because the generator yielded during close.
35.13 Generator State
A generator can be in several states:
created
running
suspended
closedUsing inspect:
import inspect
def gen():
yield 1
g = gen()
print(inspect.getgeneratorstate(g))
next(g)
print(inspect.getgeneratorstate(g))
try:
next(g)
except StopIteration:
pass
print(inspect.getgeneratorstate(g))Typical states:
GEN_CREATED
GEN_SUSPENDED
GEN_CLOSEDA generator cannot be resumed while already running.
35.14 Reentrancy Protection
Generators cannot be reentered.
def gen():
yield next(g)
g = gen()
next(g)This attempts to resume g while g is already running. CPython raises an error.
The generator has a running flag to prevent corrupting its frame state.
Conceptually:
if generator is already executing:
raise ValueError or RuntimeError depending on contextThis protects the suspended frame and stack.
35.15 Generator Bytecode
A generator function compiles to a code object marked as a generator.
def gen():
yield 1Calling the function creates a generator object instead of executing the frame to completion.
A conceptual instruction sequence:
LOAD_CONST 1
YIELD_VALUE
RESUME
LOAD_CONST None
RETURN_VALUEThe exact bytecode varies by Python version.
The key instruction is YIELD_VALUE, which sends a value to the caller and suspends execution.
35.16 yield from
yield from delegates to another iterator or generator.
def outer():
yield from inner()It is roughly equivalent to:
for value in inner():
yield valueBut it also forwards:
send
throw
close
StopIteration.valueThis makes yield from more powerful than a simple loop.
35.17 Delegation With yield from
Example:
def inner():
yield 1
yield 2
return 99
def outer():
result = yield from inner()
yield result
print(list(outer()))Output:
[1, 2, 99]The return value of inner becomes the result of the yield from expression in outer.
Conceptually:
outer delegates to inner
inner yields 1
outer yields 1 to caller
inner yields 2
outer yields 2 to caller
inner returns 99 via StopIteration.value
yield from expression evaluates to 99
outer yields 9935.18 yield from and send
yield from forwards values sent by the caller.
def inner():
x = yield "inner ready"
yield x
def outer():
yield from inner()
g = outer()
print(next(g))
print(g.send(42))Output:
inner ready
42The send(42) reaches the suspended yield inside inner.
35.19 yield from and Exceptions
yield from forwards exceptions too.
def inner():
try:
yield "ready"
except ValueError:
yield "handled"
def outer():
yield from inner()
g = outer()
print(next(g))
print(g.throw(ValueError))The exception is thrown into inner, not handled directly by outer, unless delegation ends or inner lacks the appropriate handler.
35.20 Generator Expressions
A generator expression creates a generator-like object.
squares = (x * x for x in range(10))It is lazy. Values are computed as requested.
print(next(squares))
print(next(squares))A generator expression has its own implicit function-like scope.
x = 100
g = (x for x in range(3))
print(x)The outer x remains 100.
35.21 List Comprehension vs Generator Expression
A list comprehension builds the whole list immediately.
xs = [x * x for x in range(10)]A generator expression produces values lazily.
g = (x * x for x in range(10))| Feature | List comprehension | Generator expression |
|---|---|---|
| Evaluation | Eager | Lazy |
| Result | List | Generator-like iterator |
| Memory | Stores all results | Stores execution state |
| Reusable | Yes, list can be iterated many times | No, generator is consumed |
| Scope | Own comprehension scope | Own generator scope |
35.22 One-Shot Iteration
Generators are one-shot iterators.
g = (x for x in range(3))
print(list(g))
print(list(g))Output:
[0, 1, 2]
[]Once exhausted, a generator stays exhausted.
This is different from a container such as a list:
xs = [0, 1, 2]
print(list(xs))
print(list(xs))A list creates a new iterator each time. A generator is its own iterator.
35.23 Lazy Evaluation
Generators compute values on demand.
def read_lines(path):
with open(path) as f:
for line in f:
yield line.rstrip("\n")This does not read the whole file into memory. It reads and yields one line at a time.
Lazy execution is useful for:
large files
streams
pipelines
infinite sequences
expensive computations
early stoppingExample:
def count():
n = 0
while True:
yield n
n += 1This generator represents an infinite sequence.
35.24 Pipeline Style
Generators compose naturally.
def numbers(path):
with open(path) as f:
for line in f:
yield int(line)
def positive(xs):
for x in xs:
if x > 0:
yield x
def squared(xs):
for x in xs:
yield x * xUse:
pipeline = squared(positive(numbers("data.txt")))
for x in pipeline:
print(x)Each stage pulls from the previous one. No full intermediate list is required.
35.25 Generator Cleanup
Generators that manage resources should use try/finally or context managers.
def lines(path):
f = open(path)
try:
for line in f:
yield line
finally:
f.close()If the generator is closed before exhaustion, the finally block runs.
A better form:
def lines(path):
with open(path) as f:
for line in f:
yield lineThe with statement is compiled into cleanup logic that works with generator closing.
35.26 Generators and Resource Leaks
A suspended generator may keep resources alive.
def gen():
f = open("data.txt")
yield f.readline()
f.close()If the caller stops after the first value and never closes the generator, the file may remain open until the generator is collected.
Use with or close explicitly:
g = gen()
next(g)
g.close()Resource ownership should be explicit in generator code.
35.27 Generators and Exceptions in finally
If cleanup code raises, that exception propagates during generator close or finalization.
def gen():
try:
yield 1
finally:
raise RuntimeError("cleanup failed")Calling g.close() after starting the generator raises RuntimeError.
Finalization-time exceptions may be reported as unraisable if there is no normal caller context.
35.28 Generator Memory Retention
A suspended generator keeps its local variables alive.
def gen():
data = bytearray(100_000_000)
yield 1
return len(data)
g = gen()
next(g)After the first yield, data remains alive because the generator may resume and use it.
Retention chain:
generator object
suspended frame
local dataTo release memory, exhaust or close the generator, or avoid keeping large locals across yields.
35.29 Clearing Large Locals
If a large object is not needed after a yield, clear it before yielding or before long suspension.
def gen():
data = bytearray(100_000_000)
result = process(data)
data = None
yield resultThis allows the large object to be released before suspension.
The generator frame still lives, but it no longer references data.
35.30 Generators and for Loops
A generator often appears inside a for loop:
for value in gen():
use(value)The loop repeatedly calls next() until StopIteration.
If the loop exits early with break, the generator object may become unreachable and later close. But relying on immediate finalization is implementation-specific. Use context managers when cleanup timing matters.
35.31 Generator-Based Context Managers
The contextlib.contextmanager decorator turns a generator into a context manager.
from contextlib import contextmanager
@contextmanager
def managed():
print("enter")
try:
yield "value"
finally:
print("exit")
with managed() as value:
print(value)The generator yields exactly once.
Conceptually:
__enter__
run generator until yield
return yielded value
__exit__
resume generator to run cleanupIf the with-body raises, the exception is thrown into the generator at the yield.
35.32 Generator Protocol Methods
Generator objects support these important methods:
| Method | Meaning |
|---|---|
__next__() | Resume and send None |
send(value) | Resume and send value into current yield |
throw(exc) | Resume by raising exception at current yield |
close() | Inject GeneratorExit and close |
__iter__() | Return self |
next(g) calls g.__next__().
g.__next__() is equivalent to g.send(None) for a suspended generator.
35.33 Generator Attributes
Generator objects expose useful attributes.
def gen():
yield 1
g = gen()
print(g.gi_code)
print(g.gi_frame)
print(g.gi_running)Common attributes include:
| Attribute | Meaning |
|---|---|
gi_code | Code object |
gi_frame | Frame or None when closed |
gi_running | Whether currently executing |
gi_yieldfrom | Current delegated iterator for yield from, if any |
These are CPython-level introspection hooks and may expose implementation details.
35.34 Generators and Tracebacks
If a generator raises an exception, the traceback includes the generator frame.
def gen():
yield 1
1 / 0
g = gen()
next(g)
next(g)The second next(g) resumes inside the generator and raises ZeroDivisionError.
The traceback points to the failing line inside gen.
The generator frame is part of the traceback like any other Python frame.
35.35 Generators and StopIteration Transformation
Inside a generator, accidental StopIteration is dangerous.
def gen():
next(iter([]))
yield 1The internal next(iter([])) raises StopIteration.
Modern Python transforms this into RuntimeError when it escapes the generator body. This prevents accidental termination from looking like normal generator completion.
Correct code should catch it explicitly if expected:
def gen():
try:
value = next(iter([]))
except StopIteration:
return
yield value35.36 Generators and Async
Generators are related to, but distinct from, coroutines and async generators.
| Construct | Keyword | Produces |
|---|---|---|
| Generator | def with yield | Generator object |
| Coroutine | async def | Coroutine object |
| Async generator | async def with yield | Async generator object |
A normal generator uses next, send, throw, and close.
A coroutine uses await and event loop scheduling.
An async generator uses async for and anext.
They share the idea of resumable execution but differ in protocol.
35.37 CPython Execution Model
At CPython level, a generator is a heap object that owns suspended execution state.
Conceptually:
PyGenObject
code object
frame or interpreter frame state
name and qualname
exception state
running flag
weakrefs
yield-from targetThe exact structure changes across versions, but the conceptual fields remain.
When resumed:
check generator is not closed
check generator is not already running
mark running
enter evaluation loop with saved frame
run until yield, return, or exception
save frame state if yielded
clear frame state if completed
mark not running
return yielded value or propagate exception35.38 YIELD_VALUE
The YIELD_VALUE instruction is the key bytecode operation.
Conceptually:
value = pop stack
save current frame position
return value to generator caller
mark generator suspendedWhen resumed, execution continues after the yield instruction.
The yielded value is not the final return value of the function. It is an intermediate result delivered by the iterator protocol.
35.39 SEND and Delegation
Modern bytecode has specific support for sending values into generators, coroutines, and delegation paths.
Conceptually, a send operation:
resume suspended iterator/coroutine
send value or None
receive yielded value, return value, or exceptionyield from and await both depend on sending into another resumable object.
This is how nested resumable computations are connected without manually writing full loops.
35.40 Common Misunderstandings
| Misunderstanding | Correct model |
|---|---|
| Calling a generator function runs it | It creates a generator object |
yield is the same as return | yield suspends; return completes |
| A generator can be reused after exhaustion | It is one-shot |
| A generator stores all values | It stores execution state and computes lazily |
yield from is only syntax for a loop | It also forwards send, throw, close, and return value |
| Generators release locals after each yield | Locals remain alive while suspended |
A for loop sees StopIteration.value | It ignores the value |
close() is just deletion | It injects GeneratorExit and runs cleanup |
35.41 Reading Strategy
Start with a small generator:
def gen():
x = 1
yield x
x = 2
yield xInspect:
import dis
import inspect
g = gen()
print(inspect.getgeneratorstate(g))
dis.dis(gen)
print(next(g))
print(inspect.getgeneratorstate(g))
print(g.gi_frame.f_locals)
print(next(g))
print(inspect.getgeneratorstate(g))
print(g.gi_frame.f_locals)Then study:
return value
send
throw
close
yield from
try/finally
generator expressions
contextlib.contextmanagerFor each case, track:
when the body starts
where execution suspends
which locals remain alive
what resumes execution
what exception or value crosses the boundary
when the frame is cleared35.42 Chapter Summary
Generators are resumable functions implemented as iterator objects with saved execution state. A generator function call creates a generator object. The body runs only when the generator is resumed by next, send, throw, or close.
The core model is:
generator function call
↓
create generator object with suspended frame
↓
next/send resumes frame
↓
yield returns value and suspends frame
↓
resume later from same point
↓
return or end raises StopIterationGenerators connect bytecode execution, frames, exception handling, iteration, lazy evaluation, memory lifetime, and cleanup semantics.
They are one of the clearest examples of CPython treating execution state as an object.