Chapter 10 — Decorators & Metaprogramming

Python decorators are higher-order functions that wrap other functions or classes, adding behavior without modifying the original code. Metaclasses go further — they control how classes themselves are created. Both are enormously powerful and enormously dangerous from a verification perspective: a decorator can silently change a function's signature, return type, or behavioral guarantees, and a metaclass can alter the very structure of the type hierarchy.

In this chapter, we show how Deppy verifies that decorators preserve properties, how metaclasses form fibrations over the class hierarchy, and how Python's descriptor protocol can be given a formal semantics through dependent products.

Decorator Verification Basics

A decorator @d applied to function f produces a new function d(f). The fundamental verification question is:

If f satisfies specification S, does d(f) also satisfy S?

Deppy formalizes this as a decorator preservation theorem:

from __future__ import annotations
from deppy import guarantee, preserves_spec
from functools import wraps
import time

@preserves_spec  # Deppy: verify this decorator preserves wrapped fn specs
def timing(func):
    """Decorator that logs execution time without altering behavior."""
    @wraps(func)
    def wrapper(*args, **kwargs):
        start = time.perf_counter()
        result = func(*args, **kwargs)  # call original — no modification
        elapsed = time.perf_counter() - start
        print(f"{func.__name__} took {elapsed:.4f}s")
        return result  # return unmodified result
    return wrapper

@timing
@guarantee("returns a sorted copy of the input list")
def sorted_copy(lst: list[int]) -> list[int]:
    return sorted(lst)
$ python -m deppy check timing_example.py
Analyzing decorator @timing...

✓ DecoratorPreserves[@timing]
  ├── Signature preservation:
  │   @wraps(func) applied  ✓
  │   wrapper accepts (*args, **kwargs) — subsumes any signature  ✓
  │   wrapper returns result of func(*args, **kwargs) unmodified  ✓
  ├── Behavioral preservation:
  │   No modification to args before call  ✓
  │   No modification to result after call  ✓
  │   Side effect: print() — pure observation, does not affect result  ✓
  └── Status: SPEC_PRESERVED ✅

✓ sorted_copy: guarantee "returns a sorted copy" — Z3_VERIFIED
  └── @timing does not invalidate this guarantee

2/2 obligations verified (0.08s)

How Deppy Analyzes Decorators

Deppy performs a data-flow analysis on the decorator's wrapper function to determine what it does to the arguments and return value:

Decorator Analysis Pipeline

1
AST Extraction — Parse the decorator and locate the inner wrapper function.
2
Call-through detection — Identify where func(*args, **kwargs) is called and where its result is stored.
3
Argument mutation check — Verify that args and kwargs are not modified before the call.
4
Result mutation check — Verify that the result is returned unmodified (or if modified, characterize the modification).
5
Side effect classification — Classify side effects as observational (logging, timing) or effectful (modifying global state, I/O that affects the result).

@wraps and Signature Preservation

Python's functools.wraps copies the wrapped function's __name__, __doc__, __module__, and critically __wrapped__ and __signature__. Deppy uses __wrapped__ to trace through decorator stacks:

import inspect
from deppy.decorators import unwrap_decorator_stack

# Given a decorated function, find the original
def trace_decorators(fn):
    stack = unwrap_decorator_stack(fn)
    # stack = [timing(sorted_copy), sorted_copy]
    # stack[0] is the outermost wrapper
    # stack[-1] is the original function

    for layer in stack:
        sig = inspect.signature(layer)
        print(f"  {layer.__qualname__}: {sig}")

# Deppy verifies that each layer preserves the signature:
# timing.wrapper: (*args, **kwargs) → Any
#   ↓ @wraps copies __signature__ from sorted_copy
# sorted_copy: (lst: list[int]) → list[int]
# The visible signature matches the original ✓
If a decorator does not use @wraps, Deppy warns that the signature may be lost. This is a common source of bugs — tools like help(), inspect.signature(), and IDE autocomplete will show the wrong signature.

Decorators That Transform

Not all decorators preserve behavior. Some intentionally transform it. Deppy can verify the transformation:

from deppy import transforms_spec, guarantee

@transforms_spec(
    # Declare what the decorator changes:
    adds_guarantee="result is non-negative",
    modifies_return="abs(original_result)",
)
def make_positive(func):
    """Decorator that takes the absolute value of the result."""
    @wraps(func)
    def wrapper(*args, **kwargs):
        result = func(*args, **kwargs)
        return abs(result)
    return wrapper

@make_positive
@guarantee("returns the difference between two numbers")
def difference(a: int, b: int) -> int:
    return a - b

# After decoration, the effective specification is:
# difference(a, b) = abs(a - b)
# Guarantees:
#   ✓ "returns the difference between two numbers" — TRANSFORMED
#     (now returns absolute difference)
#   ✓ "result is non-negative" — ADDED by decorator
$ python -m deppy check difference.py
✓ DecoratorTransforms[@make_positive]
  ├── Original guarantee "returns the difference": TRANSFORMED
  │   └── Now: "returns the absolute difference"
  ├── Added guarantee "result is non-negative":
  │   Z3: ∀ x : Int. abs(x) ≥ 0  — Z3_VERIFIED ✅
  └── Status: TRANSFORM_VERIFIED ✅

Class Decorators and __init_subclass__

Class decorators operate on the class object itself. A common pattern is registering classes in a registry or adding methods:

from __future__ import annotations
from deppy import verify, guarantee

# A registry decorator — adds the class to a global dict
_REGISTRY: dict[str, type] = {}

def register(cls: type) -> type:
    """Register a class by its name."""
    _REGISTRY[cls.__name__] = cls
    return cls  # returns the class unchanged

@register
@verify
class JsonParser:
    @guarantee("returns a valid Python object from JSON string")
    def parse(self, data: str) -> object:
        import json
        return json.loads(data)

# Deppy verifies:
# 1. @register returns cls unchanged → all specs preserved ✓
# 2. _REGISTRY side effect: adds entry — pure bookkeeping ✓
# 3. JsonParser.parse guarantee: verified via json.loads spec ✓

Python 3.6+ added __init_subclass__, which lets a parent class hook into subclass creation. Deppy treats this as a fibered construction:

@verify
class Plugin:
    """Base class for plugins. Subclasses are auto-registered."""
    _plugins: dict[str, type[Plugin]] = {}

    def __init_subclass__(cls, /, plugin_name: str | None = None, **kwargs):
        super().__init_subclass__(**kwargs)
        name = plugin_name or cls.__name__.lower()
        if name in Plugin._plugins:
            raise ValueError(f"Plugin '{name}' already registered")
        Plugin._plugins[name] = cls

    @guarantee("returns a non-empty string describing the plugin")
    def describe(self) -> str:
        return self.__class__.__name__

# Deppy models __init_subclass__ as a fibration section:
# For each subclass C, __init_subclass__ produces a fiber:
#   fiber(C) = (name, C) added to _plugins
# Proof obligation: name is unique (no collisions)
#
# This is verified per-subclass at class-creation time.

Metaclasses as Fibrations

A metaclass M is a class whose instances are themselves classes. In HoTT, this is naturally modeled as a fibration: the metaclass defines a fiber bundle over the space of class definitions.

# The fibration picture:
#
#   E  (total space: all classes created by metaclass M)
#   |
#   | π  (projection: each class maps to its definition)
#   |
#   B  (base space: class definitions = (name, bases, namespace))
#
# The fiber over a point b ∈ B is:
#   π⁻¹(b) = M(name, bases, namespace)
#   = the class object produced by the metaclass
#
# A fibration must satisfy the homotopy lifting property:
# if we continuously deform the class definition (base path),
# the resulting class deforms continuously (total path).

Here's a concrete metaclass and its fibration interpretation:

from __future__ import annotations
import annotationlib
from deppy import verify, guarantee
from deppy.hott import Fibration, FiberCheck

class ValidatedMeta(type):
    """Metaclass that enforces type annotations on all attributes."""

    def __new__(mcs, name: str, bases: tuple, namespace: dict) -> type:
        # Create class first, then inspect annotations (PEP 749)
        cls = super().__new__(mcs, name, bases, namespace)
        annotations = annotationlib.get_annotations(cls)
        for attr_name, value in namespace.items():
            if attr_name.startswith("_"):
                continue
            if attr_name not in annotations and not callable(value):
                raise TypeError(
                    f"Attribute '{attr_name}' in class '{name}' "
                    f"must have a type annotation"
                )
        return cls

# Deppy's fibration analysis:
fib = Fibration.from_metaclass(ValidatedMeta)
# Fibration(ValidatedMeta):
#   Base space: (name: str, bases: tuple[type], namespace: dict)
#   Fiber invariant: ∀ attr ∈ namespace.
#     ¬attr.startswith("_") ∧ ¬callable(attr_value)
#     → attr ∈ annotations
#   Total space: { cls : type | cls.__class__ == ValidatedMeta }

Verifying Classes Under a Metaclass

class Config(metaclass=ValidatedMeta):
    host: str = "localhost"
    port: int = 8080
    debug: bool = False
    # All public attributes are annotated ✓

try:
    class BadConfig(metaclass=ValidatedMeta):
        host = "localhost"  # ERROR: no type annotation!
        port: int = 8080
except TypeError as e:
    print(e)  # → Attribute 'host' in class 'BadConfig' must have a type annotation
$ python -m deppy check config.py
✓ FiberCheck[Config, ValidatedMeta]
  ├── host: str = "localhost"   — annotated ✓
  ├── port: int = 8080         — annotated ✓
  ├── debug: bool = False      — annotated ✓
  └── Metaclass invariant satisfied ✅

✗ FiberCheck[BadConfig, ValidatedMeta]
  ├── host = "localhost"        — NOT annotated ✗
  │   TypeError: Attribute 'host' must have a type annotation
  ├── port: int = 8080         — annotated ✓
  └── Metaclass invariant VIOLATED ❌

__new__ vs __init__ Verification

Python classes have two creation hooks: __new__ (creates the instance) and __init__ (initializes it). Deppy verifies both and their interaction:

@verify
class Singleton:
    """A class that ensures only one instance exists."""
    _instance: Singleton | None = None

    def __new__(cls) -> Singleton:
        if cls._instance is None:
            cls._instance = super().__new__(cls)
        return cls._instance

    def __init__(self) -> None:
        if not hasattr(self, "_initialized"):
            self.value = 0
            self._initialized = True

# Deppy verifies:
# 1. __new__ always returns an instance of cls (or subclass) ✓
# 2. __new__ is idempotent: Singleton() is Singleton() ✓
# 3. __init__ guards against re-initialization ✓
# 4. After __new__ + __init__, self.value == 0 ∧ self._initialized ✓
#
# Proof obligation chain:
#   __new__(cls) → instance : cls
#   __init__(instance) → instance.value = 0
#   Singleton() = __init__(__new__(Singleton))
#   Singleton() is Singleton()  [singleton property]

The Descriptor Protocol

Python's descriptor protocol (__get__, __set__, __delete__) controls attribute access. Deppy models descriptors as dependent products — a descriptor's type depends on the object it's accessed through:

from __future__ import annotations
from deppy import verify, guarantee

@verify
class Validated:
    """Descriptor that validates values on set."""

    def __init__(self, min_val: float, max_val: float) -> None:
        self.min_val = min_val
        self.max_val = max_val
        self.attr_name = ""

    def __set_name__(self, owner: type, name: str) -> None:
        self.attr_name = name

    @guarantee("returns a value in [min_val, max_val]")
    def __get__(self, obj: object | None, objtype: type | None = None) -> float:
        if obj is None:
            return self  # class-level access returns descriptor
        return getattr(obj, f"_{self.attr_name}", self.min_val)

    def __set__(self, obj: object, value: float) -> None:
        if not (self.min_val <= value <= self.max_val):
            raise ValueError(
                f"{self.attr_name} must be between "
                f"{self.min_val} and {self.max_val}"
            )
        setattr(obj, f"_{self.attr_name}", value)

    def __delete__(self, obj: object) -> None:
        delattr(obj, f"_{self.attr_name}")

@verify
class Temperature:
    kelvin = Validated(min_val=0.0, max_val=1e10)
    celsius = Validated(min_val=-273.15, max_val=1e10)

    def __init__(self, kelvin: float) -> None:
        self.kelvin = kelvin
        self.celsius = kelvin - 273.15
$ python -m deppy check descriptors.py --verbose
Analyzing descriptor Validated...

✓ DescriptorProtocol[Validated]
  ├── __get__: (obj, objtype?) → float
  │   └── Guarantee "returns value in [min, max]":
  │       Precondition: obj is not None
  │       Z3: min_val ≤ result ≤ max_val
  │       Status: Z3_VERIFIED ✅
  │       (default value min_val is in range trivially)
  ├── __set__: (obj, value: float) → None
  │   └── Post: getattr(obj, f"_{name}") == value
  │       Only if min_val ≤ value ≤ max_val
  │       Otherwise raises ValueError  ✓
  ├── __delete__: (obj) → None
  │   └── Post: _{name} removed from obj  ✓
  └── Coherence: __set__ then __get__ returns same value  ✓

✓ Temperature.kelvin: Validated(0.0, 1e10)
  ├── __init__ sets kelvin to parameter value
  │   Z3: kelvin ≥ 0 → 0.0 ≤ kelvin ≤ 1e10  ✓ (if kelvin ≤ 1e10)
  └── Precondition generated: 0.0 ≤ kelvin ≤ 1e10

✓ Temperature.celsius: Validated(-273.15, 1e10)
  ├── __init__ sets celsius to kelvin - 273.15
  │   Z3: (0 ≤ kelvin ≤ 1e10) → (-273.15 ≤ kelvin-273.15 ≤ 1e10)
  │   Status: Z3_VERIFIED ✅
  └── Descriptor bounds satisfied

@property Verification

Python's @property is a descriptor that turns method calls into attribute access. Deppy verifies properties as if they were methods:

@verify
class Circle:
    def __init__(self, radius: float) -> None:
        assert radius > 0
        self._radius = radius

    @property
    @guarantee("returns a positive value")
    def radius(self) -> float:
        return self._radius

    @radius.setter
    def radius(self, value: float) -> None:
        if value <= 0:
            raise ValueError("radius must be positive")
        self._radius = value

    @property
    @guarantee("returns π × radius²")
    def area(self) -> float:
        import math
        return math.pi * self._radius ** 2

    @property
    @guarantee("returns 2 × π × radius")
    def circumference(self) -> float:
        import math
        return 2 * math.pi * self._radius

# Deppy verifies the invariant:
#   area == π × radius²  ∧  circumference == 2π × radius
# And these are consistent after setter:
#   c.radius = r  →  c.area == π×r²  ∧  c.circumference == 2π×r

Case Study: Verified Caching Decorator

functools.lru_cache is one of Python's most-used decorators. Let's verify key properties of a simplified version:

from __future__ import annotations
from deppy import verify, guarantee, preserves_spec
from functools import wraps

@preserves_spec
@verify
def memoize(maxsize: int = 128):
    """
    Caching decorator with bounded cache size.

    Properties we verify:
    1. Cache hit returns same result as uncached call (correctness)
    2. Cache size never exceeds maxsize (boundedness)
    3. Decorator preserves function spec (transparency)
    """
    assert maxsize > 0

    def decorator(func):
        cache: dict[tuple, object] = {}
        order: list[tuple] = []  # LRU order

        @wraps(func)
        def wrapper(*args):
            key = args
            if key in cache:
                # Move to end (most recently used)
                order.remove(key)
                order.append(key)
                return cache[key]

            result = func(*args)
            cache[key] = result
            order.append(key)

            # Evict oldest if over capacity
            while len(cache) > maxsize:
                oldest = order.pop(0)
                del cache[oldest]

            return result

        wrapper.cache_info = lambda: {
            "size": len(cache), "maxsize": maxsize
        }
        wrapper.cache_clear = lambda: (cache.clear(), order.clear())
        return wrapper
    return decorator
$ python -m deppy check memoize.py --verbose
Analyzing @memoize...

✓ Property 1: Cache correctness
  ├── cache[key] is set to func(*args)
  ├── On cache hit, cache[key] is returned unchanged
  ├── func is pure (assumption — verified per call site)
  │   ∴ cache hit returns same as fresh call
  └── Z3_VERIFIED ✅

✓ Property 2: Cache boundedness
  ├── Loop invariant: len(cache) ≤ maxsize + 1 at entry
  ├── While body: len(cache) decreases by 1
  ├── After loop: len(cache) ≤ maxsize
  ├── Termination: len(cache) is natural number, decreases
  └── Z3_VERIFIED ✅

✓ Property 3: Spec preservation
  ├── @wraps(func) applied  ✓
  ├── wrapper returns func(*args) or cache[key]
  │   Both equal func(*args) by Property 1
  ├── No argument modification  ✓
  └── SPEC_PRESERVED ✅

3/3 obligations verified (0.21s)

Case Study: Verified Retry Decorator

from __future__ import annotations
import time
from deppy import verify, guarantee

@verify
def retry(
    max_attempts: int = 3,
    backoff_factor: float = 1.0,
    exceptions: tuple[type[Exception], ...] = (Exception,),
):
    """
    Retry decorator with exponential backoff.

    Guarantees:
    - At most max_attempts calls to the wrapped function
    - Backoff delay doubles each retry
    - Only catches specified exception types
    - If all retries fail, re-raises the last exception
    """
    assert max_attempts >= 1
    assert backoff_factor >= 0

    def decorator(func):
        @wraps(func)
        def wrapper(*args, **kwargs):
            last_exception: Exception | None = None
            for attempt in range(max_attempts):
                try:
                    return func(*args, **kwargs)
                except exceptions as e:
                    last_exception = e
                    if attempt < max_attempts - 1:
                        delay = backoff_factor * (2 ** attempt)
                        time.sleep(delay)
            raise last_exception  # type: ignore
        return wrapper
    return decorator
$ python -m deppy check retry.py --verbose
✓ Bounded retries:
  ├── Loop: for attempt in range(max_attempts)
  │   range(max_attempts) has exactly max_attempts iterations
  │   func called at most once per iteration
  │   ∴ func called at most max_attempts times
  └── Z3_VERIFIED ✅

✓ Exponential backoff:
  ├── delay = backoff_factor × 2^attempt
  │   attempt ∈ {0, 1, ..., max_attempts-2}
  │   delays: [bf, 2bf, 4bf, ...]
  │   Each delay = 2 × previous delay
  └── Z3_VERIFIED ✅

✓ Exception specificity:
  ├── except clause catches only 'exceptions' tuple
  │   Other exceptions propagate uncaught ✓
  └── AST_VERIFIED ✅

✓ Last exception re-raised:
  ├── If loop completes without return:
  │   last_exception is not None (set on each except)
  │   raise last_exception executes
  └── Z3_VERIFIED ✅

4/4 obligations verified (0.15s)

ABCMeta and Abstract Method Enforcement

abc.ABCMeta is a metaclass that enforces abstract method implementation. Deppy models this as a fibration with a completeness check:

from abc import ABCMeta, abstractmethod
from deppy import verify, guarantee

class Shape(metaclass=ABCMeta):
    @abstractmethod
    @guarantee("returns a non-negative float")
    def area(self) -> float: ...

    @abstractmethod
    @guarantee("returns a positive float")
    def perimeter(self) -> float: ...

@verify
class Rectangle(Shape):
    def __init__(self, width: float, height: float) -> None:
        assert width > 0 and height > 0
        self.width = width
        self.height = height

    @guarantee("returns width × height")
    def area(self) -> float:
        return self.width * self.height

    @guarantee("returns 2 × (width + height)")
    def perimeter(self) -> float:
        return 2 * (self.width + self.height)
$ python -m deppy check shapes.py
✓ ABCMeta completeness: Rectangle implements all abstract methods
  ├── area: implemented ✓
  │   └── "returns non-negative": w > 0 ∧ h > 0 → w×h > 0 ≥ 0  Z3 ✅
  ├── perimeter: implemented ✓
  │   └── "returns positive": w > 0 ∧ h > 0 → 2(w+h) > 0  Z3 ✅
  └── All guarantees from Shape are satisfied ✅

2/2 obligations verified (0.09s)
Deppy treats ABCMeta as a special case of fibration where the fiber invariant is: "all @abstractmethods must be overridden." This is a decidable check, so Deppy verifies it at class-creation time without needing Z3.

Exercises

Exercise 10.1. Write a @validate_args decorator that checks all arguments against their type annotations at runtime. Verify that this decorator preserves function specs while adding a runtime type check. What guarantee does the decorator add?
Exercise 10.2. Create a metaclass ImmutableMeta that makes all instances of a class immutable (freezes __setattr__ after __init__). Write the Deppy fiber invariant: what property must all classes created with this metaclass satisfy?
Exercise 10.3. Write a @rate_limit(calls_per_second=10) decorator. Verify: (a) it never calls the wrapped function more than calls_per_second times per second, (b) it preserves the function's spec, (c) it eventually calls through (liveness).
Exercise 10.4 (Challenge). Consider a decorator stack: @retry(3) @memoize(128) @timing. Write out the full verification pipeline: which properties are preserved at each layer? Does the order of decorators matter for correctness? (Hint: consider what happens when retry catches an exception — does memoize cache it?)