Generic Types

Universe-polymorphic types through Python's Generic[T] system.

Overview

In F*, polymorphism is expressed with implicit type parameters: val id : #a:Type → a → a. Python's typing module offers TypeVar and Generic — Deppy treats these as universe-polymorphic type parameters that participate in verification.

This chapter shows how to write generic data structures whose operations are verified for all possible type instantiations, not just the ones you happen to test.

TypeVar — The Basic Building Block

A TypeVar declares a universally quantified type variable. Deppy interprets it as ranging over a universe of types.

from __future__ import annotations
from typing import TypeVar, Generic

T = TypeVar('T')

def identity(x: T) -> T:
    """The identity function — verified for all T."""
    return x
# Deppy checks: ∀ T : Type, ∀ x : T, identity(x) : T  ✅

Bounded TypeVars

Adding a bound restricts the universe. Deppy treats the bound as a refinement — only types that are subtypes of the bound are valid instantiations.

from typing import TypeVar
from deppy import guarantee, assume

Comparable = TypeVar('Comparable', bound=object)

@guarantee("returns the smaller of the two arguments")
def min_of(a: Comparable, b: Comparable) -> Comparable:
    assume("a and b support __lt__")
    return a if a < b else b
In F*, bounded polymorphism is expressed with type-class constraints. In Deppy, we use bound= on TypeVar or Protocol classes to achieve the same effect.

Generic Classes — Parameterised Containers

The real power comes when you parameterise a class over a type variable. Deppy verifies every method for all valid instantiations of T.

Example: Generic Stack

from __future__ import annotations
from dataclasses import dataclass, field
from typing import TypeVar, Generic, Optional
from deppy import guarantee, check

T = TypeVar('T')

@dataclass
class Stack(Generic[T]):
    """A verified LIFO stack."""
    _items: list[T] = field(default_factory=list)

    @guarantee("len increases by 1")
    def push(self, val: T) -> None:
        old_len = len(self._items)
        self._items.append(val)
        check(len(self._items) == old_len + 1)

    @guarantee("returns most recently pushed item")
    def pop(self) -> T:
        check(len(self._items) > 0, "stack must be non-empty")
        return self._items.pop()

    @guarantee("peek does not modify the stack")
    def peek(self) -> T:
        check(len(self._items) > 0)
        return self._items[-1]

    def is_empty(self) -> bool:
        return len(self._items) == 0

Z3_VERIFIED — all methods verified for all T

Example: Pair and Either

A = TypeVar('A')
B = TypeVar('B')

@dataclass(frozen=True)
class Pair(Generic[A, B]):
    """Product type: A × B"""
    fst: A
    snd: B

    @guarantee("swap is an involution: swap(swap(p)) == p")
    def swap(self) -> Pair[B, A]:
        return Pair(self.snd, self.fst)
from typing import Union

@dataclass(frozen=True)
class Left(Generic[A]):
    value: A

@dataclass(frozen=True)
class Right(Generic[B]):
    value: B

# Sum type: A + B
Either = Union[Left[A], Right[B]]

@guarantee("applies exactly one of the two functions")
def either(
    e: Either[A, B],
    on_left: Callable[[A], T],
    on_right: Callable[[B], T],
) -> T:
    match e:
        case Left(a):
            return on_left(a)
        case Right(b):
            return on_right(b)

Covariance and Contravariance

Variance describes how subtyping of a container relates to subtyping of its elements. Deppy can prove variance properties automatically.

VarianceMeaningPython declaration
CovariantCat <: Animal ⟹ Box[Cat] <: Box[Animal]TypeVar('T', covariant=True)
ContravariantCat <: Animal ⟹ Sink[Animal] <: Sink[Cat]TypeVar('T', contravariant=True)
InvariantNo subtyping relationshipTypeVar('T') (default)

Proving Covariance

T_co = TypeVar('T_co', covariant=True)

@dataclass(frozen=True)
class FrozenBox(Generic[T_co]):
    """Read-only container — safely covariant."""
    _value: T_co

    def get(self) -> T_co:
        return self._value

Deppy's variance analyser walks the class via inspect and tags every appearance of T_co with its position. Run it via the PSDL primitive verify_variance(...) or directly in Python:

from deppy.proofs.psdl_variance import (
    VariancePos, analyze_class_variance,
)

report = analyze_class_variance(
    FrozenBox,
    type_var_name="T_co",
    declared=VariancePos.COVARIANT,
)

print(report.overall.value)              # 'covariant'
print(report.consistent_with_declaration)  # True ✓
for occ in report.occurrences:
    print(occ.position.value, "@", occ.location)
# covariant      @ field _value (frozen)
# covariant      @ method get: return type

Over in PSDL the same check is one line:

verify frozen_box_covariance:
  function: examples.containers.FrozenBox
  psdl_proof: |
    verify_variance(FrozenBox, type_var="T_co", declared="covariant")
    qed()

This emits a kernel-level VarianceCheck ProofTerm; soundness is the covariance_soundness theorem in metatheory/Deppy.lean §32 (proved with no sorry). Concretely: when the analyser reports overall = covariant, the kernel admits the variance declaration; when it reports invariant (e.g. for a mutable set_value(self, v: T_co) method that uses T_co as a parameter) the kernel rejects with a PSDLError citing the offending location.

The PEP-484 convention that __init__ parameters don't propagate variance is honoured: the analyser skips constructors when totalling occurrences, so a frozen dataclass's auto-generated __init__(self, _value: T_co) doesn't introduce a spurious contravariant occurrence.

A mutable container with covariant=True is unsound. Deppy will reject it: if you can write T into the container, covariance allows inserting an Animal where a Cat is expected.

Contravariance Example

T_contra = TypeVar('T_contra', contravariant=True)

class Handler(Generic[T_contra]):
    """Consumes values — contravariant in T."""
    def handle(self, item: T_contra) -> None: ...

# Handler[Animal] <: Handler[Cat]
# A handler that accepts any Animal can certainly handle a Cat.

Universe Polymorphism

In dependent type theory, types themselves have types — the universe hierarchy Type₀ : Type₁ : Type₂ : … prevents Girard's paradox. Deppy tracks universe levels implicitly:

# TypeVar ranges over Type₀ (concrete types) by default
T = TypeVar('T')  # T : Type₀

# Higher-kinded generics are at level 1
F = TypeVar('F')  # When F is used as a generic alias: F : Type₁

def apply_functor(
    f: Callable[[A], B],
    container: F[A]
) -> F[B]:
    """Map f over a functor F — universe level 1."""
    ...
You do not need to annotate universe levels explicitly. Deppy infers them from usage, similarly to how Lean and Agda handle universe polymorphism. Most code lives at level 0.

Comparison with F*

F*

val swap : #a:Type -> #b:Type
  -> (a * b) -> (b * a)
let swap (x, y) = (y, x)

Deppy

def swap(p: Pair[A, B]) -> Pair[B, A]:
    return Pair(p.snd, p.fst)

F* uses #a:Type for implicit type parameters. Python's TypeVar + Generic is the direct equivalent — type inference fills them in at call sites automatically.

Exercises

Exercise 2.1: Implement a generic Option[T] type (like Rust's Option or Haskell's Maybe) using @dataclass with Some and NoneVal variants. Add a verified map method that preserves NoneVal and transforms Some.
Exercise 2.2: Write a generic Result[T, E] type with Ok and Err variants. Prove that map composed with map equals map of the composed function (the functor law).
Exercise 2.3: Given FrozenBox[T_co] above, try adding a set method that takes a T_co parameter. Observe Deppy's rejection. Explain why mutable covariant containers are unsound.