Chapter 11 — Dynamic Typing Verification
Python is dynamically typed: variables have no fixed type, objects can change
their behavior at runtime, and the type of a value is determined by its
__class__ attribute — which can itself be reassigned. This chapter
shows how Deppy models Python's dynamic type system as a
universe in HoTT, with isinstance as a
decidable predicate, gradual typing as a trust boundary, and monkey patching
as transport along paths.
The Python Type Universe
In HoTT, a universe U is a type whose elements are
themselves types. Python's runtime type system works exactly this way:
type is a universe, and every class is an element of this universe.
# Python's type universe in action:
assert isinstance(42, int) # 42 : int, and int : type
assert isinstance(int, type) # int : type (int is in the universe)
assert isinstance(type, type) # type : type (the universe contains itself!)
# In HoTT terms:
# int : U₀ (int is a type in the first universe)
# type : U₁ (type is a type in the second universe)
# But Python has type : type, which is Russell's paradox!
# Deppy resolves this with a stratified universe hierarchy.
Deppy models Python's type universe with explicit levels to avoid paradoxes:
from deppy.hott import Universe, Level
# Deppy's internal universe hierarchy:
# U₀ : values (42, "hello", [1,2,3])
# U₁ : types (int, str, list[int])
# U₂ : metatypes (type, ABCMeta)
# U₃ : meta-metatypes (rarely needed)
#
# The stratification ensures consistency:
# type(42) = int : U₀ → U₁
# type(int) = type : U₁ → U₂
# type(type) = type : U₂ → U₂ (fixed point, explicitly handled)
isinstance as Type Predicate
isinstance(x, T) is a decision procedure for the proposition
"the runtime type of x is a subtype of T."
Deppy models this as a decidable predicate that produces
evidence when it holds:
from __future__ import annotations
from deppy import verify, guarantee
from deppy.hott import Dec, Narrowing
@verify
def safe_divide(a: object, b: object) -> float | str:
"""Divide a by b, with type checking and error handling."""
if not isinstance(a, (int, float)):
return f"Error: {a!r} is not a number"
if not isinstance(b, (int, float)):
return f"Error: {b!r} is not a number"
if b == 0:
return "Error: division by zero"
# At this point, Deppy knows:
# a : int | float (narrowed by isinstance check)
# b : int | float (narrowed by isinstance check)
# b ≠ 0 (narrowed by conditional)
return a / b # safe: types verified, no ZeroDivisionError
$ python -m deppy check safe_divide.py --verbose
Analyzing safe_divide...
Type narrowing trace:
Line 4: a : object
Line 5: isinstance(a, (int, float)) → False branch:
→ a : object \ (int | float) — returns str ✓
Line 5: isinstance(a, (int, float)) → True branch:
→ a : int | float
Line 7: isinstance(b, (int, float)) → False branch:
→ b : object \ (int | float) — returns str ✓
Line 7: isinstance(b, (int, float)) → True branch:
→ b : int | float
Line 9: b == 0 → True branch:
→ b : Literal[0] — returns str ✓
Line 9: b == 0 → False branch:
→ b : (int | float) \ {0}
Line 13: a / b
→ a : int | float, b : (int | float) \ {0}
→ result : float ✓
✓ All branches type-safe
✓ No ZeroDivisionError possible
✓ Return type: float | str — matches annotation
Type Narrowing with match/case
Python 3.10's structural pattern matching provides exhaustive type narrowing.
Deppy models each case branch as a fiber:
from __future__ import annotations
from dataclasses import dataclass
from deppy import verify, guarantee
@dataclass
class Circle:
radius: float
@dataclass
class Rectangle:
width: float
height: float
@dataclass
class Triangle:
base: float
height: float
Shape = Circle | Rectangle | Triangle
@verify
@guarantee("returns a non-negative float")
def area(shape: Shape) -> float:
import math
match shape:
case Circle(radius=r):
# narrowed: shape : Circle, r : float = shape.radius
return math.pi * r ** 2
case Rectangle(width=w, height=h):
# narrowed: shape : Rectangle, w = shape.width, h = shape.height
return w * h
case Triangle(base=b, height=h):
# narrowed: shape : Triangle, b = shape.base, h = shape.height
return 0.5 * b * h
$ python -m deppy check area.py --verbose
Analyzing area(shape: Shape)...
Pattern matching exhaustiveness:
Shape = Circle | Rectangle | Triangle
case Circle: covers Circle ✓
case Rectangle: covers Rectangle ✓
case Triangle: covers Triangle ✓
Coverage: 3/3 variants — EXHAUSTIVE ✅
Guarantee "returns non-negative float":
case Circle: π × r² ≥ 0 iff r : float (always true for real r) Z3 ✅
case Rectangle: w × h ≥ 0 iff w ≥ 0 ∧ h ≥ 0
⚠ Proof obligation: width ≥ 0 ∧ height ≥ 0
(add precondition or assert in __init__)
case Triangle: 0.5 × b × h ≥ 0 iff b ≥ 0 ∧ h ≥ 0
⚠ Proof obligation: base ≥ 0 ∧ height ≥ 0
2 proof obligations generated (need preconditions on Rectangle, Triangle)
assert statements in each class's __init__ or
add a precondition to area using @assume.
Gradual Typing: Mixing Typed and Untyped Code
Real Python codebases are gradually typed — some code has annotations, some doesn't. Deppy handles this by introducing trust boundaries at the interface between typed and untyped code:
from deppy import verify, guarantee, trust_boundary
# Untyped legacy code — no annotations
def legacy_compute(data, factor):
"""Some old code without type annotations."""
result = []
for item in data:
result.append(item * factor)
return result
# Typed code that calls untyped code
@verify
@guarantee("returns a list of floats with same length as input")
def scale_values(values: list[float], scale: float) -> list[float]:
# Calling into untyped code — trust boundary
result = trust_boundary(
legacy_compute,
args=(values, scale),
# Declare what we expect from the untyped code:
expected_return=list[float],
expected_properties=[
"len(result) == len(values)",
"all(isinstance(x, float) for x in result)",
],
)
return result
$ python -m deppy check gradual.py
Analyzing scale_values...
Trust boundary at legacy_compute:
├── legacy_compute has no type annotations
├── Inferred types from usage:
│ data : list[float] (from values : list[float])
│ factor: float (from scale : float)
├── Expected return: list[float]
│ └── Runtime check will be inserted ✓
├── Expected property: len(result) == len(values)
│ └── AST analysis: for loop iterates over data, appends once each
│ → len(result) == len(data) == len(values)
│ Status: LLM_CHECKED (confidence: 0.94) 🟠
├── Expected property: all floats
│ └── item : float (from data : list[float]), factor : float
│ → item * factor : float
│ Status: Z3_VERIFIED ✅
└── Trust level: MIXED (some Z3, some LLM)
✓ scale_values: guarantee verified at MIXED trust level
The Any Type
Any is the top of Python's type universe — it is compatible
with every type. In HoTT, Any is the contractible type
(every path to and from it exists trivially):
from typing import Any
from deppy import verify
@verify
def identity(x: Any) -> Any:
return x # trivially correct — Any → Any
@verify
def dangerous_add(a: Any, b: Any) -> Any:
return a + b # ⚠ might fail at runtime!
# Deppy's analysis:
# identity: Any → Any — trivially safe ✓
# dangerous_add: Any → Any
# ⚠ Warning: a + b requires __add__ on type(a)
# Not all types have __add__
# This is type-safe (Any accepts anything) but not VALUE-safe
# Recommendation: narrow types with isinstance or use Protocol
Any is a verification escape hatch. Deppy treats Any
as compatible with all types but issues warnings when operations on
Any-typed values could fail at runtime. Use Any
sparingly — prefer Protocol or Union types.
Union Types and Type Narrowing Proofs
Python 3.10 introduced X | Y syntax for union types.
Deppy generates narrowing proofs — evidence that
within a particular branch, the union has been refined to a specific variant:
from __future__ import annotations
from deppy import verify, guarantee
from deppy.hott import NarrowingProof
@verify
@guarantee("returns a non-empty string representation")
def stringify(value: int | float | str | list | None) -> str:
if value is None:
return "null"
if isinstance(value, str):
return f'"{value}"' if value else '""'
if isinstance(value, (int, float)):
return str(value)
# At this point: value : list (only remaining option)
items = [stringify(item) for item in value]
return "[" + ", ".join(items) + "]"
$ python -m deppy check stringify.py --verbose
Type narrowing for stringify(value: int|float|str|list|None):
Step 1: value is None → True
narrow: value : None
return "null" — len("null") = 4 > 0 ✓
remaining: int | float | str | list
Step 2: isinstance(value, str) → True
narrow: value : str
if value (non-empty): f'"{value}"' — len ≥ 3 > 0 ✓
else (empty): '""' — len = 2 > 0 ✓
remaining: int | float | list
Step 3: isinstance(value, (int, float)) → True
narrow: value : int | float
str(value) — always non-empty for numeric types ✓
remaining: list
Step 4: fallthrough
narrow: value : list (exhaustive — only option left)
"[" + ... + "]" — len ≥ 2 > 0 ✓
✓ Exhaustive narrowing — all union variants covered
✓ Guarantee "non-empty string" — Z3_VERIFIED in all branches ✅
TypeGuard and TypeIs (PEP 742)
Python 3.10 added TypeGuard and Python 3.13 added
TypeIs (PEP 742) for user-defined type narrowing functions.
Deppy verifies that these functions actually produce correct evidence:
from __future__ import annotations
from typing import TypeGuard, TypeIs
from deppy import verify, guarantee
@verify
def is_string_list(val: list[object]) -> TypeGuard[list[str]]:
"""Check if all elements in the list are strings."""
return all(isinstance(item, str) for item in val)
# Deppy verifies the TypeGuard:
# If is_string_list(val) returns True, then:
# ∀ item ∈ val. isinstance(item, str)
# → val : list[str]
# This is sound because:
# all(isinstance(item, str) for item in val) == True
# ↔ ∀ item ∈ val. isinstance(item, str) == True
# ↔ ∀ item ∈ val. type(item) ≤ str
# ↔ val : list[str] ✓
@verify
def is_positive_int(val: int | float) -> TypeIs[int]:
"""Check if value is a positive integer."""
return isinstance(val, int) and val > 0
# TypeIs is stricter than TypeGuard:
# TypeIs[int] narrows BOTH the True AND False branches
# True branch: val : int (narrowed from int | float)
# False branch: val : float (complement in the union)
# Usage with narrowing:
@verify
def process(data: list[object]) -> list[str]:
if is_string_list(data):
# data is narrowed to list[str] by TypeGuard
return [s.upper() for s in data] # .upper() valid on str ✓
return [str(item) for item in data]
$ python -m deppy check typeguard.py
✓ TypeGuard[is_string_list]: sound narrowing
├── True branch: val : list[str]
│ └── all(isinstance(item, str)) → correct narrowing ✓
└── Verified ✅
✓ TypeIs[is_positive_int]: sound narrowing
├── True branch: val : int ∧ val > 0 ✓
├── False branch: val : float | (int ∧ val ≤ 0) ✓
└── Verified ✅
✓ process: all branches type-safe ✅
Example: JSON Parsing with Runtime Validation
A common Python pattern: parse JSON (which returns Any), then
validate the structure at runtime. Deppy converts runtime validation into
proof obligations:
from __future__ import annotations
import json
from dataclasses import dataclass
from typing import Any
from deppy import verify, guarantee
@dataclass
class User:
name: str
age: int
email: str
@verify
@guarantee("returns a valid User object or raises ValueError")
def parse_user(raw_json: str) -> User:
"""Parse and validate a User from JSON."""
data: Any = json.loads(raw_json)
# Validate structure
if not isinstance(data, dict):
raise ValueError(f"Expected object, got {type(data).__name__}")
# Validate fields
required = {"name": str, "age": int, "email": str}
for field, expected_type in required.items():
if field not in data:
raise ValueError(f"Missing field: {field}")
if not isinstance(data[field], expected_type):
raise ValueError(
f"Field {field}: expected {expected_type.__name__}, "
f"got {type(data[field]).__name__}"
)
# Additional validation
if data["age"] < 0 or data["age"] > 150:
raise ValueError(f"Invalid age: {data['age']}")
if "@" not in data["email"]:
raise ValueError(f"Invalid email: {data['email']}")
# At this point, Deppy has narrowed:
# data["name"] : str
# data["age"] : int ∧ 0 ≤ age ≤ 150
# data["email"] : str ∧ "@" in email
return User(
name=data["name"],
age=data["age"],
email=data["email"],
)
$ python -m deppy check json_parse.py --verbose
Analyzing parse_user...
Trust boundary: json.loads(raw_json) → Any
├── json.loads is untyped in Deppy's model
├── Returns: Any (could be dict, list, str, int, float, bool, None)
└── All subsequent operations must validate ✓
Narrowing chain:
data : Any
→ isinstance(data, dict) → data : dict[str, Any]
→ "name" in data → data has key "name"
→ isinstance(data["name"], str) → data["name"] : str
→ "age" in data ∧ isinstance(data["age"], int)
→ data["age"] : int
→ 0 ≤ data["age"] ≤ 150
→ data["age"] : int{v | 0 ≤ v ≤ 150}
→ "email" in data ∧ isinstance(data["email"], str)
→ data["email"] : str
→ "@" in data["email"]
→ data["email"] : str{v | "@" ∈ v}
✓ User construction is safe with narrowed types
✓ All error paths raise ValueError ✓
✓ Guarantee: "valid User or ValueError" — Z3_VERIFIED ✅
Example: Plugin System with Dynamic Loading
from __future__ import annotations
import importlib
from typing import Protocol
from deppy import verify, guarantee
from deppy.hott import DuckPath
class PluginInterface(Protocol):
name: str
def initialize(self) -> None: ...
def execute(self, data: dict) -> dict: ...
def cleanup(self) -> None: ...
@verify
def load_plugin(module_name: str, class_name: str) -> PluginInterface:
"""Dynamically load a plugin class and verify it conforms."""
module = importlib.import_module(module_name)
cls = getattr(module, class_name)
# Runtime DuckPath check
duck_proof = DuckPath.runtime_check(cls, PluginInterface)
if not duck_proof.is_valid:
raise TypeError(
f"{class_name} does not satisfy PluginInterface:\n"
f"{duck_proof.error_report()}"
)
instance = cls()
# At this point, instance is verified to satisfy PluginInterface
return instance
# Deppy's analysis:
# load_plugin cannot be fully verified statically
# (the loaded class is only known at runtime)
# But DuckPath.runtime_check provides runtime evidence
# that is equivalent to a static DuckPath proof
#
# Verification level: RUNTIME_CHECKED 🟡
# (stronger than untyped, weaker than static proof)
Monkey Patching as Transport
Monkey patching — modifying classes or objects at runtime — is one of
Python's most controversial features. Deppy models it using HoTT's
transport: if you have a proof P(A) and a
path p : A ≃ A' (the patch), then
transport(p, proof) : P(A').
The key insight: monkey patching is safe if and only if the patch preserves the relevant paths. If a class satisfies a Protocol before patching, it must still satisfy the Protocol after patching.
from __future__ import annotations
from deppy import verify, guarantee
from deppy.hott import transport, DuckPath, PathPreserving
# Original class
class Logger:
def log(self, message: str) -> None:
print(f"[LOG] {message}")
def error(self, message: str) -> None:
print(f"[ERROR] {message}")
# A function verified against Logger
@verify
@guarantee("logs both start and end of operation")
def process_with_logging(logger: Logger, data: list[int]) -> int:
logger.log("Processing started")
result = sum(data)
logger.log("Processing finished")
return result
# Now someone monkey-patches Logger:
def patched_log(self, message: str) -> None:
import datetime
ts = datetime.datetime.now().isoformat()
print(f"[{ts}] {message}")
# UNSAFE monkey patch — just overwrite:
Logger.log = patched_log # ⚠ breaks verification!
The @safe_patch Decorator
from deppy.patching import safe_patch
# SAFE monkey patch — Deppy verifies it preserves paths
@safe_patch(Logger, "log")
def timestamped_log(self, message: str) -> None:
import datetime
ts = datetime.datetime.now().isoformat()
print(f"[{ts}] {message}")
$ python -m deppy check monkey_patch.py --verbose
Analyzing @safe_patch(Logger, "log")...
Transport check: Logger.log → timestamped_log
├── Original: log(self: Logger, message: str) → None
├── Patch: timestamped_log(self: Logger, message: str) → None
├── Signature compatibility:
│ Parameters: (self, message: str) ≡ (self, message: str) ✓
│ Return: None ≡ None ✓
├── Path preservation:
│ DuckPath[Logger, *] — check all protocols:
│ No protocols reference Logger.log explicitly
│ → No paths to break ✓
├── Behavioral analysis:
│ Original: prints "[LOG] {message}"
│ Patch: prints "[{timestamp}] {message}"
│ Both: write to stdout with message content preserved
│ → Observationally equivalent for logging purposes ✓
└── Transport: SAFE ✅
Affected verifications:
├── process_with_logging: guarantee "logs start and end"
│ transport(patch, proof):
│ logger.log("Processing started") still called ✓
│ logger.log("Processing finished") still called ✓
│ → Guarantee preserved via transport ✅
└── 1/1 proofs successfully transported
When Monkey Patching Breaks
# This patch CHANGES the return type — transport fails!
@safe_patch(Logger, "log")
def broken_log(self, message: str) -> bool: # return type changed!
print(f"[LOG] {message}")
return True
$ python -m deppy check broken_patch.py
✗ @safe_patch(Logger, "log"): TRANSPORT FAILED
├── Signature mismatch:
│ Original return: None
│ Patch return: bool
│ ✗ bool is not a subtype of None
│
├── Path broken:
│ If any code depends on Logger.log returning None,
│ the patch changes that behavior.
│
└── 1 error: patch rejected
Transport Formally
The formal transport rule for monkey patching:
Transport Rule for Monkey Patching. LetCbe a class,ma method ofC, andm'a proposed patch. LetΦbe a property verified againstC. The patch is safe with respect to Φ if:
(1)m'has a signature compatible withm(contravariant params, covariant return), and
(2) For every guaranteeGofmused in the proof ofΦ,m'also satisfiesG, and
(3) No other method ofCdepends on the specific implementation ofm(only on its spec).
If (1)–(3) hold, thentransport(patch, Φ_proof) : Φ(C[m↦m']).
# Deppy's internal transport computation:
from deppy.hott import transport, PathEquiv
def verify_monkey_patch(
cls: type,
method_name: str,
new_method: Callable,
affected_proofs: list[Proof],
) -> TransportResult:
old_method = getattr(cls, method_name)
old_sig = inspect.signature(old_method)
new_sig = inspect.signature(new_method)
# Check signature compatibility
sig_path = check_signature_path(old_sig, new_sig)
if not sig_path.is_valid:
return TransportResult(safe=False, reason=sig_path.error)
# Check guarantee preservation
old_guarantees = extract_guarantees(old_method)
for g in old_guarantees:
if not verify_guarantee(new_method, g):
return TransportResult(
safe=False,
reason=f"Guarantee '{g}' not preserved by patch"
)
# Transport each affected proof
transported = []
for proof in affected_proofs:
new_proof = transport(
path=PathEquiv(old_method, new_method, sig_path),
proof=proof,
)
transported.append(new_proof)
return TransportResult(
safe=True,
transported_proofs=transported,
)
Comparison with F*: Static vs. Dynamic Verification
F* — Everything Static
// F*: types are fixed at compile time
// No isinstance, no monkey patching
// No dynamic loading
val process :
x:nat → Tot (y:nat{y >= x})
// Types cannot change at runtime
// No need for narrowing or transport
// But also no flexibility
Deppy — Static + Dynamic
# Deppy: static where possible,
# dynamic where necessary
@verify
def process(x: object) -> int:
if isinstance(x, int):
return x + 1 # narrowed
return 0
# Types flow through isinstance
# Monkey patches are transported
# Maximum flexibility + safety
| Feature | F* | Deppy |
|---|---|---|
| Type narrowing | N/A (types don't narrow) | isinstance / match / TypeGuard |
| Dynamic dispatch | Typeclasses (static) | DuckPath (static + runtime) |
| Monkey patching | Not possible | Transport with @safe_patch |
| Gradual typing | N/A (fully typed) | Trust boundaries |
| Runtime loading | Not possible | Runtime DuckPath checks |
| Any type | N/A | Top of universe + warnings |
Advanced: Verification Levels for Dynamic Code
Deppy assigns verification levels to dynamic code based on how much evidence is available:
| Level | Badge | Meaning | Example |
|---|---|---|---|
| Z3_VERIFIED | 🟢 | Fully proved by Z3 | isinstance narrowing, arithmetic checks |
| RUNTIME_CHECKED | 🟡 | Verified at runtime | DuckPath.runtime_check, TypeGuard |
| LLM_CHECKED | 🟠 | LLM analysis (high confidence) | Untyped code analysis, behavioral equivalence |
| UNTRUSTED | 🔴 | No verification evidence | Fully dynamic, no annotations or checks |
# The overall verification level of a function is the minimum
# across all its proof obligations:
@verify
def mixed_trust(data: Any) -> int:
if isinstance(data, int):
return data + 1 # 🟢 Z3_VERIFIED
if isinstance(data, str):
return len(data) # 🟢 Z3_VERIFIED
return legacy_compute(data) # 🟠 LLM_CHECKED
# Overall level: 🟠 LLM_CHECKED (weakest link)
Exercises
int | str | list[int] and returns the sum of all integers
it finds (converting strings to ints if possible, recursing into lists).
Use isinstance for narrowing. Show Deppy's narrowing trace.
TypeGuard function
is_matrix(val) → TypeGuard[list[list[float]]] that checks a
value is a non-empty rectangular matrix. What proof obligations does
Deppy generate when this TypeGuard is used?
@safe_patch that replaces
dict.update with a version that logs all key changes. Verify
that the patch preserves the behavioral contract of update:
after d.update(other), every key in other is in
d with the same value.
f(x: Renderable) → str, and someone monkey-patches
a class Widget that satisfies Renderable by replacing
its render method. Write the full transport proof showing
under what conditions f(Widget()) remains correct after the
patch. When does transport fail?
D(B1, B2) where both B1 and
B2 inherit from A, show that Python's C3
linearization produces a unique path and that method calls are
deterministic. How does Deppy verify that the MRO-selected method
satisfies the spec?