In 2014, PEP 484 landed quietly. It proposed a standard way to annotate function signatures with type information — not enforced by the interpreter, just hints. Optional. Easy to ignore. The Python community largely did.

Then something strange happened. Gradually, and then suddenly, type annotations became ubiquitous in serious Python codebases. Today, major libraries ship with type stubs. Editors use them for autocompletion. CI pipelines run mypy on pull requests. The optional hint became, in practice, a standard.

What changed?

The short answer is tooling. But the longer answer is more interesting: we collectively rediscovered that types aren't about the machine — they're about communication.

Consider this function signature:

def get_user(user_id, include_deleted=False):
    ...

What is user_id? An integer? A UUID string? A custom UserId type? What does the function return? You need to read the implementation — or hope the docstring is up-to-date — to know.

Now with type annotations:

from uuid import UUID
from models import User

def get_user(user_id: UUID, include_deleted: bool = False) -> User | None:
    ...

The signature is a contract. It documents intent. It tells you what to pass in and what to expect back — and your editor will enforce it before you run a single line.

Type annotations are documentation that can be verified by a machine. That makes them worth more than any docstring.

The anatomy of Python's type system

Python's type system is structural, not nominal. This is a subtle but important distinction. A value doesn't need to explicitly declare that it implements an interface — it just needs to have the right methods. This is duck typing, formalized.

The typing module provides the building blocks: generic types, union types, optional values, callable types, and more. Since Python 3.10, much of the syntax has been simplified — you write str | None instead of Optional[str].

from typing import TypeVar, Generic

T = TypeVar('T')

class Stack(Generic[T]):
    def __init__(self) -> None:
        self._items: list[T] = []

    def push(self, item: T) -> None:
        self._items.append(item)

    def pop(self) -> T:
        return self._items.pop()

The Stack[T] here is a generic class. When you write Stack[int], the type checker understands that pop() returns an int. No casts. No guessing.

Protocols: duck typing with a safety net

The most underrated part of the type system is Protocol. It lets you define structural interfaces without inheritance:

from typing import Protocol

class Drawable(Protocol):
    def draw(self) -> None: ...

def render(shape: Drawable) -> None:
    shape.draw()

# Works with any class that has a draw() method,
# no explicit subclassing required.

This is Python's superpower: the flexibility of duck typing, combined with the verifiability of static analysis. You don't need to inherit from Drawable. You just need to have a draw() method.

The limits of the system

Python's type system is deliberately gradual. You can annotate as much or as little as you want. Unannotated code is treated as implicitly typed as Any, which is compatible with everything. This is pragmatic but also dangerous: it means type errors can hide behind untyped third-party code.

The system also can't verify runtime values. A function annotated to return User | None can still return an int at runtime — annotations are not enforced by the interpreter. For that, you need runtime validation tools like pydantic.

Should you use type annotations?

Yes, for any code that will be read again. The cost is low — a few extra characters per function signature. The benefit is a codebase that documents itself and a feedback loop that catches errors before runtime.

Start with function signatures. Add mypy in its most lenient mode. Expand coverage over time. The type system rewards patience.


Filed under

Python Types In Depth
Browse all posts