Python code on a monitor — the craft of abstraction

The first time most developers encounter a decorator, it's already doing work for them. A @login_required in a Django view. A @app.route in Flask. A @pytest.mark.parametrize in a test file. These symbols sit at the top of functions like incantations, they work, and so nobody investigates.

That's a shame. Because decorators are not magic — they're one of the most elegant applications of a simple idea: in Python, functions are values, and you can wrap them like any other value. Once that clicks, a whole tier of expressive, DRY code opens up.

The foundation: functions are first-class objects

Everything rests on this. In Python, a function is an object like any other. You can assign it to a variable, pass it to another function, return it from a function. If you've never consciously used this, let it sink in:

def greet(name: str) -> str:
    return f"Hello, {name}"

# greet is an object — assign it like any value.
say_hello = greet
print(say_hello("Ada"))        # Hello, Ada

# Pass it to another function.
def apply(func, value):
    return func(value)

print(apply(greet, "Turing"))   # Hello, Turing

This isn't a party trick. A decorator is just a function that takes a function as input and returns a new function as output.

Closures: the invisible glue

Before writing a decorator, you need to understand closures. A closure is a function that remembers variables from the scope where it was defined, even after that scope has exited.

def make_multiplier(factor: int):
    def multiply(n: int) -> int:
        return n * factor  # 'factor' is captured from outer scope
    return multiply

double = make_multiplier(2)
triple = make_multiplier(3)

print(double(10))  # 20
print(triple(10))  # 30

double and triple are closures. Each carries its own captured factor. make_multiplier has long returned — but factor lives on, locked inside the inner function. This is exactly how decorators work internally.

Code on a monitor — layered abstractions in Python

Building your first decorator

Let's write a decorator that times any function it wraps. No external libraries — pure Python.

import time
from functools import wraps

def timer(func):
    @wraps(func)
    def wrapper(*args, **kwargs):
        start   = time.perf_counter()
        result  = func(*args, **kwargs)
        elapsed = time.perf_counter() - start
        print(f"[timer] {func.__name__} ran in {elapsed:.4f}s")
        return result
    return wrapper

@timer
def slow_sum(n: int) -> int:
    return sum(range(n))

slow_sum(10_000_000)
# [timer] slow_sum ran in 0.3142s

The @timer syntax is pure shorthand. Writing @timer above slow_sum is identical to slow_sum = timer(slow_sum) right after the definition. The @ sign is syntactic sugar — nothing more.

@decorator applied to a function f is exactly equivalent to f = decorator(f). Every time. No exceptions.

Why functools.wraps is non-negotiable

Without it, your wrapped function loses its identity:

# Without @wraps:
print(slow_sum.__name__)  # 'wrapper'  — wrong
print(slow_sum.__doc__)   # None       — docstring gone

# With @wraps(func):
print(slow_sum.__name__)  # 'slow_sum' — correct
print(slow_sum.__doc__)   # original docstring preserved

@wraps copies __name__, __doc__, __qualname__, __annotations__, and __module__ from the original onto the wrapper. It costs one line and saves real confusion in debugging, logging, and documentation generation. Always use it.

Decorators with arguments

A plain decorator takes one argument: the function. To configure it, you add a third layer of nesting — a function that returns a decorator:

import functools, time

def retry(max_attempts: int = 3, delay: float = 1.0):
    """Decorator factory: retries the function on exception."""
    def decorator(func):
        @functools.wraps(func)
        def wrapper(*args, **kwargs):
            for attempt in range(1, max_attempts + 1):
                try:
                    return func(*args, **kwargs)
                except Exception as exc:
                    if attempt == max_attempts:
                        raise
                    print(f"Attempt {attempt} failed: {exc}. Retrying…")
                    time.sleep(delay)
        return wrapper
    return decorator

@retry(max_attempts=5, delay=0.5)
def fetch_data(url: str) -> dict:
    ...

The pattern: outermost function accepts config, returns a decorator, which accepts a function, returns a wrapper. Three levels. Once you've seen it, you'll recognise it everywhere.

Stacking decorators

You can apply multiple decorators to a single function. They apply from bottom to top — the decorator closest to the function definition executes first:

@timer
@retry(max_attempts=3)
def fetch_data(url: str) -> dict:
    ...

# Equivalent to:
# fetch_data = timer(retry(max_attempts=3)(fetch_data))

The order matters. Here retry wraps the original function first, then timer wraps the retry logic — the timer measures total elapsed time including any retry delays. Swap them and the timer only measures a single attempt.

Developer writing layered Python code at a laptop

The standard library is full of them

@property, @staticmethod, @classmethod, and @dataclass are all decorators. One of the most immediately useful is @lru_cache — it memoises a function's return value, making repeated calls with identical arguments effectively free:

from functools import lru_cache

@lru_cache(maxsize=None)
def fib(n: int) -> int:
    if n < 2:
        return n
    return fib(n - 1) + fib(n - 2)

print(fib(100))         # Instant. Without cache, this hangs.
print(fib.cache_info())  # CacheInfo(hits=98, misses=101, ...)

Without the cache, fib(100) requires exponential recursive calls. With it, each unique input is computed exactly once. A single line of decoration turns an unusable algorithm into a fast one.

When decorators get in the way

Decorators can obscure the call stack. When something fails inside a heavily-decorated function, the traceback passes through every wrapper layer — which is confusing when third-party libraries have forgotten @wraps. They also make static analysis harder when a decorator fundamentally changes a function's signature, though ParamSpec in Python 3.10+ handles most of that.

The mental model

A decorator is a function transformer. It takes a function, adds behaviour around it, and returns the result. The @ syntax is just a clean way to apply that transformation at the point of definition — nothing more.

Write a timer. Write a retry. Write a cache guard. After a few of your own, the pattern becomes instinctive. You start reaching for decorators not because they're clever, but because they're the right tool: they separate cross-cutting concerns from business logic without inheritance, subclassing, or any framework machinery required.


Filed under

Python Patterns In Depth
Browse all posts