advanced
Step 16 of 20
Decorators and Generators
Python Programming
Decorators and Generators
Decorators and generators are two powerful advanced features that distinguish Python from many other languages. Decorators allow you to modify or extend the behavior of functions and classes without changing their source code — they are used extensively in web frameworks like Flask and Django. Generators provide a memory-efficient way to work with large or infinite sequences of data by producing values lazily, one at a time, instead of computing everything upfront. Mastering these features will significantly elevate your Python programming skills.
Understanding Decorators
# A decorator is a function that takes a function and returns a modified version
import time
import functools
def timer(func):
"""Measure execution time of a function."""
@functools.wraps(func) # Preserves original function metadata
def wrapper(*args, **kwargs):
start = time.time()
result = func(*args, **kwargs)
elapsed = time.time() - start
print(f"{func.__name__} took {elapsed:.4f} seconds")
return result
return wrapper
@timer
def slow_function():
time.sleep(1)
return "Done"
result = slow_function() # "slow_function took 1.0012 seconds"
# Without decorator syntax, equivalent to:
# slow_function = timer(slow_function)
Practical Decorators
# Retry decorator
def retry(max_attempts=3, delay=1):
def decorator(func):
@functools.wraps(func)
def wrapper(*args, **kwargs):
for attempt in range(1, max_attempts + 1):
try:
return func(*args, **kwargs)
except Exception as e:
if attempt == max_attempts:
raise
print(f"Attempt {attempt} failed: {e}. Retrying...")
time.sleep(delay)
return wrapper
return decorator
@retry(max_attempts=3, delay=0.5)
def fetch_data(url):
import random
if random.random() < 0.7:
raise ConnectionError("Network error")
return {"data": "success"}
# Cache decorator (memoization)
def memoize(func):
cache = {}
@functools.wraps(func)
def wrapper(*args):
if args not in cache:
cache[args] = func(*args)
return cache[args]
return wrapper
@memoize
def fibonacci(n):
if n < 2:
return n
return fibonacci(n - 1) + fibonacci(n - 2)
print(fibonacci(100)) # Instant! Without memoization, this would take forever
# Python 3.9+ has built-in: @functools.lru_cache or @functools.cache
Generators
# A generator function uses 'yield' instead of 'return'
def countdown(n):
while n > 0:
yield n
n -= 1
# Calling the function returns a generator object
gen = countdown(5)
print(next(gen)) # 5
print(next(gen)) # 4
print(next(gen)) # 3
# Iterate over remaining values
for num in gen:
print(num) # 2, 1
# Generator for reading large files line by line
def read_large_file(file_path):
with open(file_path, "r") as f:
for line in f:
yield line.strip()
# Memory-efficient processing
# for line in read_large_file("huge_file.txt"):
# process(line) # Only one line in memory at a time
# Infinite generator
def fibonacci_gen():
a, b = 0, 1
while True:
yield a
a, b = b, a + b
# Take first 10 Fibonacci numbers
fib = fibonacci_gen()
first_10 = [next(fib) for _ in range(10)]
print(first_10) # [0, 1, 1, 2, 3, 5, 8, 13, 21, 34]
Generator Pipelines
# Chain generators for data processing pipelines
def integers():
n = 1
while True:
yield n
n += 1
def squares(nums):
for n in nums:
yield n ** 2
def under_100(nums):
for n in nums:
if n >= 100:
return
yield n
# Pipeline: integers -> square -> filter under 100
pipeline = under_100(squares(integers()))
result = list(pipeline)
print(result) # [1, 4, 9, 16, 25, 36, 49, 64, 81]
# yield from — delegate to sub-generator
def flatten(nested):
for item in nested:
if isinstance(item, list):
yield from flatten(item) # Recursive delegation
else:
yield item
data = [1, [2, 3], [4, [5, 6]], 7]
print(list(flatten(data))) # [1, 2, 3, 4, 5, 6, 7]
Pro tip: Use functools.wraps in every decorator to preserve the original function's name, docstring, and other metadata. Without it, debugging and introspection become difficult because the decorated function appears to be the wrapper function instead.
Key Takeaways
- Decorators modify function behavior without changing source code; they are functions that take and return functions.
- Use
@functools.wrapsin decorators to preserve the original function's metadata. - Generators use
yieldto produce values lazily, making them memory-efficient for large or infinite sequences. - Generator pipelines chain multiple generators for clean, memory-efficient data processing.
- Python's built-in
@functools.lru_cacheprovides production-ready memoization without writing your own decorator.