7 Python Decorator Tips to Write Cleaner Code
Picture by Editor
Introduction
Often shrouded in thriller at first look, Python decorators are, at their core, capabilities wrapped round different capabilities to offer additional performance with out altering the important thing logic within the operate being “embellished”. Their foremost added worth is retaining the code clear, readable, and concise, serving to additionally make it extra reusable.
This text lists seven decorator tips that may allow you to write cleaner code. A number of the examples proven are an ideal match for utilizing them in information science and information evaluation workflows.
1. Clear Timing with @timer
Ever felt you’re cluttering your code by inserting time()
calls right here and there to measure how lengthy some heavy processes in your code take, like coaching a machine studying mannequin or conducting giant information aggregations? The @timer decorator could be a cleaner different, as proven on this instance, in which you’ll exchange the commented line of code contained in the simulated_training
embellished operate with the directions wanted to coach a machine studying mannequin of your alternative, and see how the decorator precisely counts the time taken to execute the operate:
1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 |
import time from functools import wraps
def timer(func): @wraps(func) def wrapper(*args, **kwargs): begin = time.time() end result = func(*args, **kwargs) print(f“{func.__name__} took {time.time() – begin:.3f}s”) return end result return wrapper
@timer def simulated_training(): time.sleep(2) # fake coaching a machine studying mannequin right here return “mannequin educated”
simulated_training() |
The important thing behind this trick is, in fact, the definition of the wrapper()
operate inside timer(func)
.
Nearly all of examples that comply with will use this key sample: first, we outline the important thing operate that may later be used as a decorator for an additional operate.
2. Simpler Debugging with @log_calls
This can be a very helpful decorator for debugging functions. It makes the method of figuring out causes for errors or inconsistencies simpler, by monitoring which capabilities are known as all through your workflow and which arguments are being handed. A good way to avoid wasting a bunch of print()
statements all over the place!
1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 |
from functools import wraps import pandas as pd
def log_calls(func): @wraps(func) def wrapper(*args, **kwargs): print(f“Calling {func.__name__} with {args}, {kwargs}”) return func(*args, **kwargs) return wrapper
@log_calls def preprocess_data(df, scale=False): if not isinstance(df, pd.DataFrame): increase TypeError(“Enter should be a pandas DataFrame”) return df.copy()
# Easy dataset (Pandas DataFrame object) to reveal the operate information = {‘col1’: [1, 2], ‘col2’: [3, 4]} sample_df = pd.DataFrame(information)
preprocess_data(sample_df, scale=True) |
On first point out, bear in mind to hyperlink vital libraries for readers: for instance, pandas.
3. Caching with @lru_cache
This can be a pre-defined Python decorator we are able to instantly use by importing it from the functools
library. It’s appropriate to wrap computationally costly capabilities — from a recursive Fibonacci computation for a big quantity to fetching a big dataset — to keep away from redundant computations. Helpful if we’ve a number of heavy capabilities in computational phrases and need to keep away from manually implementing caching logic inside all of them one after the other. LRU stands for “Least Not too long ago Used”, i.e., a typical caching technique in Python. See additionally the functools docs.
from functools import lru_cache
@lru_cache(maxsize=None) def fibonacci(n): if n < 2: return n return fibonacci(n–1) + fibonacci(n–2)
print(fibonacci(35)) # Caching this operate name makes its execution a lot sooner |
4. Knowledge Sort Validations
This decorator saves you from creating repetitive checks for clear information inputs or inputs belonging to the suitable sort. As an example, beneath we outline a customized decorator known as @validate_numeric
that customizes the error to throw if the enter checked shouldn’t be from a numeric information sort. In consequence, validations are stored constant throughout completely different capabilities and components of the code, and they’re elegantly remoted from the core logic, math, and computations:
from functools import wraps
def validate_numeric(func): @wraps(func) def wrapper(x): # Settle for ints and floats however reject bools (that are a subclass of int). if isinstance(x, bool) or not isinstance(x, (int, float)): increase ValueError(“Enter should be numeric”) return func(x) return wrapper
@validate_numeric def square_root(x): return x ** 0.5
print(square_root(16)) |
5. Retry on Failure with @retry
Typically, your code might have to work together with elements or set up exterior connections to APIs, databases, and many others. These connections might generally fail for a number of, out-of-control causes, often even at random. Retrying the method a number of instances in some instances is the way in which to go and navigate the problem, and the next decorator can be utilized to use this “retry on failure” technique a specified variety of instances: once more, with out mixing it with the core logic of your capabilities.
1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 23 24 25 26 27 |
import time, random from functools import wraps
def retry(instances=3, delay=1): def decorator(func): @wraps(func) def wrapper(*args, **kwargs): last_exc = None for try in vary(1, instances + 1): attempt: return func(*args, **kwargs) besides Exception as e: last_exc = e print(f“Try {try} failed: {e}”) time.sleep(delay) # After exhausting retries, increase the final encountered exception increase last_exc return wrapper return decorator
@retry(instances=3) def fetch_data(): if random.random() < 0.7: # fail about 70% of the time increase ConnectionError(“Community difficulty”) return “information fetched”
print(fetch_data()) |
6. Sort Checking with Annotations
Helpful for information science workflows, this decorator is designed to make sure operate arguments match their sort annotations and might be robotically utilized to capabilities with sort annotations to keep away from guide double checking. It’s a form of “contract enforcement” for these capabilities, and really helpful for collaborative tasks and production-bound information science tasks the place stricter information typing is vital to stopping future points and bugs.
1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 23 24 25 26 27 28 29 30 |
import examine from functools import wraps from typing import get_type_hints
def enforce_types(func): @wraps(func) def wrapper(*args, **kwargs): hints = get_type_hints(func) sure = examine.signature(func).bind_partial(*args, **kwargs) # Validate arguments for title, worth in sure.arguments.objects(): if title in hints and not isinstance(worth, hints[name]): anticipated = getattr(hints[name], “__name__”, str(hints[name])) acquired = sort(worth).__name__ increase TypeError(f“Argument ‘{title}’ anticipated {anticipated}, received {acquired}”) end result = func(*args, **kwargs) # Optionally validate return sort if “return” in hints and not isinstance(end result, hints[“return”]): anticipated = getattr(hints[“return”], “__name__”, str(hints[“return”])) acquired = sort(end result).__name__ increase TypeError(f“Return worth anticipated {anticipated}, received {acquired}”) return end result return wrapper
@enforce_types def add_numbers(a: int, b: int) -> int: return a + b
print(add_numbers(3, 4)) # TRY INSTEAD: add_numbers(“3”, 4) |
7. Monitoring DataFrame Dimension with @log_shape
In information cleansing and preprocessing workflows, it is not uncommon that the dataset form (variety of rows and columns) might change on account of sure operations. The next decorator is an effective way to trace how a pandas DataFrame
form might change after every operation, with out continually printing the form in several components of the workflow. Within the instance beneath it’s utilized to trace how dropping rows with lacking values impacts the dataset dimension and form:
1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 |
from functools import wraps import pandas as pd
def log_shape(func): @wraps(func) def wrapper(df, *args, **kwargs): end result = func(df, *args, **kwargs) print(f“{func.__name__}: {df.form} → {end result.form}”) return end result return wrapper
@log_shape def drop_missing(df): return df.dropna()
df = pd.DataFrame({“a”:[1,2,None], “b”:[4,None,6]}) df = drop_missing(df) |
Wrapping Up
This text listed seven insightful methods to make use of and apply Python decorators, highlighting the utility of every one and hinting at how they will add worth to information science and associated undertaking workflows.