It is often the case that a function's performance can be drastically improved if it remembers previous computations. This kind of optimization is broadly called "memoization" and it is widely applicable to many different algorithms - especially dynamic programming ones1.
However, it is also often the case that reaching into a function and altering its logic can be time consuming and frustrating - especially a poorly commented function written many hundreds of years ago by a now defunct software-and-beer conglomerate. Despite such difficulties, we can resort to Python's powerful decorator construct to retrofit previously forgetful functions with memoization.
There are various ways to build a memoizing decorator. Here I opt to use a sweet and classy version2.
class cached: def __init__(self, f): self.f = f self.cache = {} def __call__(self, *args): if args not in self.cache: self.cache[args] = self.f(*args) return self.cache[args]
We can now apply this decorator to any function and instantly imbue it with memoization. For instance, computing the nth Fibonacci number recursively is only a modestly bad idea:
@cached def fib(n): return 1 if n < 2 else fib(n-1) + fib(n-2) print fib(200)The 200th Fibonacci number is computed instantly, whereas the naive approach would take well past the heat death of the universe.
The program above works roughly like this: The function defined under @cached is used to initialize an instance of cached, which then replaces it (observe the __call__ method implementation). The end result is that all occurrences of recursive calls are transparently captured by the memoizing decorator.
Unfortunately this approach has some shortcomings. Though it can cope with a variable number of arguments, cached cannot handle keyword arguments; and more disconcerting, the reliance on a dictionary cache precludes non-hashable input! Furthermore, if unfettered, the cache can grow in size indefinitely, presenting a serious problem. Still, the ease of implementation and the ease of use of this caching decorator has kept me content.
1 Implementing dynamic programming algorithms using this caching decorator is very easy and convenient, but for many algorithms this approach will have quadratic memory demands where linear space would suffice.
2 The method presented here makes significant use of syntactic sugar. There are yet other ways to write a caching decorator in Python, but this variety is the subject of another post.