This module contains a decorator cached() that can be used to cache the results of any Python functions to disk.

This is useful when you have functions that take a long time to compute their value, and you want to cache the results of those functions between runs.

Python’s pickle is used to serialize data. All cache files go into the cache/ directory inside your working directory.

@cached uses a cache key function to find out if it has the value for some given function arguments cached on disk. The way it calculates that cache key by default is to simply use the string representation of all arguments passed into the function. Thus, the default cache key function looks like this:

def default_cache_key(*args, **kwargs):
    return str(args) + str(sorted(kwargs.items()))

Here is an example use of the cached() decorator:

import math
def fac(x):
    print 'called!'
    return math.factorial(x)


Often you will want to use a more intelligent cache key, one that takes more things into account. Here’s an example cache key function for a cache decorator used with a transform method of a scikit-learn BaseEstimator:

>>> def transform_cache_key(self, X):
...     return ','.join([
...         str(X[:20]),
...         str(X[-20:]),
...         str(X.shape),
...         str(sorted(self.get_params().items())),
...         ])

This function puts the first and the last twenty rows of the matrix X into the cache key. On top of that, it adds the shape of the matrix X.shape along with the items in self.get_params, which with a scikit-learn BaseEstimator class is the dictionary of model parameters. This makes sure that even though the input matrix is the same, it will still calculate the value again if the value of self.get_params() is different.

Your estimator class can then use the decorator like so:

class MyEstimator(BaseEstimator):
    def transform(self, X):
        # ...
nolearn.cache.cached(cache_key=<function default_cache_key at 0x7f14c0be3048>, cache_path=None)[source]

Previous topic

Welcome to nolearn’s documentation!

Next topic


This Page