Install the extension with one of the following commands:
$ easy_install Flask-Cache
or alternatively if you have pip installed:
$ pip install Flask-Cache
Cache is managed through a Cache instance:
from flask import Flask
from flask.ext.cache import Cache
app = Flask(__name__)
# Check Configuring Flask-Cache section for more details
cache = Cache(app,config={'CACHE_TYPE': 'simple'})
You may also set up your Cache instance later at configuration time using init_app method:
cache = Cache(config={'CACHE_TYPE': 'simple'})
app = Flask(__name__)
cache.init_app(app)
You may also provide an alternate configuration dictionary, useful if there will be multiple Cache instances each with a different backend.:
#: Method A: During instantiation of class
cache = Cache(config={'CACHE_TYPE': 'simple'})
#: Method B: During init_app call
cache.init_app(app, config={'CACHE_TYPE': 'simple'})
New in version 0.7.
To cache view functions you will use the cached() decorator. This decorator will use request.path by default for the cache_key.:
@cache.cached(timeout=50)
def index():
return render_template('index.html')
The cached decorator has another optional argument called unless. This argument accepts a callable that returns True or False. If unless returns True then it will bypass the caching mechanism entirely.
Using the same @cached decorator you are able to cache the result of other non-view related functions. The only stipulation is that you replace the key_prefix, otherwise it will use the request.path cache_key.:
@cache.cached(timeout=50, key_prefix='all_comments')
def get_all_comments():
comments = do_serious_dbio()
return [x.author for x in comments]
cached_comments = get_all_comments()
See memoize()
In memoization, the functions arguments are also included into the cache_key.
Memoize is also designed for methods, since it will take into account the identity. of the ‘self’ or ‘cls’ argument as part of the cache key.
The theory behind memoization is that if you have a function you need to call several times in one request, it would only be calculated the first time that function is called with those arguments. For example, an sqlalchemy object that determines if a user has a role. You might need to call this function many times during a single request. To keep from hitting the database every time this information is needed you might do something like the following:
class Person(db.Model):
@cache.memoize(50)
def has_membership(self, role_id):
return Group.query.filter_by(user=self, role_id=role_id).count() >= 1
Warning
Using mutable objects (classes, etc) as part of the cache key can become tricky. It is suggested to not pass in an object instance into a memoized function. However, the memoize does perform a repr() on the passed in arguments so that if the object has a __repr__ function that returns a uniquely identifying string for that object, that will be used as part of the cache key.
For example, an sqlalchemy person object that returns the database id as part of the unique identifier.:
class Person(db.Model):
def __repr__(self):
return "%s(%s)" % (self.__class__.__name__, self.id)
New in version 0.2.
You might need to delete the cache on a per-function bases. Using the above example, lets say you change the users permissions and assign them to a role, but now you need to re-calculate if they have certain memberships or not. You can do this with the delete_memoized() function.:
cache.delete_memoized('user_has_membership')
Note
If only the function name is given as parameter, all the memoized versions of it will be invalidated. However, you can delete specific cache by providing the same parameter values as when caching. In following example only the user-role cache is deleted:
user_has_membership('demo', 'admin')
user_has_membership('demo', 'user')
cache.delete_memoized('user_has_membership', 'demo', 'user')
Usage:
{% cache [timeout [,[key1, [key2, ...]]]] %}
...
{% endcache %}
By default the value of “path to template file” + “block start line” is used as cache key. Also key name can be set manually. Keys are concated together into a single string. that can be used to avoid the same block evaluating in different templates.
Set timeout to “del” to delete cached value:
{% cache 'del' %}...
If keys are provided, you may easily generate the template fragment key and delete it from outside of the template context:
from flask.ext.cache import make_template_fragment_key
key = make_template_fragment_key("key1", vary_on=["key2", "key3"])
cache.delete(key)
Example:
Considering we have render_form_field and render_submit macroses.
{% cache 60*5 %}
<div>
<form>
{% render_form_field form.username %}
{% render_submit %}
</form>
</div>
{% endcache %}
See clear().
Here’s an example script to empty your application’s cache:
from flask.ext.cache import Cache
from yourapp import app, your_cache_config
cache = Cache()
def main():
cache.init_app(app, config=your_cache_config)
with app.app_context():
cache.clear()
if __name__ == '__main__':
main()
Warning
Some backend implementations do not support completely clearing the cache. Also, if you’re not using a key prefix, some implementations (e.g. Redis) will flush the whole database. Make sure you’re not storing any other data in your caching database.
The following configuration values exist for Flask-Cache:
CACHE_TYPE | Specifies which type of caching object to use. This is an import string that will be imported and instantiated. It is assumed that the import object is a function that will return a cache object that adheres to the werkzeug cache API. For werkzeug.contrib.cache objects, you do not need to specify the entire import string, just one of the following names. Built-in cache types:
|
CACHE_NO_NULL_WARNING | Silents the warning message when using cache type of ‘null’. |
CACHE_ARGS | Optional list to unpack and pass during the cache class instantiation. |
CACHE_OPTIONS | Optional dictionary to pass during the cache class instantiation. |
CACHE_DEFAULT_TIMEOUT | The default timeout that is used if no timeout is specified. Unit of time is seconds. |
CACHE_THRESHOLD | The maximum number of items the cache will store before it starts deleting some. Used only for SimpleCache and FileSystemCache |
CACHE_KEY_PREFIX | A prefix that is added before all keys. This makes it possible to use the same memcached server for different apps. Used only for RedisCache, MemcachedCache and GAEMemcachedCache. |
CACHE_MEMCACHED_SERVERS | A list or a tuple of server addresses. Used only for MemcachedCache |
CACHE_MEMCACHED_USERNAME | Username for SASL authentication with memcached. Used only for SASLMemcachedCache |
CACHE_MEMCACHED_PASSWORD | Password for SASL authentication with memcached. Used only for SASLMemcachedCache |
CACHE_REDIS_HOST | A Redis server host. Used only for RedisCache. |
CACHE_REDIS_PORT | A Redis server port. Default is 6379. Used only for RedisCache. |
CACHE_REDIS_PASSWORD | A Redis password for server. Used only for RedisCache. |
CACHE_REDIS_DB | A Redis db (zero-based number index). Default is 0. Used only for RedisCache. |
CACHE_DIR | Directory to store cache. Used only for FileSystemCache. |
CACHE_REDIS_URL | URL to connect to Redis server. Example redis://user:password@localhost:6379/2. Used only for RedisCache. |
In addition the standard Flask TESTING configuration option is used. If this is True then Flask-Cache will use NullCache only.
Uses a local python dictionary for caching. This is not really thread safe.
Relevant configuration values
Uses the filesystem to store cached values
Uses a memcached server as a backend. Supports either pylibmc or memcache or google app engine memcache library.
Relevant configuration values
Is MemcachedCache under a different name
Uses a memcached server as a backend. Intended to be used with a SASL enabled connection to the memcached server. pylibmc is required and SASL must be supported by libmemcached.
Relevant configuration values
New in version 0.10.
Same as SASLMemcachedCache however, it has the ablity to spread value across multiple keys if it is bigger than the memcached treshold which by default is 1M. Uses pickle.
New in version 0.11.
You are able to easily add your own custom cache backends by exposing a function that can instantiate and return a cache object. CACHE_TYPE will be the import string to your custom function. It should expect to receive three arguments.
Your custom cache object must also subclass the werkzeug.contrib.cache.BaseCache class. Flask-Cache will make sure that threshold is already included in the kwargs options dictionary since it is common to all BaseCache classes.
An example Redis cache implementation:
#: the_app/custom.py
class RedisCache(BaseCache):
def __init__(self, servers, default_timeout=500):
pass
def redis(app, config, args, kwargs):
args.append(app.config['REDIS_SERVERS'])
return RedisCache(*args, **kwargs)
With this example, your CACHE_TYPE might be the_app.custom.redis
An example PylibMC cache implementation to change binary setting and provide username/password if SASL is enabled on the library:
#: the_app/custom.py
def pylibmccache(app, config, args, kwargs):
return pylibmc.Client(servers=config['CACHE_MEMCACHED_SERVERS'],
username=config['CACHE_MEMCACHED_USERNAME'],
password=config['CACHE_MEMCACHED_PASSWORD'],
binary=True)
With this example, your CACHE_TYPE might be the_app.custom.pylibmccache
This class is used to control the cache objects.
This is used to initialize cache with your app object
Proxy function for internal cache object.
Proxy function for internal cache object.
Proxy function for internal cache object.
Proxy function for internal cache object.
Proxy function for internal cache object.
Proxy function for internal cache object.
Proxy function for internal cache object.
Proxy function for internal cache object.
Decorator. Use this to cache a function. By default the cache key is view/request.path. You are able to use this decorator with any function by changing the key_prefix. If the token %s is located within the key_prefix then it will replace that with request.path
Example:
# An example view function
@cache.cached(timeout=50)
def big_foo():
return big_bar_calc()
# An example misc function to cache.
@cache.cached(key_prefix='MyCachedList')
def get_list():
return [random.randrange(0, 1) for i in range(50000)]
my_list = get_list()
Note
You MUST have a request context to actually called any functions that are cached.
New in version 0.4: The returned decorated function now has three function attributes assigned to it. These attributes are readable/writable.
- uncached
- The original undecorated function
- cache_timeout
- The cache timeout value for this function. For a custom value to take affect, this must be set before the function is called.
- make_cache_key
- A function used in generating the cache_key used.
Parameters: |
|
---|
Use this to cache the result of a function, taking its arguments into account in the cache key.
Information on Memoization.
Example:
@cache.memoize(timeout=50)
def big_foo(a, b):
return a + b + random.randrange(0, 1000)
>>> big_foo(5, 2)
753
>>> big_foo(5, 3)
234
>>> big_foo(5, 2)
753
New in version 0.4: The returned decorated function now has three function attributes assigned to it.
- uncached
- The original undecorated function. readable only
- cache_timeout
The cache timeout value for this function. For a custom value to take affect, this must be set before the function is called.
readable and writable
- make_cache_key
A function used in generating the cache_key used.
readable and writable
Parameters: |
|
---|
New in version 0.5: params make_name, unless
Deletes the specified functions caches, based by given parameters. If parameters are given, only the functions that were memoized with them will be erased. Otherwise all versions of the caches will be forgotten.
Example:
@cache.memoize(50)
def random_func():
return random.randrange(1, 50)
@cache.memoize()
def param_func(a, b):
return a+b+random.randrange(1, 50)
>>> random_func()
43
>>> random_func()
43
>>> cache.delete_memoized('random_func')
>>> random_func()
16
>>> param_func(1, 2)
32
>>> param_func(1, 2)
32
>>> param_func(2, 2)
47
>>> cache.delete_memoized('param_func', 1, 2)
>>> param_func(1, 2)
13
>>> param_func(2, 2)
47
Delete memoized is also smart about instance methods vs class methods.
When passing a instancemethod, it will only clear the cache related to that instance of that object. (object uniqueness can be overridden
by defining the __repr__ method, such as user id).
When passing a classmethod, it will clear all caches related across all instances of that class.
Example:
class Adder(object):
@cache.memoize()
def add(self, b):
return b + random.random()
>>> adder1 = Adder()
>>> adder2 = Adder()
>>> adder1.add(3)
3.23214234
>>> adder2.add(3)
3.60898509
>>> cache.delete_memoized(adder.add)
>>> adder1.add(3)
3.01348673
>>> adder2.add(3)
3.60898509
>>> cache.delete_memoized(Adder.add)
>>> adder1.add(3)
3.53235667
>>> adder2.add(3)
3.72341788
Parameters: |
|
---|
Note
Flask-Cache uses inspect to order kwargs into positional args when the function is memoized. If you pass a function reference into fname instead of the function name, Flask-Cache will be able to place the args/kwargs in the proper order, and delete the positional cache.
However, if delete_memoized is just called with the name of the function, be sure to pass in potential arguments in the same order as defined in your function as args only, otherwise Flask-Cache will not be able to compute the same cache key.
Note
Flask-Cache maintains an internal random version hash for the function. Using delete_memoized will only swap out the version hash, causing the memoize function to recompute results and put them into another key.
This leaves any computed caches for this memoized function within the caching backend.
It is recommended to use a very high timeout with memoize if using this function, so that when the version has is swapped, the old cached results would eventually be reclaimed by the caching backend.
Delete the version hash associated with the function.
..warning:
Performing this operation could leave keys behind that have
been created with this version hash. It is up to the application
to make sure that all keys that may have been created with this
version hash at least have timeouts so they will not sit orphaned
in the cache backend.
Uses base64 for memoize caching. This fixes rare issues where the cache_key was either a tuple or larger than the caching backend would be able to support.
Adds support for deleting memoized caches optionally based on function parameters.
Python 2.5 compatibility, plus bugfix with string.format.
Added the ability to retrieve memoized function names or cache keys.
Bugfix release. Fixes a bug that would cause an exception if no CACHE_TYPE was supplied.
Pypi egg fix.