client module.
CacheClient serves mediator purpose between a single entry point that implements Cache and one or many namespaces targeted to concrete cache implementations.
CacheClient let partition application cache by namespaces effectively hiding details from client code.
Sets a key’s value, if and only if the item is not already.
Adds multiple values at once, with no effect for keys already in cache.
Atomically decrements a key’s value. The value, if too large, will wrap around.
If the key does not yet exist in the cache and you specify an initial_value, the key’s value will be set to this initial value and then decremented. If the key does not exist and no initial_value is specified, the key’s value will not be set.
Looks up multiple keys from cache in one operation. This is the recommended way to do bulk loads.
Atomically increments a key’s value. The value, if too large, will wrap around.
If the key does not yet exist in the cache and you specify an initial_value, the key’s value will be set to this initial value and then incremented. If the key does not exist and no initial_value is specified, the key’s value will not be set.
Replaces a key’s value, failing if item isn’t already.
Replaces multiple values at once, with no effect for keys not in cache.
dependency module.
CacheDependency introduces a wire between cache items so they can be invalidated via a single operation, thus simplifing code necessary to manage dependencies in cache.
encoding module.
Encodes key with base64 encoding.
>>> result = base64_encode(string_type('my key'))
>>> result == 'bXkga2V5'.encode('latin1')
True
Encodes all keys in mapping with key_encode callable. Returns tuple of: key mapping (encoded key => key) and value mapping (encoded key => value).
>>> mapping = {'k1': 1, 'k2': 2}
>>> keys, mapping = encode_keys(mapping,
... lambda k: str(base64_encode(k).decode('latin1')))
>>> sorted(keys.items())
[('azE=', 'k1'), ('azI=', 'k2')]
>>> sorted(mapping.items())
[('azE=', 1), ('azI=', 2)]
Encodes key with given hash function.
See list of available hashes in hashlib module from Python Statndard Library.
Additional algorithms may also be available depending upon the OpenSSL library that Python uses on your platform.
>>> try:
... from hashlib import sha1
... key_encode = hash_encode(sha1)
... r = base64_encode(key_encode(string_type('my key')))
... assert r == 'RigVwkWdSuGyFu7au08PzUMloU8='.encode('latin1')
... except ImportError: # Python2.4
... pass
lockout module.
A container of various attributes used by lockout.
Used to define lockout terms.
A lockout is used to enforce terms of use policy.
A decorator that forbids access (by a call to forbid_action) to func once the counter threshold is reached (lock is set).
You can override default forbid action by action.
See test_lockout.py for an example.
A guard decorator is applied to a func which returns a boolean indicating success or failure. Each failure is a subject to increase counter. The counters that support reset (and related locks) are deleted on success.
logging module.
memcache module.
A wrapper around python-memcache Client in order to adapt cache contract.
Sets a key’s value, if and only if the item is not already.
Adds multiple values at once, with no effect for keys already in cache.
Atomically decrements a key’s value. The value, if too large, will wrap around.
If the key does not yet exist in the cache and you specify an initial_value, the key’s value will be set to this initial value and then decremented. If the key does not exist and no initial_value is specified, the key’s value will not be set.
Looks up multiple keys from cache in one operation. This is the recommended way to do bulk loads.
Atomically increments a key’s value. The value, if too large, will wrap around.
If the key does not yet exist in the cache and you specify an initial_value, the key’s value will be set to this initial value and then incremented. If the key does not exist and no initial_value is specified, the key’s value will not be set.
Replaces a key’s value, failing if item isn’t already.
Replaces multiple values at once, with no effect for keys not in cache.
memory module.
A single cache item stored in cache.
Effectively implements in-memory cache.
Sets a key’s value, if and only if the item is not already.
>>> c = MemoryCache()
>>> c.add('k', 'v', 100)
True
>>> c.add('k', 'v', 100)
False
Adds multiple values at once, with no effect for keys already in cache.
>>> c = MemoryCache()
>>> c.add_multi({'k': 'v'}, 100)
[]
>>> c.add_multi({'k': 'v'}, 100)
['k']
Atomically decrements a key’s value. The value, if too large, will wrap around.
If the key does not yet exist in the cache and you specify an initial_value, the key’s value will be set to this initial value and then decremented. If the key does not exist and no initial_value is specified, the key’s value will not be set.
>>> c = MemoryCache()
>>> c.decr('k')
>>> c.decr('k', initial_value=10)
9
>>> c.decr('k')
8
Deletes a key from cache.
If key is not found return False
>>> c = MemoryCache()
>>> c.delete('k')
False
>>> c.store('k', 'v', 100)
True
>>> c.delete('k')
True
There is item in cache that expired
>>> c.items['k'] = CacheItem('k', 'v', 1)
>>> c.delete('k')
False
Delete multiple keys at once.
>>> c = MemoryCache()
>>> c.delete_multi(('k1', 'k2', 'k3'))
True
>>> c.store_multi({'k1':1, 'k2': 2}, 100)
[]
>>> c.delete_multi(('k1', 'k2'))
True
There is item in cached that expired
>>> c.items['k'] = CacheItem('k', 'v', 1)
>>> c.get_multi(('k', ))
{}
Deletes everything in cache.
>>> c = MemoryCache()
>>> c.set_multi({'k1': 1, 'k2': 2}, 100)
[]
>>> c.flush_all()
True
Looks up a single key.
If key is not found return None
>>> c = MemoryCache()
>>> c.get('k')
Otherwise return value
>>> c.set('k', 'v', 100)
True
>>> c.get('k')
'v'
There is item in cached that expired
>>> c.items['k'] = CacheItem('k', 'v', 1)
>>> c.get('k')
Looks up multiple keys from cache in one operation. This is the recommended way to do bulk loads.
>>> c = MemoryCache()
>>> c.get_multi(('k1', 'k2', 'k3'))
{}
>>> c.store('k1', 'v1', 100)
True
>>> c.store('k2', 'v2', 100)
True
>>> sorted(c.get_multi(('k1', 'k2')).items())
[('k1', 'v1'), ('k2', 'v2')]
There is item in cache that expired
>>> c.items['k'] = CacheItem('k', 'v', 1)
>>> c.get_multi(('k', ))
{}
Atomically increments a key’s value. The value, if too large, will wrap around.
If the key does not yet exist in the cache and you specify an initial_value, the key’s value will be set to this initial value and then incremented. If the key does not exist and no initial_value is specified, the key’s value will not be set.
>>> c = MemoryCache()
>>> c.incr('k')
>>> c.incr('k', initial_value=0)
1
>>> c.incr('k')
2
There is item in cached that expired
>>> c.items['k'] = CacheItem('k', 1, 1)
>>> c.incr('k')
Replaces a key’s value, failing if item isn’t already.
>>> c = MemoryCache()
>>> c.replace('k', 'v', 100)
False
>>> c.add('k', 'v', 100)
True
>>> c.replace('k', 'v', 100)
True
Replaces multiple values at once, with no effect for keys not in cache.
>>> c = MemoryCache()
>>> c.replace_multi({'k': 'v'}, 100)
['k']
>>> c.add_multi({'k': 'v'}, 100)
[]
>>> c.replace_multi({'k': 'v'}, 100)
[]
Sets a key’s value, regardless of previous contents in cache.
>>> c = MemoryCache()
>>> c.set('k', 'v', 100)
True
Set multiple keys’ values at once.
>>> c = MemoryCache()
>>> c.set_multi({'k1': 1, 'k2': 2}, 100)
[]
There is item in cached that expired
>>> c = MemoryCache()
>>> c.items['k'] = CacheItem('k', 'v', 1)
>>> c.store('k', 'v', 100)
True
There is item in expire_buckets that expired
>>> c = MemoryCache()
>>> i = int((int(unixtime()) % c.period)
... / c.interval) - 1
>>> c.expire_buckets[i] = (allocate_lock(), [('x', 10)])
>>> c.store('k', 'v', 100)
True
There is item in cached that expired
>>> c = MemoryCache()
>>> c.items['k'] = CacheItem('k', 'v', 1)
>>> c.store_multi({'k': 'v'}, 100)
[]
There is item in expire_buckets that expired
>>> c = MemoryCache()
>>> i = int((int(unixtime()) % c.period)
... / c.interval) - 1
>>> c.expire_buckets[i] = (allocate_lock(), [('x', 10)])
>>> c.store_multi({'k': 'v'}, 100)
[]
time is below 1 month
>>> expires(10, 1)
11
more than month
>>> expires(10, 3000000)
3000000
otherwise
>>> expires(0, 0)
2147483647
>>> expires(0, -1)
2147483647
If there are no expired items in the bucket returns empty list
>>> bucket_items = [('k1', 1), ('k2', 2), ('k3', 3)]
>>> find_expired(bucket_items, 0)
[]
>>> bucket_items
[('k1', 1), ('k2', 2), ('k3', 3)]
Expired items are returned in the list and deleted from the bucket
>>> find_expired(bucket_items, 2)
['k1']
>>> bucket_items
[('k2', 2), ('k3', 3)]
interface module.
NullCache is a cache implementation that actually doesn’t do anything but silently performs cache operations that result no change to state.
Sets a key’s value, if and only if the item is not already.
>>> c = NullCache()
>>> c.add('k', 'v')
True
Adds multiple values at once, with no effect for keys already in cache.
>>> c = NullCache()
>>> c.add_multi({})
[]
Atomically decrements a key’s value. The value, if too large, will wrap around.
If the key does not yet exist in the cache and you specify an initial_value, the key’s value will be set to this initial value and then decremented. If the key does not exist and no initial_value is specified, the key’s value will not be set.
>>> c = NullCache()
>>> c.decr('k')
Deletes a key from cache.
>>> c = NullCache()
>>> c.delete('k')
True
Delete multiple keys at once.
>>> c = NullCache()
>>> c.delete_multi([])
True
Looks up multiple keys from cache in one operation. This is the recommended way to do bulk loads.
>>> c = NullCache()
>>> c.get_multi([])
{}
Atomically increments a key’s value. The value, if too large, will wrap around.
If the key does not yet exist in the cache and you specify an initial_value, the key’s value will be set to this initial value and then incremented. If the key does not exist and no initial_value is specified, the key’s value will not be set.
>>> c = NullCache()
>>> c.incr('k')
Replaces a key’s value, failing if item isn’t already.
>>> c = NullCache()
>>> c.replace('k', 'v')
True
Replaces multiple values at once, with no effect for keys not in cache.
>>> c = NullCache()
>>> c.replace_multi({})
[]
patterns module.
Specializes access to cache by using a number of common settings for various cache operations and patterns.
Sets a key’s value, if and only if the item is not already.
Looks up multiple keys from cache in one operation. This is the recommended way to do bulk loads.
Cache Pattern: get an item by key from cache and if it is not available use create_factory to aquire one. If result is not None use cache add operation to store result and if operation succeed use dependency_key_factory to get an instance of dependency_key to link with key.
Cache Pattern: get an item by key from cache and if it is not available see one_pass_create.
Cache Pattern: get an item by key from cache and if it is not available use create_factory to aquire one. If result is not None use cache set operation to store result and use dependency_key_factory to get an instance of dependency_key to link with key.
Cache Pattern: get_multi items by make_key over args from cache and if there are any missing use create_factory to aquire them, if result available use cache set_multi operation to store results, return cached items if any.
Cache Pattern: try enter one pass: (1) if entered use create_factory to get a value if result is not None use cache set operation to store result and use dependency_key_factory to get an instance of dependency_key to link with key; (2) if not entered wait until one pass is available and it is not timed out get an item by key from cache.
Replaces multiple values at once, with no effect for keys not in cache.
Sets a key’s value, regardless of previous contents in cache.
Returns specialized decorator for get_or_add cache pattern.
Example:
kb = key_builder('repo')
cached = Cached(cache, kb, time=60)
@cached.wraps_get_or_add
def list_items(self, locale):
pass
Returns specialized decorator for get_or_create cache pattern.
Example:
kb = key_builder('repo')
cached = Cached(cache, kb, time=60)
@cached.wraps_get_or_create
def list_items(self, locale):
pass
A solution to Thundering Head problem.
see http://en.wikipedia.org/wiki/Thundering_herd_problem
Typical use:
with OnePass(cache, 'op:' + key) as one_pass:
if one_pass.acquired:
# update *key* in cache
elif one_pass.wait():
# obtain *key* from cache
else:
# timeout
Returns a key builder that allows build a make cache key function at runtime.
>>> def list_items(self, locale='en', sort_order=1):
... pass
>>> repo_key_builder = key_builder('repo')
>>> make_key = repo_key_builder(list_items)
>>> make_key('self')
"repo-list_items:'en':1"
>>> make_key('self', 'uk')
"repo-list_items:'uk':1"
>>> make_key('self', sort_order=0)
"repo-list_items:'en':0"
Here is an example of make key function:
def key_list_items(self, locale='en', sort_order=1):
return "repo-list_items:%r:%r" % (locale, sort_order)
pylibmc module.
A wrapper around pylibmc Client in order to adapt cache contract.
Sets a key’s value, if and only if the item is not already.
Adds multiple values at once, with no effect for keys already in cache.
Atomically decrements a key’s value. The value, if too large, will wrap around.
If the key does not yet exist in the cache and you specify an initial_value, the key’s value will be set to this initial value and then decremented. If the key does not exist and no initial_value is specified, the key’s value will not be set.
Looks up multiple keys from cache in one operation. This is the recommended way to do bulk loads.
Atomically increments a key’s value. The value, if too large, will wrap around.
If the key does not yet exist in the cache and you specify an initial_value, the key’s value will be set to this initial value and then incremented. If the key does not exist and no initial_value is specified, the key’s value will not be set.
Replaces a key’s value, failing if item isn’t already.
Replaces multiple values at once, with no effect for keys not in cache.
utils module.
Returns a total number of seconds for the given delta.
delta can be datetime.timedelta.
>>> total_seconds(timedelta(hours=2))
7200
or int:
>>> total_seconds(100)
100
otherwise raise TypeError.
>>> total_seconds('100')
Traceback (most recent call last):
...
TypeError: ...