kv-caches

Caches speed up access to stores greatly, if used right. Usually, these require combining two KeyValueStore instances of the same or different kind. A typical example is a store that uses a MemcacheStore in front of a FilesystemStore:

from simplekv.memory.memcachestore import MemcacheStore
from simplekv.fs import FilesystemStore
from simplekv.cache import CacheDecorator

import memcache

# initialize memcache instance
mc = memcache.Client(['localhost:11211'])

store = CacheDecorator(
  cache=MemcacheStore(mc),
  store=FilesystemStore('.')
)

# will store the value in the FilesystemStore
store.put('some_value', '123')

# fetches from the FilesystemStore, but caches the result
print store.get('some_value')

# any further calls to store.get('some_value') will be served from the
# MemcacheStore now
class simplekv.cache.CacheDecorator(cache, store)

Write-through cache decorator.

Can combine two KeyValueStore instances into a single caching KeyValueStore. On a data-read request, the cache will be consulted first, if there is a cache miss or an error, data will be read from the backing store. Any retrieved value will be stored in the cache before being forward to the client.

Write, key-iteration and delete requests will be passed on straight to the backing store. After their completion, the cache will be updated.

No cache maintainenace above this is done by the decorator. The caching store itselfs decides how large to grow the cache and which data to keep, which data to throw away.

Parameters:
  • cache – The caching backend.
  • store – The backing store. This is the “authorative” backend.
delete(key)

Implementation of delete().

If an exception occurs in either the cache or backing store, all are passing on.

get(key)

Implementation of get().

If a cache miss occurs, the value is retrieved, stored in the cache and returned.

If the cache raises an IOError, the cache is ignored, and the backing store is consulted directly.

It is possible for a caching error to occur while attempting to store the value in the cache. It will not be handled as well.

get_file(key, file)

Implementation of get_file().

If a cache miss occurs, the value is retrieved, stored in the cache and returned.

If the cache raises an IOError, the retrieval cannot proceed: If file was an open file, data maybe been written to it already. The IOError bubbles up.

It is possible for a caching error to occur while attempting to store the value in the cache. It will not be handled as well.

open(key)

Implementation of open().

If a cache miss occurs, the value is retrieved, stored in the cache, then then another open is issued on the cache.

If the cache raises an IOError, the cache is ignored, and the backing store is consulted directly.

It is possible for a caching error to occur while attempting to store the value in the cache. It will not be handled as well.

put(key, data)

Implementation of put().

Will store the value in the backing store. After a successful or unsuccessful store, the cache will be invalidated by deleting the key from it.

put_file(key, file)

Implementation of put_file().

Will store the value in the backing store. After a successful or unsuccessful store, the cache will be invalidated by deleting the key from it.