large_image.cache_util package¶
Submodules¶
large_image.cache_util.base module¶
- class large_image.cache_util.base.BaseCache(*args, getsizeof=None, **kwargs)[source]¶
Bases:
Cache
Base interface to cachetools.Cache for use with large-image.
- property curritems¶
- property currsize¶
The current size of the cache.
- logError(err, func, msg)[source]¶
Log errors, but throttle them so as not to spam the logs.
- Parameters:
err – error to log.
func – function to use for logging. This is something like logprint.exception or logger.error.
msg – the message to log.
- property maxsize¶
The maximum size of the cache.
large_image.cache_util.cache module¶
- class large_image.cache_util.cache.LruCacheMetaclass(name, bases, namespace, **kwargs)[source]¶
Bases:
type
- classCaches = {}¶
- namedCaches = {}¶
- large_image.cache_util.cache.getTileCache()[source]¶
Get the preferred tile cache and lock.
- Returns:
tileCache and tileLock.
- large_image.cache_util.cache.isTileCacheSetup()[source]¶
Return True if the tile cache has been created.
- Returns:
True if _tileCache is not None.
- large_image.cache_util.cache.methodcache(key=None)[source]¶
Decorator to wrap a function with a memoizing callable that saves results in self.cache. This is largely taken from cachetools, but uses a cache from self.cache rather than a passed value. If self.cache_lock is present and not none, a lock is used.
- Parameters:
key – if a function, use that for the key, otherwise use self.wrapKey.
large_image.cache_util.cachefactory module¶
- large_image.cache_util.cachefactory.loadCaches(entryPointName='large_image.cache', sourceDict={})[source]¶
Load all caches from entrypoints and add them to the availableCaches dictionary.
- Parameters:
entryPointName – the name of the entry points to load.
sourceDict – a dictionary to populate with the loaded caches.
- large_image.cache_util.cachefactory.pickAvailableCache(sizeEach, portion=8, maxItems=None, cacheName=None)[source]¶
Given an estimated size of an item, return how many of those items would fit in a fixed portion of the available virtual memory.
- Parameters:
sizeEach – the expected size of an item that could be cached.
portion – the inverse fraction of the memory which can be used.
maxItems – if specified, the number of items is never more than this value.
cacheName – if specified, the portion can be affected by the configuration.
- Returns:
the number of items that should be cached. Always at least two, unless maxItems is less.
large_image.cache_util.memcache module¶
- class large_image.cache_util.memcache.MemCache(url='127.0.0.1', username=None, password=None, getsizeof=None, mustBeAvailable=False)[source]¶
Bases:
BaseCache
Use memcached as the backing cache.
- property curritems¶
- property currsize¶
The current size of the cache.
- property maxsize¶
The maximum size of the cache.
Module contents¶
- class large_image.cache_util.LruCacheMetaclass(name, bases, namespace, **kwargs)[source]¶
Bases:
type
- classCaches = {}¶
- namedCaches = {}¶
- class large_image.cache_util.MemCache(url='127.0.0.1', username=None, password=None, getsizeof=None, mustBeAvailable=False)[source]¶
Bases:
BaseCache
Use memcached as the backing cache.
- property curritems¶
- property currsize¶
The current size of the cache.
- property maxsize¶
The maximum size of the cache.
- large_image.cache_util.getTileCache()[source]¶
Get the preferred tile cache and lock.
- Returns:
tileCache and tileLock.
- large_image.cache_util.isTileCacheSetup()[source]¶
Return True if the tile cache has been created.
- Returns:
True if _tileCache is not None.
- large_image.cache_util.methodcache(key=None)[source]¶
Decorator to wrap a function with a memoizing callable that saves results in self.cache. This is largely taken from cachetools, but uses a cache from self.cache rather than a passed value. If self.cache_lock is present and not none, a lock is used.
- Parameters:
key – if a function, use that for the key, otherwise use self.wrapKey.
- large_image.cache_util.pickAvailableCache(sizeEach, portion=8, maxItems=None, cacheName=None)[source]¶
Given an estimated size of an item, return how many of those items would fit in a fixed portion of the available virtual memory.
- Parameters:
sizeEach – the expected size of an item that could be cached.
portion – the inverse fraction of the memory which can be used.
maxItems – if specified, the number of items is never more than this value.
cacheName – if specified, the portion can be affected by the configuration.
- Returns:
the number of items that should be cached. Always at least two, unless maxItems is less.