LFU Cache#
The lfu module provides the LFUCache
(Least Frequently Used) class.
- class cacheout.lfu.LFUCache(*, maxsize: int = 256, ttl: int | float = 0, timer: ~typing.Callable[[], int | float] = <built-in function time>, default: ~typing.Any = None, enable_stats: bool = False, on_get: ~typing.Callable[[~typing.Hashable, ~typing.Any, bool], None] | None = None, on_set: ~typing.Callable[[~typing.Hashable, ~typing.Any, ~typing.Any], None] | None = None, on_delete: ~typing.Callable[[~typing.Hashable, ~typing.Any, ~cacheout.cache.RemovalCause], None] | None = None)[source]#
Bases:
Cache
The Least Frequently Used (LFU) cache is like
Cache
but uses a least-frequently-used eviction policy.The primary difference with
Cache
is that access to cache entries (i.e. calls toget()
andset()
) are tracked; each call toget()
will increment the cache key’s access count while calls toset()
will reset the counter. During cache eviction, the entry with the lowest access count is removed first.- add(key: Hashable, value: Any, ttl: int | float | None = None) None [source]#
Add cache key/value if it doesn’t already exist.
This method ignores keys that exist which leaves the original TTL in tact.
- Parameters:
key – Cache key to add.
value – Cache value.
ttl – TTL value. Defaults to
None
which usesttl
. Time units are determined bytimer
.
- add_many(items: Mapping, ttl: int | float | None = None) None #
Add multiple cache keys at once.
- Parameters:
items – Mapping of cache key/values to set.
ttl – TTL value. Defaults to
None
which usesttl
. Time units are determined bytimer
.
- clear() None #
Clear all cache entries.
- configure(*, maxsize: int | None = None, ttl: int | float | None = None, timer: ~typing.Callable[[], int | float] | None = None, default: ~typing.Any = <object object>, enable_stats: ~typing.Any = <object object>) None #
Configure cache settings.
This method is meant to support runtime level configurations for global level cache objects.
- copy() OrderedDict #
Return a copy of the cache.
- delete(key: Hashable) int #
Delete cache key and return number of entries deleted (
1
or0
).- Parameters:
key – Cache key to delete.
- Returns:
1
if key was deleted,0
if key didn’t exist.- Return type:
int
- delete_expired() int #
Delete expired cache keys and return number of entries deleted.
- Returns:
Number of entries deleted.
- Return type:
int
- delete_many(iteratee: str | List[Hashable] | Pattern | Callable) int #
Delete multiple cache keys at once filtered by an iteratee.
The iteratee can be one of:
list
- List of cache keys.str
- Search string that supports Unix shell-style wildcards.re.compile()
- Compiled regular expression.function
- Function that returns whether a key matches. Invoked withiteratee(key)
.
- Parameters:
iteratee – Iteratee to filter by.
- Returns:
Number of cache keys deleted.
- Return type:
int
- evict() int #
Perform cache eviction per the cache replacement policy:
First, remove all expired entries.
Then, remove non-TTL entries using the cache replacement policy.
When removing non-TTL entries, this method will only remove the minimum number of entries to reduce the number of entries below
maxsize
. Ifmaxsize
is0
, then only expired entries will be removed.- Returns:
Number of cache entries evicted.
- expire_times() Dict[Hashable, int | float] #
Return cache expirations for TTL keys.
- Returns:
dict
- expired(key: Hashable, expires_on: int | float | None = None) bool #
Return whether cache key is expired or not.
- Parameters:
key – Cache key.
expires_on – Timestamp of when the key is considered expired. Defaults to
None
which uses the current value returned fromtimer()
.
- full() bool #
Return whether the cache is full or not.
- get(key: Hashable, default: Any = None) Any [source]#
Return the cache value for key or default or
missing(key)
if it doesn’t exist or has expired.- Parameters:
key – Cache key.
default – Value to return if key doesn’t exist. If any value other than
None
, then it will take precedence overmissing
and be used as the return value. If default is callable, it will function likemissing
and its return value will be set for the cache key. Defaults toNone
.
- Returns:
The cached value.
- get_many(iteratee: str | List[Hashable] | Pattern | Callable) dict #
Return many cache values as a
dict
of key/value pairs filtered by an iteratee.The iteratee can be one of:
list
- List of cache keysstr
- Search string that supports Unix shell-style wildcardsre.compile()
- Compiled regular expressioncallable
- Function that returns whether a key matches. Invoked withiteratee(key)
- Parameters:
iteratee – Iteratee to filter by.
- get_ttl(key: Hashable) int | float | None #
Return the remaining time to live of a key that has a TTL.
- Parameters:
key – Cache key.
- Returns:
The remaining time to live of key or
None
if the key doesn’t exist or has expired.
- has(key: Hashable) bool #
Return whether cache key exists and hasn’t expired.
- items() ItemsView #
Return a
dict_items
view of cache items.Warning
Returned data is copied from the cache object, but any modifications to mutable values will modify this cache object’s data.
- keys() KeysView #
Return
dict_keys
view of all cache keys.Note
Cache is copied from the underlying cache storage before returning.
- memoize(*, ttl: int | float | None = None, typed: bool = False) Callable[[F], F] #
Decorator that wraps a function with a memoizing callable and works on both synchronous and asynchronous functions.
Each return value from the function will be cached using the function arguments as the cache key. The cache object can be accessed at
<function>.cache
. The uncached version (i.e. the original function) can be accessed at<function>.uncached
. Each return value from the function will be cached using the function arguments as the cache key. Calculate the cache key for the provided function arguments for use in other operations of the<function>.cache
object using the function accessible at<function>.cache_key
- Keyword Arguments:
ttl – TTL value. Defaults to
None
which usesttl
. Time units are determined bytimer
.typed – Whether to cache arguments of a different type separately. For example,
<function>(1)
and<function>(1.0)
would be treated differently. Defaults toFalse
.
- popitem() Tuple[Hashable, Any] #
Delete and return next cache item,
(key, value)
, based on cache replacement policy while ignoring expiration times (i.e. the selection of the item to pop is based solely on the cache key ordering).- Returns:
Two-element tuple of deleted cache
(key, value)
.
- set(key: Hashable, value: Any, ttl: int | float | None = None) None [source]#
Set cache key/value and replace any previously set cache key.
If the cache key previously existed, setting it will move it to the end of the cache stack which means it would be evicted last.
- Parameters:
key – Cache key to set.
value – Cache value.
ttl – TTL value. Defaults to
None
which usesttl
. Time units are determined bytimer
.
- set_many(items: Mapping, ttl: int | float | None = None) None #
Set multiple cache keys at once.
- Parameters:
items – Mapping of cache key/values to set.
ttl – TTL value. Defaults to
None
which usesttl
. Time units are determined bytimer
.
- size() int #
Return number of cache entries.
- values() ValuesView #
Return
dict_values
view of all cache values.Note
Cache is copied from the underlying cache storage before returning.