Updates: IMemoryCache has been updated to support concurrent async operations, thus this library is not necessary anymore.
A simple wrapper over MemoryCache to prevent concurrent cache missing.
The Microsoft MemoryCache API provides the ability to cache locally, however it does not guarantee the atomicity when multiple threads call GetOrCreate()
with the same entry (at the moment this was written), which can lead to multiple cache missing and duplicated data fetching.
It was implemented based on the same idea used by ConcurrentDictionary, which divides the Hashmap into a fixed number of segments (16 by default). Each of them is protected by a segment lock. Write operations that fall into the same segment are sequentialized, while write operations falling into different segments are handled concurrently. All read operations are lock-free.
-
Install package:
dotnet add package ConcurrentCaching
-
Inject caching service:
services.AddMemoryCache(options => { //... Setup e.g. max cache limit for LRU }); services.AddSingleton<IConcurrentMemoryCache, ConcurrentMemoryCache>();
-
Fetch data from cache:
var item = cache.GetOrCreate<TItem>("<key>", entry => { // Fetch data from elsewhere and return it ... }); var item = await cache.GetOrCreateAsync<TItem>("<key>", async entry => { // Fetch data from elsewhere and return it ... });