Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

GetOrCreateExclusive() and GetOrCreateExclusiveAsync(): Exclusive versions of GetOrCreate() and GetOrCreateAsync() #36500

Closed
rmja opened this issue Oct 19, 2016 · 15 comments
Labels
api-suggestion Early API idea and discussion, it is NOT ready for implementation area-Extensions-Caching
Milestone

Comments

@rmja
Copy link

rmja commented Oct 19, 2016

It would be nice with equivalents of GetOrCreate() and GetOrCreateAsync() where it is ensured that the factory is only invoked once for each cache miss.

This basically resolves aspnet/Caching#218, but with alternate extension methods.

An implementation proposal:

public static TItem GetOrCreateExclusive<TItem>(this IMemoryCache cache, object key, Func<ICacheEntry, TItem> factory)
{
    object result;

    if (!cache.TryGetValue(key, out result))
    {
        SemaphoreSlim exclusiveLock;

        lock (cache)
        {
            exclusiveLock = cache.GetOrCreate("__exclusive__", entry => {
                entry.Priority = CacheItemPriority.NeverRemove;
                return new SemaphoreSlim(1);
            });
        }

        exclusiveLock.Wait();

        try
        {
            if (cache.TryGetValue(key, out result))
            {
                return (TItem)result;
            }

            var entry = cache.CreateEntry(key);
            result = factory(entry);
            entry.SetValue(result);
            entry.Dispose();
        }
        finally
        {
            exclusiveLock.Release();
        }
    }

    return (TItem)result;
}

public static async Task<TItem> GetOrCreateExclusiveAsync<TItem>(this IMemoryCache cache, object key, Func<ICacheEntry, Task<TItem>> factory)
{
    object result;

    if (!cache.TryGetValue(key, out result))
    {
        SemaphoreSlim exclusiveLock;

        lock (cache)
        {
            exclusiveLock = cache.GetOrCreate("__exclusive__", entry => {
                entry.Priority = CacheItemPriority.NeverRemove;
                return new SemaphoreSlim(1);
            });
        }

        exclusiveLock.Wait();

        try
        {
            if (cache.TryGetValue(key, out result))
            {
                return (TItem)result;
            }

            var entry = cache.CreateEntry(key);
            result = await factory(entry); //.ConfigureAwait(false);
            entry.SetValue(result);
            entry.Dispose();
        }
        finally
        {
            exclusiveLock.Release();
        }
    }

    return (TItem)result;
}
@rmja
Copy link
Author

rmja commented Oct 19, 2016

In the async version it should of cause be await exclusiveLock.WaitAsync(); and not the sync equivalent.

@ttrider
Copy link

ttrider commented Nov 11, 2016

Looking at the MemoryCache.cs implementation, I wonder how bad would it be to use ConcurrentDictionary there instead of plain Dictionary. and expose the GetOrAdd method directly.

I know that the interface doesn't have it and I would really hate IMemoryCache2, but at this point, I'm seriously considering forking implementation of the MemoryCache nd making change myself...

@JunTaoLuo
Copy link
Contributor

@ttrider We are looking into updating the implementation of MemoryCache to use ConcurrentDictionary in dotnet/extensions#242. Note that the GetOrAdd method does not guarantee, by itself, that the factory is only invoked once and the implementation will involve Lazy<T>.

@ttrider
Copy link

ttrider commented Nov 15, 2016

Hello John,

This is the great news! While we wait on your implementation, for a time being, we created the following poor-man wrapper implementation.

https://gist.github.com/ttrider/a5ae8fc86ccfe6a4243f4481a5858a80

I think I’ll rename my methods to match your signatures – it will make dropping and replacing them easier.

Thank You,

Vladimir

@glennc
Copy link

glennc commented Feb 9, 2017

I don't think this can fit into 2.0 right now. It needs a fairly large amount of investigation and testing to make it work well.

@ohadschn
Copy link

ohadschn commented Jul 8, 2017

It would be great if in the async version, expiry could be set as a function of the retrieved value (see https://github.com/aspnet/Caching/issues/338).

@amit-ezra
Copy link

@JunTaoLuo
it looks like the ConcurrentDictionary is already implemented, BUT the GetOrAdd method of the memory cache still isn't thread safe, even when using Lazy. it still doesn't use the back line ConcurrentDicionary.GetOrAdd...
I think it's very misleading have a non-atomic (non thread safe) GetOrAdd
is there a direct issue just on the GetOrAdd?

@jelical
Copy link

jelical commented May 6, 2018

2 proposed solutions:
Based on Lazy - https://gist.github.com/jelical/17a3e1ef03af2901f4eb6866291239c6
Based on interlocks - https://gist.github.com/jelical/c424003efe008435b49fd54a3d1bd5b0
Both solutions are tested under heavy load, working with relatively the same performance, supporting multiple IMemoryCache instances and guarantee exclusive fetch call.

Updated: added consistent exceptions propagation

@dengere
Copy link

dengere commented Jun 23, 2018

@jelical welldone, thanks. Could you please share that your test code?

@aspnet-hello aspnet-hello transferred this issue from aspnet/Caching Dec 13, 2018
@BladeWise
Copy link
Contributor

@jelical Considering that ConcurrentDictionary<K, V>.GetOrAdd does not guarantee that the factory method is called once, as per documentation, wouldn't these method have the same issues as using Lazy<Task<T>> directly as cached values?

As far as I can see, using Lazy<Task<T>> and GetOrCreate is far better than using GetOrCreateAsync directly, because while the latter delays cache item addition by the time required to construct the cached value (making the contention window wider), the former creates the lazy object delaying creation for when the item is used (so the contention window is the smallest possible). So, even if the Lazy<Task<T>> mitigates the issue, race conditions could still occur. Am I wrong?

@slavanap
Copy link

slavanap commented May 11, 2019

Greetings,

Here's my solution for your issue.
https://gist.github.com/slavanap/3b9c436c2defae8b1d305ad30fdfe889

It's based on one ConcurrentDictionary that stores TaskCompletionSource for every new cache value being created.

It uses more abstracted IDistributedCache (instead of IMemoryCache) as a storage, and I can rewrite it for MemoryCache if needed.

@Dotnet-GitSync-Bot Dotnet-GitSync-Bot added the untriaged New issue has not been triaged by the area owner label May 15, 2020
@Dotnet-GitSync-Bot
Copy link
Collaborator

I couldn't figure out the best area label to add to this issue. Please help me learn by adding exactly one area label.

@analogrelay analogrelay transferred this issue from dotnet/extensions May 15, 2020
@analogrelay analogrelay added this to the Future milestone May 15, 2020
@maryamariyan maryamariyan removed the untriaged New issue has not been triaged by the area owner label Jul 1, 2020
@maryamariyan maryamariyan added the api-suggestion Early API idea and discussion, it is NOT ready for implementation label Sep 30, 2020
@theodorzoulias
Copy link
Contributor

Related StackOverflow question: Stop Reentrancy on MemoryCache Calls

@deeprobin
Copy link
Contributor

For completeness. If I understand correctly, the following is proposed here:

namespace Microsoft.Extensions.Caching.Memory {
    public static partial class CacheExtensions {
        public static TItem GetOrCreateExclusive<TItem>(this IMemoryCache<TItem>, object key, Func<ICacheEntry, TItem> factory);
        public static Task<TItem> GetOrCreateExclusiveAsync<TItem>(this IMemoryCache<TItem>, object key, Func<ICacheEntry, Task<TItem>> factory)
    }
}

An implementation example has already been given by @rmja in top post.

I myself have not worked much with Microsoft.Extensions.Caching.Memory. Therefore I cannot say much about it.
@maryamariyan Is this a relevant API? If yes, you could take a look at it (this issue is already a bit older than 5 years).

@maryamariyan
Copy link
Member

maryamariyan commented Apr 6, 2022

I myself have not worked much with Microsoft.Extensions.Caching.Memory. Therefore I cannot say much about it.
@maryamariyan Is this a relevant API? If yes, you could take a look at it (this issue is already a bit older than 5 years).

GetOrCreate is not thread safe and for that to become possible there would need to be lock statements added into MemoryCache and due to perf regression concerns I don't think we will go with that change fix this thread safety issue.

I don't think we would want these extension methods added built-in to the library. Doesn't seem like a bigger value add than just letting whomever needs this functionality to just add it directly into their code. Would make sense to keep the onus on the app side to make sure they keep their transactions thread safe by also assuring they don't make calls to GetOrCreate, otherwise as explained in this issue it could lead to factory func in GetOrCreate getting called multiple times, etc. for a given cache miss in a multithreaded scenario.

Closing as won't fix.

Sign up for free to subscribe to this conversation on GitHub. Already have an account? Sign in.
Labels
api-suggestion Early API idea and discussion, it is NOT ready for implementation area-Extensions-Caching
Projects
None yet
Development

No branches or pull requests