Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Cache may crash if a dependency change before the lazylock object is filled #352

Closed
mnivet opened this issue Sep 2, 2014 · 7 comments
Closed

Comments

@mnivet
Copy link

mnivet commented Sep 2, 2014

In AspNetCacheProvider and RuntimeCacheProvider when an item is removed the method Get of the LazyLock class is called with a null parameter.

This could crash is the Get method has never been called before on that LazyLock instance.

While it could be reasonable to think that the instance has been read at least once before it is deleted, in fact it's not always true.

In particular if one of the dependency change before the first call to that Get method. This first call seems to be performed by the MicroCache class. But nothing guaranty that there is no dependency change between the add to the cache and the first call (In our system with complex cache dependencies, I've experience the problem multiple times...)

One solution that I try and that works is to call lazy.Get(loadFunction); in MicroCache class before adding the lazylock instance to the cache provider.
Once you do that, using a LazyLock wrapper around the stored object seems useless... However it's also difficult to completely remove this object from the MvcSiteMap code without performing a breaking change on the API as it is exposed by the ICacheProvider interface.

Another solution could be to modify the LazyLock implementation to not crash when the activator parameter is null (by setting the value to default(TValue) ?), but this seems not totally working as I end up with a null when calling SiteMaps.Current later in my application...

Or once again choose a solution that will break your public API and abandon your LazyLock class and replace it by System.Lazy which support thread safe lock and take the activator method in the constructor, which avoid to be unable to know how to retrieve the value as its occurs during the removed from cache events... But I'm not sure that calling the loadfuntion provide to the MicroCache class during the item removed event of cache provider has a real utility. In fact as I was saying for the first solution, I'm not sure that using a lazy wrapper has a real utility...

Whatever solution you choose I'm pretty interested in order to remove our dirty workaround code...

@NightOwl888
Copy link
Collaborator

The caching strategy was inspired by this post: http://www.superstarcoders.com/blogs/posts/micro-caching-in-asp-net.aspx, but was refactored to allow for DI support and a cache provider pattern.

Basically, the LazyLock is there to ensure that only 1 request makes it through to fetch the data when the cache expires, while minimizing the amount of time the lock is in place. It was put there with the assumption that the ISiteMapNodeProvider implementations would likely contain database access code (or perhaps something even slower like a remote API call) and the client requests could potentially come in faster than the data can be retrieved, causing multiple data requests to occur. Read the article to get the full picture.

I agree that this is a bug. For the fix, it seems like it would be best to return default(value) when the first !got check fails and the activator is null. That would prevent the exception from happening in this case.

public sealed class LazyLock
    {
        private volatile bool got;
        private object value;

        public TValue Get<TValue>(Func<TValue> activator)
        {
            if (!got)
            {
                if (activator == null)
                {
                    return default(TValue);
                }

                lock (this)
                {
                    if (!got)
                    {
                        value = activator();

                        got = true;
                    }
                }
            }

            return (TValue)value;
        }
    }

Technically, it isn't a good design practice to allow null objects to be cached to allow for a way to check whether the cache contains the item in a reasonable way. So returning default(T) makes sense to indicate an uninitialized state.

And then cut short the ItemRemoved event in the MicroCache because nothing was actually removed except for the LazyLock, which doesn't really count as a cached item.

private void cacheProvider_ItemRemoved(object sender, MicroCacheItemRemovedEventArgs<T> e)
{
    // Skip the event if the item is null, empty, or otherwise a default value,
    // since nothing was actually put in the cache, so nothing was removed
    if (!EqualityComparer<T>.Default.Equals(e.Item, default(T)))
    {
        // Cascade the event
        OnCacheItemRemoved(e);
    }
}

Cutting the event short could be done in the ICacheProvider implementations, but that forces more shared behavior into the providers and makes them more difficult for 3rd parties to implement, not to mention more code to maintain within the project.

The ItemRemoved event is no longer used by MvcSiteMapProvider. The original intent was to provide a way to call Clear() on the SiteMap instance to remove circular references, but the theory that the CLR couldn't automatically clean them up turned out to be false, not to mention, calling Clear() before every request was done with the SiteMap made it throw exceptions.

This fix doesn't break the interfaces, seems to cover all of the bases, and could be rolled into a patch release.

Thoughts?

@mnivet
Copy link
Author

mnivet commented Sep 3, 2014

Ok I've try your fix and it seems good for me to start resolving the cache issues I've experienced.

But it seems that it still remains some problems in other parts of the code.
I've start to investigate the remaining problems, and it seems to be something like that :

  • the sitemap is loaded a first time during a request
  • the dependency removed it from the cache during that request (bad luck)
  • the sitemap is reloaded still during the same request (to draw a second level of menu)
  • the data stored in the request cache by RequestCacheableSiteMapNode aren't reset (in particular the AreRouteParametersPreserved), which cause the new build of the sitemap to raise an exception because preserved route are already filled in the node RouteValues collection and this is evaluated as an error by ThrowIfRouteValueIsPreservedRouteParameter method

for instance I've work around that problem by using an ICacheProvider implementation that store the sitemap in cache request before storing it in a more global cache provider like RuntimeCacheProvider. This pattern allow to have a consistent sitemap instance for the whole request

Here the code for that request cache provider

using System;

using MvcSiteMapProvider.Caching;
using MvcSiteMapProvider.Web.Mvc;

namespace MvcSiteMapProvider.Caching
{
    public class RequestCacheProvider<T> : ICacheProvider<T>
    {
        public RequestCacheProvider(IMvcContextFactory mvcContextFactory, ICacheProvider<T> subCacheProvider)
        {
            if (mvcContextFactory == null) throw new ArgumentNullException("mvcContextFactory");
            this.requestCache = mvcContextFactory.GetRequestCache();
            this.subCacheProvider = subCacheProvider;
            if (subCacheProvider != null)
            {
                subCacheProvider.ItemRemoved += SubCacheProvider_ItemRemoved;
            }
        }

        private readonly IRequestCache requestCache;

        private readonly ICacheProvider<T> subCacheProvider;

        public bool Contains(string key)
        {
            var value = this.requestCache.GetValue<LazyLock>(key);
            return value != null || (this.subCacheProvider != null && this.subCacheProvider.Contains(key));
        }

        public LazyLock Get(string key)
        {
            var value = this.requestCache.GetValue<LazyLock>(key);
            if (value == null && this.subCacheProvider != null)
            {
                value = this.subCacheProvider.Get(key);
                this.requestCache.SetValue(key, value);
            }
            return value;
        }

        public bool TryGetValue(string key, out LazyLock value)
        {
            value = this.Get(key);
            if (value != null)
            {
                return true;
            }
            return false;
        }

        public void Add(string key, LazyLock item, ICacheDetails cacheDetails)
        {
            if (this.subCacheProvider != null)
            {
                this.subCacheProvider.Add(key, item, cacheDetails);
            }
            this.requestCache.SetValue(key, item);
        }

        public void Remove(string key)
        {
            var value = this.Get(key);
            this.requestCache.SetValue<LazyLock>(key, null);
            if (this.subCacheProvider != null)
            {
                this.subCacheProvider.Remove(key);
            }
            else if (value != null)
            {
                this.OnCacheItemRemoved(new MicroCacheItemRemovedEventArgs<T>(key, value.Get<T>(null)));
            }
        }

        private void SubCacheProvider_ItemRemoved(object sender, MicroCacheItemRemovedEventArgs<T> e)
        {
            this.OnCacheItemRemoved(e);
        }

        protected virtual void OnCacheItemRemoved(MicroCacheItemRemovedEventArgs<T> e)
        {
            if (this.ItemRemoved != null)
            {
                ItemRemoved(this, e);
            }
        }

        public event EventHandler<MicroCacheItemRemovedEventArgs<T>> ItemRemoved;
    }
}

@NightOwl888
Copy link
Collaborator

The issue where the SiteMap could potentially be loaded multiple times per request was reported as #310 and I have already made the fix. However, it didn't meet the requirements of a patch so it is waiting in the v4.7 branch in commit 88b1127fcf6993147af2ada4c89711e39815d59a. The SiteMapSpooler request-caches the SiteMap instance so it will remain until all requests are finish with it.

However, I don't think that addresses the entire problem because you can still have 2 SiteMap objects with the same key loaded at the same time.

For that, we could potentially go back to using an instance ID (random Guid) that is added as part of the request cache key of SiteMap. I don't recall exactly when the instance id was removed from SiteMapNode, but it was replaced with using the SiteMapCacheKey instead of a random Guid so the equality check would pass. That instance id scheme was removed to fix the first part of #310. This definitely doesn't make sense to change back until the SiteMapSpooler is put into place. But perhaps when it is, fact that the SiteMapCacheKey is used as a fallback for the equality checking would be moot, and we could perhaps use an instance id on each SiteMap object that all of the request caching keys would automatically use.

I think some more thought needs to go into exactly how to handle this.

@mnivet
Copy link
Author

mnivet commented Sep 3, 2014

Ok I let you see how to handle all of this

If you can at least release your branch patch_352 It will already allow me to remove some workaround code in my project.
For the multiple instantiation of the sitemap, my RequestCacheProvider implementation seems valid in my uses cases so I can live with it for the moment

@NightOwl888
Copy link
Collaborator

FYI - This is now in release v4.6.15, which is up on NuGet.org.

@rlerch
Copy link

rlerch commented May 28, 2019

Was this error fixed in any version of this library?

@justinelhiggins
Copy link

This still seems to occur for dynamic pages in 4.6.26.

Is there a way to just disable the cache?

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Projects
None yet
Development

No branches or pull requests

4 participants