RIA Services Output Caching
It has come to my attention that we haven’t published any good resources on an important feature of RIA Services – our integration with ASP.NET output caching. This is a shame, since output caching can greatly improve the performance of your application. This post will explain the support we offer and hopefully draw more attention to the feature.
In RIA Services, declaring that the results of a query operation can be cached is as simple as applying our OutputCacheAttribue (System.ServiceModel.DomainServices.Server) to your query method:
For the above query method, an absolute expiration of one hour is set for the returned product list. Once your application has retrieved the results, subsequent calls to GetProducts from the client will result in cache hits – your GetProducts query method will not be invoked until the cached result expires (in one hour). If you take a look at OutputCacheAttribue you’ll see that you can use it for additional caching strategies: you can specify different locations for cache, set up sliding expirations, use cache profiles, cache dependencies, etc.
RIA Services isn’t performing any magic here – we’ve simply given you an easy declarative way to specify your cache requirements. The cache directive you specify on your query method will be translated into the corresponding HttpCachePolicy in our service layer. This takes place in our custom query operation invoker (WCF IOperationInvoker). Before invoking your query method, we inspect the cache directives you’ve applied, and configure HttpContext.Response.Cache appropriately.
You can use the sample application I’ve attached to this post to experiment with the feature. Each time the “Load Products” button is hit, the client issues a query to load some products data. You’ll see that the elapsed time for the entire request is displayed:
As you can see, without caching the request takes about 3 seconds. Applying the attribute [OutputCache(OutputCacheLocation.Server, duration: 10)] to your method and rerunning the app, you’ll see that after the first request, the average response time goes down to around 50 ms since cached results are being returned:
You’ll also see that a breakpoint in the query method will not be hit before the cached result expires. After the absolute expiration of 10 seconds, the next time the client issues a request your query method will again be called to retrieve fresh results which will again be cached. Finally, to demonstrate client side caching, if you change the attribute to [OutputCache(OutputCacheLocation.Client, duration: 10)] and inspect the network traffic with Fiddler, you’ll see that after the initial request subsequent loads will not even hit the server (the results are cached in the browser). As before, once the cache expiration of 10 seconds is reached, the browser cache entry will expire.
A few additional notes:
- Caching is only enabled for query methods, and only if they use GET (i.e. QueryAttribute.HasSideEffects = false, which is the default)
- Caching is disabled for requests that specify an additional LINQ query. This is because output caching should generally only be used for a controlled set of inputs, to ensure that a small number of responses are cached. Otherwise it would be easy for someone to fill up server memory by sending a bunch of query variations.
- For query methods taking parameters (e.g. GetProductsByCategory(int categoryID), we automatically set up the HttpCachePolicy to vary by those parameters (HttpCachePolicy.VaryByParams).
In summary, through intelligent use of caching the performance of your application can be greatly improved. For example, if you have query methods that return reference data that doesn’t change very often, it likely makes sense to cache that. RIA Services builds on the ASP.NET cache infrastructure, so the guidelines and best practices established for ASP.NET applications generally apply.
Comments
Anonymous
January 05, 2011
Mathew, good article. We need more of these. Glad to see this also works for parameterised queries using VaryByParams.. Is there any way of manually invalidating (expiring) the cache, should an update / insert occur? Perhaps drilling down the output cache context to figure out which naming conventions is used for keys? AndriesAnonymous
January 06, 2011
Andries, I wouldn't advise you to attempt to manually manage cache entries that you haven't added yourself. OutputCacheAttribute.SqlCacheDependencies is close to what you're asking for and is designed for such purposes, but that's alot of machinery.Anonymous
January 18, 2011
Can you show what a VaryByParams would like like in your scenario in the summary of your article? Would it be like this? [OutputCache(OutputCacheLocation.Server, 180, UseSlidingExpiration = true, VaryByParams = "categoryID")] public IQueryable<Product> GetProductsByCategory(int categoryID)Anonymous
January 18, 2011
Matt - The runtime will handle VaryByParams automatically - you don't have to configure that.Anonymous
January 22, 2011
Nice artice that has opened up my eyes to how to provide simple caching in ria. How erver I cannot get the client side caching to work in IE, it works fine in Firefox. Am I missing something in IE.Anonymous
January 26, 2011
Sorry! It works for your solution. But not for mine. I am running my server on IIS instead of Casini.Anonymous
February 22, 2011
Does a parameter set in XAML (as opposed to defined in modified method) get treated as a parameterised query using VaryByParams?Anonymous
March 27, 2011
Great article. I can't the Client caching working. Setting the declaration to OutputCacheLocation.Client has no effect. I checked with HTTPWatch and the test application generates a WCF request every time. Is there any special configuration I need to set to get it working?Anonymous
May 23, 2011
Is this behaviour consistent when using either Casini, IIS Express or ISS?Anonymous
September 20, 2011
This is very helpful. Thank you. Since this is an ASP.NET function, I think RIA services cannot either update or invalidate the cache when the cached data changes. Right? What about swapping out.? When memory is full, will something (probably the least recently used item) be swapped out?