Concurrency Models (Windows Server AppFabric Caching)

Windows Server AppFabric architecture allows any cache client to openly access any cached data if those clients have the appropriate network access and configuration settings. This presents a challenge to both security and concurrency.

To mitigate security risks, all cache clients, cache servers, and the primary data source server should be members of the same domain, and should be deployed within the perimeter of a firewall. We also highly recommend that you secure your application configuration files on the cache clients.

To help your application deal with concurrency issues, AppFabric supports optimistic and pessimistic concurrency models. For information about the methods available to align to these models, see Concurrency Methods (Windows Server AppFabric Caching).

Optimistic Concurrency Model

In the optimistic concurrency model, updates to cached objects do not take locks. Instead, when the cache client gets an object from the cache, it also obtains and stores the current version of that object. When an update is required, the cache client sends the new value for the object along with the stored version object. The system only updates the object if the version sent matches the current version of the object in the cache. Every update to an object changes its version number, which prevents the update from overwriting someone else’s changes.

The example in this topic illustrates how optimistic concurrency maintains data consistency.

Example

In this example, two cache clients (cacheClientA and cacheClientB) on two separate application servers try to update the same cached object, which is named RadioInventory.

Time Zero: Both Clients Retrieve the Same Object

At time zero (T0), both cache clients instantiate a DataCacheItem class to capture the cached object that they intend to update, together with additional information associated with that cached object, such as version and tag information. This is illustrated in the following diagram and code example.

"Velocity" concurrency model at T0

'cacheClientA pulls the FM radio inventory from cache
Dim clientACacheFactory As DataCacheFactory = New DataCacheFactory()
Dim cacheClientA As DataCache = _
        clientACacheFactory.GetCache("catalog")
Dim radioInventoryA As DataCacheItem = _
        cacheClientA.GetCacheItem("RadioInventory", "electronics")

'cacheClientB pulls the same FM radio inventory from cache
Dim clientBCacheFactory As DataCacheFactory = New DataCacheFactory()
Dim cacheClientB As DataCache = _
       clientBCacheFactory.GetCache("catalog")
Dim radioInventoryB As DataCacheItem = _
        cacheClientA.GetCacheItem("RadioInventory", "electronics")
//cacheClientA pulls the FM radio inventory from cache
DataCacheFactory clientACacheFactory = new DataCacheFactory();
DataCache cacheClientA = clientACacheFactory.GetCache("catalog");
DataCacheItem radioInventoryA = 
    cacheClientA.GetCacheItem("RadioInventory","electronics");

//cacheClientB pulls the same FM radio inventory from cache
DataCacheFactory clientBCacheFactory = new DataCacheFactory();
DataCache cacheClientB = clientBCacheFactory.GetCache("catalog");
DataCacheItem radioInventoryB= 
    cacheClientA.GetCacheItem("RadioInventory", "electronics");

Note

Although this example obtains the version information by using the GetCacheItem method to retrieve the DataCacheItem object, it is also possible to use the Get method to obtain the DataCacheItemVersion object associated with the retrieved cache item.

Time One: The First Update Succeeds

At time one (T1), cacheClientA updates the cached object RadioInventory with a new value. When cacheClientA executes the Put method, the version associated with the RadioInventory cache item increments. At this time, cacheClientB has an out-of-date cache item. This is illustrated in the following diagram and code example.

"Velocity" concurrency model at T1

'at time T1, cacheClientA updates the FM radio inventory
Dim newRadioInventoryA As Integer = 155

cacheClientA.Put("RadioInventory", newRadioInventoryA, _
                 radioInventoryA.Version, "electronics")
//at time T1, cacheClientA updates the FM radio inventory
int newRadioInventoryA = 155;

cacheClientA.Put("RadioInventory", newRadioInventoryA, 
    radioInventoryA.Version,"electronics");

Time Two: The Second Update Fails

At time two (T2), cacheClientB tries to update the RadioInventory cached object by using what is now an out-of-date version number. To prevent the changes from cacheClientA from being overwritten, the cacheClientBPut method call fails. The cache client throws a DataCacheException object with the ErrorCode property set to CacheItemVersionMismatch. This is illustrated in the following diagram and code example.

"Velocity" concurrency model at T2

'later, at time T2, cacheClientB tries to 
'update the FM radio inventory, throws DataCacheException with
'an error code equal to DataCacheErrorCode.CacheItemVersionMismatch.
Dim newRadioInventoryB As Integer = 130

cacheClientB.Put("RadioInventory", newRadioInventoryB, _
                 radioInventoryB.Version, "electronics")
//later, at time T2, cacheClientB tries to 
//update the FM radio inventory, throws DataCacheException with
//an error code equal to DataCacheErrorCode.CacheItemVersionMismatch.
int newRadioInventoryB = 130;

cacheClientB.Put("RadioInventory", newRadioInventoryB,
    radioInventoryB.Version,"electronics");

Pessimistic Concurrency Model

In the pessimistic concurrency model, the client explicitly locks objects to perform operations. Other operations that request locks are rejected (the system does not block requests) until the locks are released. When objects are locked, a lock handle is returned as an output parameter. The lock handle is required to unlock the object. In case the client application ends before freeing a locked object, time-outs are provided to release the locks. Locked objects are never expired, but they may expire immediately after they are unlocked if it is past their expiry time. For more information about the methods used with the pessimistic concurrency model, see Concurrency Methods (Windows Server AppFabric Caching).

Note

Transactions spanning operations are not supported. The application that uses cache is responsible for determining the order of the locks and detecting deadlocks, if any.

Warning

Locked objects in the cache can still be replaced by any cache client with the Put method. Cache-enabled applications are responsible for consistently using PutAndUnlock for items that use the pessimistic concurrency model.

See Also

Concepts

Concurrency Methods (Windows Server AppFabric Caching)
Windows Server AppFabric Caching Physical Architecture Diagram
Windows Server AppFabric Caching Logical Architecture Diagram
Developing a Cache Client (Windows Server AppFabric Caching)