Need a proposal for an ASP.Net in-memory queue

I have a requirement to create an HttpHandler that will serve up an image file (simple static file) and also insert a record into a SQL Server table. (e.g. http: //site/some.img where some.img is the HttpHandler) I need an in-memory object (like a Generic List object) that I can add items to every request (I also have to account for a few hundred or thousands requests per second) and I should be able to dump this in-memory object into the sql table using SqlBulkCopy.

List -> DataTable -> SqlBulkCopy

I was thinking about using the Cache object. Create a Generic List object and store it in HttpContext.Cache and insert every time a new item is added to it. This will NOT work as the CacheItemRemovedCallback will start working immediately when the HttpHandler tries to add a new item. I cannot use the Cache object as an in-memory queue.

Can anyone suggest something? Can I scale in the future if the load is higher?

+1


source to share


5 answers


Why does CacheItemRemovedCalledback fire when you add something to the queue? It doesn't make sense to me ... Even if it works, there is nothing to do here. Perhaps I am misunderstanding your requirements?

I have used the Cache object in this way quite successfully. This is what it is for and it scales very well. I kept a Hashtable that was accessed on every request of the application page and refreshed / cleared as needed.



Option two ... do you really need a queue? SQL Server will scale very well if you just want to write directly to the DB. Use a shared connection object and / or connection pool.

+1


source


How can I just use a shared list to store queries and use a different thread to execute SqlBulkCopy?

This way, keeping the queries in the list won't block the response for too long, and the background thread will be able to update the Sql at its own time every 5 minutes.



you can even create a background thread in the Cache engine by doing work on the CacheItemRemovedCallback.

Just insert some object with 5 minutes deletion time and reinsert it at the end of processing.

0


source


Thanks to Alex and Bryan for your suggestions.

Brian: When I try to replace the List object in the Cache for the second request (the counter should now be 2) the CacheItemRemovedCalledback lights up as I replace the current Cache object with the new one. Initially I also thought this was strange behavior, so I have to look deeper into it. Also, for the second suggestion, I'll try to insert a record (with a Cached SqlConnection object) and see what kind of performance I get when I do a stress test. I doubt I will be getting fantastic numbers as an I / O operation.

I will continue to dig at my side for an optimal solution, meanwhile, with your suggestions.

0


source


You can create a conditional requirement in the callback to make sure you are working on a cache entry that has been removed from expiration instead of being removed / replaced (in VB as I meant it):

Private Shared Sub CacheRemovalCallbackFunction(ByVal cacheKey As String, ByVal cacheObject As Object, ByVal removalReason As Web.Caching.CacheItemRemovedReason)
    Select Case removalReason
        Case Web.Caching.CacheItemRemovedReason.Expired, Web.Caching.CacheItemRemovedReason.DependencyChanged, Web.Caching.CacheItemRemovedReason.Underused
        ' By leaving off Web.Caching.CacheItemRemovedReason.Removed, this will exclude items that are replaced or removed explicitly (Cache.Remove) '
    End Select
End Sub

      

Edit . Here it is in C # if you need it:

private static void CacheRemovalCallbackFunction(string cacheKey, object cacheObject, System.Web.Caching.CacheItemRemovedReason removalReason)
{
    switch(removalReason)
    {
        case System.Web.Caching.CacheItemRemovedReason.DependencyChanged:
        case System.Web.Caching.CacheItemRemovedReason.Expired:
        case System.Web.Caching.CacheItemRemovedReason.Underused:
            // This excludes the option System.Web.Caching.CacheItemRemovedReason.Removed, which is triggered when you overwrite a cache item or remove it explicitly (e.g., HttpRuntime.Cache.Remove(key))
            break;
    }
}

      

0


source


To expand on my previous comment ... I misunderstood the image you are thinking about the cache. If you have an object stored in a cache, say a Hashtable, any update / store in that Hashtable will persist without explicitly modifying the contents of the cache. You only need to add the Hashtable to the cache once, either on application startup or on the first request.

If you are worried about simultaneous forwarding of requests for bulky paper and page, then I suggest you just have two cached lists. Is there a list that is updated as requests for the page are entered, and one list for the bulk copy operation. When one bulk copy is complete, replace the lists and repeat. This is similar to double buffering video memory for video games or video applications.

0


source







All Articles