A last word on Azure Queues (Performance)

First off, my most sincere apologies for taking so long to get back to this blog. Between work and my personal life, my time has been more limited then I would have liked. It also doesn’t help that this last project took substantially more effort than I had hoped. But after long last, here I sit on a lovely Sunday morning putting this post together to share my results. I’d like to preface by saying that while I am happy with some aspects of what I’ve done, there’s still much left to do. Hopefully I’ll be able to come back to it at some point.

Some time ago, someone came by the MSDN Windows Azure forums and asked a question regarding performance of Azure Queues. They didn’t just want to know something simple like call performance, but wanted to know more about throughput, from initial request until final response was received. So over the last month I managed to put together something that lets me create what I think is a fairly solid test sample. The solution involves a web role for initializing the test and monitoring the results, and a multi-threaded worker role that actually performs the test. Multiple worker roles could also have been used, but I wanted to create a sample that anyone in the CTP or using the local development fabric could easily execute.

My test app features several improvements over my previous queue examples. I added a “clear” method to my AzureQueue class. This helps make sure that queues get emptied when I need them to be. I also made sure my queue messages were built as objects. I created a base class, QueueMsg which would handle the work of serializing and de-serializing messages for queuing. Then created two specific dependent classes, StartMsg and StatusMsg. This approach allowed me to create a message object, set its properties, then send it to the queue easily. Like so…

            // build message

StartMsg tmpStartMsg = new StartMsg(int.Parse(ddlMsgSize.SelectedValue),

int.Parse(ddlIterations.SelectedValue));


            // serialize msg and put into queue
            string tmpMsg = tmpStartMsg.Serialize();
            _RequestQueue.Put(Server.UrlEncode(tmpMsg));

As usual, I’ve kept my implementation pretty basic. I didn’t put XML attribute tags into my classes so the defaults are used. I also didn’t put in any checks to make sure my messages don’t exceed the 8k size limits of queues. You’ll want to strongly consider both these issues when dealing with any real world implementation.

Next up was figuring out how to actually simulate the load. I decided to go with using a worker role that contained 3 threads. The root thread would be the controller and responsible for initializing and monitoring the processing as well as reporting the progress back to the web role. There would be an initiator sub-thread that starts the loop and watches for the close of our processing loop. Finally, there is the relay sub-thread which does nothing more then look for a message from the initiator and relays it back. As with my queue message class, both these sub-threads are based on classes that inherit from a single base class.

The final piece was the parameters of the test. We wanted to vary both the size of the message (aka payload) being sent, as well as how many iterations to execute.

I’m not going to post all the code here in the blog, but I have uploaded it so you can pull the project down and test it yourself.

Course the end result is to get some data on queue performance:

  Iteration: 1000
  Total Execution Time: 00:05:45.9060000

  Avg. Read: 60  (in ms)
  Avg. Write: 36  (in ms)
  Avg. Delete: 42  (in ms)
  Avg. Round Trip: 345  (in ms)

  Fastest Read: 31    Slowest Read: 203
  Fastest Write: 31    Slowest Write: 234
  Fastest Delete: 0    Slowest Delete: 593

All in all, I’m fairly pleased with these results. IMHO, Queues are not meant to be a bulk processor of data but simply a reliable method of delivering messages between processes. These tests show that Azure Queues definitely serve this purpose, especially given that it exists in the cloud. Results will be higher if you’re running things locally and accessing hosted storage, but they’re still respectable enough. I can honestly see many future uses for queues and expect that they will play a key role in many Windows Azure applications.

As I mentioned, this solution isn’t as mature as I’d like. There are a couple issues with the web role that should still be worked out. It has problems with response messages being displayed out of order as well as an issue with messages not always being deleted properly after they are read from the queue (likely causing the out of order issues). This is annoying, but fortunately does not impact the final results if you let a test run through completion.

The final thing that still confounds me is the delay before we start seeing the results come back. Its almost like there is a substantial delay between when the web role sends the request via a queue to the worker role and when the worker starts processing. I’ve pondered it a bit and all I can think of is that since Azure Storage runs as a scalable cloud based service, there is a delay between when two processes (which are presumably being routed to different instances of the Azure Storage services) send messages due to caching. Its also possible that the delay is due to delays between when a queue message is written to Azure Storage’s backend and when its then visible to be read. If anyone from the Azure team knows of an answer to this, please let me know. 🙂

Sorry again for the delay in getting this posted as well as the poor state the code is in. But at least its finished and I can now move on to my next big topic… .NET Services. 🙂

Advertisements

Leave a Reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out / Change )

Twitter picture

You are commenting using your Twitter account. Log Out / Change )

Facebook photo

You are commenting using your Facebook account. Log Out / Change )

Google+ photo

You are commenting using your Google+ account. Log Out / Change )

Connecting to %s

%d bloggers like this: