Introduction to Azure Storage Analytics (YOA Week 6)

Since my update on Storage Analytics last week was so short, I really wanted to dive back into it this. And fortunately, its new enough that there was some new ground to tread here. While is great because I hate just putting up another blog post that doesn’t really add anything new.

Steve Marx posted his sample app last week and gave us a couple nice methods for updating the storage analytics settings. The Azure Storage team did two solid updates on working with both the Metrics and Logging. However, neither of them dove deep into working with the API I wanted more meat on how to do exactly this. By digging through Steve’s code and the MSDN documentation on the API, I can hopefully shed some additional light on this.

Storage Service Properties (aka enabling logging and metrics)

So the first step is turning this on. Well, actually its understanding what we’re turning on and why, but we’ll get to that in a few. Steve posted on his blog a sample ‘Save’ method. This is a implementation of the Azure Storage Analytics API’s “Set Storage Service Properties” call. However, the key to that method is an XML document that contains the analytics settings. It looks something like this:

<?xml version="1.0" encoding="utf-8"?>
<StorageServiceProperties>
  <Logging>
    <Version>version-number</Version>
    <Delete>true|false</Delete>
    <Read>true|false</Read>
    <Write>true|false</Write>
    <RetentionPolicy>
      <Enabled>true|false</Enabled>
      <Days>number-of-days</Days>
    </RetentionPolicy>
  </Logging>
  <Metrics>
    <Version>version-number</Version>
    <Enabled>true|false</Enabled>
    <IncludeAPIs>true|false</IncludeAPIs>
    <RetentionPolicy>
      <Enabled>true|false</Enabled>
      <Days>number-of-days</Days>
    </RetentionPolicy>
  </Metrics>
</StorageServiceProperties>

Cool stuff, but what does it mean. Well fortunately, its all explained in the API documentation. Also fortunately, I won’t make you click a link to look at it. I’m nice that way.

Version – the service version / interface number to help with service versioning later one, just use “1.0” for now.

Logging->Read/Write/Delete – these nodes determine if we’re going to log reads, writes, or deletes. So you can get just the granularity of logging you want.

Metrics->Enabled – turn metrics capture on/off

Metrics->IncludeAPIs – set to true if you want to include capture of statistics for your API operations (like saving/updating analytics settings). At least I think it is, I’m still playing/researching this one.

RetentionPolicy – Use this to enabled/disable a retention policy and set the number of days to retain information for. Now without setting a policy, data will be retained FOREVER, or at least until your 20TB limit is reached. So I recommend you set a policy and leave it on at all times. The maximum value you can set is 365. To learn more about the retention policies, check out the MSDN article on them.

Setting Service Properties

Now Steve did a slick little piece of code, but given that I’m not what I’d call “MVC fluent” (I’ve been spending too much time doing middle/backend services I guess), I took a bit of deciphering, at least for me, to figure out what was happening. And I’ve done low level Azure Storage REST operations before. So I figured I’d take a few minutes to explain what was happening in his “Save” method.

First off, Steve setup the HTTP request we’re going to send to Azure Storage:

var creds = new StorageCredentialsAccountAndKey(Request.Cookies["AccountName"].Value, Request.Cookies["AccountKey"].Value);
var req = (HttpWebRequest)WebRequest.Create(string.Format("http://{0}.{1}.core.windows.net/?restype=service&comp=properties", creds.AccountName, service));
req.Method = "PUT";
req.Headers["x-ms-version"] = "2009-09-19";

 

So this code snags the Azure Storage account credentials from the cookies (where it was stored when you entered it). They are then used it to generate an HttpWebRequest object using the account name, and the service (blob/table/queue) that we want to update the settings for. Lastly, we set a method and x-ms-version properties for the request. Note: the service was posted to this method by the javascript on Steve’s MVC based page.

Next up, we need to digitally sign our request using the account credentials and the length of our XML analytics config xml document.

            req.ContentLength = Request.InputStream.Length;
            if (service == "table")
                creds.SignRequestLite(req);
            else
                creds.SignRequest(req);

Now what’s happening here, is that our XML document came to this method via the javascript/AJAX post to our code-behind method via Request.InputStream. We sign the request using the StorageCredentialsAccountAndKey object we created earlier, doing either a SignRequestLite for a call to the Table service, or SignRequest for the blob or queue service.

Next up, we need to copy our XML configuration settings to our request object…

            using (var stream = req.GetRequestStream())
            {
                Request.InputStream.CopyTo(stream);
                stream.Close();
            }

 

This chunk of code uses GetRequestStream to get the stream we’ll copy our payload to, copy it over, then close the stream so we’re ready to send the request.

            try
            {
                req.GetResponse();
                return new EmptyResult();
            }
            catch (WebException e)
            {
                Response.StatusCode = 500;
                Response.TrySkipIisCustomErrors = true;
                return Content(new StreamReader(e.Response.GetResponseStream()).ReadToEnd());
            }

Its that first line that we care about. req.GetResponse will send our request to the Azure Storage service. The rest of this snippet is really just about exception handling and returning results back to the AJAX code.

Where to Next

I had hoped to have time this week to create a nice little wrapper around the XML payload so you could just have an Analytics configuration object that you could hand a connection too and set properties on, but I ran out of time (again). I hope to get to it and actually put something our before we get the official update to the StorageClient library. Meanwhile, I think you can see how easy it is to generate your own REST requests to get (which we didn’t cover here) and set (which we did) the Azure Storage Analytics settings.

For more information, be sure to check out Steve Marx’s sample project and the MSDN Storage Analytics API documentation.

Advertisements

3 Responses to Introduction to Azure Storage Analytics (YOA Week 6)

  1. Hi Brent,

    You may want to check this out: http://www.cerebrata.com/Blog/post/Cerebrata-Windows-Azure-Storage-Analytics-Configuration-Utility-A-Free-Utility-to-Configure-Windows-Azure-Storage-Analytics.aspx.

    This is a free utility for configuring storage analytics we release a few hours ago.

    Thanks

  2. Pingback: Windows Azure and Cloud Computing Posts for 8/11/2011+ - Windows Azure Blog

Leave a Reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out / Change )

Twitter picture

You are commenting using your Twitter account. Log Out / Change )

Facebook photo

You are commenting using your Facebook account. Log Out / Change )

Google+ photo

You are commenting using your Google+ account. Log Out / Change )

Connecting to %s

%d bloggers like this: