In my previous post I talked about using Blobs for storage when using Windows Azure. In this article we will talk about table which are similar in many respects to blobs, though as you might expect, with a higher degree of organization.
Lets first try to understand how tables are organized. Many of you are thinking similar to SQL tables and you are partially correct, but its more of a bucket for storing entities with similar structure. So basically, your class definition defines the table structure, you are not explicitly creating a table with columns. That said, Azure does make a requirement of these classes, they must inherit from TableServiceEntity, the code for our PictureEntry object:
1: public class PictureEntry : TableServiceEntity
2: {
3: public string Name { get; set; }
4: public byte[] Picture { get; set; }
5: public string Type { get; set; }
6: public int Length { get; set; }
7:
8: public PictureEntry()
9: {
10: PartitionKey = "picture";
11: RowKey = Name + Guid.NewGuid();
12: }
13: }
The key thing to notice with this snippet is the constructor which references the two properties given to you via the inheritance. Remember, Azure tables ARE NOT SQL tables. The partition key is used for partitioning your entities. Remember, you are not creating separate tables for data, you have a bucket so the PartitionKey is used to separate the entities for different application/uses so to speak; its a partitioning mechanism, similar to defining a table. The idea behind RowKey is to provide a way to uniquely reference a row, think auto-incrementing field.
The following is a picture, taken from Cloudy in Seattle
So to understand this, first lets talk about the setup. In my the previous post I noted, at the end, that you do have to modify the WebRole.OnStart event to get things to work because the Publisher settings must be established. Here is the code for the OnStart event:
1: public override bool OnStart()
2: {
3: CloudStorageAccount.SetConfigurationSettingPublisher((configName, configSetter) => {
4: // Provide the configSetter with the initial value
5: configSetter(RoleEnvironment.GetConfigurationSettingValue(configName));
6: RoleEnvironment.Changed += (sender, arg) => {
7: if (arg.Changes.OfType().Any(
8: (change) => (change.ConfigurationSettingName == configName))) {
9: if (!configSetter(
10: RoleEnvironment.GetConfigurationSettingValue(configName))
11: )
12: {
13: RoleEnvironment.RequestRecycle();
14: }
15: }
16: };
17: });
18:
19: return base.OnStart();
20: }
Our next step is to initialize the storage location. Here we will provide a name for our bucket. Below is the code snippet:
1: var account = CloudStorageAccount.FromConfigurationSetting("DataConnectionString");
2: account.CreateCloudTableClient().DeleteTableIfExist("Pictures");
3: account.CreateCloudTableClient().CreateTable("Pictures");
This is sort of backwards considering the standards that Azure recommends, however, in the case of this experimental application I wanted to just remove the existing table and recreate it when this method is invoked. The table is, quite literally, a bucket for entities, as the diagram above shows. It is important to remember, however, that these tables ARE NOT relational and many of the aspects of TSQL programming we take advantage of are not supported here; that is what SQL Azure handles.
To actually add and remove PictureEntry instances from our table, we are going to employ a repository pattern using Azure’s managed library classes. Here is our context layer:
1: public class PictureEntryDataContext : TableServiceContext
2: {
3: private CloudStorageAccount _account;
4:
5: public List Pictures
6: {
7: get
8: {
9: var result = PictureQueryList;
10: if (result == null)
11: return new List();
12:
13: return result.ToList();
14: }
15: }
16: private IQueryable PictureQueryList
17: {
18: get
19: {
20: if (!DoesSchemaExist)
21: return null;
22:
23: return CreateQuery("Pictures");
24: }
25: }
26:
27: public PictureEntryDataContext(CloudStorageAccount account)
28: : base(account.TableEndpoint.ToString(), account.Credentials)
29: {
30: _account = account;
31: }
32:
33: public void AddPictureEntry(PictureEntry entry)
34: {
35: AddObject("Pictures", entry);
36: }
37:
38: public PictureEntry GetPictureByRowKey(string rowKey)
39: {
40: return Pictures.FirstOrDefault(p => p.RowKey == rowKey);
41: }
42:
43: #region Schema Methods
44: public bool DoesSchemaExist
45: {
46: get
47: {
48: return _account.CreateCloudTableClient().DoesTableExist("Pictures");
49: }
50: }
51: #endregion
52: }
In this case, we are defining a context class inheriting from TableServiceContext, this class will give you many of the methods seen in the Entity Framework, such as SaveChanges. However, it is specifically for working with Azure and cloud-based tables. For the most part this code is fairly straightforward. What links this class to the Cloud is the construct which takes a CloudStorageAccount as a parameter; we have seen this class frequently between this entry and the last.
So lets table about how we would add a picture to our table. To be fair, I used webforms last entry this time I will use ASP .NET MVC. Here is my action receiving the form with the file upload:
1: public ActionResult AddPicture(string name,HttpPostedFileWrapper uploadedFile)
2: {
3: var entry = new PictureEntry
4: {
5: Name = name,
6: Length = uploadedFile.ContentLength,
7: Picture = new BinaryReader(uploadedFile.InputStream)
8: .ReadBytes(uploadedFile.ContentLength),
9: Type = uploadedFile.ContentType
10: };
11:
12: var account =
13: CloudStorageAccount.FromConfigurationSetting("DataConnectionString");
14: var ctx = new PictureEntryDataContext(account);
15: ctx.AddPictureEntry(entry);
16: ctx.SaveChanges();
17:
18: return Index();
19: }
I honestly didn’t know you could do this, I just decided to try it out of understanding the theory. Who knew you could specify the uploaded file in the parameter list as opposed to getting it from Request.Files. Our only goal here is to get the the binary stream into a byte array form so we can assign it to the PictureEntry instance we are creating. After that its just basic interaction with our context instance reference.
Like the Blob example, we would like to be able to show the user what we have stored in the table, however, in ASP .NET MVC we dont normally think of using ASHX, its kind of against the convention. The better method would be to call an action which returns binary data. After some looking I was surprised that this did not come built-in meaning I would have to write something myself. As it turns out people have run into this issue and a solution exists using BinaryResult (code is courtesy of Jim Guerts). Note: I am told that a version of this code exists in MvcContrib, but I couldnt find BinaryResult in the most recent release.
With this and my context I needed to only pass the RowKey and use it as an ID to extract the images binary stream:
1: public ActionResult GetImage(string id)
2: {
3: var account = CloudStorageAccount.FromConfigurationSetting("DataConnectionString");
4: var ctx = new PictureEntryDataContext(account);
5:
6: var pictureEntry = ctx.GetPictureByRowKey(id);
7: if (pictureEntry == null)
8: return ViewContents();
9:
10: return new BinaryResult() {
11: Data = pictureEntry.Picture,
12: ContentType = pictureEntry.Type
13: };
14: }
If the image does not exist, I simply return an empty view which would result in a broken image being displayed. Here is the code for the view that uses GetImage:
1: <asp:Content ID="Content2" ContentPlaceHolderID="MainContent" runat="server">
2: <h2>View</h2>
3: <table>
4: <tr>
5: <th>Name</th>
6: <th>Type</th>
7: <th>Picture</th>
8: <th></th>
9: </tr>
10: <%1: foreach (var entry in Model)2: {
3:
%>
11: <tr>12: <td><%1: = entry.Name
%></td>
13: <td><%1: = entry.Type
%></td>
14: <td><img src="/Home/GetImage/" /></td>15: <td>[ <%1: = Html.ActionLink("Delete", "RemovePicture", "Home", new { id = entry.RowKey }, new { @class = "removeLink" })%> ]</td>
16: </tr>17: <%1:
2: }
%>
18: </table>19:
20: </asp:Content>
I always hate how this code looks, soon I hope to make Spark my default ViewEngine for all things, but for now I am still using WebForms and getting ugly code like this which reminds me too much of ASP3.
To conclude, working with tables for this simple example is not that much different then working with Blobs. Tables do provide a bit more structure and replace metadata with the properties that most developers are intimately familiar with having worked with .NET in the past. I would prefer tables to blobs because of this structure.
Download the Code here (Visual Studio 2010)
http://cid-630ed6f198ebc3a4.skydrive.live.com/embedicon.aspx/Public/PictureStoreTables.zip