Skip to content
/

With Windows Azure we can use Windows Azure SQL Database service (formerly known as SQL Azure) when we need a Relational Database. Microsoft offers a 99.9% monthly SLA for this service. But Microsoft doesn’t provide you with the service to restore your data to a moment back in time when you (accidentally) delete of corrupt data.

With an “on premise” Microsoft SQL Server installation you would solve this by configuring and scheduling (transactional log) backups. But this isn’t possible for the Windows Azure service.

You could replicate a copy of your database using SQL Data Sync with a delay. But if you fail to discover the issue in time for the next scheduled sync your copy won’t make a difference.

To have a backup in time I wrote some code to allow a Worker Role to backup my Windows Azure SQL Database every hour. Most solutions you find online are relying on the REST Endpoints but the address of the endpoint is different depending on which datacenter your database is hosted. I found a different solution where you only need the connection string to your database using the DacServices.

Pre-requisites

To make a backup I use the Microsoft SQL Server 2012 Data-Tier Application Framework. This framework provides the DacServices Class.

You need to configure Local Storage to store the backup temporary and you need a Storage Account to store the backup file permanently.

The solution

// Get the Storage Account
var backupStorageAccount = 
                CloudStorageAccount.FromConfigurationSetting("StorageAccount");

// The container to store backups
var backupBlobClient = backupStorageAccount.CreateCloudBlobClient();
backupContainer = backupBlobClient.GetContainerReference("backups");
backupContainer.CreateIfNotExist();

// The backup file on blob storage
var storageName = 
  string.Format("Backup_{0}.bacpac", DateTime.Now.ToString("yyyyMMdd-HHmmss"));
var backupFile = backupContainer.GetBlobReference(storageName);

// Get a reference to the temporary files
var localResource = RoleEnvironment.GetLocalResource("TempFiles");
var file = string.Format("{0}{1}", localResource.RootPath, backupFile.Name);

// Connect to the DacServices
var services = new DacServices(ConfigurationManager
                          .ConnectionStrings["DatabaseName"].ConnectionString);
services.Message += (sender, e) =>
    {
        // If you use a lock file, 
        // this would be a good location to extend the lease
    };

// Export the database to the local disc
services.ExportBacpac(file, "DatabaseName");

// Upload the file to Blob Storage
backupFile.Properties.ContentType = "binary/octet-stream";
backupFile.UploadFile(file);

// Remove the temporary file
File.Delete(file);

Explanation

The code does the following:

  1. Connect to the Storage Account
  2. Get a reference to the container and file where you want to store the backup
  3. Get a reference to the local storage location
  4. Connect to the DacService using the database connection string
  5. Export the database to a “.bacpac” file
  6. Upload the file to blob storage
  7. Delete the local file

A Catch

Before you deploy your Worker Role you’ll have to make sure each of the listed assemblies are referenced and the property “Copy Local” is set to “True” otherwise you will run into trouble.

  • Microsoft.Data.Tools.Schema.Sql
  • Microsoft.Data.Tools.Utilities
  • Microsoft.SqlServer.Dac
  • Microsoft.SqlServer.TransactSql
  • Microsoft.SqlServer.TransactSql.ScriptDom
  • Microsoft.SqlServer.Types
/

Techdays 2013 Netherlands

I’m invited to speak at the in the Netherlands. The event is held on the & of March at the World Forum in Den Haag

My session, in Dutch, is called “” and I will be talking about my experiences using Windows Azure to build a high available and scaling platform for the various applications used during “3FM Serious Request“, the annual charity event organized by the Dutch radio station 3FM for the Dutch Red Cross.

If the schedule doesn’t change I will be speaking in the Africa Room on Friday between .

Update

I’ve uploaded my slides for this session (PowerPoint PPTX, 6,83MB).
You can also view my presentation on SlideShare (but the fonts get messed up):

Techdays 2013 NL – Serious Request met Windows Azure from Michaël Hompus

/

DotNed is the Dutch .NET usergroup and organizes several meetings each year for the .NET community. They also run a regular podcast covering several .NET development related topics like Visual Studio, Mono, ASP.NET MVC, Windows Phone and Windows Azure.

Triggered by my scheduled TechDays 2013 presentation, I was interviewed a few weeks ago by Maurice de Beijer about my experiences using Windows Azure to build a high available and scaling platform for the various applications used during “3FM Serious Request“.

You can listen to this 64 minute during podcast on the dotNed website. I’m sorry for the international crowd, it was recorded in Dutch.

/

When writing .NET code to access a SQL database we often rely on the Entity Framework (EF). The EF makes it very easy to retrieve data from the database by generating a SQL Query for us. But we shouldn’t trust it blindly as the EF can also generate a bad query. It will return the correct data yes, but at what performance cost?

I have a table with lots of data, and to keep the table as small as possible on disk and in memory I want to optimize the columns. This can be done by using varchar instead of nvarchar, date instead of datetime and also using tinyint instead of int. Of course you only can do this if your data allows this.

The problem

Changing a column in my table from int to tinyint gave me a storage and memory space win, but it also gave me a performance loss!

In my case the SQL column “HasSong” is a tinyint. EF translates this to a Byte Structure in the .NET model.

The following code:

var query = from d in dataEntities.Donations
            where d.HasSong == 1
            select d.DonationId;

produces this SQL Query:

SELECT
    [Extent1].[DonationId] AS [DonationId]
FROM
    [dbo].[Donations] AS [Extent1]
WHERE
    1 = CAST([Extent1].[HasSong] AS int)

When SQL encounters this CAST it will skip all Indexes that are on the HasSong column resulting in the query engine using non-optimized indexes or even worse: full table scans.

So this explains my performance loss, but how do we convince EF not to cast my byte to an int?

The familiar “Contains”

Browsing the internet gave me my first idea: using the familiar contains method I encountered on an earlier post about the EF: Joining an IQueryable with an IEnumerable.
I just have to add an Array with a single value.

So trying this:

var tinyintComparison = new byte[] { 1 };

var query = from d in dataEntities.Donations
            where tinyintComparison.Contains(d.HasSong)
            selectd.DonationId;

was not such a good idea as it throws an ArgumentException:
Screenshot showing Visual Studio debugger with the message "ArgumentException: DbExpressionBinding requires an input expression with a collection ResultType"
It did work for other people using a SMALLINT, so I guess an array of bytes is a special case reserved for working with binary data in your database.

The solution

Lucky enough we are not far from a solution, changing the array to a List<T> Class:

var tinyintComparison = new List<byte> { 1 };

var query = from d in dataEntities.Donations
            where tinyintComparison.Contains(d.HasSong)
            selectd.DonationId;

results in the following query:

SELECT
    [Extent1].[DonationId] AS [DonationId]
FROM
    [dbo].[Donations] AS [Extent1]
WHERE
    1 = [Extent1].[HasSong]

There we are! No more casting. The SQL query engine can use the correct indexes again and we still won disk and memory space.

Conclusion

If you don’t want a CAST in your SQL query when using a tinyint use the IList<T>.Contains Method.

/

Recently I was challenged with the task to set the layout and content of a wiki page when a new page is added to a team site. As I’m used to work with SharePoint publishing the task sounded easy, but I was wrong.

Text Layout

Image showing the Text Layout option in the SharePoint ribbon

My first path was to figure out where SharePoint puts the wiki “text layouts”. I discovered this isn’t how it works. The layouts available for wiki’s are not configurable anywhere.

But using some PowerShell it was easy to get and set the layout as it’s the HTML content of the “Wiki Field” column in the list.

$web = Get-SPWeb http://server/teamsite
$list = $web.Lists["Site Pages"]
$listItem = $list.Items[2] # My test page
$listItem["Wiki Content"] # Returns HTML

The HTML content consists of 2 parts. The layout table and the layout data.

<table id="layoutsTable" style="width: 100%">
    <tbody>
        <tr style="vertical-align: top">
            <td style="width: 100%">
                <div class="ms-rte-layoutszone-outer" style="width: 100%">
                    <div class="ms-rte-layoutszone-inner">
                    </div>
                </div>
            </td>
        </tr>
    </tbody>
</table>
<span id="layoutsData" style="display: none">false,false,1</span>

The layout data describes if the header is visible, the footer is visible and the number of columns.

Event receiver

To set the content the first thing in my mind was to add an event receiver on the ItemAdding event, associated with ListTemplateId 119 (WebPageLibrary).

I deployed the solution and added a page and tadah: no content!

Using the debugger to verify my event receiver was triggered, I went to the next option: adding an ItemAdded event receiver. This time I got an exception the page was already modified by another user. Refreshing the page gave me the default content. So this told me 2 things:

  1. It’s possible to set default content
  2. I forgot to set the Synchronize property

So fixing this I deployed once again and got: no content!

As I used ListItem.Update Method in my receiver I got a version history where it showed the content was set but the final version still ended up empty.

When faced with utter desperation, working with SharePoint has always one escape: launch Reflector.

There I found this gem of code in the “SubmitBtn_Click” method of the CreateWebPage Class:

SPFile file = SPUtility.CreateNewWikiPage(wikiList, serverRelativeUrl);
SPListItem item = file.Item;
item["WikiField"] = "";
item.UpdateOverwriteVersion();

So no matter what I do in ItemAdding or ItemAdded, the content always ends up empty!

After this discovery the fix was removing the code from the ItemAdding and ItemAdded events and moving it to the ItemUpdated event (synchronious) and added a check if the WikiField content is a empty string.

public override void ItemUpdated(SPItemEventProperties properties)
{
    base.ItemUpdated(properties);

    var listItem = properties.ListItem;

    if (!string.IsNullOrEmpty(listItem["WikiField"] as string))
    {
        return;
    }

    this.EventFiringEnabled = false;

    listItem["WikiField"] = html;
    listItem.UpdateOverwriteVersion();

    this.EventFiringEnabled = true;
}

Now every wiki page I add to the team site contains the correct text layout and contains the default HTML.