Skip to content
/

This week I’ll be attending the Microsoft Build 2016 conference in San Francisco.
Lots of news to be expected for developers covering the many technologies Microsoft is putting on the market.

This week I’ll be attending the Microsoft Build 2016 conference in San Francisco.
Lots of news to be expected for developers covering the many technologies Microsoft is putting on the market.

Keynotes

Traditionally there are 2 keynotes at the Build conference. The first one is focusing on Microsoft Windows. The second one is more focused on Microsoft Azure. I expect the same pattern this year.
Based on the news and rumors the last couple of weeks, combined with the scheduled sessions, I expect the following topics to be covered during the keynotess.

  • Windows 10
    Obviously. Redstone is coming and Edge is getting add-ins so those are 2 obvious topics. I expect an overview of all the new things that are already part of the current fast ring previews.
  • Xamarin
    Microsoft has recently bought the company. A move I already expected 2 years ago. And looking around San Francisco they’re running quite the marketing campaign. So naturally this needs to be in the keynote. I hope they’ll be answering the question how the licensing will be affected.
  • Xbox
    We are expecting universal apps on the Xbox for several years. But this year should be it. A tweet by Scott Hanselman indicates that Phil Spencer will be part of this years conference.
  • Surface Hub
    I don’t expect a new device, only some demo’s. But as the devices are now finally available for purchase they probably want to do some marketing around these costly beasts. Also a couple of sessions in the program are mentioning developing apps for this device.
  • Hololens
    The shipping of the first wave of devices is at the same time as the conference, that can’t be accidental. Lots of sessions are covering different aspects of the HoloLens. So a new demo during the keynote can’t be far away. Hopefully the device can measure the distance between the eyes automatically. Last year this was a manual task.
  • Visual Studio vNext
    A bit of news got around about the next version of Visual Studio and the improved installer experience. This would be the time to share this news officially.
  • .NET Core
    It’s about time the .NET core is officially released as it’s been in preview for a long time. So I expect the 1.0 RTM version to be pushed to the world today.
  • Azure
    Azure is a big platform, and I expect a couple of new features will be released to the public. Maybe even some Azure Stack integration will be demoed.
  • Office Graph
    Already announced in preview last year. I expect a full release this year. Maybe adding new features the API in previews.
  • My journey through the week

    On my twitter feed I’ll be posting all the sessions I’m attending. My focus will probably be on UWP and Azure related sessions.

    Want to meet me or do you have a question about the sessions I’ve attended? Just send me a message.

    Watching sessions

    Not in San Francisco and still want to be part of the action?
    Channel9 will be covering a lot of content live including the Keynotes and interviews. Also all the sessions will be available later.

    Go to the Channel9 Build 2016 website.

/

Recently I got certified by Microsoft as Solutions Developer for the Windows Universal Platform by taking two exams that are currently in beta. Because the exams are in beta there is not much guidance to be found online. I noticed during the exams I was being tested on skills not mentioned on the Microsoft Learning web site.

In this post I’ll cover these differences and how I prepared for the exams so it’ll be easier for you to get certified.

MCSD Universal Windows Platform

Recently I got certified by Microsoft as Solutions Developer for the Windows Universal Platform by taking two exams that are currently in beta. Because the exams are in beta there is not much guidance to be found online. I noticed during the exams I was being tested on skills not mentioned on the web site.
In this post I’ll cover these differences and how I prepared for the exams so it’ll be easier for you to get certified.

Disclaimer

Microsoft is constantly changing the exams, so my experience can differ from yours. As both UWP exams were in beta, the exams I took might not represent the exams in the future.
Also, I won’t go into detail about the actual questions in the exam. This is prohibited by the we all sign at the start of an exam.

MCSD: Universal Windows Platform

According to with this certification you:

Demonstrate your expertise at planning the designing and implementing of Universal Windows Platform apps that offer a compelling user experience, leverage other services and devices, and use best coding practices to enhance maintainability.

The certification covers a total of three exams. One exam that has been around for a couple of years and the two beta exams I mentioned earlier.

Exam 70-483: Programming in C#

Microsoft Specialist Programming in C-Sharp

Passing this exam will give you the Microsoft Specialist certification.

This certification will count towards other MCSA and MCSD certification:

  • Microsoft Certified Solutions Associate: SQL Server 2012
  • Microsoft Certified Solutions Developer: SharePoint Applications
  • Microsoft Certified Solutions Developer: Web Applications
  • Microsoft Certified Solutions Developer: Windows Store Apps Using C#

As I passed this exam back in december 2013 I can’t offer you any actual insights into additional measured skills. So I’ll only give you the link to .

Exam 70-354: Universal Windows Platform – App Architecture and UX/UI

This exam validates a candidate’s knowledge and skills for planning the development of Universal Windows Platform apps and designing and implementing a compelling user experience.

This exam is quite broad. As it covers everything from designing the app to the application lifecycle management of your app.
The are listed on the website.

Additional skills that can be tested:

  • Choose between version control systems. For example Team Foundation Server, Visual Studio Team Services and GitHub
  • Implement optimistic concurrency in your data layer
  • Enable beta testing of your app
  • Publish the app to the store

Exam 70-355: Universal Windows Platform – App Data, Services, and Coding Patterns

This exam validates a candidate’s knowledge and skills for implementing apps that leverage other services and devices and that use best coding practices to enhance maintainability.

This exam is more limited to the developer role. It covers everything related to developing code, but not limited only to application development.
The are listed on the website.

Additional skills that can be tested:

  • Execute code reviews

Preparation

To prepare for the exam I used the free online training provided at the .
The courses I followed were:

Conclusion

What I found surprising was that a lot of questions were not about UWP app development itself but focusing on the surrounding challenges and technologies. Like:

  • Working in a team
  • Sharing and reviewing code
  • Using back-end services like Azure
  • Using and connecting with technologies not owned by Microsoft like GitHub, SQLite and MongoDB

So passing the exams will show that you are not only able to write an app. But also that you can do with a team, an appropriate lifecycle and utilizing external data sources.

Beta bonus: Charter member

If you do the exams while they are beta you will find a bonus notation in the certification title on your transcript:
MCSD Charter notation
This is explained at the end of the transcript.

*Charter- Certification that was achieved within six months following the retail release date of the certification. Charter Members are recognized by being given the Charter version of the certificate acknowledging their early adoption of the technology solution.

/

This is the second part in a series of posts about reducing the amount of data transferred between ASP.NET Web API or Azure Mobile App Service and the (mobile) client.
In this post we will squeeze a little bit more from our DTOs (Data Transfer Objects).

This is the second part in a series of posts about reducing the amount of data transferred between ASP.NET Web API or Azure Mobile App Service and the (mobile) client.
We will continue where we left off in .

In the first post we managed a reduction of 41%.
Of course the reduction depends heavily on how often default values are part of your transferred data. But it’s an easy diet on transferred data that the other side can reconstruct on itself.
In this post we will squeeze a little bit more from our DTOs (Data Transfer Objects).

Part 2: Skip empty collections

First we look at the response we got from the second controller:

[
    {
        "SomeUri": "http://reducejsontraffic.azurewebsites.net/api/",
        "TheDate": "2015-11-05T17:30:02.3206122+00:00",
        "AFixedDate": "2015-07-02T13:14:00+00:00",
        "SomeEmptyObjects": [],
        "SomeObjects": [
            { "ADouble": 0 },
            {},
            { "ADouble": 1.23456789 }
        ]
    },
    ...
]

(formatted for readability)

We see that the property SomeEmptyObjects is present in the message, despite being an empty collection. So this is the next target for elimination.

Removing the empty collection

The reason the collection is present in the message is because a value of null is of course different than an empty array. But this post is about eliminating data on transport so we start with removing empty collections from our messages.
I was not the first one with this question, and at StackOverflow I found . The post contains a custom contract resolver that deals with this situation.

public class SkipEmptyCollectionsContractResolver : DefaultContractResolver
{
  protected override JsonProperty CreateProperty(MemberInfo member, 
                                            MemberSerialization memberSerialization)
  {
    var property = base.CreateProperty(member, memberSerialization);

    var isDefaultValueIgnored = ((property.DefaultValueHandling ?? 
                   DefaultValueHandling.Ignore) & DefaultValueHandling.Ignore) != 0;
    if (!isDefaultValueIgnored 
                   || typeof (string).IsAssignableFrom(property.PropertyType) 
                   || !typeof (IEnumerable).IsAssignableFrom(property.PropertyType))
    {
      return property;
    }

    Predicate<object> newShouldSerialize = obj =>
    {
      var collection = property.ValueProvider.GetValue(obj) as ICollection;
      return collection == null || collection.Count != 0;
    };

    var oldShouldSerialize = property.ShouldSerialize;
    property.ShouldSerialize = oldShouldSerialize != null 
                               ? o => oldShouldSerialize(o) && newShouldSerialize(o) 
                               : newShouldSerialize;

    return property;
  }
}

I created a third controller with the custom . This controller demonstrates this new behavior.
A GET request to http://reducejsontraffic.azurewebsites.net/api/skipemptycollection returns:

[
    {
        "SomeUri": "http://reducejsontraffic.azurewebsites.net/api/",
        "TheDate": "2015-11-10T16:35:32.3507203+00:00",
        "AFixedDate": "2015-07-02T13:14:00+00:00",
        "SomeObjects": [
            {
                "ADouble": 0
            },
            {},
            {
                "ADouble": 1.23456789
            }
        ]
    },
    ...
]

(formatted for readability)
Removing the empty collection removes another 6% from our transferred data. Bringing the total reduction of this example to 47%.

As I wrote before: there is a difference between a value of null and an empty array. The trade-off is that by removing the information from the DTO the client doesn’t know if there was an empty collection or nothing at all. If this is fine for your code you’re done for now.

Reviving empty collections on the receiving side

In the previous blog post I used the DefaultAttribute to declare these defaults on simple types:

[DefaultValue(14)]
public int Fourteen { get; set; }

When the client uses the Populate (or IgnoreAndPopulate) as DefaultValueHandling the property Fourteen will get the value of 14 then it’s not present in the data.

This also works for the AStringArray property which is an array of String.

[DefaultValue(new string[] {})]
public string[] AStringArray { get; set; }

However, for the arrays that contain objects you can’t declare this as a default. When you try:

[DefaultValue(new SomeObject[] { })]
public SomeObject[] SomeEmptyObjects { get; set; }

This will give the following error: CS0182: An attribute argument must be a constant expression, typeof expression or array creation expression of an attribute parameter type.
We’ll have to find another way to declare the default value.

Initialize collections in the constructor

My own preferred way to initialize collections on a POCO is by setting the collection in the constructor.

public Message()
{
    SomeObjects = new SomeObject[0];
    SomeEmptyObjects = new SomeObject[0];
}

This will make sure we always have an empty collection. And if there is data in the message the Serializer will create a filled collection.
There is a caveat though. When there is no data in message and we have a set the DefaultValueHandling to Populate the empty collection is overwritten with a value of null. Therefore we need to override the DefaultValueHandling for collection properties.

[JsonProperty(DefaultValueHandling = DefaultValueHandling.Include)]
public SomeObject[] SomeEmptyObjects { get; set; }

To test this we launch the demo client app again.
When we look at the watch in the debugger we see again that all the properties that aren’t present in the transferred data are populated with either null or their correct default value.
Visual Studio Watch showing a deserialized object with default value handling

Conclusion

In this example we managed to get a total reduction of 47% and still have all the data available in the client.
Of course the reduction still depends heavily on how often default values and empty collections are part of your transferred data. But the diet continues.

Source code

You can download the updated Reduce Json Traffic sample project on GitHub.
You can also go to the specific commit to see the exact changes.

Filed under C#
Last update:
/

Not long ago I wrote a blog post about Responsive Pivot Headers in Universal Windows Platform apps. Paul responded to this post asking how to change the background of the selected item, just like the example I posted on top of the post.
It’s a great question and I’m sorry I didn’t cover this part so the pivot looks more like the example image.
An omission I want to correct with this blog post.

Not long ago I wrote a blog post about Responsive Pivot Headers in Universal Windows Platform apps. Paul responded asking how to change the background of the selected item, just like the example I posted on top of the post.
It's a great question and I'm sorry I didn't cover this so the pivot looks more like the example.
An omission I want to correct with this blog post.

The example

So to brush up on the first post, here is the example again:
Example design from pivot guidelines by Microsoft

The solution

There are only 3 small changes to make to get the Pivot more like the example.

  1. Add a dark background to the pivot header
  2. Change the color of the pivot header content to light so we can read it
  3. Add a lighter background to the selected pivot header

Adding the background

There is not really an element to set the background of the whole pivot header, so we have to add one.
In the Pivot Template we have created in the first post we add a Border to the Grid of the PivotPanel.
It gets the darker color and has to span the 3 columns.

<PivotPanel x:Name="Panel" VerticalAlignment="Stretch">
<Grid x:Name="PivotLayoutElement">
    <Grid.ColumnDefinitions>
        <ColumnDefinition Width="Auto"/>
        <ColumnDefinition Width="*"/>
        <ColumnDefinition Width="Auto"/>
    </Grid.ColumnDefinitions>
    <Grid.RowDefinitions>
       ...
    </Grid.RowDefinitions>
    <Grid.RenderTransform>
        ...
    </Grid.RenderTransform>
    <Border Background="#FF34323F" Grid.Row="0" Grid.Column="0" Grid.ColumnSpan="3" HorizontalAlignment="Stretch" VerticalAlignment="Stretch"/>
    <ContentPresenter x:Name="LeftHeaderPresenter" ... />

Lighten up the foreground

Then we go the PivotHeaderItem Style we also created before.
Here we add a Setter so we can change the RequestedTheme to Dark.

<Pivot>
    <Pivot.Resources>
        <Style TargetType="PivotHeaderItem">
            ...
            <Setter Property="RequestedTheme" Value="Dark" />
                <Setter Property="Template">

Adding the highlighted background

Now we go to the Visual State named "Selected".
Here we change the Background color to the lighter color.

<VisualState x:Name="Selected">
    <Storyboard>
        <ObjectAnimationUsingKeyFrames Storyboard.TargetName="ContentPresenter" Storyboard.TargetProperty="Foreground" >
            <DiscreteObjectKeyFrame KeyTime="0" Value="{ThemeResource SystemControlHighlightAltBaseHighBrush}" />
        </ObjectAnimationUsingKeyFrames>
        <ObjectAnimationUsingKeyFrames Storyboard.TargetName="Grid" Storyboard.TargetProperty="Background" >
            <DiscreteObjectKeyFrame KeyTime="0" Value="#FF42424C" />
        </ObjectAnimationUsingKeyFrames>
    </Storyboard>
</VisualState>

And that's it!
It's not an exact replica, but you can change all the colors and sizes to accommodate your needs.

Styled Pivot Header with selected item

Source code

The source code of the sample project on GitHub has been updated to reflect these changes.
You can also go to the specific commit to see the exact changes.

/

These days JSON is used a lot. For storing data, for storing settings, for describing other JSON files and often for transporting information between server and client using DTO’s (Data Transfer Objects).

Recently I was monitoring the data transferred from one of my own Web API controllers to a mobile app. I discovered the amount of data transferred was way more then expected. This inspired me try to reduce the size of the transferred data. In this and following blog posts I will describe the different options you can use and combine.

You can download the source code at the end of my article.

These days JSON is used a lot. For storing data, for storing settings, for describing other JSON files and often for transporting information between server and client using DTO's (Data Transfer Objects).

When using an Azure Mobile App Service or ASP.NET Web API you will see that JSON is the default format to transport data. When running apps on a PC's with a fixed internet connection data size might not be a hot topic. But for apps on mobile devices, possibly using slow, limited, or expensive connections you want to save on the amount of data that is transferred.

Recently I was monitoring the data transferred from one of my own Web API controllers to a mobile app. I discovered the amount of data transferred was way more then expected. This inspired me try to reduce the size of the transferred data. In this and following blog posts I will describe the different options you can use and combine.

You can download the source code at the end of my article.

Part 1: Default Value Handling

I created an ASP.NET Web API controller to demonstrate the default behavior. The default controller returns two objects with data, some properties are empty.
A GET request to http://reducejsontraffic.azurewebsites.net/api/default returns:

[
    {
        "AString": null,
        "AnInt": 0,
        "Fourteen": 14,
        "ANullableByte": null,
        "AStringArray": null,
        "NoUri": null,
        "SomeUri": "http://reducejsontraffic.azurewebsites.net/api/",
        "TheDate": "2015-11-05T17:11:29.0809876+00:00",
        "AnEmptyDate": "0001-01-01T00:00:00",
        "AFixedDate": "2015-07-02T13:14:00+00:00",
        "SingleObject": null,
        "SomeEmptyObjects": [],
        "SomeObjects": [
            { "ADouble": 0 },
            { "ADouble": 3.14 },
            { "ADouble": 1.23456789 }
        ]
    },
    ...
]

(formatted for readability)

As you can see every property is present, despite it doesn't have a real value.

In ASP.NET Web API Microsoft has chosen to use the JSON.net serializer. The serializer has a setting called DefaultValueHandling which is set to Include by default. To quote the documentation:

Include members where the member value is the same as the member's default value when serializing objects. Included members are written to JSON. Has no effect when deserializing.

And we can confirm this is the case when we look at the result from the first example.

If a property already gets the default value when deserializing, why would we want to transport that value anyway?

Changing the Default Value Handling

Another option for DefaultValueHandling is Ignore (and for the serializing part IgnoreAndPopulate acts the same). The documentation states:

Ignore members where the member value is the same as the member's default value when serializing objects so that it is not written to JSON. This option will ignore all default values (e.g. null for objects and nullable types; 0 for integers, decimals and floating point numbers; and false for booleans). The default value ignored can be changed by placing the DefaultValueAttribute on the property.

So when we set this option properties with default values will be removed from the data transferred.

The documentation also mentions the DefaultValueAttribute. So we can describe a different default for a property.

[DefaultValue(14)]
public int Fourteen { get; set; }

Now the property Fourteen will only get serialized when its value is not 14.

I created a second controller with the modified setting. This controller demonstrates this new behavior.
A GET request to http://reducejsontraffic.azurewebsites.net/api/defaultvaluehandling returns:

[
    {
        "SomeUri": "http://reducejsontraffic.azurewebsites.net/api/",
        "TheDate": "2015-11-05T17:30:02.3206122+00:00",
        "AFixedDate": "2015-07-02T13:14:00+00:00",
        "SomeEmptyObjects": [],
        "SomeObjects": [
            { "ADouble": 0 },
            {},
            { "ADouble": 1.23456789 }
        ]
    },
    ...
]

(formatted for readability)
That's quite a reduction! But are all values recovered after deserialization?
Yes, if we didn't use the DefaultValueAttribute anywhere in our DTO it will work right away. Otherwise we will need to tell the serializer explicitly we want to populate the default values on deserialization using the same DefaultValueHandling setting we used on serialization.

I wrote a small console app as a client to show you everything is restored correctly.
When we look at the watch in the debugger we see all properties not present in the transferred data are populated with either null or their correct default value.
Visual Studio Watch showing a deserialized object with default value handling

Conclusion

In this example we managed a reduction of 41%.
Of course the reduction depends heavily on how often default values are part of your transferred data. But it's an easy diet on transferred data that the other side can reconstruct on itself.

Source code

You can download my Reduce Json Traffic sample project on GitHub.

Filed under C#
Last update: