It's on! At The Movies is back!

For those who have attended At The Movies (ATM) event in the past, I'm happy to announce that ATM event is back on. You know it'll be awesome, so make you register at before we run out of space in Scotibank movie theater. 

For those who have not attended At The Movies (ATM) event in the past, well, you have missed out on a lot of fun (and expert knowledge.) So, don't let that happen to you again and register for the event at You will not be disappointed. Did I mention that this is a free event?!?

To remind you how awesome ATM events are, I have added a few links to old videos from past ATM events:


What can I say. Visual Studio, TFS, Azure, SharePoint , Windows 8, etc. Pure geeky fun time. See you there on May 8th, 2014! :)

At the Movies is Back

May 8th is the next At the Movies event from ObjectSharp. Click the poster below to register. You don’t want to miss this. 


TFS 2013 Upgrade - SharePoint 2010 to SharePoint 2013 Upgrade Errors

OK, so I'm upgrading TFS 2008 with SharePoint Foundation 2007 installation to TFS 2013 with SharePoint Foundation 2013. Naturally, you do an upgrade using an intermediary server with TFS 2012 with SharePoint 2010 install, because you cannot upgrade TFS 2008 to TFS 2013 directly (you have to upgrade to TFS 2012 first), just as you cannot upgrade SharePoint 2007 to SharePoint 2013 (you have to upgrade to SharePoint 2010 first). All goes well except for the SharePoint portion of the upgrade.

Running Test-SPContentDatabase against SharePoint content database produces a few weird errors, like:

Category : MissingFeature Error : True UpgradeBlocking : False Message : Database [WSS_Content] has reference(s) to a missing feature: Id = [00bfea71-c796-4402-9f2f-0eb9a6e71b18], Name = [Wiki Page Library], Description = [An interconnected set of easily editable web pages, which can contain text, images and web parts.], Install Location = [WebPageLibrary]. Remedy : The feature with Id 00bfea71-c796-4402-9f2f-0eb9a6e71b18 is referenced in the database [WSS_Content], but is not installed on the current farm. The missing feature may cause upgrade to fail. Please install any solution which contains the feature and restart upgrade if necessary

You would think that Wiki Page Library feature should exist in new SharePoint, and it does, but the error still shows up. Makes no sense, right. Well, if you try to go ahead and proceed with an upgrade anyways, you will see even stranger things. Upgrade completes successfully, but none of the SharePoint pages come up properly. None of the "default" SharePoint web parts come up properly. To make things worse, you cannot get to any of the system pages in SharePoint. When you try to access any system pages, like Site Settings, you get Access Denied errors (strange part is that you are denied access to built-in v4.master master page; how is that possible!). Very strange…

After a bit of digging on the web, I have found a solution. Apparently, the problem was caused by the fact that SharePoint 2013 has two modes (hives), 2010 (v14) and 2013 (v15). Apparently, by default, a new SharePoint 2013 installation mostly only installs v15 features. Using SharePoint Feature Admin Tool, we can tell that v14 features we needed were not installed. Now that we know that we can simply install missing v14 features individually using SharePoint 2013 Management Shell or we can simply install all existing features in both the v14 or v15 hives by running the following cmdlet.

Install-SPFeature -AllExistingFeatures

Second approach was easier, so I run with it. Running that cmdlet fixed all of my SharePoint problems, and that's a good thing.

How to connect Release Management to Team Foundation Server

Release Management for Visual Studio 2013 (formerly known as inRelease client) is tightly integrated with Team Foundation Server (TFS 2010, TFS 2012 and TFS 2013 versions are supported. Visual Studio Online is not supported yet) To connect  Release Management server to Team Foundation Server, you need to use Release Management Client for Visual Studio 2013:

  • Launch Release Management Client for Visual Studio 2013. If you launch it for the first time, you will be prompted with Configure Services dialog window. Just enter Release Management server name and port number, and click on OK.
  • Click Administration tab to connect Release Management server to TFS. Then, click on Manage TFS section.
  • Click New button to add a TFS connection. You can add connections to many project collections hosted on different TFS servers or many separate project collections hosted on the same TFS server.
  • Provide the following connection settings:
    • name or the URL of TFS server
    • name of the project collection
    • service account credentials to connect to TFS
    • HTTP/HTTPs protocol used to connect to TFS
  • Click Verify to validate the settings provided.
  • Click Save and Close to save the connection to TFS

That's all. Now you should be able to start using Release Management server with TFS. Oh, almost forgot, to configure connection to TFS, your account(s) must have the following minimal permissions

  • Collection Level 
    • 'Make requests on behalf of others' permission (required to setup TFS Connection in release management server)
    • 'View collection-level information' permission (to get list of Build Definitions on behalf of current user) 
    • 'View build resources' permission (to set a Build to Release) 
  • Team Project Level – for all projects used in release management 
    • 'View project-level information' permission (to add a TFS Group) 
  • Build Definition Level – for all build definitions used in release management server
    • 'Retain Indefinitely' permission (when starting a Release)

To keep things simple, you can simply make service account used by release management server to connect to TFS a member of the Project Collection Service Accounts group. 

Release Management Build Template

If you’re working with the Release Management Server (and you should, because it's awesome) and cannot find the location of the Release Management Build Process Template, then try looking under 

C:\Program Files (x86)\ Microsoft Visual Studio 12.0\ReleaseManagement\bin.

In some cases you might only see ReleaseDefaultTemplate.11.1.xaml (TFS 2012 build template) or ReleaseDefaultTemplate.xaml (TFS 2010 build template) under C:\Program Files (x86)\ Microsoft Visual Studio 12.0\ReleaseManagement\bin. But where is TFS 2013 build template you might ask. Without one you cannot properly integrate Release Management with TFS 2013, which would be fairly disappointing. Luckily, you can download TFS 2013 template here. Inside this .zip file you will find three files:

  • TfvcTemplate.12.xaml (for when you're using TfvcTemplate.12.xaml)
  • GitTemplate.12.xaml (for when you're using GitTemplate.12.xaml)
  • ReleaseTemplate.12.Snippets.xaml (for when you would like to add release management functionality to your custom TFS 2013 build template. snippets file only contains sections with start / end markers to indicate which parts to copy)

Please note that, Release Management Build Process Template are not installed in TFS by default, so it won’t appear as an available build process template until you add it. To add the release management build process template, you will need to check it in to your TFS source control, and then add the build process file when editing (or adding) a Build Definition. Once the release management template has been added to the list of build templates, you can start using it. More on how to use release management build template later... 

Using Deployment Metadata in TFS Release Management

TFS Release Management allows you to use deployment metadata as a value for configuration variables in release templates. It comes in very handy when you need to refer to the build drop location or build number. The following is the list of available Metadata that can be used with Configuration Variables.

Build Directory        $(PackageLocation)

Build Number        $(BuildNumber)

Build Definition    $(BuildDefinition)

TFS Collection        $(TfsUrlWithCollection)

Team Project        $(TeamProject)


Unfortunately, you cannot create custom deployment metadata just yet. Hopefully that will change one day. Another "catch" is that deployment metadata can only be used in components. Deployment metadata cannot be used in actions or tools because those are taking place outside of the build process.

It’s starting

And there is no escape...


JsonConvert.Serialize Fails Silently

I ran into an interesting issue with JSON.NET over the weekend. Specifically, while I was serializing an object, it would fail silently. No exception was raised (or could even be trapped with a try-catch). However, the call to Serialize did not return and the application terminated.

The specific situation in which the call to Serialize was being made was the following:

List<Customer> _customers;

Task creationTask = new Task(() =>
    _customers = new List<Customer>();
   // Do stuff to build the list of customers
creationTask.ContinueWith(() =>


Now the actual call to JsonConvert.Serialize is found in the serializeCustomer method. Nothing special there, other than the method that actually fails. But the reason for the failure is rooted in the code snippet shown above.

This was the code as originally written. It was part of a WPF application that collected the parameters. And it worked just fine. However the business requirements changed slightly and I had to change the WPF application to a console app where the parameters are taken from the command line. No problem. However while there was a good reason to run the task in the background with a WPF application (so that the application doesn’t appear to be hung), that is not a requirement for a console app.  And to minimize the code change as I moved from WPF to Console, I changed a single line of code:


Now the call to JsonConvert.Serialize in the serializeCustomer method would fail. Catastrophically. And silently. Not really much of anything available to help with the debugging.

Based on the change, it appears that the problem is related to threading. Although it might not be immediately obvious, the ContinueWith method results in the creation of a Task object. This process represented by this object will be executed in a separate thread from the UI thread. So any issues that relate to cross-threading execution has the potential to cause a problem. I’m not sure, but I suspect that was the issue in this case. When I changed the code to be as follows, the problem went away.

List<Customer> _customers;

Task creationTask = new Task(() =>
    _customers = new List<Customer>();
   // Do stuff to build the list of customers


Now could I have eliminated the need for the Task object completely? Yes. And in retrospect, I probably should have. However if I had, I wouldn’t have had the material necessary to write this blog post. And the knowledge of how JsonConvert.Serialize operates when using Tasks was worthwhile to have, even if it was learned accidentally.

Cloud Computing in 2014

As 2013 came to a close, I put the wraps on my latest book (Professional Visual Studio 2013). While I’m not quite *done* done, all that’s left is to review the galleys of the chapter as they come back from the editor. Work, yes. But not nearly as demanding as everything that has gone before.

As well, since I’ve now published four books in the last 25 months, I’m a little burned out on writing books. I’m sure that I’ll get involved in another project at some point in the future, but for at least the next 6 months, I’m saying ‘no’ to any offer that that involves writing to a deadline.

Yet, the need to write still burns strongly in me. I really can’t *not* write. So what that means is that my blogging will inevitably increase. Be warned.

To start the new year, I thought I’d get into an area that I’m moderately familiar with: Cloud Computing. And for this particular blog, it being the start of the year and all, a prediction column seemed most appropriate. So here we go with 5 Trends in Cloud Computing for 2014

Using the Cloud to Innovate

One of the unintended consequences of the cloud is that it sits at the intersection of the three big current technology movements: mobile, social and big data.

  • Mobile is the biggest trend so far this century and is becoming as significant as the Internet itself did 20 years ago. The commoditization of the service is well underway and smartphones need to be considered in almost every technology project.
  • Social is not at the leading edge of mind share any more. And definitely not to the same level it was a few years ago. It it quickly becoming a given that social, of some form or another, needs to be a part of every new app.
  • Big Data is the newest of these three trends. Not that it hasn’t been around for a while. But the tools are now available for smaller companies to be able to more easily capture and analyze large volumes of data that previously would have simply been ignored.

What do these three trends have in common? They all use (or can use) the cloud as the delivery mechanism for their services. Most companies wouldn’t think of developing a turnkey big data environment. Instead, they would use a Hadoop instance running in Azure (or AWS or pick your favorite hosting environment). And why build an infrastructure to support mobile apps until you really need to roll your own. Instead, use the REST-based API available through Windows Azure Mobile Services. It has become very easy to use the cloud-available services as the jumping off point for your innovation across all three of these dimensions. And by allowing innovators to focus more on their creations and less on the underlying infrastructure, the pace and quality of the innovations will only increase.

Hybrid-ization of the Cloud

Much as some might want (and most don’t), you cannot move every piece of your infrastructure to the cloud. Inevitably, there is some piece of hardware that needs to be running locally in order to deliver value. But more importantly, why would you want to rip out and migrate functionality that already works if such a move provides little or no practical benefits. Instead, the focus of your IT group should be on delivering new value using cloud functionality, transitioning older functions to the cloud only on an as-needed basis.

What this does mean is that most companies are going to need to run a hybrid cloud environment. Some functions will stay on-premise. Others will move to the cloud. It will be up to IT to make this work seamlessly. There are already a number of features available through Azure AD to assist with authentication and authorization. But as you go through the various components of your network, there will be many opportunities to add to the hybrid portion of your infrastructure. And you should take them. The technology has gotten to the point that *most* issues related to creating an hybrid infrastructure have been addressed. Take advantage of this to make the most of the interplay between the two environments.

Transition from Capitalization to Expenses

For most people, the idea of using the cloud in their business environment is driven by the speed with which technology can be deployed. Instead of needing to wade through a budget approval process for a new blade server, followed by weeks of waiting for delivery, you can spin up the equivalent functionality in a matter of minutes.

But while that capability is indeed quite awesome, for business people it’s not really the big win. Instead, it’s the ability to move the cost associated with infrastructure from the balance sheet to the income statement. At the same time as this (generally) beneficial move, the need to over-purchase capacity is removed. Cloud computing allows you to add capacity on an as-needed basis. While it’s not quite like turning on a light switch, it’s definitely less onerous than a multi-week purchase/install/deploy cycle that is standard with physical hardware. One can question whether the cost of ‘renting’ space in the cloud is more or less expensive that the physical counterpart, but the difference in how the costs are accounted for make more of a difference than you think.

So how does this impact you in 2014? More and more, you will need to be aware of the costing models that are being used by your cloud computer provider. While the costs have not yet become as complicated as, say, the labyrinth of Microsoft software licensing, they are getting close. Keep a close eye on how the various providers are charging you and what you are paying for, so that as you move to a cloud environment, you can make the most appropriate choices.

Network Amplification

In order to be successful, your application needs to leverage connections between a wide variety of participants: users, partners, suppliers, employees. This is the ‘network’ for your organization. And, by extension, the applications that are used within your organization.

If you want to maximize the interconnectedness of this network, as well as allowing the participants to take full advantage of your application, you need to provide two fundamental functions: a robust and useable API and the ability to scale that API as needed.

In most cases, a REST-based API is the way to go. And you will see in the coming 12 months an increased awareness of what makes a REST API ‘good’. This is not nearly as simple as it sounds. Or, possibly, as it should be. While some functionality is easy to design and implement, others are not. And knowing the difference between the two is either trial and error or you find someone who has already been through the process.

As for scalability, a properly designed API combined with cloud deployment can come close to giving you that for free. But note the critical condition ‘properly designed’. When it comes to API functionality, it is almost entirely about the up-front design. So spend the necessary effort to make sure that it works as you need it to. Or, more importantly, as the clients of your API need it to.

Predictive Technology

For the longest time, real-time was the goal. Wouldn’t it be nice to see what the user is doing on your Web site at the moment they are doing it. Well, that time is now in the past. If you’re trying to stay ahead of the curve, you need to look ahead to the user’s next actions.

This is not the same as Big Data, although Big Data helps. It’s the ability to take the information (not just the data) extracted from Big Data and use it to modify your business processes. That process could be as simple as changing the data that appears on the screen to modifying the workflow in your production line. But you’ll start to see tools aimed at helping you understand and take advantage of ‘future’ knowledge start to arrive shortly.

So there you are. Five trends that are going to define cloud computing over the next 12 months, ranging from well on the way to slightly more speculative. But all of them are (or should be) applicable to your company. And the future of how you create and deploy applications.

Toronto ALM User Group

I really need to stop ignoring my BLOG, I have lots of stuff to post, however I just keep forgetting to do it. Life gets so busy. Well it’s a new year and I am going to try and post something at least every two weeks. I want to say every week but I can’t see that happening. Smile


Since I run the Toronto ALM user group I should at least let people know what is coming up.

In January we had the last Canadian speaking appearance of Colin Bowern when he gave a great presentation sharing his thoughts on this topic: As with many things in software engineering there is rarely an answer that is always right, all the time – except locking your workstation when you walk away from your desk, no excuses there.  In the ALM space we have heavily integrated stacks like Microsoft TFS, Rational Team Concert, CollabNet TeamForge and Atlassian’s toolset, but we also have standalone tools that are focused on being the best at one thing alone. In this session we’ll walk through a particular stack of tools that can be used in .NET shops that have investments in other platforms such as PSAKE, TeamCity, xUnit, SpecFlow, Node and PowerShell. But bigger than this toolset we will compare and contrast the integrated and best-in-class approaches to make sure we understand the tool, the myth and the legends behind each. Bring your experiences and let’s have a rich discussion that will broaden our horizons on what is possible to help teams reduce friction and ship value faster.

Thanks again Colin your presentation was informative and very well received by the group.

In February we have Max Yermakhanov showing us the new Release Management Solution that comes in TFS 2013. This is Microsoft's newest acquisition from Canada’s own InCycle. Are you looking for a way to track your release process and start automating your deployments for repeatable success? Are you wanting to have automation that is the same across development, test, and even production environments? If so, come by and learn about release management tooling in TFS 2013.

Hope to see you at the meeting in February.