Separate TFS Build Server Output into Different Folders

TFS 2013 makes it easier to separate build server output to different folder. In most of the cases, all you have to do is to change Output Location setting in build definition. Output Location setting can be set to the following values:

  • SingleFolder to place all the build output files together in the drop folder.
  • PerProject to group the build outputs into drop folder sub-folders for each solution or code project that you have specified in the Projects box.
  • AsConfigured to leave the binaries in the build agent sources folder, organized into the same sub-folder structure you see when you build your code on your dev machine in Visual Studio. This structure is defined in your code projects.
  • If you use this option, TFBuild will not copy the output to the drop folder. Instead, you can program your scripts to copy the outputs to the location specified by TF_BUILD_BINARIESDIRECTORY so that they get dropped to the staging location. See post-build or post-test scripts.

This is a great feature/setting in TFS 2013 and it works every time. Well, almost every time. PerProject setting is a bit misleading because it does not always group the build outputs per project. In cases, when you choose to build multiple solutions which consists of multiple projects, TFS Build server will split solutions into its different folders. Project output will not be output into separate folders and instead the output for all projects in the solution will be stored in a single folder. Even though your build definition Output Location is set to PerProject. Frustrating. L

To separate project output into different folder when building multiple solutions, you need to set GenerateProjectSpecificOutputFolder MSBuild property to True. To be more precise, set MSBuild arguments setting to /p:GenerateProjectSpecificOutputFolder=True and projects will automatically build into different subfolders. Voila!

Visual Studio Test Manager 2013 does not record actions

We have noticed that Visual Studio Test Manager 2013 behaves differently on some of the machines. More precisely, it does not record actions on some of the machines, while it works perfectly fine on the other machine while running the same test plan under the same test settings. All machines were running Windows 7 with the same version of Visual Studio installed. User tried rebooting the machine, clearing cache, you know the usual stuff. Nothing helped. This is where I stepped in. J

After a bit of digging, I have noticed that Visual Studio Test Manager 2013 is running fine on 32-bit installs of Windows 7. Visual Studio Test Manager was also running fine when users were running 32-bit of Internet Explorer instead of 64-bit version. Then, I have remembered that Visual Studio Test Manager 2013 does not support 64-bit version of Internet Explorer for recording and playback (see for more info.) So, as long as users are using 32-bit version of Internet Explorer (C:\Program Files (x86)\Internet Explorer\iexplore.exe) instead of 64-bit version of Internet Explorer (C:\Program Files\Internet Explorer\iexplore.exe) everything is working as it should. Problem solved. Sort of.

Creating new websites using TFS Release Management

When you use TFS Release Management server to deploy your builds to various (and you really should because it's awesome), you might have tried to get it to do all kinds of stuff like automate Windows operations, IIS actions, registry modifications, services manipulations and so on and so forth. Beautiful, right? Yes, I like it too. Release Management is a great piece of software that will not get better. And, in an effort, to make this software better, I wanted to share a small bug I have discovered in the software.

When you use Create Website action to create a new website in IIS, Release Management seem to create website just fine, but if you look closer you might notice a bug. When Release Management action created a website in IIS, it will misconfigure physical path assigned to the website. Unless you set IsAutoStart option to true, and leave IsPreloadEnabled option blank (unless, of course, you use IIS8 or better.) The reason why this is happening is that when Release Management action creates a website it is using iisconfig.exe command. And, for some reason, it's messing up the syntax unless you set those two options. So, for now, just set set IsAutoStart option to true, and leave IsPreloadEnabled option blank. And, hopefully, Microsoft will fix that bug in the future release…

Making Template Changes to the Projects Migrated from Visual Studio Online

So, you have migrated TFS projects from Visual Studio Online and now you want to make certain improvements to newly migrated projects. Well, you're out of luck. When you try to customize work item types of the existing TFS projects migrated from the cloud, you will receive an error. Even though you have all the necessary permissions. So, what exactly is the problem here?!? Apparently, there is something in the migration process that causes certain metadata missing from security tables when TFS contents are moved from Visual Studio Online to the on-premises servers. As a result TFS simply does not know that you're now allowed to modify the project template.

To solve the problem, you will have to call Microsoft technical support. They will send you the fix. Hopefully, they will integrate that fix into the migration, so this kind of think does not happen. Hopefully…

Putting the Developer in DevOps

The term DevOps has been getting a lot of play lately. While it’s possible (albeit unlikely) that DevOps is a passing fad, my personal opinion is that it’s the next logical step in the maturation of the software development process. Which means that, as a developer, it behooves you to become aware of the tasks that make up DevOps and the options which are available to help you accomplish them.

In case it wasn’t immediately apparently, DevOps is a portmanteau of the words “Developer” and “Operations”. The overarching idea is that consideration is given to the needs of the system administrators during the design and development phase of a project. In some cases, this might mean that the administrators themselves work along side the developers. But at a minimum, the developer must have an understanding of the needs of the system administrator after the application goes live and bake the appropriate hooks and instrumentation into the code before it goes live.

This approach is different from the ‘traditional’ approach. For many (most??), the Dev side of the process involves the creation of the software. Ops, on the other hand, are viewed as simply the people who have to deal with the artifact of the developer’s creative genius once it’s in the hands of real people. These two groups have been treated, by management and users alike, as separate entities. Or is that enmities?

However, in the real world, this division is not optimal. It is well documented that the majority of an application’s life will be spent in production. Doesn’t it make sense to include functionality in the application that helps to ensure that this life will be both productive, helpful and informative? Of course it does. But being obvious doesn’t mean that it has come to pass. Yet.

But by taking the more holistic view that operation functionality is actually just one more aspect of an application’s functionality, it makes sense to address delivery and instrumentation right from the start. This results is a more robust product and, ultimately, provides more value to the users.

At ObjectSharp’s At the Movies event in May, I had the opportunity to demo a new component that will provide some of the functionality necessary to integrate your application with operations. Application Insights is an SDK that is available across most Microsoft platforms, including Web (MVC and Forms), Windows Phone, and Windows Store. It’s primary requirement is that a live Internet connection is necessary to send the instrumentation data to a central repository. You (that is, the Dev of DevOps) can submit custom, hierarchical categories, along with corresponding metrics to the repository. And the repository can be visualized using the portal that is available at Visual Studio Online.





As you might expect, there are more features that are available for to help with DevOps, including server performance tacking, availability monitoring for your Web site (either by hitting a page or by running a simple script) and even a live data stream of events while you are debugging your application. And Application Insights is a locus of regular innovation, with new versions of the SDK being released on a regular cadence. Almost too regular, if you catch my meaning.

You can learn more about Application Insights at the Visual Studio Online site and if you would like to see my demo (or any of the other sessions) from At the Movies, you can find it on Channel 9.

More on fixing reports after Visual Studio Online export

In my previous post on fixing reports after exporting Visual Studio Online project collection to the on premises server, I have forgot to mention that we have fixed the reports by running SQL queries created by a local SQL ninja who wished to stay anonymous (like all ninjas do.) He created the SQL scripts before Microsoft provided the update script that I have included in my previous post and, frankly, his scripts were more comprehensive (no offense to Microsoft support, as they have been very helpful too.) Anyways, here is customer supplied script that helped us resolve the warehouse errors and fix reports after exporting Visual Studio Online project collection:

select l.Id, A.[Changed Order], l.[Changed Order] from WorkItemsLatest L

inner join WorkItemsAre A on l.Id = A.Id and a.[Changed Order] <> l.[Changed Order]


update WorkItemsLatest set [Changed Order] = a.[Changed Order]

from WorkItemsLatest L

inner join WorkItemsAre A on l.Id = A.Id and a.[Changed Order] <> l.[Changed Order]


select, w.[Changed Order], l.[Changed Order] from workitemswere w

inner join workitemslatest l on = and w.[Changed Order] > l.[Changed Order]

inner join (select id, max(rev) as maxrev from workitemswere

group by id) x on = and w.rev = x.maxrev


Update workitemswere set [Changed Order] = cast((cast((cast(l.[Changed Order] as binary(8))) as Bigint)-1) as binary(8))

from workitemswere w

inner join workitemslatest l on = and w.[Changed Order] > l.[Changed Order]

inner join (select id, max(rev) as maxrev from workitemswere

group by id) x on = and w.rev = x.maxrev


select id, rev, [Changed Order] from WorkItemsWere where [Changed Order] > @@dbts


Update WorkItemsWere set [Changed Order] = cast((cast((cast(z.[Changed Order] as binary(8))) as Bigint)-1) as binary(8))


(select, w.[Changed Order], x.rev from WorkItemsWere w

inner join (select id, rev, [Changed Order] from WorkItemsWere where [Changed Order] > @@dbts) x

on = and x.rev = w.rev-1) z

inner join workitemswere ww on = and z.rev = ww.rev



SharePoint keeps prompting for credentials

One of the clients had an unusual problem accessing SharePoint. When SharePoint was accessed using Internet Explorer, user kept getting prompted for credentials. At the same time, user was able to access SharePoint using Firefox or Chrome without a problem. Weird, right? Your credentials should let you in (or not) no matter which browser you're using. Google/Bing searching did not help…

A solution to the problem ended up nothing to do with SharePoint. In the end, the problem was caused by a simple fact that the Active Directory user had "User Must Change Password at Next Logon" option turned. And, only Internet Explorer picked up on that. Once we have removed the "User Must Change Password at Next Logon", everything for back to normal. Another problem solved.

I still wonder though why did Firefox and Chrome ignored the fact that user need to change his/her password and Internet Explorer was the only browser that picked up on that. Any thoughts?

Fixing reports in exported Visual Studio Online project collection

As Visual Studio Online has gone to general availability (i.e. no more free rides), a lot of people used TFSCONFIG CLOUDIMPORT command to export your TFS projects from Visual Studio Online to the on premises TFS server. That tool makes the export very smooth and actually easy process. But, that's not what we here to talk about. We're here to talk about fixing reports once you have finished the export. CloudImport command only imports TFS content and does not deal with reports at all. So, reports will have to be fixed manually.

First of all, you need to recreate the reports in the Report Server. By the way, you need to upload reports from TFS Scrum template. You can recreate reports using TFS Administrator's Toolkit or all-powerful PowerShell. Once reports are uploaded and warehouse/cube being processed, you will noticed that some of the reports do not work. If you check warehouse processing status, you will notice that an error being thrown during warehouse processing.

System.Data.SqlClient.SqlException: Cannot create compensating record. Missing historic data. Predecessor of work item(s)…

Very confusing error… Points to some kind of confusion between revisions. How can that be? How can the data get "corrupted" during the export? And, more importantly, what data got "corrupted"? To get answer to the last question, try running the following SQL statement against your project collection database:

Select *

FROM [dbo].[WorkItemsLatest] L

JOIN [dbo].[WorkItemsAre] A

ON L.[ID] = A.[ID]

    AND L.[Rev] = A.[Rev]

WHERE L.[Changed Order] <> A.[Changed Order]


If the query returns any records, then we have a problem. Most likely all of the returned records have "messed" up revisions. Potentially, that could be a lot of records. To fix the problem, Microsoft provided us with the following SQL query:


SET L.[Changed Order] = A.[Changed Order]

FROM [dbo].[WorkItemsLatest] L

JOIN [dbo].[WorkItemsAre] A

ON L.[ID] = A.[ID]

    AND L.[Rev] = A.[Rev]

WHERE L.[Changed Order] <> A.[Changed Order]



After running the script manually refresh warehouse and cube, and you should be fine. Good luck.

UPDATE: See a few more details on how we've fixed the reports after exporting TFS project collection from Visual Studio Online at

Release Management for Visual Studio custom training and more…

Over the next month, I'm planning to write a custom one day course on Release Management for Visual Studio. Here is a draft course outline:

Module 1: Introduction to Release Management for Visual Studio

  • An overview of the features of Release Management
  • How to reduce the risks and costs associated with releasing software
  • A look at the Build-Deploy-Test-Release process

Module 2: Installing and configuring Release Management for Visual Studio

  • Installing and configuring the Release Management Server and Client
  • Installing Deployment Agents
  • Configuring roles and permissions in Release Management

Module 3: Configuring Release Paths and Release Templates

  • Defining servers and environments
  • Configuring release path stages
  • Understanding release templates
  • Setting up a deployment procedure for each stage
  • Configuring a build definition to use the Release Management process template

Module 4: Release Management Execution

  • Use of Token
    • Tokens description
    • Token replacements
  • Execution of a Full Release Cycle
    • Trigger manual release
    • Approvals
    • The deployment log
    • Audit and history
  • How to Approve a Request
    • Release Management console
    • Notifications
    • Release Management Web client
    • Deferred deployment
  • How to Release from TFS
    • Configuration of a build definition to use the Release Management Default Template
    • Configuration of an Application Version to allow triggering a Release from the Build
  • Debugging a Deployment
    • Internal logic of a deployment
    • Different logs available


The course will be available for public and private training at the end of July 2014. After writing this course, I'm planning to author a more comprehensive three day DevOps course where we will cover TFS Build server, Release Management, and Application Insights from Microsoft Azure. Three day course should be available around September 2014.

If you're interested in these courses, please contact ObjectSharp training at 416-649-3690. J



Error upgrading TFS 2010 to TFS 2013

I was helping another client to upgrade TFS 2010 to TFS 2013, as I did tens and hundreds of times before, and I thought I saw it all. All possible errors, warnings… all the weird stuff. I was wrong. There is always something. Something that can go wrong. Something new to learn. Anyways, back to the upgrade. As I was saying, I was upgrading TFS 2010 to TFS 2013. It began as a fairly straightforward upgrade. You build new TFS 2013 environment, detach TFS 2010 project collection, (if you're moving from SQL Enterprise to Standard - backup up the database and remove encryption/compression), backup the database, restore the database, and finally attach TFS 2010 project collection to new TFS 2013 environment. Easy, right? Well, I forgot to mention, fixing reports, SharePoint sites, build definitions, etc. But, still, very straightforward process. Except, this time.

This time when I was attaching TFS 2010 project collection to newly built and awesome TFS 2013 environment, I got an error: "TF400744: An error occurred while executing the following script: WorkItemTrackingToDev11Beta2Custom.sql. Failed batch starts on the line 1. Statement line: 35. Script line: 35. Error: 4922 ALTER TABLE ALTER COLUMN SeqId failed because one or more objects access this column." I have never seen this error before. There wasn't much information on this error online either. So, I have tried re-running project collection process again to make sure that we have not missed anything. Got the same error again. Disappointing.

After some digging, a lot of digging actually, I still did not find anything. I knew that the problem must lie within the database. The database must got corrupted somehow. But, I couldn't put my finger on it though. So, I called Microsoft support. We dug some more. Then, some more… And, finally, we have found a problem. Apparently, project collection database was manually modified. An index was created on one of the columns (SeqID column), and that was preventing upgrade process dropping that column. That was it. That was the reason why we lost hours trying to solve the mystery of failing upgrade… So, the lesson is DO NOT make manual changes to any TFS databases. It might seem like a good and perfectly harmless idea at the time, but it's not. It never is.