TFS 2013 makes it easier to separate build server output to different folder. In most of the cases, all you have to do is to change Output Location setting in build definition. Output Location setting can be set to the following values:
- SingleFolder to place all the build output files together in the drop folder.
- PerProject to group the build outputs into drop folder sub-folders for each solution or code project that you have specified in the Projects box.
- AsConfigured to leave the binaries in the build agent sources folder, organized into the same sub-folder structure you see when you build your code on your dev machine in Visual Studio. This structure is defined in your code projects.
- If you use this option, TFBuild will not copy the output to the drop folder. Instead, you can program your scripts to copy the outputs to the location specified by TF_BUILD_BINARIESDIRECTORY so that they get dropped to the staging location. See post-build or post-test scripts.
This is a great feature/setting in TFS 2013 and it works every time. Well, almost every time. PerProject setting is a bit misleading because it does not always group the build outputs per project. In cases, when you choose to build multiple solutions which consists of multiple projects, TFS Build server will split solutions into its different folders. Project output will not be output into separate folders and instead the output for all projects in the solution will be stored in a single folder. Even though your build definition Output Location is set to PerProject. Frustrating. L
To separate project output into different folder when building multiple solutions, you need to set GenerateProjectSpecificOutputFolder MSBuild property to True. To be more precise, set MSBuild arguments setting to /p:GenerateProjectSpecificOutputFolder=True and projects will automatically build into different subfolders. Voila!
We have noticed that Visual Studio Test Manager 2013 behaves differently on some of the machines. More precisely, it does not record actions on some of the machines, while it works perfectly fine on the other machine while running the same test plan under the same test settings. All machines were running Windows 7 with the same version of Visual Studio installed. User tried rebooting the machine, clearing cache, you know the usual stuff. Nothing helped. This is where I stepped in. J
After a bit of digging, I have noticed that Visual Studio Test Manager 2013 is running fine on 32-bit installs of Windows 7. Visual Studio Test Manager was also running fine when users were running 32-bit of Internet Explorer instead of 64-bit version. Then, I have remembered that Visual Studio Test Manager 2013 does not support 64-bit version of Internet Explorer for recording and playback (see http://msdn.microsoft.com/en-ca/library/dd380742.aspx for more info.) So, as long as users are using 32-bit version of Internet Explorer (C:\Program Files (x86)\Internet Explorer\iexplore.exe) instead of 64-bit version of Internet Explorer (C:\Program Files\Internet Explorer\iexplore.exe) everything is working as it should. Problem solved. Sort of.
When you use TFS Release Management server to deploy your builds to various (and you really should because it's awesome), you might have tried to get it to do all kinds of stuff like automate Windows operations, IIS actions, registry modifications, services manipulations and so on and so forth. Beautiful, right? Yes, I like it too. Release Management is a great piece of software that will not get better. And, in an effort, to make this software better, I wanted to share a small bug I have discovered in the software.
When you use Create Website action to create a new website in IIS, Release Management seem to create website just fine, but if you look closer you might notice a bug. When Release Management action created a website in IIS, it will misconfigure physical path assigned to the website. Unless you set IsAutoStart option to true, and leave IsPreloadEnabled option blank (unless, of course, you use IIS8 or better.) The reason why this is happening is that when Release Management action creates a website it is using iisconfig.exe command. And, for some reason, it's messing up the syntax unless you set those two options. So, for now, just set set IsAutoStart option to true, and leave IsPreloadEnabled option blank. And, hopefully, Microsoft will fix that bug in the future release…
So, you have migrated TFS projects from Visual Studio Online and now you want to make certain improvements to newly migrated projects. Well, you're out of luck. When you try to customize work item types of the existing TFS projects migrated from the cloud, you will receive an error. Even though you have all the necessary permissions. So, what exactly is the problem here?!? Apparently, there is something in the migration process that causes certain metadata missing from security tables when TFS contents are moved from Visual Studio Online to the on-premises servers. As a result TFS simply does not know that you're now allowed to modify the project template.
To solve the problem, you will have to call Microsoft technical support. They will send you the fix. Hopefully, they will integrate that fix into the migration, so this kind of think does not happen. Hopefully…
In my previous post on fixing reports after exporting Visual Studio Online project collection to the on premises server, I have forgot to mention that we have fixed the reports by running SQL queries created by a local SQL ninja who wished to stay anonymous (like all ninjas do.) He created the SQL scripts before Microsoft provided the update script that I have included in my previous post and, frankly, his scripts were more comprehensive (no offense to Microsoft support, as they have been very helpful too.) Anyways, here is customer supplied script that helped us resolve the warehouse errors and fix reports after exporting Visual Studio Online project collection:
select l.Id, A.[Changed Order], l.[Changed Order] from WorkItemsLatest L
inner
join WorkItemsAre A on l.Id = A.Id and a.[Changed Order] <> l.[Changed Order]
update WorkItemsLatest set [Changed Order] = a.[Changed Order]
from WorkItemsLatest L
inner
join WorkItemsAre A on l.Id = A.Id and a.[Changed Order] <> l.[Changed Order]
select w.id, w.[Changed Order], l.[Changed Order] from workitemswere w
inner
join workitemslatest l on w.id = l.id and w.[Changed Order] > l.[Changed Order]
inner
join
(select id,
max(rev)
as maxrev from workitemswere
group
by id) x on w.id = x.id and w.rev = x.maxrev
Update workitemswere set [Changed Order] =
cast((cast((cast(l.[Changed Order] as
binary(8)))
as
Bigint)-1)
as
binary(8))
from workitemswere w
inner
join workitemslatest l on w.id = l.id and w.[Changed Order] > l.[Changed Order]
inner
join
(select id,
max(rev)
as maxrev from workitemswere
group
by id) x on w.id = x.id and w.rev = x.maxrev
select id, rev, [Changed Order] from WorkItemsWere where [Changed Order] >
@@dbts
Update WorkItemsWere set [Changed Order] =
cast((cast((cast(z.[Changed Order] as
binary(8)))
as
Bigint)-1)
as
binary(8))
from
(select w.id, w.[Changed Order], x.rev from WorkItemsWere w
inner
join
(select id, rev, [Changed Order] from WorkItemsWere where [Changed Order] >
@@dbts) x
on w.id = x.id and x.rev = w.rev-1) z
inner
join workitemswere ww on z.id = ww.id and z.rev = ww.rev
Again, while running the script may have helped us, please RUN THE SCRIPT AT YOUR OWN RISK! AND, PLEASE, DO NOT FORGET TO BACKUP THE PROJECT COLLECTION DATABASE BEFORE YOU RUN THE SCRIPT!!!
One of the clients had an unusual problem accessing SharePoint. When SharePoint was accessed using Internet Explorer, user kept getting prompted for credentials. At the same time, user was able to access SharePoint using Firefox or Chrome without a problem. Weird, right? Your credentials should let you in (or not) no matter which browser you're using. Google/Bing searching did not help…
A solution to the problem ended up nothing to do with SharePoint. In the end, the problem was caused by a simple fact that the Active Directory user had "User Must Change Password at Next Logon" option turned. And, only Internet Explorer picked up on that. Once we have removed the "User Must Change Password at Next Logon", everything for back to normal. Another problem solved.
I still wonder though why did Firefox and Chrome ignored the fact that user need to change his/her password and Internet Explorer was the only browser that picked up on that. Any thoughts?
As Visual Studio Online has gone to general availability (i.e. no more free rides), a lot of people used TFSCONFIG CLOUDIMPORT command to export your TFS projects from Visual Studio Online to the on premises TFS server. That tool makes the export very smooth and actually easy process. But, that's not what we here to talk about. We're here to talk about fixing reports once you have finished the export. CloudImport command only imports TFS content and does not deal with reports at all. So, reports will have to be fixed manually.
First of all, you need to recreate the reports in the Report Server. By the way, you need to upload reports from TFS Scrum template. You can recreate reports using TFS Administrator's Toolkit or all-powerful PowerShell. Once reports are uploaded and warehouse/cube being processed, you will noticed that some of the reports do not work. If you check warehouse processing status, you will notice that an error being thrown during warehouse processing.
System.Data.SqlClient.SqlException: Cannot create compensating record. Missing historic data. Predecessor of work item(s)…
Very confusing error… Points to some kind of confusion between revisions. How can that be? How can the data get "corrupted" during the export? And, more importantly, what data got "corrupted"? To get answer to the last question, try running the following SQL statement against your project collection database:
Select
*
FROM
[dbo].[WorkItemsLatest]
L
JOIN
[dbo].[WorkItemsAre]
A
ON
L.[ID] = A.[ID]
AND
L.[Rev] = A.[Rev]
WHERE
L.[Changed Order] <> A.[Changed Order]
If the query returns any records, then we have a problem. Most likely all of the returned records have "messed" up revisions. Potentially, that could be a lot of records. To fix the problem, Microsoft provided us with the following SQL query:
UPDATE
L
SET
L.[Changed Order] = A.[Changed Order]
FROM
[dbo].[WorkItemsLatest]
L
JOIN
[dbo].[WorkItemsAre]
A
ON
L.[ID] = A.[ID]
AND
L.[Rev] = A.[Rev]
WHERE
L.[Changed Order] <> A.[Changed Order]
While running the script may have helped us, please RUN THE SCRIPT AT YOUR OWN RISK! AND, PLEASE, DO NOT FORGET TO BACKUP THE PROJECT COLLECTION DATABASE BEFORE YOU RUN THE SCRIPT!!!
After running the script manually refresh warehouse and cube, and you should be fine. Good luck.
UPDATE: See a few more details on how we've fixed the reports after exporting TFS project collection from Visual Studio Online at http://blogs.objectsharp.com/post/2014/06/14/More-on-fixing-reports-after-Visual-Studio-Online-export.aspx
Over the next month, I'm planning to write a custom one day course on Release Management for Visual Studio. Here is a draft course outline:
Module 1: Introduction to Release Management for Visual Studio
- An overview of the features of Release Management
- How to reduce the risks and costs associated with releasing software
- A look at the Build-Deploy-Test-Release process
Module 2: Installing and configuring Release Management for Visual Studio
- Installing and configuring the Release Management Server and Client
- Installing Deployment Agents
- Configuring roles and permissions in Release Management
Module 3: Configuring Release Paths and Release Templates
- Defining servers and environments
- Configuring release path stages
- Understanding release templates
- Setting up a deployment procedure for each stage
- Configuring a build definition to use the Release Management process template
Module 4:
Release Management Execution
The course will be available for public and private training at the end of July 2014. After writing this course, I'm planning to author a more comprehensive three day DevOps course where we will cover TFS Build server, Release Management, and Application Insights from Microsoft Azure. Three day course should be available around September 2014.
If you're interested in these courses, please contact ObjectSharp training at 416-649-3690. J
I was helping another client to upgrade TFS 2010 to TFS 2013, as I did tens and hundreds of times before, and I thought I saw it all. All possible errors, warnings… all the weird stuff. I was wrong. There is always something. Something that can go wrong. Something new to learn. Anyways, back to the upgrade. As I was saying, I was upgrading TFS 2010 to TFS 2013. It began as a fairly straightforward upgrade. You build new TFS 2013 environment, detach TFS 2010 project collection, (if you're moving from SQL Enterprise to Standard - backup up the database and remove encryption/compression), backup the database, restore the database, and finally attach TFS 2010 project collection to new TFS 2013 environment. Easy, right? Well, I forgot to mention, fixing reports, SharePoint sites, build definitions, etc. But, still, very straightforward process. Except, this time.
This time when I was attaching TFS 2010 project collection to newly built and awesome TFS 2013 environment, I got an error: "TF400744: An error occurred while executing the following script: WorkItemTrackingToDev11Beta2Custom.sql. Failed batch starts on the line 1. Statement line: 35. Script line: 35. Error: 4922 ALTER TABLE ALTER COLUMN SeqId failed because one or more objects access this column." I have never seen this error before. There wasn't much information on this error online either. So, I have tried re-running project collection process again to make sure that we have not missed anything. Got the same error again. Disappointing.
After some digging, a lot of digging actually, I still did not find anything. I knew that the problem must lie within the database. The database must got corrupted somehow. But, I couldn't put my finger on it though. So, I called Microsoft support. We dug some more. Then, some more… And, finally, we have found a problem. Apparently, project collection database was manually modified. An index was created on one of the columns (SeqID column), and that was preventing upgrade process dropping that column. That was it. That was the reason why we lost hours trying to solve the mystery of failing upgrade… So, the lesson is DO NOT make manual changes to any TFS databases. It might seem like a good and perfectly harmless idea at the time, but it's not. It never is.
As I'm sure we all know, Visual Studio Online has gone to general availability recently, which means that very soon you will have to pay to use Visual Studio Online. We all knew it was coming… and now you might have to make a decision whether you should stay in the cloud or whether you should move your TFS to your internal network. To be fair I think that, in some cases, it could be very well worth it to pay for hosting TFS in the cloud instead of hosting one internally. However, in some cases it might be cheaper for customers to host TFS internally. Those users need to move TFS content from the cloud to on premise server. Kudos to Microsoft for providing an easy to use Data Export tool (see http://msdn.microsoft.com/en-us/library/7cb80f0d-0119-4277-82e8-719a8db1796e for more info) to migrate your Visual Studio Online contents to on premise TFS server. I wish they made that tool available permanently to users, instead of just for a short time… but I'm not complaining.
Anyways, for the most part this is a fairly to use tool and it worked just fine every time I use it. But once I got the following error: "TF400711: Error occurred while executing servicing step download bacpac from azure storage for component cloudimport during cloudimport: the remote server returned an error: (403) forbidden", trying to migrate Visual Studio Online contents to on premise TFS server. In my case, this error was caused by a simple fact that my export package created in Visual Studio Online has simply expired. I forgot that the package is only valid for 10 days, and gets deleted from Microsoft servers once expired. So, simply start export process again in Visual Studio Online, get a new export package, and run TFSCONFIG CLOUDIMPORT command again using new export package.