A new version of the TFS PowerTools were released recently.
This release includes changes to the following areas:
- TFS Backup
- TFS Best Practices Analyzer
- TFPT.EXE Version control command line
- Windows Shell Extensions
You can get more details and download from here.
I’m a little slow getting to this. Just no time to Blog in the past coupe of weeks.
Last weeks big announcement and two service packs and a Feature Pack oh my!
1. Unlimited Load testing. Now you can load test your application without having to purchase Virtual User Packs. You use to have to purchase Load Test Packs to be able to simulate 100o users. Now this comes free with the Ultimate edition of MSDN.
2. VS/TFS 2010 SP1 and TFS-Project Server Integration Feature Pack is out in Beta. The Service Packs includes a bunch of fixes see links below, while the Feature Pack allows you to integrate your project Server with TFS.
I learnt something I didn’t know, so I thought I would share it. I was creating a build for a client that builds several technologies. PowerBuilder, Java, .net 1.1, SQL Server 2008, Reporting Services and .net 3.5.
I altered the Default Template to perform the other technology builds. The problem was The Build Settings argument is required and in this case I didn’t want it to be. They may use the process to build something that is not a .net solution after all.
The trick I found: If you add the Buildsettings to the MetaData argument and don’t make it required it overrides the default.This way you can still use Buildsettings just like you would in the DefaultTemplate however if it’s not needed for the build definition you are creating it will not be required.
Deb Forsyth posts lots of great information about TFS and it’s related tools for the Test community. She put up a post recently about the new Feature Pack for Visual Studio.
There is one feature in this Feature Pack that is particularly useful, the Coded UI Test Editor. If you have worked with Coded UI Tests you will know, although powerful the tooling around version 1 needs some maturing. We'll it’s already begun. When you create a Coded UI Test it generates a couple of files. The UIMap.uitest which maps methods that manipulate the UI and assertions to the application under test, and the Coded UI Test class which is really just a test class with test methods that call methods in this UIMap object.
Prior to Feature Pack 2, to make a change to the UIMap.uitest has not been easy you would have to have a good working knowledge of .net development. However, now we have the Coded UI Test editor. After you install the new Feature Pack open the UIMap file in your test project and an editor will open that allows you to see all the methods in your UIMap rename them split them out into other methods. It will also allow you to change an assertion as seen in the image below.
This is a welcome tool when writing automated tests using Visual Studio.
If you ever get an error like this in the build. Think back, did you recently remove a build argument from the build process template?
Apparently when you modify the Build process template and therefore it’s no longer in sync with the build definition you can get this error.
To solve this problem you need to get them back in sync. Even though from the process tab of the build definition your arguments all seem to be ok.
To fix the problem open the Build definition and from the process tab refresh the build process.
That fixed the problem for me.
Thanks to Jakob Ehn for his post which explains this issue in more detail.
I have been helping customers who are implementing Team Foundation Server (TFS) and would like to put PowerBuilder Code into TFS source control. I’m not sure how different this would be in the newest versions of PowerBuilder that support xaml and where powerscript is a .net language. However for the older versions where the customer is still using native PB there are a couple of things you might want to do to make the experience better for all involved.
First let me explain one problem with PowerBuilder and TFS: If you just add the PBL’s in your target to TFS all the exported objects end up in one folder and it’s hard to tell where everything comes from. This may not be a big deal if you never look at the source explorer in Team Explorer, but why wouldn’t you?
So here are my suggestions:
1. First set up a folder structure that will become your local working copy of the PB code. I recommend putting each PBL into it’s own folder. This might seem odd at first but once TFS gets a hold of it and all your objects are in each folder along with the .PBG file it will be a lot easier to work with. So lets say your folder structure looks something like this:
|MyPBApp || |
| ||Logistics |
| ||Order |
| ||Shipping |
| ||Warehouse |
2. Now put your target file in the MyPBApp folder. In .net speak the PBL is the project and the target is like the solution. There is one more level above that (the workspace) but I’m going to ignore that in source control and just keep it local. You will have to fix the target in PowerBuilder so that it can find the PBL’s in their new locations.
3. Make sure PowerBuilder is connected to TFS via the MSSCCI provider. You can set up the connection by right clicking on the PowerBuilder workspace, select properties and go to the Source Control tab. I would also install Team Explorer so you have access to the full feature set of TFS.
4. From PowerBuilder pick your target and select Add To Source Control. This will export all the objects out of their PBL’s and create a PBG file which is a manifest of the objects inside that PBL. Now anyone on the team can get latest and open the Target in their local workspace.
5. Make sure to tell everyone to make the local copies of their PBL’s writable. Or they will have to check them out when they check out an object. Trust me it’s easier to make them writable locally.
Would you like to be able to run code metrics for your application via the build process. As of two days ago you can. ON Jan 26th 2011 MS published the Visual Studio Code Metrics PowerTool 10.0.
With this power tool you can perform code metrics via the command line so it can be called from your build.
Just download and run metrics.exe /? for help.
This is a great walkthrough that explains how to deploy a Database from a TFS 2010 Build WorkFlow.
It uses a command line tool called VSDBCMD.EXE, that is explained here.
I recently used this technique and it works great.
I actually created a pretty sophisticated build and deploy process using the build process templates from TFS 2010 to build a Database project and allow the user to decide via variables if they want to deploy to a Database or just create the script.
Something to watch out for though.
Take a look at the section titled “To define the Then Deploy block”
Then look at Step 3.d
Set the FileName property to the path of VSDBCMD.EXE on your build server. For example, you might specify C:\Program Files\Microsoft Visual Studio 10.0\VSTSDB\Deploy\VSDBCMD.EXE if you installed Visual Studio on your build computer or C:\Deploy\VSDBCMD.EXE if you just copied the Deploy folder to your build computer.
Notice the reference to “your build server”. This seems innocent enough of a statement. However if your Build controller is a different machine than your build agent it makes a difference as to where you can put your Invoke process to call VSDBCMD.
If you installed VSDBCMD on the build agent make sure the Invoke VSDBCMD is within the Run on Agent block in your Build process or it won’t work. If you follow the walkthrough and put your Invoke Process after the Check In Gated Changes block the process will not be able to find VSDBCMD.EXE because it’s installed on the Build Agent and at this point you are on the controller. If they are the same machine it won’t make a difference.
referring to the diagram below: If your controller and agent are different machines, everything in the Run On Agent block runs on the Agent and everything else runs on the Controller.
Team Foundation Build variables: You can use the following variables in a build agent working directory:
- $(BuildAgentId): An automatically generated integer that uniquely identifies a build agent within a team project collection.
- $(BuildAgentName): The Display Name of the build agent.
- $(BuildDefinitionId): An automatically generated integer that uniquely identifies a build definition within a team project collection.
- $(BuildDefinitionPath): The team project name and the build definition name, separated by a backslash.
This Blog post contains material that is not suitable for the branching novice. Viewer discretion is advised.
First I would like to say this should be avoided if at all possible. Having a relationship between branches makes it much easier to deal with branching. So unless you absolutely have to merge between unrelated related branches try not to.
Now that disclaimers are out of the way, what is a baseless merge?
Take the following branch hierarchy. Dev can be merged with QA, and QA can be merged with hotfix or Prod. If you tried to merge a change from hotfix directly to Dev the UI will not let you. There is no relationship between them therefore it would be a baseless merge.
If I were to make a change in the Hotfix branch and attempt to merge it with Dev that option would not be available. As you can see in the merge dialog below the only option available for merging is the QA branch.
If I was to merge changeset 108 from Hotfix to QA the visualizer would show something like this.
If I wanted to merge the latest version of HotFix into Dev skipping QA I would have to do a Baseless merge from the command line. Using the baseless switch on the tf merge command.
tf merge /recursive /baseless Hotfix Dev
If we then take a look at the visualizer we can see that we did a baseless merge denoted by the dotted line.