Two times I #HitRefresh

I’ve been an independent, consultant since 1993. Which means it’s coming up to the 25th anniversary of the first time I #HitRefresh on my career. At that time I was working as an employee with a fairly large data centre in the financial industry, primarily writing teller systems in C for the Credit Unions of Ontario. At this time these systems ran on DOS and OS\2. I #HitRefresh by moving into the consulting world building applications for Windows written in PowerBuilder.

The second time I #HitRefresh was around 2000 when I latched onto the .net wave and learned C# and the .net framework. Along with many other new and exciting tools. Most important to me Team Foundation Server.

This was a big change from the world I had become accustomed. My friend and colleague Barry Gervin and I started speaking at user groups, attending the PDC conferences and going to each others house once a week to teach each other different parts of the .net framework. We eventually started a company that helped out customers adopt this technology. That was the beginning of ObjectSharp Consulting. We offered consulting and training. Some of that training we authored, and some was Microsoft Official Curriculum.

Although there have been many #HitRefresh moments in my career this was the biggest one. I am sure there will be more. Looking back it amazes me how this industry has changed since the mid 80’s when I began writing code. 

This post is part of a series of #HitRefresh moments. Read more at www.HitRefreshbook.com.

Quick and easy custom filtered burn downs

Today I am going to show you how to create quick little burn down charts right off of a work item query.

As you may well know the burn down chart in TFS is based on the remaining work field in the task work item. It queries all the tasks in the corresponding iteration and shows how the remaining work changes each day during the iteration.

This is great if you are a scrum team, it's all you need to know.

I have a lot of customers that use TFS that are not using Scrum or are even that agile. However they love the idea of a burn down report. However they want to see it with different filters applied.

Perhaps by individual or by activity or Area Path. Remember not everyone follows the agile manifesto 100%. Don’t judge. :)

Back to our quick burn downs.

First we need to create a work item query that defines the scope of our burn down. Since it’s a query and not the iteration backlog we can grab any tasks we want we can create a burn down that crosses teams and iterations.

So lets keep it simple for now and create a query that gets all the tasks for the current iteration.

image

Now add the a few columns to the query. We’ll need Remaining Work since that is the key to our Burn Down. Lets also grab Activity, Area Path and Assigned To for our example

image

Now over to the Charts section of the query. (Make sure you have saved the query so you can make a chart.)

Create a new chart and select a Line Trend report.

Under Values change Count to Sum and select Remaining Work as the field to Sum.

Pick your Rolling period. This is how far back you want to start the burn down.

Now select your Group By field. This is where you can define if you want to see the burn down by Activity or Team Member or Area Path.

image

There is no projection beyond today or an ideal trend line, but I am sure you can visualize that.

With charts you can make a bunch of these, each with their own “group by” field to augment the burn down for your teams iteration.  

Writing to the Build Report in TFS 2015

This is a cool trick for writing something directly onto the build report. 

As an example I will write out the heading “ObjectSharp” and under it I’ll put the website. So it will come out looking like this.

image

 

 

 

 

 

 

 

 

 

 

 

The syntax follows this format:

##vso[area.action property1=value;property2=value;...]message

For more examples and syntax look here. For my example I simply added an inline Powershell task. With the following powershell.

$TempFile = [System.IO.Path]::GetTempFileName()
$fullurl = "http://www.ObjectSharp.com" | set-content $Tempfile
Write-Host "##vso[task.addattachment type=Distributedtask.Core.Summary;name=ObjectSharp;]$Tempfile" 

How would I use this. In the past I have used it to write code metric results to the build report. I have also used it to write the URL to a location in Nexus Package Manager where a build artifact was uploaded ready for deployment.

Global Azure BootCamp

GLOBAL AZURE BOOTCAMP

On Saturday, April 22, 2017, communities across the world will be holding Azure Bootcamps. There are already 132 planned!

Here is a list of events happening in Canada: Calgary, Montreal, Ottawa, Quebec, Toronto, Vancouver

Sharing files between Host/VM

Since Windows 8.1 I have had Hyper-V installed on my machine. As an MVP I do a lot of demo’s so this is wonderful. I used to have a spare machine running Windows 2008 just for this.

One thing I have found difficult to do, is share files between my VM and the host machine. All the solutions I was coming across were just not as simple as I would like. Until now. I came across this Blog which outlines 5 ways to achieve this.

Below is, in my opinion the easiest way to share files between a host and VM.

1. Start Hyper-V Manager
2. Right click the Hyper-V host and select "Hyper-V Settings"


image

 

 

 

 

 

 

 

 

3. Ensure the tick box is ticked under Enhanced Session Mode and click OK.

image

 

 

 

 

 

 

4. Right click the virtual machine-> Settings -> Integration Services at bottom left hand side of the menu.
5. Check "Guest Services" and click OK.

image

 

 

 

 

 

 

 

 

 

 

6. Start up the virtual machine, copy files from physical machine and paste in to virtual the machine!

Build and Release Tasks with DropDown Arguments

I have been creating a lot of Build Tasks for the new TFS 2015 Build and Release.

I have discovered a few tricks that were not simple to find. I thought I would Blog about them for my own record and anyone who happens by.

This Blog post makes the assumption that you have already been creating your own build tasks. If not, and you want some background take a look here first.

In this example I created a Release task to perform a DACPAC deployment. The script calls SQLPackage.exe which can do several things. Deploy the changes directly to a Database or generate a Deployment Report, a Drift Report or the SQL that needs to be run to update the target, amongst others.

So I figured I would make my Build Task smart enough to let the end user select which action they would they would like to perform. To do that I need the user to select which action and have that passed into my Powershell script.

The input value that is defined in my task.json file looks like this.

{
   "name": "action",
   "type": "pickList",
   "label": "Action",
   "defaultValue": "Publish",
   "required": true,
   "helpMarkDown": "Select the Action to perform",
   "options": {
     "Publish": "Publish Changes",
     "Script": "Script Changes",
     "DeployReport": "Generate Deployment Report"
   }
 }

The type needs to be “picklist”. I also needed to define options. Each option is comprised of two strings seperated by a semicolon. The string before each semicolon is the value to be returned and the string after the semicolon is the display name. Notice I set one of the values to be the defaultValue.

Once published to TFS the build task inputs will look something like this.

image

 

 

 

 

 

 

 

Now the selected value will be passed into my Script and I can change my call to SQLPackage.exe accordingly.

Making a TFS Build wait

In my career I have created Automated builds for many different technologies. TFS never lets me down. The old XAML build did a good job however the new TFS 2015 Builds are awesome. First of all I love PowerShell, which makes my life much easier. I can just call out to a script sitting in source control or better yet create my own Build Task and upload it to the server.

I recently had to create a build for a development tool that generates documents from various system data. This tool has it’s own editor, version control,  release management everything. It’s kind of a closed eco system.

So how do we get the rest of the team involved in it’s builds and deployments. How do I trigger a build and have this system drop a deployable package for me, that I can deploy to different environments without having to ask the developers? Letting QA be responsible for their own environment.

After talking with an expert in this tool. It turns out, it also has an automation component. The developers can write scripts to perform builds and deployments. However we want this to be part of a larger eco system with our other applications.

It was decided that we would create a file share on the network where I could drop a file to let the other system know I wanted a new build dropped. This way TFS would still generate a build number the testers can deploy to the QA environment and reference from their test plan so we get the full end to end traceability.

So I wrote a script that would drop a file (Build.txt) in a known location on the network. Said products automation tool watches for my file and triggers a script to perform a build when it sees it and then drops that build package back for me to turn into a build artifact. I will put the build number in this file so the other system can use that as a label in it’s own Version Control system.

Then the build waits and watches for a file named BuildComplete.txt to be placed back in that working folder. When my build sees it I know the built package is there and I can create a build artifact. By the way the BuildComplete.txt either contains the word successful or an error log if something went wrong. I will pull that out and toss it up into the build console.

My build has two tasks one to Drop the file that triggers the other system and one to create a build artifact. The second one is already built into TFS “Copy and Publish Build Artifacts”. The first step is one Powershell script. See below with Comments:

The Powershell script takes two arguments The WorkingFolder and a timeout.

image

 

 

 

The comments in line explain how it works.

   1: param()
   2:  
   3: #This is how we do parameters in a custom build task 
   4: Trace-VstsEnteringInvocation $MyInvocation
   5: try {
   6:     Import-VstsLocStrings "$PSScriptRoot\Task.json" 
   7:     [int32]$timeout = Get-VstsInput -Name TimeOut
   8:     [string]$WorkingFolder = Get-VstsInput -Name WorkingFolder 
   9: } 
  10: finally 
  11: { 
  12:      Trace-VstsLeavingInvocation $MyInvocation 
  13: } 
  14:  
  15: #Start the Timer at zero
  16: $timesofar = 0
  17: #This is the file we are looking for to know the package is complete 
  18: $BuildComplete = "$WorkingFolder\BuildComplete.txt"
  19: #Clean the Working Folder
  20: Remove-Item  "$WorkingFolder\*"
  21: #Create file containing Build number to trigger the build
  22: New-Item -path "$WorkingFolder" -name 'Build.txt' -type 'file' -value "$env:BUILD_BUILDNUMBER"
  23:  
  24: Write-Host "Triggered Build"
  25: #While BuildComplete.txt does not exist and timer has not reached the timeout specified
  26: while (!(Test-Path $BuildComplete) -and !($timesofar -eq $timeout)) 
  27: { 
  28:     #Wait for 10 seconds
  29:     Start-Sleep 10 
  30:     #Add 10 seconds to the time So Far
  31:     $timesofar = $timesofar + 10
  32: }
  33:  
  34: #Either the File has been dropped by the other system or the timeout was reached
  35: if (!(Test-Path $BuildComplete)) 
  36: {
  37:     #If the file was not dropped it must be the timeout so fail the build write to the console
  38:     Write-Error "Build Timed out after $timeout seconds."
  39: }
  40: else
  41: {
  42:     #If we got here the file was dropped
  43:     #Get the content of the file
  44:     $BuildResult = Get-Content $BuildComplete -Raw
  45:  
  46:     #Check if it says successful
  47:     if ($BuildResult.StartsWith("successful"))
  48:     {
  49:         #On Success say so
  50:         Write-Host "Successful Build"
  51:         #write the contents out in case the calling system wants to tell us something
  52:         write-Host "$BuildResult"
  53:     }
  54:     else
  55:     {
  56:         #On Error Faile the build and say so 
  57:         Write-Error "Build Error"
  58:         #write the contents out so we can see why it failed
  59:         Write-Error "$BuildResult"
  60:     }
  61: }

Build Toronto June 10th 2016

  • 400+ attendees in Toronto (Allstream Centre)
  • 1000+ attendees online on Channel 9 with live Q&A
  • Keynote highlighting the most important announcements from Build 2016.
  • Deep Dive sessions on Universal Windows Platform, Xamarin, Web Apps, Project Centennial, and Azure IoT.
  • Live Panel discussion with influential members of the CDN Dev Community
  • Free to attend. Space is limited.

Tech Days Online

Next week it all happens again! For the fifth year Microsoft United Kingdom is running its TechDays Online event—at no cost to attend. Covering the top topics in IT today, you can watch all the live stream action May 18th and 19th from 9:30am – 5:30pm. Register to attend or catch-up on all the great content here.

As in past years, this popular event has attracted noteworthy speakers from across Microsoft's technical community, including many, many MVPS. The focus this year will be Intelligence and DevOps.

On Wednesday, May 18th, the presentations will feature Analytics on Azure and Cortana Intelligence and explore Microsoft's Intelligent Cloud and the interconnected services called the Cortana Analytics Suite. Technical experts already building these services will explain what these services are, how they fit together to enable organizations to leverage their data and share their own experiences.

On Thursday, May 19th, the theme shifts to DevOps with the MVPs and community leaders talking about the latest practices to enable organizations to increase efficiency. Throughout the day the main technologies discussed will enable viewers to implement DevOps practices within their own organization. As with day one, technical experts and MVPs already taking advantage of these practices will share their journey and real-world experiences with DevOps.

 Here's the agenda (subject to change):

Day One;

Day Two;

Cleanup the GlobalList.Xml

TFS has been using the GlobalList.xml file to store builds since 2010. This Global List is used in several workitems as the suggested list of values. For example, in the Bug there are two fields that use this Found In and Integrated In. These drop down fields present a list of builds that comes from the GlobalList.

Although it's not the great solution and I hear it's changing. It's what we have had for 6 years so we have used it. I have many clients that rely on these work item fields and therefore the list of builds.

However, when you delete a build it cleans up a lot of stuff, including the label, Drop Folder, Symbols and Test Results. But not the entry in the GlobalList. L

Ours was getting out of hand. So I built a Powershell script that I could run that would clean up the Global List by checking to see if the build had been deleted from TFS. If it was removed it removes the build from the list. Because this list is implemented as a Suggested list in the work items removing them has no effect on the old workitems referring to this build. They will still have their value. It just means the cleaned up build will not be in the list to be selected anymore.

[Update on Nov 24 2016] I have since cleaned up and fixed this Powershell script. The Script does not upload until you set $UpdateServer = 1

[void][System.Reflection.Assembly]::LoadWithPartialName("Microsoft.TeamFoundation.Client")  
[void][System.Reflection.Assembly]::LoadWithPartialName("Microsoft.TeamFoundation.Build.Client")  
[void][System.Reflection.Assembly]::LoadWithPartialName("Microsoft.TeamFoundation.Build.Common")  
[void][System.Reflection.Assembly]::LoadWithPartialName("Microsoft.TeamFoundation.VersionControl.Client")  
[void][System.Reflection.Assembly]::LoadWithPartialName("Microsoft.TeamFoundation.VersionControl.Client.VersionSpec")  
 
Add-Type -AssemblyName System.Xml
 
cls
#Add the Entry to the Global List 
 
[string] $GlobalListFileName = "D:\Temp\GlobalList.xml"
[string] $NewGlobalListFileName = "D:\Temp\NewGlobalList.xml"
$RemovedBuilds = new-object System.Collections.ArrayList
[String] $GlobalListName = "Builds - NBFC"
[string] $tfsServer = "http://tfs.nbfc.com/tfs/products"
[string] $witadmin = "C:\Program Files (x86)\Microsoft Visual Studio 14.0\Common7\IDE\witadmin.exe"
[bool] $UpdateServer = 0
 
function GlobalList( [string] $action, [string] $GlobalList)
{
 
    $arguments = $action + ' /collection:"' + $tfsServer + '" /f:"' + $GlobalList + '"'
    write-host $witadmin $arguments
    #start-process -FilePath $witadmin -Argumentlist $arguments -wait -WindowStyle Hidden
 
    $pinfo = New-Object System.Diagnostics.ProcessStartInfo
    $pinfo.FileName = """$witadmin"""
    $pinfo.RedirectStandardError = $true
    $pinfo.RedirectStandardOutput = $true
    $pinfo.UseShellExecute = $false
    $pinfo.Arguments = $arguments
    $p = New-Object System.Diagnostics.Process
    $p.StartInfo = $pinfo
    $p.Start() | Out-Null
    $p.WaitForExit()
    $stdout = $p.StandardOutput.ReadToEnd()
    $stderr = $p.StandardError.ReadToEnd()
 
    if ($p.ExitCode -eq 0)
    {
        Write-Host "$action Successful"
           Write-Host "stdout: $stdout"
    }
    else
    {
        Write-Host "$action Failed" 
        Write-Error "Error: $stderr" 
        exit $p.ExitCode
    }
    $p.Close()
 
}
 
GlobalList "exportGloballist" $GlobalListFileName
 
#Load the contents of GlobalList.xml
[xml]$doc = Get-Content($GlobalListFileName)      
$root = $doc.DocumentElement
$BuildsList = $root.SelectSingleNode("GLOBALLIST[@name='$GlobalListName']")
 
#sort the list of builds
$orderedBuildsCollection = $BuildsList.LISTITEM | Sort Value
$BuildsList.RemoveAll()
$orderedBuildsCollection | foreach { $BuildsList.AppendChild($_) } | Out-Null
 
#Connect to TFS and get the Build Server
$server = new-object Microsoft.TeamFoundation.Client.TfsTeamProjectCollection(New-Object Uri($tfsServer))
$buildServer = $server.GetService([Microsoft.TeamFoundation.Build.Client.IBuildServer])
 
#Create a list of builds that are no longer in TFS
foreach ($child in $BuildsList.ChildNodes)
{
    [array]$items = $child.value.Split("/")
    [string]$buildDefinitionName = $items[0]
    [string]$buildNumber = $items[1]
    #Look in the XAML Builds List 
    $buildDetail = $buildServer.QueryBuilds("NBFC", $buildDefinitionName) | ? { $_.BuildNumber -eq $buildNumber }
    if ($buildDetail)
    {
       write-host "$($buildDefinitionName) - $($buildNumber) is Valid"
    }
    else
    {
        $apicall = "http://TFS.NBFC.COM:8080/tfs/Products/NBFC/_apis/build/builds/?api-version=2.0&DefinitionName=$buildDefinitionName&buildNumber=$buildNumber"
        $apicall
        $json = invoke-RestMethod -uri "http://TFS.NBFC.COM:8080/tfs/Products/NBFC/_apis/build/builds/?api-version=2.0&DefinitionName=$buildDefinitionName&buildNumber=$buildNumber" -method Get -UseDefaultCredentials
        if ($json.count -eq 0)
        {
            $RemovedBuilds +=  $child.value
            write-Warning "$($buildDefinitionName) - $($buildNumber) Will be Removed"
        }
        else
        {
            write-host "$($buildDefinitionName) - $($buildNumber) is Valid"
        }
    }    
}
 
#[xml]$mainlist = get-content $GlobalListFileName
 
foreach ($toRemove in $RemovedBuilds)
{
    $query = "//LISTITEM[@value=""$toRemove""]"
    write-host "Select node $query"
    $BuildNode = $doc.SelectSingleNode($query)
    if ($BuildNode)
    {
        $BuildNode.ParentNode.RemoveChild($BuildNode)
    }
    else
    {
    write-host "Can't find Entry $toRemove"
    }
}
 
#Sort the list 
#$orderedBuildsCollection = $BuildsList.LISTITEM | Sort Value
#$BuildsList.RemoveAll()
#$orderedBuildsCollection | foreach { $BuildsList.AppendChild($_) } | Out-Null
 
$doc.save($NewGlobalListFileName)
 
 
if ($UpdateServer)
{
    GlobalList "importgloballist" $NewGlobalListFileName
}