Getting output from Remote PowerShell on Target Machines

Remote PowerShell on Target Machines task in TFS/VSTS is awesome. Easy to use and it works. The only things that bugs me about that task is that I don't see the output of the script on the build console output. Luckily, this can be easily fixed. All you have to do that is to use Write-Verbose -verbose instead of Write-Output command inside your PowerShell script.

For example, instead of:

Write-Output "Remote script is executing step 1-2-3"

use the following

Write-Verbose "Remote script is executing step 1-2-3" -verbose

You should now see the remote script output in build console.

Persisting PowerShell variables between steps

In VSTS and in TFS 2015 Update 1 (and higher), you can use the following command to persist PowerShell variables between steps by simply writing it to standard out (Write-Host) in powershell

Write-Host ("##vso[task.setvariable variable=testvar;]testvalue")

 

This will make the variable available to be consumed by the later steps in an input via the $(testvar) syntax and the replacement will happen as expected. It's that easy.

You can see the full list of the commands you can use from the scripts at https://github.com/Microsoft/vsts-tasks/blob/master/docs/authoring/commands.md

Customize TFS build steps on the fly based on build trigger

I have the following scenario. The client currently has multiple build definitions, which are very similar (not identical, but similar), but have different triggers. For example, nightly builds run more/different tests, while gated or CI builds run less tests and perhaps run less steps. And, they want to combine/merge those build definitions into one. With TFS 2015 you can create multiple triggers on the same build definitions, so we should be able to merge those build definitions into one. The problem is that the build steps must be the same, or we need to figure out the way to update those steps on the fly somehow. Having the same steps is not an option, since we want gated/CI builds to be smaller/faster. So, we need to figure out a way to modify build steps on the fly. Somehow. PowerShell to the rescue.

I wrote a small PowerShell script that checks if the running build is gated build and update certain build variables (in my case, I used a variable called TestFilterCriteriaToSet) in memory, which are later consumed by next steps in the build definition.

 

function Is-This-Gated-Build ($buildid)

{

    $buildDetails=Invoke-RestMethod -Uri "http://TFSSERVER:8080/tfs/TFSCOLLECTION/TEAMPROJECT/_apis/build/builds/$($buildid)?sapi-version=2.0" -UseDefaultCredentials -Method Get

    $buildReason = $buildDetails.reason

    if ($buildReason -eq 'CheckInShelveset')

    {

        return $true

    }

    else

    {

        return $false

    }

}

 

if (!$gatedBuild)

{

Write-Output "Running non-gated build..."

    

# DO SOME POWERSHELL MAGIC HERE

 

    $TestFilterCriteriaToSet = "(Name!=TestProcessQueue)"

    Write-Host ("##vso[task.setvariable variable=TestFilterCriteria;]$TestFilterCriteriaToSet")

}

else

{

    Write-Output "Running gated build..."

    $TestFilterCriteriaToSet = "(TestCategory!=LongRunning & TestCategory!=Commit & TestCategory!=IntermittentFailure & TestCategory!=ExecutableLaunch)"

    Write-Host ("##vso[task.setvariable variable=TestFilterCriteria;]$TestFilterCriteriaToSet")

}

 

Next steps: Save the script. Check it in. Add this script as a step in the build definition. Consume TestFilterCriteriaToSet variable in the next step as such $( TestFilterCriteriaToSet). For example, in Visual Studio Test step. That's it.

Downgrade SQL Server database from Enterprise to Standard Edition

I've seen quite a few clients who have SQL Server Enterprise Edition installed, but even though they don't use any of the enterprise features of SQL Server. For example, TFS does not require SQL Server Enterprise Edition, unless of course you need encryption or compression or SQL clusters. It works just fine with SQL Server Standard Edition. So, if you're not using enterprise features, having SQL Server Enterprise Edition installed seems like an overkill. An expensive overkill. Fortunately, this can be fixed. If you need to downgrade SQL Server database from Enterprise Edition to Standard Edition.

First, let's check what edition of SQL Server you're running. To do that open SQL Server Management Studio, connect to your SQL Server and create/run new SQL query:

SELECT @@version

Let's assume that you're running SQL Server Enterprise Edition. To downgrade database from Enterprise Edition to Standard Edition, we need to disable compression on that database. Before we do that we need to make sure we do the following:

  • BACKUP your database because you know… it's smart thing to do
  • Make sure that your SQL Server has enough disk space available to accommodate larger database sizes. Remember that uncompressing a database might/will increase the size of the database.

To disable compression, create/run the following script against the database where you want to disable compression.

SELECT DISTINCT 'ALTER TABLE [' + SCHEMA_NAME(schema_id) + '].[' + NAME + '] REBUILD PARTITION = ALL WITH (DATA_COMPRESSION = NONE);'

FROM sys.partitions p

join sys.objects o

on p.object_id = o.object_id

WHERE o.TYPE = 'u'

and data_compression_desc != 'NONE'

UNION

SELECT 'ALTER INDEX ALL ON [' + SCHEMA_NAME(schema_id) + '].[' + NAME + '] REBUILD PARTITION = ALL WITH (DATA_COMPRESSION = NONE);'

FROM sys.partitions p

join sys.objects o

on p.object_id = o.object_id

WHERE o.TYPE = 'u'

and data_compression_desc != 'NONE'

   

If any of the tables in the database have compression enabled, this script will generate more SQL scripts as an output. When that happens, copy the generated SQL scripts and execute those scripts against the same database to disable compression. Do that for all the databases that have compression enabled. That's it.

DevOps Days Hackathon Toronto

One day DevOps Hackathon event is coming to Toronto on May 25th. It's a full day event where you will get a chance to chance for some hands-on learning experience with DevOps concepts and automation using tools from Microsoft & Chef. Here is an agenda for hackathon event:

Agenda:

9:00 AM – 9:30 AM Registration, Breakfast & Introduction

9:30 AM – 10:00 AM Session I – DevOps and the Cloud: Microsoft Azure

10:00 AM – 10:45 AM Session II – Chef + Microsoft Azure, Managing your cloud consistently, securely and transparently

11:00 AM – 12:00 PM Hackathon Begins

12:00 PM – 1:00 PM Working Lunch

1:00 PM – 3:00 PM Hackathon Continues

3:00 PM – 4:45 PM Hackathon Ends & Group Presentations (5 min/group)

4:45 PM – 5:00 PM Closing Remarks & Winners Announced

 

If you are interested in participating, fill free to register at https://www.eventbrite.com/e/devops-days-toronto-2016-tickets-17038721274?access=DEVOPSTOHACKATHON. This event will take place a day before DevOps Days Toronto kicks off. It's an excellent event, which is unfortunately already all booked, but hackathon event has only recently opened up and promises to be very informative. So, go ahead and register. It will be fun… See you there.

 

TFS chart size limits

TFS is an excellent ALM/DevOps tool. Very flexible ALM/DevOps tool. You can do a lot with it. A lot. Once in a while though, like with any other tool, you hit a hurdle that prevents you from doing what you and/or the customer is trying to do with TFS. For example, recently I encountered the following error: VS402395: The daily number of work items returned exceeds the trend chart size limit of 1000. An error was caused because the customer was trying to build a chart from a query that returns more than 1,000 work items. I questioned why would they need a chart for a query that big, and the customer seemed to have a good reason to have chart for that many items. I might not agree with that reason, but the customer thought it was important, so I've promised to look into it.

And, of course, if there is a will, there is a way. Especially, if you're dealing with a flexible and feature rich TFS. TFS has a very rich API. And, almost always, if there is something that you cannot do in UI, you can do via API. Just like in this case. Even though, there was no way to by part chart size limit, there is a very straightforward to get it done using API. OK, enough talk. Here is how you can change chart size limit using PowerShell and TFS API:

 

# Load TFS snapin

if ( (Get-PSSnapin -Name Microsoft.TeamFoundation.PowerShell -ErrorAction SilentlyContinue) -eq $null )

{

Add-PSSnapin Microsoft.TeamFoundation.PowerShell

}

[void][System.Reflection.Assembly]::LoadWithPartialName("Microsoft.TeamFoundation.Common")

[void][System.Reflection.Assembly]::LoadWithPartialName("Microsoft.TeamFoundation.Client")

[void][System.Reflection.Assembly]::LoadWithPartialName("Microsoft.TeamFoundation.WorkItemTracking.Client")

 

$tfsCollection = New-Object -TypeName Microsoft.TeamFoundation.Client.TfsTeamProjectCollection TFS_COLLECTION_URL

$hiveCollection = $tfsCollection.GetService([Microsoft.TeamFoundation.Framework.Client.ITeamFoundationRegistry])

 

Write-Output "The current value of chart size limit is:"

$hiveCollection.GetValue("/Service/WorkItemTracking/Settings/MaxTrendChartTimeSliceResultSize")

Write-Output "Changing the chart size limit to 1500 (maximum)…"

$hiveCollection.SetValue("/Service/WorkItemTracking/Settings/MaxTrendChartTimeSliceResultSize", 1500)

Write-Output "The new setting for a chart size limit is:"

$hiveCollection.GetValue("/Service/WorkItemTracking/Settings/MaxTrendChartTimeSliceResultSize")

 

 

That's it. Have a good day J

Excluding weekends in Azure Automation Runbooks

Getting Azure Automation runbooks to shut down your virtual machines (or turn them on) automatically is not new. There are a lot of scripts out there that could do it for you. You can also write one yourself. It's not that complicated. I did it J Just kidding…

There are a couple of ways my PowerShell scripts are different:

  1. First, the scripts that automatically start/stop Azure virtual machines take the weekend into account. Scripts will not turn your machines on or off on the weekends. After all, you probably do not want to automatically turn on your virtual machines in Azure early in the morning on the weekend, just so that you can turn them off at the end of the day on the weekend. Seems like a waste, right? Anyways, this small change should save you a few bucks.
  2. Second, the scripts adjust the schedule from UTC to the time zone you need. It looks like when the scripts that are part of Azure Automation runbooks run, they use UTC time. So, if you're in Toronto, script will think that the weekend starts 5 hours earlier. It's not bad, I guess. But, it also means that the weekend will end 5 hours earlier, and that just not right and need to be fixed.

Below is a code snippet that makes the above mentioned happen:

$UTCTime = (Get-Date).ToUniversalTime()

$TZ = [System.TimeZoneInfo]::FindSystemTimeZoneById("Eastern Standard Time")

$LocalTime = [System.TimeZoneInfo]::ConvertTimeFromUtc($UTCTime, $TZ)

$day = $LocalTime.DayOfWeek

if ($day -eq 'Saturday' -or $day -eq 'Sunday')

{

Write-Output ("It is " + $day + ". Cannot use a runbook to start VMs on a weekend.")

Exit

}

 

The complete scripts that start or stop Azure virtual machines can be downloaded from OneDrive. Enjoy.

Accessing Git from Behind the Proxy

So, you have installed Git client and trying to connect to Git server (on Visual Studio Team Services, Github, or whatever), but you're getting "fatal: unable to connect a socket (Invalid argument)" error. One of the reasons could is that you're behind the proxy. For example, you're at work and your employer requires all internet traffic to go through the proxy. ~/.gitconfig global config file is the key here. In this case, to get Git client to work with the proxy, you need to configure http.proxy key in git config using one of the following commands:

git config --global http.proxy http://proxyuser:proxypwd@proxy.server.com:8080

or

git config --global https.proxy https://proxyuser:proxypwd@proxy.server.com:8080

  • change proxyuser to your proxy user
  • change proxypwd to your proxy password
  • change proxy.server.com to the URL of your proxy server.
  • change 8080 to the proxy port configured on your proxy server

If you do not need to authenticate to proxy, then just specify proxy server name and port number and skip proxy user and password.

 

If you decide at any time to reset this proxy and work without (no proxy), use one of the the following commands:

git config --global --unset http.proxy

or

git config --global --unset https.proxy

 

Finally, to check the currently set proxy, use one of the following commands:

git config --global --get http.proxy

or

git config --global --get https.proxy

 

By the way, to retrieve the proxy settings you're using, you can use one of the following commands:

reg query "HKEY_CURRENT_USER\Software\Microsoft\Windows\CurrentVersion\Internet Settings" | find /i "proxyserver"

or

netsh winhttp show proxy

That's all I got to say about Git and proxy server.

tf.exe (Team Explorer) vs. tf.exe (Team Explorer Everywhere)

I recently had an interesting experiencing writing post build PowerShell script for a client. The client wanted to check in certain files into source control after the build is finished. Sounds easy, right? You can use either good old tf.exe command line utility from Visual Studio command tools. Or, you can use something more current like PowerShell to write a simple script that will check in pending changes for you. The problem is that the client also wanted to associate work items with the check in. Not a big deal, right? Well, apparently it is a big deal. You cannot associate work item with the check in using tf.exe command tool. And, what's even stranger, I could not find a way to associate work item with the check in using PowerShell. I got stuck with figuring out how to make WorkItemCheckinInfo[] parameter in Workspace.Checkin method to work properly.

This is how I learned that apparently you can associate work item with TFS check in, but you have to use tf.exe command from Team Explorer Everywhere. Apparently, even though the names are the same, those are very different command line utilities. When you use tf.exe from Team Explorer Everywhere, you can associate work item with the check in using a simple command:

tf checkin ItemSpec -associate:WorkItemIds

It's that easy. I just wish –associate option was available in common tf.exe command from Visual Studio command tools. I would also wish that those two seemingly identical tf.exe commands would actually do the same thing (the same way), or at least that those commands would have different names to avoid the confusion. By the way, there are also other differences between those two commands with the same names. You can get them form the links provided in the post. I'm too upset to list myself L

Toronto Enterprise DevOps User Group

I have started a new user group with the focus on Enterprise DevOps. DevOps is getting significant attention in the industry. Many organizations don't understand what DevOps is, how to adopt DevOps practices effectively within the organization, and are not aware of what DevOps tools to use. Toronto Enterprise DevOps User Group is focused on applying DevOps practices in the enterprise environment. This group is for people in the Greater Toronto Area who are interested in continuous deployment/integration, release management, infrastructure as code, change/configuration management, load testing & auto-scale, performance/availability monitoring, capacity management, automated testing, automated environment provisioning/de-provisioning, self service environments, automated recovery (rollback & roll-Forward), and many more of constantly evolving DevOps practices. Every level of experience is welcome, all we ask is that you come with an open mind and are excited to share your knowledge.

The first meeting is on September 10th, 2015. We'll start with a discussion of What is DevOps? DevOps is a term for a group of concepts that, while not all new, has catalyzed into a movement and is rapidly spreading throughout the technical community. Like any new and popular term, people have somewhat confused and sometimes contradictory impressions of what it is. Is it "Quality" or "Agile,"? Well, DevOps is a large enough concept that it requires some nuance to fully understand. DevOps is the practice of operations and development engineers participating together in the entire service lifecycle, from design through to the development process to production support. We will cover what DevOps is and is not during our first user group meeting.

Visit our website for more info. See you there.