Backup SharePoint solution before deploying its new version using Release Management

Just wanted to share the PowerShell script to backup SharePoint solutions using Release Management. This script can be used as a part of a custom tool/action in Release Management that will backup existing SharePoint solution before deploying a new version of this solution. This tool/action could come in handy when you would like to set up rollback activities for when SharePoint solution install goes sideways. I think it's a good idea to back up things. You know, just in case…

Here is a script. It's not the most sophisticated PowerShell script, but it gets the job done. It takes two parameters: name of the SharePoint solution and the destination folder where you would like to store the backed up solution file.

param

(

[string]$WSP_FileName = $null,

[string]$DestinationName = $null

)

# Load SharePoint snap-in

$snapin = Get-PSSnapin | Where-Object { $_.Name -eq "Microsoft.SharePoint.Powershell" }

if ($snapin -eq $null) {

Write-Host "[INIT] Loading SharePoint Powershell Snapin"

Add-PSSnapin "Microsoft.SharePoint.Powershell"

}

 

Write-Host "[INFO] ----------------------------------------"

Write-Host "[INFO] Backing up $WSP_FileName"

$farm = Get-SPFarm

$file = $farm.Solutions.Item("$WSP_FileName").SolutionFile

$BackupPath = Join-Path $DestinationName $WSP_FileName

if ($file.Count -ne 0)

    {

$file.SaveAs("$BackupPath")

Write-Host -f Green "...Backup complete!"

    }

else

{

    throw "$WSP_FileName solutiuon was not found."

}

 

##################################################################################

# Indicate the resulting exit code to the calling process.

if ($exitCode -gt 0)

{

"`nERROR: Operation failed with error code $exitCode."

}

"`nDone."

exit $exitCode

 

Here is a link to download the script.

Replace a string in a file with PowerShell

I have wanted to expand my PowerShell skills for some time, however I find it a steep learning curve. One thing I am learning is the result is worth it.

I recently needed to replace a string in a file. I searched a found the working parts I needed create this. I just wanted to post it here so I know where to come back and find it.

 

[CmdletBinding()]

param

(

    [Parameter(Mandatory=$True)]

    [string]$OldValue,

    [Parameter(Mandatory=$True)]

    [string]$NewValue,

    [Parameter(Mandatory=$True)]

    [string]$FilePath

)

(Get-Content $FilePath) |

Foreach-Object {$_ -replace "$OldValue","$NewValue"|

set-content $FilePath

 

A wonderful addition to this is you can even pop regular expressions into the $OldValue.

 

(Get-Content MySSAS.deploymenttargets) |

Foreach-Object {$_ -replace "<Server>(.*?)</Server>","<Server>__TargetDataBase__</Server>"}  |

set-content MySSAS.deploymenttargets

 

 

 

Release Stuck in Pending State

Recently, I have come across an interesting behavior in TFS Release Management. When you kick off new release, the release gets stuck in Pending state on certain custom components. And, it stays in that state "forever". Obviously, the first thing that comes to mind is that deployment agent is not responding (even though it does not really make sense since if the agent became unresponsive the component deployment task would time out eventually), but you go ahead and try to restart the deployment service running on the target server anyways. It does not help, of course. So, you start taking other "desperate" measures like re-configuring deployment agent or restarting the target server, but nothing works. I'm calling those measures "desperate" because, deep down in your heart, you know that there is nothing wrong with the target server and that the problem lies somewhere else. You just don't know where, so you resort to the old "Have you tried turning it off and on" approach. We all do it...

Anyways, after a bunch of digging around, I have finally discovered a pattern when this problem occurs. Imagine the following scenario:

  1. You create a custom component and add it to your release template
  2. You kick off your release template
  3. You then realize that you need to tweak your custom component, so you go ahead and tweak your custom component, then kick off your release template again with your recent changes to the component.

This is a very typical continuous improvement approach. You make a change to your custom component, save it and the next time release has been triggered your changes to component will take effect. It works every time in Release Management. Well, almost every time. Apparently, if you change one of the configuration variables in your components from Standard type to Encrypted type (and vice versa), save your component and trigger release, then the release will get stuck in Pending state on that component. Not sure why this is happening. Perhaps, the hash of the component changes or something. Anyways, to fix the issue, you need to:

  1. Remove the component in question completely from your release template, including a link to the component. Then, save the release template
  2. Reopen release template and re-link the component in question
  3. Re-add the component in question, and re-enter the values for your configuration variables.
  4. Save the release template
  5. Trigger a new release. Release should now successfully deploy the component in question

To me, this looks like a bug to me and, hopefully, Microsoft will address this in the next update(s) of the Release Management Server. I am sure they will.

P.S.: This is my first blog post as a Microsoft MVP in Visual Studio ALM. Looking forward to writing a lot more… Hurray!!!

Separate TFS Build Server Output into Different Folders

TFS 2013 makes it easier to separate build server output to different folder. In most of the cases, all you have to do is to change Output Location setting in build definition. Output Location setting can be set to the following values:

  • SingleFolder to place all the build output files together in the drop folder.
  • PerProject to group the build outputs into drop folder sub-folders for each solution or code project that you have specified in the Projects box.
  • AsConfigured to leave the binaries in the build agent sources folder, organized into the same sub-folder structure you see when you build your code on your dev machine in Visual Studio. This structure is defined in your code projects.
  • If you use this option, TFBuild will not copy the output to the drop folder. Instead, you can program your scripts to copy the outputs to the location specified by TF_BUILD_BINARIESDIRECTORY so that they get dropped to the staging location. See post-build or post-test scripts.

This is a great feature/setting in TFS 2013 and it works every time. Well, almost every time. PerProject setting is a bit misleading because it does not always group the build outputs per project. In cases, when you choose to build multiple solutions which consists of multiple projects, TFS Build server will split solutions into its different folders. Project output will not be output into separate folders and instead the output for all projects in the solution will be stored in a single folder. Even though your build definition Output Location is set to PerProject. Frustrating. L

To separate project output into different folder when building multiple solutions, you need to set GenerateProjectSpecificOutputFolder MSBuild property to True. To be more precise, set MSBuild arguments setting to /p:GenerateProjectSpecificOutputFolder=True and projects will automatically build into different subfolders. Voila!

Visual Studio Test Manager 2013 does not record actions

We have noticed that Visual Studio Test Manager 2013 behaves differently on some of the machines. More precisely, it does not record actions on some of the machines, while it works perfectly fine on the other machine while running the same test plan under the same test settings. All machines were running Windows 7 with the same version of Visual Studio installed. User tried rebooting the machine, clearing cache, you know the usual stuff. Nothing helped. This is where I stepped in. J

After a bit of digging, I have noticed that Visual Studio Test Manager 2013 is running fine on 32-bit installs of Windows 7. Visual Studio Test Manager was also running fine when users were running 32-bit of Internet Explorer instead of 64-bit version. Then, I have remembered that Visual Studio Test Manager 2013 does not support 64-bit version of Internet Explorer for recording and playback (see http://msdn.microsoft.com/en-ca/library/dd380742.aspx for more info.) So, as long as users are using 32-bit version of Internet Explorer (C:\Program Files (x86)\Internet Explorer\iexplore.exe) instead of 64-bit version of Internet Explorer (C:\Program Files\Internet Explorer\iexplore.exe) everything is working as it should. Problem solved. Sort of.

Creating new websites using TFS Release Management

When you use TFS Release Management server to deploy your builds to various (and you really should because it's awesome), you might have tried to get it to do all kinds of stuff like automate Windows operations, IIS actions, registry modifications, services manipulations and so on and so forth. Beautiful, right? Yes, I like it too. Release Management is a great piece of software that will not get better. And, in an effort, to make this software better, I wanted to share a small bug I have discovered in the software.

When you use Create Website action to create a new website in IIS, Release Management seem to create website just fine, but if you look closer you might notice a bug. When Release Management action created a website in IIS, it will misconfigure physical path assigned to the website. Unless you set IsAutoStart option to true, and leave IsPreloadEnabled option blank (unless, of course, you use IIS8 or better.) The reason why this is happening is that when Release Management action creates a website it is using iisconfig.exe command. And, for some reason, it's messing up the syntax unless you set those two options. So, for now, just set set IsAutoStart option to true, and leave IsPreloadEnabled option blank. And, hopefully, Microsoft will fix that bug in the future release…

Making Template Changes to the Projects Migrated from Visual Studio Online

So, you have migrated TFS projects from Visual Studio Online and now you want to make certain improvements to newly migrated projects. Well, you're out of luck. When you try to customize work item types of the existing TFS projects migrated from the cloud, you will receive an error. Even though you have all the necessary permissions. So, what exactly is the problem here?!? Apparently, there is something in the migration process that causes certain metadata missing from security tables when TFS contents are moved from Visual Studio Online to the on-premises servers. As a result TFS simply does not know that you're now allowed to modify the project template.

To solve the problem, you will have to call Microsoft technical support. They will send you the fix. Hopefully, they will integrate that fix into the migration, so this kind of think does not happen. Hopefully…

Putting the Developer in DevOps

The term DevOps has been getting a lot of play lately. While it’s possible (albeit unlikely) that DevOps is a passing fad, my personal opinion is that it’s the next logical step in the maturation of the software development process. Which means that, as a developer, it behooves you to become aware of the tasks that make up DevOps and the options which are available to help you accomplish them.

In case it wasn’t immediately apparently, DevOps is a portmanteau of the words “Developer” and “Operations”. The overarching idea is that consideration is given to the needs of the system administrators during the design and development phase of a project. In some cases, this might mean that the administrators themselves work along side the developers. But at a minimum, the developer must have an understanding of the needs of the system administrator after the application goes live and bake the appropriate hooks and instrumentation into the code before it goes live.

This approach is different from the ‘traditional’ approach. For many (most??), the Dev side of the process involves the creation of the software. Ops, on the other hand, are viewed as simply the people who have to deal with the artifact of the developer’s creative genius once it’s in the hands of real people. These two groups have been treated, by management and users alike, as separate entities. Or is that enmities?

However, in the real world, this division is not optimal. It is well documented that the majority of an application’s life will be spent in production. Doesn’t it make sense to include functionality in the application that helps to ensure that this life will be both productive, helpful and informative? Of course it does. But being obvious doesn’t mean that it has come to pass. Yet.

But by taking the more holistic view that operation functionality is actually just one more aspect of an application’s functionality, it makes sense to address delivery and instrumentation right from the start. This results is a more robust product and, ultimately, provides more value to the users.

At ObjectSharp’s At the Movies event in May, I had the opportunity to demo a new component that will provide some of the functionality necessary to integrate your application with operations. Application Insights is an SDK that is available across most Microsoft platforms, including Web (MVC and Forms), Windows Phone, and Windows Store. It’s primary requirement is that a live Internet connection is necessary to send the instrumentation data to a central repository. You (that is, the Dev of DevOps) can submit custom, hierarchical categories, along with corresponding metrics to the repository. And the repository can be visualized using the portal that is available at Visual Studio Online.

image

 

 

 

As you might expect, there are more features that are available for to help with DevOps, including server performance tacking, availability monitoring for your Web site (either by hitting a page or by running a simple script) and even a live data stream of events while you are debugging your application. And Application Insights is a locus of regular innovation, with new versions of the SDK being released on a regular cadence. Almost too regular, if you catch my meaning.

You can learn more about Application Insights at the Visual Studio Online site and if you would like to see my demo (or any of the other sessions) from At the Movies, you can find it on Channel 9.

More on fixing reports after Visual Studio Online export

In my previous post on fixing reports after exporting Visual Studio Online project collection to the on premises server, I have forgot to mention that we have fixed the reports by running SQL queries created by a local SQL ninja who wished to stay anonymous (like all ninjas do.) He created the SQL scripts before Microsoft provided the update script that I have included in my previous post and, frankly, his scripts were more comprehensive (no offense to Microsoft support, as they have been very helpful too.) Anyways, here is customer supplied script that helped us resolve the warehouse errors and fix reports after exporting Visual Studio Online project collection:

select l.Id, A.[Changed Order], l.[Changed Order] from WorkItemsLatest L

inner join WorkItemsAre A on l.Id = A.Id and a.[Changed Order] <> l.[Changed Order]

 

update WorkItemsLatest set [Changed Order] = a.[Changed Order]

from WorkItemsLatest L

inner join WorkItemsAre A on l.Id = A.Id and a.[Changed Order] <> l.[Changed Order]

 

select w.id, w.[Changed Order], l.[Changed Order] from workitemswere w

inner join workitemslatest l on w.id = l.id and w.[Changed Order] > l.[Changed Order]

inner join (select id, max(rev) as maxrev from workitemswere

group by id) x on w.id = x.id and w.rev = x.maxrev

 

Update workitemswere set [Changed Order] = cast((cast((cast(l.[Changed Order] as binary(8))) as Bigint)-1) as binary(8))

from workitemswere w

inner join workitemslatest l on w.id = l.id and w.[Changed Order] > l.[Changed Order]

inner join (select id, max(rev) as maxrev from workitemswere

group by id) x on w.id = x.id and w.rev = x.maxrev

 

select id, rev, [Changed Order] from WorkItemsWere where [Changed Order] > @@dbts

 

Update WorkItemsWere set [Changed Order] = cast((cast((cast(z.[Changed Order] as binary(8))) as Bigint)-1) as binary(8))

from

(select w.id, w.[Changed Order], x.rev from WorkItemsWere w

inner join (select id, rev, [Changed Order] from WorkItemsWere where [Changed Order] > @@dbts) x

on w.id = x.id and x.rev = w.rev-1) z

inner join workitemswere ww on z.id = ww.id and z.rev = ww.rev

 

Again, while running the script may have helped us, please RUN THE SCRIPT AT YOUR OWN RISK! AND, PLEASE, DO NOT FORGET TO BACKUP THE PROJECT COLLECTION DATABASE BEFORE YOU RUN THE SCRIPT!!!

SharePoint keeps prompting for credentials

One of the clients had an unusual problem accessing SharePoint. When SharePoint was accessed using Internet Explorer, user kept getting prompted for credentials. At the same time, user was able to access SharePoint using Firefox or Chrome without a problem. Weird, right? Your credentials should let you in (or not) no matter which browser you're using. Google/Bing searching did not help…

A solution to the problem ended up nothing to do with SharePoint. In the end, the problem was caused by a simple fact that the Active Directory user had "User Must Change Password at Next Logon" option turned. And, only Internet Explorer picked up on that. Once we have removed the "User Must Change Password at Next Logon", everything for back to normal. Another problem solved.

I still wonder though why did Firefox and Chrome ignored the fact that user need to change his/her password and Internet Explorer was the only browser that picked up on that. Any thoughts?