Putting the Developer in DevOps

The term DevOps has been getting a lot of play lately. While it’s possible (albeit unlikely) that DevOps is a passing fad, my personal opinion is that it’s the next logical step in the maturation of the software development process. Which means that, as a developer, it behooves you to become aware of the tasks that make up DevOps and the options which are available to help you accomplish them.

In case it wasn’t immediately apparently, DevOps is a portmanteau of the words “Developer” and “Operations”. The overarching idea is that consideration is given to the needs of the system administrators during the design and development phase of a project. In some cases, this might mean that the administrators themselves work along side the developers. But at a minimum, the developer must have an understanding of the needs of the system administrator after the application goes live and bake the appropriate hooks and instrumentation into the code before it goes live.

This approach is different from the ‘traditional’ approach. For many (most??), the Dev side of the process involves the creation of the software. Ops, on the other hand, are viewed as simply the people who have to deal with the artifact of the developer’s creative genius once it’s in the hands of real people. These two groups have been treated, by management and users alike, as separate entities. Or is that enmities?

However, in the real world, this division is not optimal. It is well documented that the majority of an application’s life will be spent in production. Doesn’t it make sense to include functionality in the application that helps to ensure that this life will be both productive, helpful and informative? Of course it does. But being obvious doesn’t mean that it has come to pass. Yet.

But by taking the more holistic view that operation functionality is actually just one more aspect of an application’s functionality, it makes sense to address delivery and instrumentation right from the start. This results is a more robust product and, ultimately, provides more value to the users.

At ObjectSharp’s At the Movies event in May, I had the opportunity to demo a new component that will provide some of the functionality necessary to integrate your application with operations. Application Insights is an SDK that is available across most Microsoft platforms, including Web (MVC and Forms), Windows Phone, and Windows Store. It’s primary requirement is that a live Internet connection is necessary to send the instrumentation data to a central repository. You (that is, the Dev of DevOps) can submit custom, hierarchical categories, along with corresponding metrics to the repository. And the repository can be visualized using the portal that is available at Visual Studio Online.

image

 

 

 

As you might expect, there are more features that are available for to help with DevOps, including server performance tacking, availability monitoring for your Web site (either by hitting a page or by running a simple script) and even a live data stream of events while you are debugging your application. And Application Insights is a locus of regular innovation, with new versions of the SDK being released on a regular cadence. Almost too regular, if you catch my meaning.

You can learn more about Application Insights at the Visual Studio Online site and if you would like to see my demo (or any of the other sessions) from At the Movies, you can find it on Channel 9.

More on fixing reports after Visual Studio Online export

In my previous post on fixing reports after exporting Visual Studio Online project collection to the on premises server, I have forgot to mention that we have fixed the reports by running SQL queries created by a local SQL ninja who wished to stay anonymous (like all ninjas do.) He created the SQL scripts before Microsoft provided the update script that I have included in my previous post and, frankly, his scripts were more comprehensive (no offense to Microsoft support, as they have been very helpful too.) Anyways, here is customer supplied script that helped us resolve the warehouse errors and fix reports after exporting Visual Studio Online project collection:

select l.Id, A.[Changed Order], l.[Changed Order] from WorkItemsLatest L

inner join WorkItemsAre A on l.Id = A.Id and a.[Changed Order] <> l.[Changed Order]

 

update WorkItemsLatest set [Changed Order] = a.[Changed Order]

from WorkItemsLatest L

inner join WorkItemsAre A on l.Id = A.Id and a.[Changed Order] <> l.[Changed Order]

 

select w.id, w.[Changed Order], l.[Changed Order] from workitemswere w

inner join workitemslatest l on w.id = l.id and w.[Changed Order] > l.[Changed Order]

inner join (select id, max(rev) as maxrev from workitemswere

group by id) x on w.id = x.id and w.rev = x.maxrev

 

Update workitemswere set [Changed Order] = cast((cast((cast(l.[Changed Order] as binary(8))) as Bigint)-1) as binary(8))

from workitemswere w

inner join workitemslatest l on w.id = l.id and w.[Changed Order] > l.[Changed Order]

inner join (select id, max(rev) as maxrev from workitemswere

group by id) x on w.id = x.id and w.rev = x.maxrev

 

select id, rev, [Changed Order] from WorkItemsWere where [Changed Order] > @@dbts

 

Update WorkItemsWere set [Changed Order] = cast((cast((cast(z.[Changed Order] as binary(8))) as Bigint)-1) as binary(8))

from

(select w.id, w.[Changed Order], x.rev from WorkItemsWere w

inner join (select id, rev, [Changed Order] from WorkItemsWere where [Changed Order] > @@dbts) x

on w.id = x.id and x.rev = w.rev-1) z

inner join workitemswere ww on z.id = ww.id and z.rev = ww.rev

 

Again, while running the script may have helped us, please RUN THE SCRIPT AT YOUR OWN RISK! AND, PLEASE, DO NOT FORGET TO BACKUP THE PROJECT COLLECTION DATABASE BEFORE YOU RUN THE SCRIPT!!!

SharePoint keeps prompting for credentials

One of the clients had an unusual problem accessing SharePoint. When SharePoint was accessed using Internet Explorer, user kept getting prompted for credentials. At the same time, user was able to access SharePoint using Firefox or Chrome without a problem. Weird, right? Your credentials should let you in (or not) no matter which browser you're using. Google/Bing searching did not help…

A solution to the problem ended up nothing to do with SharePoint. In the end, the problem was caused by a simple fact that the Active Directory user had "User Must Change Password at Next Logon" option turned. And, only Internet Explorer picked up on that. Once we have removed the "User Must Change Password at Next Logon", everything for back to normal. Another problem solved.

I still wonder though why did Firefox and Chrome ignored the fact that user need to change his/her password and Internet Explorer was the only browser that picked up on that. Any thoughts?

Fixing reports in exported Visual Studio Online project collection

As Visual Studio Online has gone to general availability (i.e. no more free rides), a lot of people used TFSCONFIG CLOUDIMPORT command to export your TFS projects from Visual Studio Online to the on premises TFS server. That tool makes the export very smooth and actually easy process. But, that's not what we here to talk about. We're here to talk about fixing reports once you have finished the export. CloudImport command only imports TFS content and does not deal with reports at all. So, reports will have to be fixed manually.

First of all, you need to recreate the reports in the Report Server. By the way, you need to upload reports from TFS Scrum template. You can recreate reports using TFS Administrator's Toolkit or all-powerful PowerShell. Once reports are uploaded and warehouse/cube being processed, you will noticed that some of the reports do not work. If you check warehouse processing status, you will notice that an error being thrown during warehouse processing.

System.Data.SqlClient.SqlException: Cannot create compensating record. Missing historic data. Predecessor of work item(s)…

Very confusing error… Points to some kind of confusion between revisions. How can that be? How can the data get "corrupted" during the export? And, more importantly, what data got "corrupted"? To get answer to the last question, try running the following SQL statement against your project collection database:

Select *

FROM [dbo].[WorkItemsLatest] L

JOIN [dbo].[WorkItemsAre] A

ON L.[ID] = A.[ID]

    AND L.[Rev] = A.[Rev]

WHERE L.[Changed Order] <> A.[Changed Order]

   

If the query returns any records, then we have a problem. Most likely all of the returned records have "messed" up revisions. Potentially, that could be a lot of records. To fix the problem, Microsoft provided us with the following SQL query:

UPDATE L

SET L.[Changed Order] = A.[Changed Order]

FROM [dbo].[WorkItemsLatest] L

JOIN [dbo].[WorkItemsAre] A

ON L.[ID] = A.[ID]

    AND L.[Rev] = A.[Rev]

WHERE L.[Changed Order] <> A.[Changed Order]

   

While running the script may have helped us, please RUN THE SCRIPT AT YOUR OWN RISK! AND, PLEASE, DO NOT FORGET TO BACKUP THE PROJECT COLLECTION DATABASE BEFORE YOU RUN THE SCRIPT!!!

After running the script manually refresh warehouse and cube, and you should be fine. Good luck.

UPDATE: See a few more details on how we've fixed the reports after exporting TFS project collection from Visual Studio Online at http://blogs.objectsharp.com/post/2014/06/14/More-on-fixing-reports-after-Visual-Studio-Online-export.aspx

Release Management for Visual Studio custom training and more…

Over the next month, I'm planning to write a custom one day course on Release Management for Visual Studio. Here is a draft course outline:

Module 1: Introduction to Release Management for Visual Studio

  • An overview of the features of Release Management
  • How to reduce the risks and costs associated with releasing software
  • A look at the Build-Deploy-Test-Release process

Module 2: Installing and configuring Release Management for Visual Studio

  • Installing and configuring the Release Management Server and Client
  • Installing Deployment Agents
  • Configuring roles and permissions in Release Management

Module 3: Configuring Release Paths and Release Templates

  • Defining servers and environments
  • Configuring release path stages
  • Understanding release templates
  • Setting up a deployment procedure for each stage
  • Configuring a build definition to use the Release Management process template

Module 4: Release Management Execution

  • Use of Token
    • Tokens description
    • Token replacements
  • Execution of a Full Release Cycle
    • Trigger manual release
    • Approvals
    • The deployment log
    • Audit and history
  • How to Approve a Request
    • Release Management console
    • Notifications
    • Release Management Web client
    • Deferred deployment
  • How to Release from TFS
    • Configuration of a build definition to use the Release Management Default Template
    • Configuration of an Application Version to allow triggering a Release from the Build
  • Debugging a Deployment
    • Internal logic of a deployment
    • Different logs available

 

The course will be available for public and private training at the end of July 2014. After writing this course, I'm planning to author a more comprehensive three day DevOps course where we will cover TFS Build server, Release Management, and Application Insights from Microsoft Azure. Three day course should be available around September 2014.

If you're interested in these courses, please contact ObjectSharp training at 416-649-3690. J

 

 

Error upgrading TFS 2010 to TFS 2013

I was helping another client to upgrade TFS 2010 to TFS 2013, as I did tens and hundreds of times before, and I thought I saw it all. All possible errors, warnings… all the weird stuff. I was wrong. There is always something. Something that can go wrong. Something new to learn. Anyways, back to the upgrade. As I was saying, I was upgrading TFS 2010 to TFS 2013. It began as a fairly straightforward upgrade. You build new TFS 2013 environment, detach TFS 2010 project collection, (if you're moving from SQL Enterprise to Standard - backup up the database and remove encryption/compression), backup the database, restore the database, and finally attach TFS 2010 project collection to new TFS 2013 environment. Easy, right? Well, I forgot to mention, fixing reports, SharePoint sites, build definitions, etc. But, still, very straightforward process. Except, this time.

This time when I was attaching TFS 2010 project collection to newly built and awesome TFS 2013 environment, I got an error: "TF400744: An error occurred while executing the following script: WorkItemTrackingToDev11Beta2Custom.sql. Failed batch starts on the line 1. Statement line: 35. Script line: 35. Error: 4922 ALTER TABLE ALTER COLUMN SeqId failed because one or more objects access this column." I have never seen this error before. There wasn't much information on this error online either. So, I have tried re-running project collection process again to make sure that we have not missed anything. Got the same error again. Disappointing.

After some digging, a lot of digging actually, I still did not find anything. I knew that the problem must lie within the database. The database must got corrupted somehow. But, I couldn't put my finger on it though. So, I called Microsoft support. We dug some more. Then, some more… And, finally, we have found a problem. Apparently, project collection database was manually modified. An index was created on one of the columns (SeqID column), and that was preventing upgrade process dropping that column. That was it. That was the reason why we lost hours trying to solve the mystery of failing upgrade… So, the lesson is DO NOT make manual changes to any TFS databases. It might seem like a good and perfectly harmless idea at the time, but it's not. It never is.

From Requirements to Deployment: The Modern SW Developer using TFS

On May 8th, the Scotia Bank Theatre, Richmond Street Toronto was the scene of another successful “ObjectSharp At the Movies” event. The event was recorded so you can watch again, share with colleagues or maybe you couldn’t make the presentation and just want to watch for the first time. Below is a quick synopsis of what I presented. Click here to enjoy the Video.

My goal for the morning was simple. As MC I needed to keep things moving, entertain the audience during lull's and show the 500+ registered attendees how great it will be when they upgrade and or start using TFS 2013 in under 30 minutes.

I Started things off with this slide showing Team explorer when TFS is your repository and when Git is your repository. Not much to say on the subject as Colin covered it in his presentation.

image

The new Build process in 2013 includes the ability to shell out to a batch file or powershell script from several steps including prebuild, postbuild, pretest and posttest. This is a great feature that I have been adding to build processes since 2010.

A lot of people didn’t like the fact that pending changes was incorporated into the Team Explorer. Several of the Team explore windows like pending changes and builds can be torn off the team explorer and float as their own windows.

My Work is not new to 2013 but worth mentioning. This window is a great view of all your Work In Progress, Suspended Work, Available Workitems and Code Reviews. Watch the video to see how you can easily switch context using Suspended work and change the query behind Available Workitems to show the work that is assigned to you.

Check out Code Lens inserted right in your code on classes and methods, showing you references to this code, changesets when this code was added and workitems associated to those change sets. This is a great connection from your code directly into TFS.

Release Management is the biggest and best new feature of TFS 2013. Watch and see how a code change can easily be deployed out to multiple environments all starting from your build.

Again click here to enjoy the Video.

MVP Consumer Camp (May 29th 2014)

clip_image002

Check out this great event that Microsoft is organizing at the Microsoft Store at Square One Shopping Centre in Mississauga – the first Canadian MVP Consumer Camp on Thursday, May 29th from 4pm to 9pm. some of my fellow MVPs will be there answering tech questions, showing off demos and the unique features of Microsoft devices. There will be prize draws, Q&A sessions, snacks and refreshments. 

MVPs are recognized exceptional, independent community leaders who share their passion, technical expertise, and real-world knowledge of Microsoft products with others. We speak at events, answer questions online, and have awesome technical blogs! 

For those of you who haven’t been to a Microsoft Store yet, they are amazing!  They have a huge selection of the latest products and gadgets with experts who can answer all of your questions.  If you can’t make the event, definitely try to drop in to a store to try the latest Xbox game, check out Windows 8.1 and its great touch features or to buy the latest and greatest Windows Phone.  The Nokia Lumia 1020 has an unbelievable camera by the way ;)

Do you have any questions about Surface, Windows, Office, Windows Phone or Xbox?  Do you want to learn about how to get the most out of your gadgets?  There will be an MVP there who can provide answers! Hope to see you there!

Register here!

Error while migrating from Visual Studio Online to an on premise TFS

As I'm sure we all know, Visual Studio Online has gone to general availability recently, which means that very soon you will have to pay to use Visual Studio Online. We all knew it was coming… and now you might have to make a decision whether you should stay in the cloud or whether you should move your TFS to your internal network. To be fair I think that, in some cases, it could be very well worth it to pay for hosting TFS in the cloud instead of hosting one internally. However, in some cases it might be cheaper for customers to host TFS internally. Those users need to move TFS content from the cloud to on premise server. Kudos to Microsoft for providing an easy to use Data Export tool (see http://msdn.microsoft.com/en-us/library/7cb80f0d-0119-4277-82e8-719a8db1796e for more info) to migrate your Visual Studio Online contents to on premise TFS server. I wish they made that tool available permanently to users, instead of just for a short time… but I'm not complaining.

Anyways, for the most part this is a fairly to use tool and it worked just fine every time I use it. But once I got the following error: "TF400711: Error occurred while executing servicing step download bacpac from azure storage for component cloudimport during cloudimport: the remote server returned an error: (403) forbidden", trying to migrate Visual Studio Online contents to on premise TFS server. In my case, this error was caused by a simple fact that my export package created in Visual Studio Online has simply expired. I forgot that the package is only valid for 10 days, and gets deleted from Microsoft servers once expired. So, simply start export process again in Visual Studio Online, get a new export package, and run TFSCONFIG CLOUDIMPORT command again using new export package.

Mocking and Unit Testing

Last Thursday, I had the opportunity to give a presentation on unit testing in general and mocking using MOQ specifically to the London .NET User Group meeting. Many thanks to Tom Walker for his efforts in organizing the group, as well as to the people who took time out of their evening to attend. For the purpose of preserving the effort for posterity (not that any posterity will actually care), the slide deck is available at http://www.slideshare.net/LACanuck/unit-testing-and-mocking-using-moq, while the source code used for the demos can be found at http://1drv.ms/1kQJWhb.

As always, if you have any questions or comments, you’re welcome to drop me an email or contact me on Twitter (@LACanuck).