Problems Publishing Unit Tests from VSTS

Earlier today, a colleague had an issue with the publishing of unit test results into a TFS instance. The publication process, which is typically done manually at the click of a button, was no longer available. Specifically, the Publish button was actually disabled. There was no obvious error message indicating what, if anything, was wrong. This lack of information made identifying the problem a challenge, to put it mildly.

The solution, at least to identifying what the problem was, is to use the command line version of MSTest. If you execute the command MSTest /? in a command window, you will see that there are a number of options which can be used to execute a set of unit tests and publish them to a TFS server. For example, the following command will execute the unit tests in the TestLibrary.dll assembly and publish the results to the TFS server located at http://TfsServer:8080

MSTest /nologo /testcontainer:"TestLibrary.dll" /runconfig:"localtestrun.testrunconfig"
/resultsfile:"TestLibraryResults.trx" /test:TestLibrary /publish:http://TfsServer:8080
/publishbuild:"DemoTestBuild_20081103.1" /teamproject:"DemoProject" /platform:"Any CPU" /flavor:Debug

In this particular situation, running MSTest generate an error that indicated that the drop location for the build could not be created. An error that was, thankfully, quite easy to correct. But difficult to identify without using the command line tool.

More Thoughts on the Cloud

One of the more farsighted thoughts on the implications of cloud computing is the concern about vendor lock-in. Tim Bray mentioned it in his Get in the Cloud post

Big Issue · I mean a really big issue: if cloud computing is going to take off, it absolutely, totally, must be lockin-free. What that means if that I’m deploying my app on Vendor X’s platform, there have to be other vendors Y and Z such that I can pull my app and its data off X and it’ll all run with minimal tweaks on either Y or Z.


I’m simply not interested in any cloud offering at any level unless it offers zero barrier-to-exit.

This idea was also commented on by Dare Obasanjo here. It was Dare who originally pointed me at Tim's post.

My take on the vendor lock-in problem is two-fold. First is the easier one to deal with - the platform on which the application is running. As it sits right now, use of Azure is dependent on you being able to publish an application. The destination for the application is a cloud service, but that is not a big deal. You can just as easily publish the application to your own servers (or server farm). The application which is being pushed out to the cloud is capable of being deployed onto a different infrastructure.

Now, there are aspects of the cloud service which might place some significant requirements on your target infrastructure. A basic look at the model used by Azure indicates that a worker pattern in being used. Requests arrive at the service and are immediately queued. The requests are then processed in the background by a worker. The placement of the request in the queue helps to ensure the reliability of the application, as well as the ability to scale up on demand. So if you created an infrastructure that was capable of supporting such a model, then your lock-in at the application level doesn't exist. Yes, the barrier is high, but it is not insurmountable. And there is the possibility that additional vendors will take up the challenge.

The second potential for lock-in comes from the data. Again, this becomes a matter of how you have defined your application. Many companies will want to maintain their data within their premises. In the Azure world, this can be done through ADO.NET Data Services. In fact, this is currently (I believe) the expected mechanism. The data stores offered by Azure are not intended to be used for large volumes of data. At some point, I expect that Azure will offer the ability to store data (of the larger variety) within the cloud. At that point, the spectre of lock-in becomes solid. And you should consider your escape options before you commit to the service. But until that happens, the reality is that you are still responsible for your data. It is still yours to preserve, backup and use.

The crux of all this is that the cloud provides pretty much the same lock-in that the operating system does now. If you create an ASP.NET application, you are now required to utilize IIS as the web server. If you create a WPF application, you require either Silverlight or .NET Framework on the client. For almost every application choice you make, there is some form of lock-in. It seems to me that, at least at the moment, the lock-in provided by Azure is no worse than any other infrastructure decision that you would make.

Summarizing the Cloud Initiative

So it's the last few hours of PDC for this year. Which means that pretty much all of the information that can be shoved into my brain has been. It also means that it's a pretty decent moment to be summarizing what I've learned.

Obviously (from its presence in the initial keynote and the number of sessions) cloud computing was the big news. This was also one of the more talked about parts of the conference, and not necessary for a good reason. Many people that I have talked to walked out of the keynote wondering exactly what Azure was. Was it web host? If so, what's the point? It's not like there aren't other companies doing the same thing. Could it be more than web hosting? If so, that wasn't made very clear from the keynote. In other words, I was not exactly chomping at the bit to try out Azure.

But it's here at the end of the week. And I've had a chance to see some additional sessions and talk to a number of people about what Azure is capable of and represents for Microsoft. That has tempered my original skepticism a little. But not completely.

In the vision of Azure was presented, the cloud was intended to be a deployment destination in the sky. Within that cloud, there was some unknown infrastructure that you did not need to be aware of. You could configure the number of servers to go up or down depending on the expected traffic to your site. As you change the configured value, your application becomes provisioned accordingly. This is nice for companies that need to deal with spikes in application usage. Or who don't have (or don't want to have) the support personnel for their infrastructure.

However, there are some limitations to the type of applications which can fit into this model. For example, you need to be able to deploy the application. This implies that you have created the application to the point where you can publish it to the Azure service. The published application might include third party components (a purchase ASP.NET control, for example), but can't use a separate third-party application (such as Community Server).

As well, you need to be able to access the application through the web. You could use a Silverlight front end to create the requests. You could use a client-based application to create the requests. But, ultimately, there is a Web request that need to be sent to the service. I fully expect that Silverlight will be the most common interface.

So if you have applications that fit into that particular model, then Azure is something you should look at. There are some 'best practices' that you need to consider as part of your development, but they are not onerous. In fact, they are really the kind of best practices that you should already be using. As well, you should remember that the version of Azure introduced this past week is really just V1.0. You have to know that Microsoft has other ideas for pushing applications to the cloud. So even if the profile doesn't fit you right now, keep your ears open. I expect that future enhancements will only work to envelope more and more people in the cloud.

WPF Futures

I just got out of the WPF Roadmap talk. I found the future for WPF less interesting that the present, although that could very well because the speed of innovation coming from that group has been very high over the past couple of years. In fact, the 'roadmap' is as much about the present as it is about the future.

Mention was made of the fact that the SP1 version of .NET 3.5 actually included a fair bit of 'new' functionality for WPF. As well, they have just released some new controls that are in different states. There is a DataGrid, Calendar control and DatePicker that are officially released. And there is a brand new ribbon control that is (I believe) in CTP mode.

As well, there is also a Visual State Manager available for WPF. This is actually quite cool, as the VSM solves a problem that WPF has (and that SIlverlight already has). Specifically, it could be challenging to get the visual state of a control/page/form matched up with the event or data triggers that would be raised within the application. The Visual State Manager lets you associate a storyboard with a particular state and then set the state programmatically, at which point WPF runs the storyboard.

Both the ribbon control and the VSM are available right now

The future for WPF seemed less clear. A list of 'potential' controls was presented. And some general comments about improvements in the rendering, integration with touch, etc. In other words, the future appears to be now. I guess that lack of specificity is to be allowed when a product group is right at the beginning of it's development cycle.

Silverlight 2.0 Penetration

I'm sure that you know that Silverlight was released a couple of weeks ago. ScottGu made an interesting comment indicating that Silverlight (that would be 1.0 or 1.1) was installed on 1 in 4 computers that are connected to the Internet. And that over the next month, those computers would be updated to 2.0. Also that they had already upgrade 100 million computers. These numbers surprised me, only in that I didn't expect that penetration had achieved that goal.

It also provides a number that might encourage people to start developing externally facing applications using Silverlight. Which, given some of the capabilities, is a good thing.

Development Changes

The presenter for the development changes related to .NET and Windows 7 is ScottGu. Who else?

First off, Microsoft will be releasing a WPF ribbon control later this week. That is very nice, especially since the ribbon is the new hotness when it comes to the user interface.

Demos also mentioned the integration of touch and jump lists, which are Win7 specific functionality. The jump lists (which is context sensitive functionality available through the thumbnail for an application) are simply commands which are implemented and registered by the application. Touch integration includes the ability to tie gestures to commands. So, ultimately, commands become the main mechanism for integrating WPF and Win7, at least at this level.

Today, some additional WPF controls are being released. This includes DataGrid and a DataPicker control, as well as a Visual State Manager (a feature that greatly improves the creating of interfaces that depend upon events within the UI).

.NET 4.0 includes support for the dynamic language runtime.

VS2010 includes improvements for WPF designer. It also appears that VS2010 was built using WPF (at least the UI portion). Other areas of improvement include a streamlined test-driven development (TTD) model, and a better extensibility within the editor (and the IDE itself). For the later functionality, check out the Microsoft Extensibility Framework (MEF). The idea behind MEF is to allow for better visualization of data, even from external sources such as TFS, within the IDE. The demo includes the ability to grab bug information from TFS and display it while looking at the code associated with the bug.

Windows 7 Brick-a-brack

A couple of notes on how Windows 7 will improve O/S and application performance and some additional features.

If you reduce the granularity of a timer, you can improve power consumption.

They are trying to reduce the memory and processor requirements for Windows. Sinofsky (the speaker) held up his current machine, which is a 1MHz, 1GB system on which Windows 7 uses only half the RAM when started.

A VHD can be created from within Windows (yeah!!!!!). As well, a VHD can be added to a running instance of Windows. And the machine can be booted from a VHD using the boot manager. This has the potential to be quite cool, especially for people like me who go from system to system on a regular basis.

Built-in zoom-in and zoom-out. Useful for me when I'm training.

To give you an idea of what the release schedule for Windows 7 is, the version that is available is basically the Milestone 3 version. There are still basically two steps to go. Next up, after the Milestone 3 is the Beta version, which is expected to be out at the beginning of next year. While M3 is criteria complete, in terms of performance, it is not feature complete. The beta version will be feature complete when it comes out. Microsoft is basically looking for us to use Windows 7 and provide feedback that will impact bugs fixes and the release schedule.

That having been said, there is no hard timeline for when Windows 7 will be released. The only comment beyond that was "we're sticking with the idea of roughly 3 years from the Vista release". Which, if memory serves, means that we're talking about Q4 2009 or Q1 2010.

Windows 7 includes Touch Support

First off, I find touch cool. I couldn't really use it to do my work, but I love the possibilities for improving user experiences.

Windows 7 has built in support for touch in a number of ways. First, applications don't need to explicitly support touch in order to be used by touch. Second, the interface with touch goes beyond using your finger as a mouse. It includes the concept of 'gestures'.

This is tough to explain in words, but let me try. Say you were looking at a Word document and you wanted to scroll down. You could 'flick' your fingers down and it would scroll as if you had started to roll a wheel. This is quite similar to the interface that an IPhone has. Along the same line, you can go back in IE by 'flicking' to the left. The subtlety of the interface fits nicely with my idea of the kinds of animation that a good WPF application should have.

Ray Ozzie Keynote - Day two

The starting point for his presentation was to talk about how the Internet has become ubiquitous. And how there is a benefit to having proximity between the hardware, the software and the user. He also mentioned the phone as one of the software platforms that are part of most people's lives, which is not a surprise considering the comments about the Live Mesh made at Mix '08.

Also interesting that Windows 7 is being referred to as the 'client' operating system. The intent is obviously to set up the idea that the next version of Windows is intended to be the 'client' to the Azure 'service'.

The highlights for Windows 7 include:

  • Changes to the taskbar to including thumbnails of the running instances of an application, as well as the ability to manage (e.g. close, open a recently used document, etc.) an instance from the thumbnail.
  • Improvements to the searching capabilities within the machine, as well as across the machines in your home network.
  • Improvements to automatic connection while in your home network. This includes discovery of other Windows 7 machines on the network. And, more importantly, the default printer changes to a printer within your home network (instead of having to remember to change from your printer at work).

These last two points indicate the focus for Windows 7 improvements, that being to try to smooth the integration of computers within a home next.

More to follow.

ASP.NET 4.0 Futures

One way that you can tell that a product is getting more mature by the type of features that are included with new release. By this measure, ASP.NET is getting to be positively adult like.

I just sat through the session on ASP.NET 4.0 futures. The main announcements are surprisingly tame. Well, surprising by former ASP.NET standards. And not particularly revolutionary.

There are, of course, other reasons for that. The ASP.NET team has been working on a release schedule that is based on Internet time. In other words, you can't want for three years between releases when the rest of the world is innovating every 6-9 months. That's one of the reasons that you saw an MVC release onto Codeplex. It's why you saw Atlas before AJAX. And why the AJAX Control Toolkit exists.

As a result, the advancements are more along the lines of incorporation and integration of these features. MVC is officially part of the product. As is the AJAX Control Toolkit. All good things, but definitely not earth-shaking.

There are a couple of changes on the technology side. There is support for JQuery, a JavaScript library used to improve the integration between JSON and HTML. The binding used in AJAX looks much more like the WPF binding syntax. That last point made me wonder how long until there is a convergence of ASP.NET and WPF, when it comes to application markup.

There are a number of other announcements, regarding JavaScript integration, better granularity when it comes to enabling and disabling ViewState (yeah!), and more control over how server controls deal with CSS markup. If you spend your business hours working with ASP.NET, some of this will probably fall into the 'finally' category. But for me, there was nothing that I found particularly compelling.