Dropping Cookies in IE7

I was asked an unusual question yesterday about cookies, Silverlight and WCF. The scenario was that a Silverlight application was being used in a consumer-facing situation. The application itself communicates with the server using WCF. The service which is the target of the communication uses ASP.NET Authentication to authenticate the user. It’s an implementation detail (but critical to this post) that the method of storing the authentication token is a cookie called .ASPXAUTH.

In a normal (that is, working) scenario with Silverlight, the credentials are sent to the server and an .ASPXAUTH cookie is returned. The browser strips off the cookie and stores it. On any subsequent requests, Silverlight creates a request and sends it the the server through the browser’s networking API. The browser is responsible for determining which, if any, cookies should be send with the request and adding them to the outgoing header. In other words, the Silverlight application has no specific knowledge of or interaction with the .ASPXAUTH cookie.

As you would expect, this mechanism works the vast majority of the time. If it didn’t, I think it would have been a significant story long before now. But my questioner was running into a situation where the Silverlight application was unable to communicate with the server even after authentication was performed. What’s worse, this behavior was only happening on IE7. When Silverlight was run through Firefox, it worked exactly as it was supposed to.

The diagnostic step in a situation like this is to use Fiddler (or whatever your favorite TCP trace application is) to view the raw messages. And what was seen is that although the authentication response had the .ASPXAUTH cookie in it, any requests sent back to the server after authentication did not. Given when I’ve already explained about the processing of cookies with Silverlight requests, this eliminates the Silverlight application as the most likely culprit. But it also makes one scratch your head, as we can be pretty certain it’s not a widespread failure of IE7 to process cookies.

The answer likes in a strange bug in IE7. It turns out that if a domain name has a underscore in it, IE7 doesn’t persist the cookies. Let me repeat that, because it’s such a bizarre sounding problem. In IE7, if the domain name has an underscore (‘_’) in it, then any cookies returned from the domain will not be persisted. Which also means that subsequent requests will be ‘cookie-free’.

I’m guessing that most domain names don’t have an underscore, which is why this bug didn’t get widespread notice. In this particular case, the domain was one used for development, which would keep the problem from being a production issue. But I have no reason to believe that the bug would be restricted to a local problem. Deploy a ‘underscored’ domain name to the public internet and no authentication, shopping carts or other state information can be saved.

Fortunately, the solution was a simple one. If the domain name in the endpoint configuration is replaced with the raw IP address, IE7 is more than happy to save the cookie. I wouldn’t be surprise if an entry in your local hosts file would have the same effect. And the final solution would be to have your domain administrator create a DNS Alias entry…one that doesn’t have an underscore, of course.

Silverlight 2.0 Penetration

I'm sure that you know that Silverlight was released a couple of weeks ago. ScottGu made an interesting comment indicating that Silverlight (that would be 1.0 or 1.1) was installed on 1 in 4 computers that are connected to the Internet. And that over the next month, those computers would be updated to 2.0. Also that they had already upgrade 100 million computers. These numbers surprised me, only in that I didn't expect that penetration had achieved that goal.

It also provides a number that might encourage people to start developing externally facing applications using Silverlight. Which, given some of the capabilities, is a good thing.

ASP.NET 4.0 Futures

One way that you can tell that a product is getting more mature by the type of features that are included with new release. By this measure, ASP.NET is getting to be positively adult like.

I just sat through the session on ASP.NET 4.0 futures. The main announcements are surprisingly tame. Well, surprising by former ASP.NET standards. And not particularly revolutionary.

There are, of course, other reasons for that. The ASP.NET team has been working on a release schedule that is based on Internet time. In other words, you can't want for three years between releases when the rest of the world is innovating every 6-9 months. That's one of the reasons that you saw an MVC release onto Codeplex. It's why you saw Atlas before AJAX. And why the AJAX Control Toolkit exists.

As a result, the advancements are more along the lines of incorporation and integration of these features. MVC is officially part of the product. As is the AJAX Control Toolkit. All good things, but definitely not earth-shaking.

There are a couple of changes on the technology side. There is support for JQuery, a JavaScript library used to improve the integration between JSON and HTML. The binding used in AJAX looks much more like the WPF binding syntax. That last point made me wonder how long until there is a convergence of ASP.NET and WPF, when it comes to application markup.

There are a number of other announcements, regarding JavaScript integration, better granularity when it comes to enabling and disabling ViewState (yeah!), and more control over how server controls deal with CSS markup. If you spend your business hours working with ASP.NET, some of this will probably fall into the 'finally' category. But for me, there was nothing that I found particularly compelling.

Real World AJAX with ASP.NET Session

I have to give credit to Nikhil Kothari. He just ran a session the way that I like to see sessions run. As I read the title and abstract for the session, I was expecting not much in the way of new information. I actually attended the session mostly because it was the only one in this time slot that might be of interest.

Nikhil started out with just some basic stuff on AJAX. But very quickly he got past that and the session became very interesting. The idea is that if you plan on using AJAX in a "real" application, there are a number of issues that need to be address. Simple issues to identify, but complex to deal with. For example:

History

Say you have a page that makes a number of AJAX calls, updating the screen each time. Then you click on the back button. The typical action is to go to the page prior to the initial view of the current page. But what you generally what to do is go to the results from the previous AJAX call.

The solution is to use an AJAX UpdateHistory control. More details can be found on Nikhil's weblog here. It requires some work by the developer to determine when the browser history needs to be updated, but if you put the effort in, your users will love you. OK, not really. This is one of those features that users will ignore if it's there, because your site is working as they think it should. But if you don't have it, they will hate you. It's almost the same thing.

Indexability

If you have a page that uses AJAX to retrieve and display information, you have a problem with making that page appear in the appropriate place within Google. After all, Google depends on static data for its indexing.

The pattern that Nikhil is suggesting is to create a static page that contains all of the data. Make the data static on page. Put the data into a hidden div. Then create a separate frame and use script to pull the data from the hidden div into the frame. When viewed normally, the data will appear in the frame. When crawled by a spider, the script is not executed, so the data is just available for indexing. This pattern might not work in all situations, but there are enough to make it worthwhile being aware of. Again, a blog post (here) provides some additional details.

Overall, a nice presentation with some useful takeaways.

An Old Problem is New Again

I ran into a problem yesterday that brought back memories. And not the good kind of memories either.

I'm doing some work using TypeConverters. In particular, I'm creating a Windows Forms control that functions in a manner similar to a PropertyGrid, but with a user interface that looks more like a typical Windows form. You know, with labels, and text boxes. Because this is a generic control (that is, the fields which get displayed depends on the object associated with the control), I'm doing a lot of reflection to retrieve the properties that are required. And I'm using reflection (the GetValue/SetValue) to interact with the object. This last part (the SetValue method in particular) means that I'm using type converters to move the string values that are in the text boxes into the actual properties themselves.

Now type converters are quite useful. They expose a ConvertFromString method that (not surprisingly) takes a string value and converts it to a value of the associated type. If the string is not valid for the destination type, an exception is thrown.

Problem 1: the raised exception is System.Exception. I have no idea why the exception surfaced from ConvertFromString is an Exception. The InnerException is a FormatException, which is the type of exception that I would have expected. But no, ConvertFromString throws Exception. This forces me to use a pattern that I loathe - try {...} catch (Exception) { }

Whenever I teach, I always harp on this. You should (almost) never catch the general Exception object. You should only catch specific Exceptions. But because of ConvertFromString, I have to add to my list of cases where catch (Exception) is required.

But why should this matter, you ask. TypeConverters exposes an IsValid method. You should be able to check for the validity of the string value prior to calling ConvertFromString. This would completely eliminate the need to catch any exceptions at all. And it's the best practice/recommended/ideal way to go.

Problem 2: Int32Converter.IsValid("bad") returns true.

Think about that last one for a second.

According to the type converter for Int32, the string 'bad' is a valid integer value. In my world, not so much. If you spelunk the various classes in .NET Framework, you find out that IsValid is supposed to be overridden in the various converter classes. But the numeric classes don't bother to do so. As a result, the IsValid that actually services my request just returns a true regardless of whether the string is valid or not.

What's worse, this is a known bug that will never be resolved. Apparently to make IsValid work for Int32Converter is considered a breaking change.

So my suggestion (courtesy of Dave Lloyd) is to add a new method to the TypeConverters. Call it IsReallyValid. IsReallyValid could then be implemented properly without the breaking change. It could take the CultureInfo object necessary to truly determine whether a string is valid. For the base type converter, it could simply return a true.

Heck, let's go a step further and mark IsValid as being Obsolete. That would force people to move to IsReallyValid and correct any problems in their applications. And maybe, in a couple of versions, Microsoft could reintroduce a working version of IsValid and mark IsReallyValid as being obsolete. In this way, four versions from now (that's about 10 developer years), IsValid will work the way it's supposed to (and everyone would expect it to).

Looking for Mr (or Ms) GoodDeveloper

Business has been booming of late at ObjectSharp. Don't know whether it's the weather or the business cycle, but our recent company barbeque had more new faces that I've seen in many years. And we haven't lost any of the old faces either.

And yet it doesn't seem to end. At the moment, we're looking to add some consultants to our team. Specifically, we have the need for someone with Windows Forms experience, either in creating commercial-grade user interfaces on their own or with the CAB application block.

If you (or someone you know) has that experience and is looking to join a fun team of top-notch developers, drop me your (or your friend's) resume. You'll get a chance to work on projects that use cutting edge technology. You'll learn about (and sometimes use) technologies that aren't yet available. And, if you have the interest, we have six MVPs on staff to help you get your own designation.

If you don't fit into this skill set, fret not. There will be others coming along in the very near future. Just keep your eye tuned to this blog. 

Relieving File in Use Problems

It's quite possible this is old news to people, but it took me more than a couple of minutes to find the solution. I was using the FromFile method on the Image class to create a new Image object. When I was done with the Image, I needed to move the file. But it turns out that the file was still 'being used by another process'.

The solution is simple, but it didn't come immediately to mind because I'm so used to basically ignoring the scope of variables. The trick is to call Dispose immediately rather than waiting for garbage collection to do its thing. Once Dispose has been invoked, you can manipulate the underlying image file as you need to.

WPF Found a Home!

The more I see about Silverlight, both 1.0 and 1.1, the more I realize that WPF has found a client to service. From the moment I heard about XAML and WPF, I questioned where it was going to fit in the real world. A large part of the 'coolness' of XAML revolved about transitions and transformations. But business applications don't really need that. Business applications are, generally speaking, not flashy, demo-candy-ful systems. So this disconnect between the needs of business and the strengths of XAML left me wondering.

While I don't believe that WPF is going to replace Windows Forms anytime in the near future, I think that it's pretty obvious that the raison d'etre of XAML is to support Silverlight.

But it's important for developers to realize that XAML/WPF/Silverlight is sill new technology. The one thing I haven't seen here at MIX is an integrated development environment. Orcas doesn't have a XAML designer...it has a link to edit the XAML in Expression Blend. The data binding story in XAML is a step back from what ASP.NET developers gained in .NET 2.0. Jasper and Astoria are interesting technologies, but the security and error management stories are still being developed. In other words, temper the excitement of some very, very slick technology with the realization that we still have a little time before it becomes 'real'

Drilling Down on the Silverlight Hype

First off, I do think that Silverlight is a very cool piece of technology. Want to see just how astounding? Check out the Top Banana demo application that was part of yesterday's MIX keynote. You can also find it here.

What I found most astounding is that Top Banana is a browser-based application with cross-browser support.

Silverlight is also touting that it allows .NET to run on the various browsers. Which it will do at version 1.1. Keep that in mind. The Silverlight that was released as a beta version yesterday (albeit with a "Go Live" license) does not have .NET support. Programming against the object model is done using JavaScript. The version that contains .NET support was released as an alpha version without the Go Live license. In other words, the cross-platform .NET support is only available as a preview.

Not that it won't be cool when it becomes 'real', but keep that in mind when you're looking at the various demos. Remember to ask which version was used to put the demo together.

Identifying Formatting Errors with Expression Web

One of the more interesting aspects of Expression Web is its ability to help debug CSS errors. I'm in a talk on Designing with Expression Web and the story is about how to use Expression Web to identify problems in a web page. The web page is loaded and the problematic element is selected. One of the windows is a rules summary pane that includes all of the formatting rules that apply to the selected item, in the order in which they occur.

So the process of identifying the problem is finding the particular rule that actually causes the unexpected formatting. And having tried to suss this out on my own in the past, I can greatly appreciate the benefits of having such a tool available.