Generic Implementation of INotifyPropertyChanged on ADO.NET Data Services (Astoria) Proxies with T4 Code Generation

IMG_4855 Last Week Mike Flasko from the ADO.NET Data Services (Astoria) Team blogged about what’s coming in V1.5 which will ship prior to VS 2010. I applaud these out of band releases.

One of the new features is support for two-way data binding in the client library generated proxy classes. These classes currently do not implement INotifyPropertyChanged events nor project into ObservableCollections out of the box.

Last week at the MVP Summit I had the chance to see a demo of this and other great things coming down the road from the broader Data Programmability Team. It seems like more and more teams are turning to T4 Templates for code generation which is great for our extensibility purposes. At first I was hopeful that the team had implemented these proxy generation changes via changing to T4 templates along with a corresponding “better” template.  Unfortunately, this is not the case and we won’t see any T4 templates in v1.5. It’s too bad – would it really have been that much more work to invest the time in implementing T4 templates than to add new switches to datasvcutil and new code generation (along with testing that code).

Anyway, after seeing some other great uses of T4 templates coming from product teams for VS 2010, I thought I would invest some of my own time to see if I couldn’t come up with a way of implementing INotifyPropertyChanged all on my own. The problem with the existing code gen is that while there are partial methods created and called for each property setter (i.e. FoobarChanged() ), there is no generic event fired that would allow us to in turn raise a InotifyPropertyChanged.PropertyChanged event. So you can manually added this for each and every property on every class – but it’s tedious.

I couldn’t have been the first person to think of doing this, and after a bit of googling, I confirmed that. Alexey Zakharov’s post on generating custom proxies with T4 has been completely ripped off, er, inspirational in this derivative work. What I didn’t like about Alexy’s solution was that it completely over wrote the proxy client. I would have preferred a solution that just implemented the partial methods in a partial class to fire the PropertyChanged event. This way, any changes, improvements, etc. to the core MS codegen can still be expected down the road. Of course, Alexey’s template is a better solution if there are indeed other things that you want to customize about the template in its entirely should you find that what you need to accomplish can’t be done with a partial class.

What I did like about Alexey’s solution is that it uses the service itself to query the service meta data directly. I had planned on using reflection to accomplish the same thing but in hindsight, that would be difficult to generate a partial class of a class I’m currently reflecting on in the same project (of course). Duh.

So what do you need to do to get this solution working?

  1. Add the MetadataHelper.tt file to the project where you have your reference/proxies to the data service. You will want to make sure there is no custom tool associated with this file – it’s just included as a reference in the next one. This file wraps up all the calls to get the meta data I’ve made a couple of small changes to Alexey’s -- Added support for Byte and Boolean (typo in AZ’s).
  2. Copy the DataServiceProxy.tt file to the same project. If you have more than one data service, you’ll want one of these files for each reference. So for starters you may want to rename it accordingly. You are going to need to edit this bad boy as well.
  3. There are two options you’ll need to specify inside of the proxy template. The MetadataUri should be the uri to your service suffixed with $metadata. I’ve found that if your service is secured with integrated authentication, then the the metadata helper won’t pass those credentials along so for the purposes of code generation you’d best leave anonymous access on. Secondly is the Namespace. You will want to use the same namespace used by your service reference. You might have to do a Show All Files and drill into the Reference.cs file to see exactly what that is. 
  4. var options = new {
        MetadataUri = "http://localhost/ObjectSharpSample.Service/SampleDataService.svc/$metadata",
        Namespace = "ObjectSharp.SampleApplication.ServiceClient.DataServiceReference"
        };

That’s it. When you save your file, should everything work, you’ll have a .cs file generate that implements through a partial class an INotifyProxyChanged interface. Something like…..

public partial class Address : INotifyPropertyChanged
{
    public event PropertyChangedEventHandler PropertyChanged;

    private void OnPropertyChanged(string property)
    {
        var handler = PropertyChanged;
        if (handler != null)
        {
            handler(this, new PropertyChangedEventArgs(property));
        }
    }

    partial void OnAddressIdChanged()
    {
        OnPropertyChanged("AddressId");
    }
    partial void OnAddressLine1Changed()
    {
        OnPropertyChanged("AddressLine1");
    }
}

My Biggest Disappointment at the MVP Summit

I’ve just come home from spending the last three days in Redmond at the MVP Summit. For those who might not be aware, the Summit is an annual event that Microsoft hosts for Most Valuable Professionals (MVP). The MVP designation is given to people who have contributed in a positive way to the community through speaking, blogging, answering questions in forums or organizing user groups and conferences at the local level. At the Summit, the various product groups get the opportunity to demonstrate some of the futures for their products in order to solicit feedback. The chance to meet and talk with product group members is actually one of the main benefits of being an MVP to me. They are people who are passionate about the code they write and who love to hear about the good and the bad.

However, the futures that are being discussed are really that. We’re not talking about what’s going to be in VS2010. The feature list for that has been set in stone for a while and is generally well known. Instead, we’re talking about what might be coming in the next version of Visual Studio. Or Silverlight. Or ASP.NET. Or Data Programmability. These futures have not, for the most part, even been designed much less coded. So to talk with us about this, MVPs at the Summit (and, indeed, all MVPs) have to sign a non-disclosure agreement (NDA). This means that we cannot discuss with anyone outside of the MVP community what we have seen and heard until the information becomes public.

Today’s technology, combined with the outgoing personalities of MVPs makes this restriction a challenge. Normally when I’m at a conference, I’m live blogging the session that I’m in. Or I’m twittering my schedule. Can’t do that here. It gives me itchy fingers, but the NDA is taken quite seriously. Even the code names for various projects are considered NDA, a problem for the person who unthinkingly twittered one while in sessions on Monday.

So that inability to share is my biggest disappointment. Not an unexpected one (I’ve been to the Summit before and am under NDA constraints constantly), but still a source of sadness nonetheless. But let me just say that I’ve already written some blog posts that will be published once the details of the products are made public in the near future. Hopefully that little tidbit of foreshadowing won’t get the NDA police on my trail.

Clearing the Debugging Tooltip

On the scale of annoyances, this is not a huge one. But the solution is so simple, that it’s worth a quick post.

Have you ever been in the middle of debugging an application and used the hover functionality to display the current value of a variable? And expanded multiple levels to find the specific property that you’re looking for? Of course you have.

But have you ever been frustrated when the debugging view obscures a piece of code that you’re interested in? So that you have to close the debugging tooltip to see the code and then navigate through the hierarchy to get back to what you were looking for? Odds are that it has happened to you. It certainly happens to me frequently enough.

Fortunately, there is a solution. While the debugging tooltip is displayed, simply press the Ctrl key. This instantly makes the tooltip transparent. And when you release the Ctrl key, the tooltip reappears. Simple and elegant, just the way all solutions should be.

Thanks to Dave Lloyd and Brian Rasmussen for the information (same tip at different times).

Changing Generated Code in VS2008

Have you ever been dissatisfied with the code that is automatically generated by Visual Studio in response to various commands that you perform. For example, in VS2005, I got used to using the prop snippet to create properties. This snippet created a property declaration complete with a private backing variable. However, in VS2008, the prop snippet was modified to create a property declaration in the form of the automatic property declaration. That, to me, was annoying.

(Note to Microsoft: Please consider the existing snippets to be part of what you consider when looking at backwards compatibility. Create more snippets, sure. But please don’t change what the current snippets do…especially when the ‘better’ snippets are not actually better)

Or, consider the fact that, as more and more work is done in WPF, the need to ensure that business classes implement INotifyPropertyChanged grows. Adding or modifying a snippet to include the code to call PropertyChanged could be quite useful. It certainly has been for me.

Little known by most developers is the location for the snippets that Visual Studio provides. Or, what might be more interesting to intrepid developers, the snippets used by Visual Studio refactoring. All of these are found in %ProgramFiles%\Microsoft Visual Studio 9.0\VC#\Snippets\1033. The keyword snippets can be found in the Visual C# subdirectory, while the refactoring snippets can be found in the Refactoring subdirectory. Take a look at what’s there and you may be surprised how you can increase your productivity for mundane (and frequent) tasks.

“Fixing” the WPF Designer

If you use WPF on a regular basis, then I can take a good guess as the esteem in which you hold the designer. While it has it’s place, hard code WPF devs using XAML directly. It’s much each to get the result that you’re looking for quickly. The problem I’ve always had is that when you open a WPF file, even when you have disabled the design portion, there is still a significant time lag.

Thanks to Fabrice Marguerie who wrote this blog post which describes how to eliminate that annoying delay.

It’s nice to have less friction between me and my XAML. :)

Excel automatically defines column types

So to set up this problem, an application that I'm currently working on needs to process data that is stored in an Excel spreadsheet. The creation of the spreadsheet is actually performed by a scientific instrument, so my ability to control the format of the output is limited. The instrument samples liquids and determines a quantity. That quantity is placed into the spreadsheet. If the quantity in the liquid is not successfully read, then the literal "Too low" is placed into the sheet. The application opens up the spreadsheet and loads up a DataSet using the OleDb classes (OleDbConnection and OleDbDataAdapter).

This seemed like a fine setup. At least, until some of the numeric values in the spreadsheet were not being read. Or, more accurately, the values in the spreadsheet were not making it into the DataSet. Head-scratching, to say the least.

After some examination, the problem became apparent. When the values were being successfully read, the column in the DataSet had a data type of Double. When the values were not being successfully read, the column in the DataSet had a data type of String. Now the difference between success and no-success was nothing more than the contents of the spreadsheet.

Now the obvious path to follow is how the data type of the column is determine. Some research brought me to what I believe is the correct answer. The Excel driver looks at the contents of the first 8 cells. If the majority is numeric, then the column's data type is set to Double/Integer. If the majority is alphabetic, then the column becomes a string.

Of course, this knowledge didn't really help me. As I said at the outset, my ability to control the format of the spreadsheet was limited. So I needed to be able to read the numeric data even if the column was a string. And, at present, the cells containing numbers in a column marked as a string were returned as String.Empty.

The ultimate solution is to add an IMEX=1 attribute to the connection string. This attribute causes all of the data to be treated as a string, avoiding all of the cell scanning process. And, for reasons which I'm still not certain of, it also allowed the numeric data to be read in and processed. A long and tortuous route yes, but the problem was eventually solved.

Walking the WPF Control Hierarchy

I just finished teaching a Silverlight/WPF course last week. As is true in almost every course I teach, there were questions from people who are trying to use the technology in the real world. It's what I love about teaching...that I always learn something about how people use the tools they have been given.

In this particular case, the problem was relating to walking the control hierarchy in a user control. The user control contained a DataGrid and the goal was to get from a particular cell within the DataGrid back to the user control itself. The code that was written to do so looks approximately like the following.

public UserControl GetContainingUserControl(FrameworkElement element)

{

   if (element is UserControl)

      return element;

   else

      return GetContainingUserControl(element.Parent);

}

The problem with the code is that, in the case specified, the Parent becomes null before the UserControl is found. For most developers, this is a strange and unexpected turn of affairs. For most hierarchies, it seems completely reasonable to presume that every element in a hierarchy will have a non-null parent except the root of the tree. But if you read over the description of the Parent property closely, you will see that it could be null. From MSDN,

Parent may be a null reference (Nothing in Visual Basic) in cases where an object was instantiated, but is not attached to an object that eventually connects to the Silverlight RootVisual, or the application object.

Ouch. Sure it's documented, but still. Ouch.

Fortunately, there is a solution, found in an incredibly useful class named VisualTreeHelper. This class exposes a number of methods that can be used to navigate up and down the visual hierarchy in a WPF interface component. There is, for example, a GetChild method that retrieves a particular child element. As well, of particular interest for this example, there is a GetParent method that retrieves the parent of a given element. This method will not return a null parent until the top of the visual hierarchy is reached.

In other words,VisualTreeHelper.GetParent(element) can be used in place of element.Parent in the above method with the result being more in line with expectations.

Whew.

First Post of the Year and a Gift for You

Now that the holidays are over (which weren't particularly fun for me this year, due to a persistent bout with sinusitis), it's time to get back to the posting. And to start things off, let me offer any of you who might be thinking about going to DevTeach Vancouver at the beginning of June (8th to the 12th). Jean-Rene Roy, the organizer of the conference, has offered 50% off the registration cost to the first 30 people who register with the following code: DEVT50OFFVAN. Also, the registration need to be done prior to Feb 10th.

If you've never been to a DevTeach conference, you don't know what you're missing. This is easily the top .NET developer-focused conference in Canada. They get big name speakers presenting on the latest and greatest of technologies. As well, the setup for the conference is such that the speakers are much more accessible than any other conference I've been to. So not only will you be able to hear familiar luminaries, but you'll also get the ability to speak with them one-on-one. A great deal at full price, this becomes an incredible opportunity at half-price. So if you were just thinking of going, let this offer make your mind up for you.

Technology Predictions and Trends for 2009

I’m sure we’re going to look back at 2009 and say “it was the best of times, it was the worst of times’ and it will no doubt be interesting. Here’s my predictions….

Social Networking Everywhere

Although online social networking companies are already struggling with diminished valuations, in 2009 we’ll see social networks break out of their silos and become essential platform elements that see their way into other online applications such as travel, e-commerce, job-posting boards, online dating services, CRM services, web based email systems, etc. Blogging is also changing, slowing down in fact. Micro-blogging with status update-esque features in FaceBook, Windows Live, and of course the explosion of Twitter will take on even larger roles. It’s as true today as it was back in 1964 when fellow Canadian Marshall McLuhan wrote “The Medium Is The Message”.

The Death of Optical Media

Okay, so you’ll still be able to walk into a video store to rent a DVD or buy a spindle of blanks at your grocery store but make no mistake about it – the death march is on, and that includes you too Blu-Ray. Blu-Ray will never see the adoption curve that DVD’s had. They thought they won when HD-DVD died, but if winning means dying last, then sure, you won. We’ll increasingly be renting our movies on-demand through our cable boxes, on our converged PC’s and XBOX 360’s via services like Netflix. Along with this, the rest of us will start to realize we don’t really need to own our libraries of movies. With IPod penetration as high as it is, it may take longer to realize we don’t need to own our music either – frankly we don’t own it anyway even though the pricing models try to convince us we do. I won’t go out and predict the death of DRM, frankly, I think 2009 maybe the year where DRM starts to get more tolerable once we are clearly renting our music and movies. The Zune Pass is making some inroads here but until Apple starts offering a similar subscription pricing, this may take a bit longer.

The Mac Air may have been a bit ahead of the curve with dropping the optical drive, but get used to it. Expect more vendors to do the same as they reduce size or cram in additional batteries or hard drives.

The Rise of the NetBook

If 2009 is the year of doing more with less, then this will surely be the NetBook’s year. Mainstream hardware manufacturers hate these and their small profit margins, but Acer and Intel will be raking it in building market share if not large bottom lines. Who knows, MS may learn to love the NetBook if they can get Acer to start shipping Windows 7 on them this year as well. Be prepared to see these everywhere in 2009, but don’t expect to see Apple make one (ever).

Zune Phone

The big story at the end of 2008 has been the global suicide of the original Zune 30s. I predict that tomorrow they’ll be they shall rise from the dead but it might take until the 2nd for everybody to figure out that they need to entirely drain the battery. The big news is that there won’t be a Zune phone with the MS brand name on it, but the Zune UI will come to Windows Mobile (6.5?) turning legions of touch based smart phones into music players almost as good as an IPhone. The bad news is that without an App Store to vet software quality, crapware will continue to be the source of reliability issues for the Windows Mobile platform. The good news is that without an App Store, Windows Mobile users will have lots of choice in the software for their devices, not to mention lots of choice in devices, carriers and plans. The battle between Good and Evil may morph into the battle between Reliability and Choice.

Touch Everywhere

Get your head out of the gutter, that’s not what I meant. What I did mean is that 12-24 months from now, it will be difficult to purchase a digital frame, LCD monitor or phone without an onscreen touch capability. Windows 7 will light these devices up and we’ll start to not think about the differences between Tablet PC’s and Notebooks as they just converge into a single device. With the advent of Silverlight, WPF and Surface computing, MS has been banging the “user experience” drum for a while now but when touch starts to be the expectation and not the exception, we’ll have to re-engineer our applications to optimize for the touch experience. This may turn out to be bigger than the mouse or even a windowed operation system.

Flush with Flash

In 2008 we’ve been teased with sold state hard drives but with less than stellar performance at outrageous prices, they’ve been on the fringe. In 2009 prices and read/write times will both come down in solid state drives, but with the increased capacity of USB memory sticks 32gb, 64gb +, we likely won’t see SSD drives hitting mainstream this year. Instead I think we’ll see an increase in the behavior of people keeping their entire lives on USB flash memory sticks. Hopefully we’ll see sync & backup software such as Windows Live Sync, Active Sync, Windows Home Server, etc. become more aware of these portable memory devices that may get synced from any device in your mesh. 

Camera flash will have to have a new format as SDHC currently is maxed at 32gb. With the increase in demand for HD video recording on still and video cameras, we’ll need a new format. As such we’re seeing rock bottom prices on 2gb chips now. Maybe somebody will come out with a SD Raid device that lets us plug in a bank of 2GB SD Cards.

Growing up in the Cloud

Cloud computing is going to be a very long term trend. I think we’ll only see baby steps in 2009 towards this goal. In the consumer space we’ll see more storage of digital media in the cloud, online backup services and the move of many applications to the cloud. Perfect for your Touch Zune Phone and Touch NetBook without an optical drive eh? IT shops will take a bit longer to embrace the cloud. Although many IT Data centers are largely virtualized already, applications are not all that virtual today and that doesn’t seem to be changing soon as developers have not whole-heartedly adopted SOA practices, addressed scalability and session management issues nor adopted concepts such as multi-tenancy. As we do more with less in 2009, we won’t see that changing much as a lot of software out there will be in “maintenance mode” during the recession.

Maybe, Just Maybe, this is the year of the Conveniently Connected Smart Client

Adobe Air & Silverlight are mainstreaming web deployed and updated rich client desktop apps. It’s hard to take advantage of touch interfaces and massive portable flash storage within a browser. All of these other trends can influence Smart Client applications, potentially to a tipping point. We’ll hopefully see out of browser, cross-platform Silverlight applications in 2009 to make this an easy reality on the MS Stack.

Incremental, Value-Based and Agile Software Development

Many of my customers began large-scale re-writes of their key software assets in 2008, many of them against my recommendations. For most of my key customers in 2008 and into 2009 I’m an advocate of providing incremental value in short iterative releases, not major re-writes that take 6+ months to develop. Even if your application is written in PowerBuilder 6 or Classic ASP, avoid the temptation to rewrite any code that won’t see production for 4 months or longer. We can work towards componentized software by refactoring legacy assets and providing key integration points so that we can release updated modules towards gradual migration. It is difficult for software teams in this economy to produce big-bang, “boil the ocean”, build cathedral type projects. We simply can’t predict what our project’s funding will be in 4 months from now, or if we’ll be owned by another company, scaled down, out sourced or just plain laid off. That is of course unless you work for the government. Government spending will continue if not increase in 2009, but still, try to spend our taxpayer money wisely by delivering short incremental software releases. It allows you to build trust with your customers, mark a line in the sand and move onward and upward, and let’s you move quickly in times of fluid business requirements and funding issues.

Incremental, Value-Based software development isn’t easy. It takes lots of work, creative thinking, and much interop and integration work than one would prefer. It might easily seem like an approach that costs more in the long term, and in some cases you could be right. But if a company has to throw out work in progress after 6-8 months or never sees the value of it because of other changing business conditions, then what have you saved? Probably not your job anyway.

My Book is now Available

In the excitement of PDC, it slipped my mind to let everyone know that the book on which I was a co-author was actually shipped at the beginning of October. The title is the terse, yet incredibly descriptive MCTS Self-Paced Training Kit (Exam 70-503): Microsoft® .NET Framework 3.5 Windows® Communication Foundation (PRO-Certification). There is a bidding war for the movie rights and I'm hoping that George Clooney plays me in the adaptation. :)

For those of you wondering how the actual release might have slipped my mind, the reason is that I'm not involved in the steps that takes place at the end of the publishing process. Most of the book was written in the first half of the year. Since July, I have been reviewing chapters and responding to editor notes. But since the middle of August my tasks have been done. And, I'm afraid, when it comes to book writing, once I'm done, I mentally move on to the next task. So I wasn't even sure when the publication date was. But it was released and, based on the numbers that I've seen so far, it seems to be doing quite well. If any of you have the chance to read it, I'd be thrilled to hear any feedback (both good and bad).