.NET 2.0 Whidbey Presentation in Toronto

On December 9th, myself and Dave Lloyd will present an in depth sneak peak of the next release of .NET: Visual Studio .NET 2.0 (code-named "Whidbey"). This should be a really fun presentation but seating is pretty limited. Adam Gallant from Microsoft is going to come by and also demo Longhorn and Avalon. He's getting daily builds so hopefully he'll show us some cool stuff that wasn't in the PDC build and is maybe even newer than some of the stuff they showed us at PDC. Hope to see some of you there.

NAntContrib "slingshot" vs. NAnt "solution"

I was trying to build the latest NAntContrib project so I could take advantage of their Slingshot task which automagically converts a visual studio solution (*.sln) and projects (*.csproj - not sure about *.vbproj) into a handy-dandy NAnt *.build file complete with all reference, source inclusions, dependencies, debug, release, and clean targets. Nice.

The only problem with this approach of course is that you need to run it daily if you don't want your NAnt builds to break when a new project or reference is added to your solution. Fortunately, NAntContrib exposes slingshot not only as a command line tool but also as a native NAnt task. The pain is that NAntContrib doesn't have a “stable release” but only a “nightly build”.....which of course is not built against the NAnt “stable release”. So I had to throw away my NAnt stable release that I've been using and opt for the latest nightly build for it too. After I got both to compile successfully I started to battle goofy slingshot issues like having to map web projects to my local hd path, avoiding spaces in my paths, and ReallyReallyReallyLongPaths.  I ended up doing a subst to map a directory to a drive letter to make it short. All this to create a build file that will get thrown away each and every night.

While browsing through the new things added to NAnt since the stable release I was accustomed too, I discovered a new task “Solution”. It compiles a whole solution - sln file. No generation of a build file. My build file literally goes from hundreds if not thousands of lines to this.

<target name="Debug">

<solution solutionfile="ObjectSharp.Framework.sln" configuration="Debug" />

target>

This compiles our entire framework, with all 8 projects as referenced in the sln file, in the right order, with the same dependencies used by a developer while using the VS.NET IDE. What a concept: share the build files used by your build server and IDE so that there are no surprises or impedance mismatches. Such a great idea that MS is doing this with MSBuild in Whidbey. I wonder if MSBuild will have add on tasks like NAntContrib, like for Visual Source Safe, sending email, running NUnit, and executing SQL. I like my NAnt - not sure if I'll be able to break free with MSBuild.

MSDN Experimental Annotations Service uses RSS

I miss Win95 winhelp. In particular, I was sad to see in Win98 that HTML Help had not included the annotation feature, the ability to add your own notes to a help topic - any help topic. These were stored in a local .ann file next to the help file if memory serves.

During his PDC keynote, Eric Rudder mentioned and briefly showed some stuff they were doing with the Longhorn SDK to enable threaded annotations, kind of like discussions to a help topic. So I've stumbled on what promises to be a cool site: lab.msdn.microsoft.com. One of the play things is the MSDN Annotations Service. It requires the download of a small plug in for your browser.

It basically works like this...

You visit any page (in theory) on the msdn site (including the longhorn sdk) and you get an annotations window on the bottom. This allows you to add your own comments. Nice. The cool thing is, you get to see other users annotations as well. These annotations are not stored in a local .ann file, no they are stored on the Microsoft site.

Maybe you don't want other people to see your goofy code snippets. Fortunately you can subscribe to your own feeds - so long as they are exposed as an RSS, like say - this blog. If you want to make an entry to a page your visiting, simple paste in the URL to your blog entry (like so: http://longhorn.msdn.microsoft.com/lhsdk/ref/ns/microsoft.build.buildengine/c/target/target.aspx).

The annotation service allows you to subscribe to a feed. While you are looking at a given page - like the one above, if the subscribed feed contains an URL to that page, then presto it shows up as an annotation. Very cool.  The stipulation here is that in the RSS XML feed, the tag has to contain an anchor with that URL.

So does MS listen to your subscribed feeds? No, that's what the small utility plug in is for. It's done on the client.

Yet another creative use of RSS. I'm also told that the MS provided annotations also are scraped from newsgroups.

Using a DLL from Web Service

I had the distinct pleasure of trying to incorporate a non-COM compliant DLL into a web service yesterday.  Along with the issues associated with marshalling parameters (and which I'll mention in a separate blog entry), I also had to get the web service to find the DLL that needed to be loaded.  I would have thought that simply placing it into the bin directory under the virtual root would be sufficient.  Apparently not. Even after doing so, I still got a could not load DLL message

The correct answer is that the DLL needs to be someplace in the directories listed in PATH.  And by default, the bin directory is not in this list. 

For those who are unfamiliar with the process, the PATH is build from a combination of sources.  First, the System Path as defined in the System Environment Variables in the System administration applet is included.  Then the User path as defined in the same place is appended.  Finally the directories added to the PATH through Autoexec.bat are included. 

So if you plan on using a DLL in a web service, make sure that either the DLL is installed someplace along your PATH or your PATH is modified to include the web service's bin directory.

Loop optimization and .Net JIT Compiler

The other day, a colleague asked me a performance question involving loops and optimization.

The question was as follows:

When looping thru an array in a for loop, should the array's upper bound calculation (array.length) be placed in a separate variable prior to the start of a loop so that the expression does not get evaluated each time thru the loop.

For example:

int l = theArray.length;
for(int i = 0; i < l; i++){
  …
}

vs.

for(int i = 0; i < theArray.length; i++){
  …
}

 

On the surface, this seems to make perfect sense; however, I seemed to remember reading something that suggested that the optimal solution was actually the second, because the compiler already has an algorithm for dealing with this. 

After several minutes of debate, we decided to check the web, and found this interesting article  by Emmanuel Schanzer of Microsoft which suggests that “optimizations like this haven't been helpful for nearly 10 years: modern compilers are more than capable of performing this optimization for you. In fact, sometimes things like this can actually hurt performance.”

Although this sounds counter-intuitive, it really makes sense once you read the article and think about it.

The moral of this story?  Don't try to out think the compiler.

Sparkle to kill Flash? I think not.

Mary Jo Foley speculates that Avalon is a Macromedia “Flash Killer”.

So I guess that would mean MS is going to extract out the Avalon Graphics subsystem from Longhorn, including the WinFx/.NET Framework which is the API to it, all into a handy dandy ActiveX “Avalon Player“ control and we can all embed that in our web pages. That's cool. I guess that will also mean that MS will back port this to Windows 98, Windows ME, and Mac OS 9.x

Maybe that article should have been posted on slashdot.

C# vs. VB.NET Interfaces

Visual Basic has an interest syntax for implementing interface members.

Public Interface IAdd

Function Execute(ByVal i As Integer, ByVal j As Integer) As Integer

End Interface

Public Interface ISubtract

Function Execute(ByVal i As Integer, ByVal j As Integer) As Integer

End Interface

Public Class Calculator

Implements IAdd, ISubtract

Public Function Add(ByVal i As Integer, ByVal j As Integer) As Integer Implements IAdd.Execute

Return i + j

End Function

Public Function Subtract(ByVal i As Integer, ByVal j As Integer) As Integer Implements ISubtract.Execute

Return i - j

End Function

End Class

The key here is that the function names don't have to be the same as they are in the interface definition. I have two interfaces that both expose a Execute method. They are implemented separately as Add and Subtract. Thanks VB team for the “Implements“ keyword. C# is a different story - and up until a few days ago I didn't think you could do this at all in in C#. Here you go:

interface IAdd

{

int Execute(int i, int j);

}

interface ISubtract

{

int Execute(int i, int j);

}

public class Calculator: IAdd, ISubtract

{

#region IAdd Members

public int Execute(int i, int j)

{

return i + j;

}

#endregion

#region
ISubtract Members

int InterfacesCS.ISubtract.Execute(int i, int j)

{

return i - j;

}

#endregion

}

I used the auto-magic interface template generator built into the VS.NET C# editor. After you type ISubtract on the class declaration, wait for a little tool tip that tells you to hit TAB for a default implementation template. You'll notice the fully qualified function name “InterfacesCS.ISubtract.Execute“ on the second declaration of “Execute”. The whipper snappers out there will also notice that the second method is not public. So it's private you might say. The class view and object browser would agree with you. Don't try putting “private” in front of the declaration. The compiler won't like that. Don't even think of trying to call it directly though. The only way to call this second method either internally to the class (why not it's private) or externally is by casting the reference to the interface, like so..

Calculator calc = new Calculator();

int result = ((ISubtract)calc).Execute(5,4);

 

If you really want to call the method directly without casting it, you'll have to create a wrapper method and expose it publicly or privately if that suits your needs.

public int Subtract(int i, int j)

{

return ((ISubtract)this).Execute(i, j);

}

Thanks to Alex Bershadski for pointing me to a few code snippets in the MS Press C# book about this. The book could really do a better job of identifying the limitations of C# in this regard. At least it can be done. Up until Tuesday, I didn't think it was possible.

EnforceConstraints

Have you ever tried to update a Dataset and received this message?

Failed to enable constraints. One or more rows contain values violating non-null, unique, or foreign-key constraints.

Not a problem in a small DataSet. But if your Dataset is more complex sometimes it's not easy to see the problem right away.

There is a property on the DataSet called EnforceConstraints. When you set this to true the DataSet will try to enforce all the constraints in the DataSet (Not null columns, Foreign keys, unique keys) The problem with this exception is, it tells you very little. You know you have violated a constraint, but which one? The way to find out is to look at the row and column errors.

Below is a method that will attempt to apply contraints and write the errors out to the output window. I hope someone out there finds it useful.

Public Sub GetDataSetErrors(ByVal ds As System.Data.DataSet)

 

Public Shared Sub GetDataSetErrors(ByVal ds As System.Data.DataSet)

 

Try

ds.EnforceConstraints =

True

Catch ex As Exception

Debug.WriteLine("DataSet errors: " & ds.DataSetName)

 

For Each table As DataTable In ds.Tables

 

Dim ErrorRows As DataRow()

ErrorRows = table.GetErrors()

 

For Each row As DataRow In ErrorRows

Debug.WriteLine("Table: " & table.TableName)

Debug.WriteLine(" Row Error: " & row.RowError)

 

Dim ErrorColumns As DataColumn()

ErrorColumns = row.GetColumnsInError()

 

For Each column As DataColumn In ErrorColumns

Debug.WriteLine("Column: " & column.ColumnName)

Debug.WriteLine(" Error: " & row.GetColumnError(column))

 

Next

 

Next

 

Next

 

End Try

 

End Sub

Zero Touch and the Cache

I had a chance to travel to Calgary earlier this week to speak at the Microsoft Bigger Better Basic conference.  One of the session I was scheduled for including a number of demos on the Smart Client technology.  So I'm sitting on the plane running through the demos just to make sure that I don't...er...mess up in front of 600 people and I get to the Zero Touch Deployment (ZTD) demo.  This is a straightforward demo showing what happens if you move a .NET application onto a virtual directory.  The deployment is done by clicking on a link in a simple HTML page.  So sweat right?

So I get to the part where the HTML page is displayed, click on the link and...nothing happens.  That strikes me as a little strange, so I start checking on the normal stuff.  Does the file exists.  Is it really a .NET application.  Does the link in the HTML page point to the right place.  All looks good.  So now I go on to the more esoteric possibilities.  Do I have the appropriate permissions defined in the Framework Configuration Wizard.  Are there any settings in the virtual directory that might be giving me grief.  Has ASP.NET been installed in the virtual directory.  Still nothing.

Now I'm beginning to pull my hair out.  After another 15 minutes or so of puzzling, I try to view the source for the HTML page.  Nothing happens.  Fortunately, this is a situation that I'm familiar with.  If you can't view the source for a web page, your temporary Internet cache is full.  And then it hits me.  No room in the Internet cache means that the downloaded application doesn't have a place to live.  So I go into the Internet Options, clear out my cache and ZTD works like it's supposed to.  No damage (other than those haris that are still stuck between my fingers).

But this does bring out one of the biggest weaknesses in the ZTD model.  The downloaded application lives in the cache, along with other pages of stuff from the Internet.  It is too easy for a user to clear out the cache, thus removing the benefit of off-line functionality.  I believe that this situation will be corrected one of the future products (I can't remember whether it's Indigo, Whidbey or Longhorn).  The correction is the cache the ZTD applications in a different location and allowing a generally more granular level of cache clearing.  Can't come too soon for my taste.  ZTD is nice, but when in the hands of the uninitiated, it has the potential to generate some help desk traffic.

October 2003 Visual Studio .NET Documentation Update

It was Eric Rudder who mentioned in his keynote that there was a new update to all the online help for the Visual Studio including something like 5,000 new samples. He mentioned this update like it had been around for months. Well it was posted on Oct 22/2003 in case you missed it.

 It must be good, it's an 80mb download.