Loop optimization and .Net JIT Compiler

The other day, a colleague asked me a performance question involving loops and optimization.

The question was as follows:

When looping thru an array in a for loop, should the array's upper bound calculation (array.length) be placed in a separate variable prior to the start of a loop so that the expression does not get evaluated each time thru the loop.

For example:

int l = theArray.length;
for(int i = 0; i < l; i++){
  …
}

vs.

for(int i = 0; i < theArray.length; i++){
  …
}

 

On the surface, this seems to make perfect sense; however, I seemed to remember reading something that suggested that the optimal solution was actually the second, because the compiler already has an algorithm for dealing with this. 

After several minutes of debate, we decided to check the web, and found this interesting article  by Emmanuel Schanzer of Microsoft which suggests that “optimizations like this haven't been helpful for nearly 10 years: modern compilers are more than capable of performing this optimization for you. In fact, sometimes things like this can actually hurt performance.”

Although this sounds counter-intuitive, it really makes sense once you read the article and think about it.

The moral of this story?  Don't try to out think the compiler.

Sparkle to kill Flash? I think not.

Mary Jo Foley speculates that Avalon is a Macromedia “Flash Killer”.

So I guess that would mean MS is going to extract out the Avalon Graphics subsystem from Longhorn, including the WinFx/.NET Framework which is the API to it, all into a handy dandy ActiveX “Avalon Player“ control and we can all embed that in our web pages. That's cool. I guess that will also mean that MS will back port this to Windows 98, Windows ME, and Mac OS 9.x

Maybe that article should have been posted on slashdot.

C# vs. VB.NET Interfaces

Visual Basic has an interest syntax for implementing interface members.

Public Interface IAdd

Function Execute(ByVal i As Integer, ByVal j As Integer) As Integer

End Interface

Public Interface ISubtract

Function Execute(ByVal i As Integer, ByVal j As Integer) As Integer

End Interface

Public Class Calculator

Implements IAdd, ISubtract

Public Function Add(ByVal i As Integer, ByVal j As Integer) As Integer Implements IAdd.Execute

Return i + j

End Function

Public Function Subtract(ByVal i As Integer, ByVal j As Integer) As Integer Implements ISubtract.Execute

Return i - j

End Function

End Class

The key here is that the function names don't have to be the same as they are in the interface definition. I have two interfaces that both expose a Execute method. They are implemented separately as Add and Subtract. Thanks VB team for the “Implements“ keyword. C# is a different story - and up until a few days ago I didn't think you could do this at all in in C#. Here you go:

interface IAdd

{

int Execute(int i, int j);

}

interface ISubtract

{

int Execute(int i, int j);

}

public class Calculator: IAdd, ISubtract

{

#region IAdd Members

public int Execute(int i, int j)

{

return i + j;

}

#endregion

#region
ISubtract Members

int InterfacesCS.ISubtract.Execute(int i, int j)

{

return i - j;

}

#endregion

}

I used the auto-magic interface template generator built into the VS.NET C# editor. After you type ISubtract on the class declaration, wait for a little tool tip that tells you to hit TAB for a default implementation template. You'll notice the fully qualified function name “InterfacesCS.ISubtract.Execute“ on the second declaration of “Execute”. The whipper snappers out there will also notice that the second method is not public. So it's private you might say. The class view and object browser would agree with you. Don't try putting “private” in front of the declaration. The compiler won't like that. Don't even think of trying to call it directly though. The only way to call this second method either internally to the class (why not it's private) or externally is by casting the reference to the interface, like so..

Calculator calc = new Calculator();

int result = ((ISubtract)calc).Execute(5,4);

 

If you really want to call the method directly without casting it, you'll have to create a wrapper method and expose it publicly or privately if that suits your needs.

public int Subtract(int i, int j)

{

return ((ISubtract)this).Execute(i, j);

}

Thanks to Alex Bershadski for pointing me to a few code snippets in the MS Press C# book about this. The book could really do a better job of identifying the limitations of C# in this regard. At least it can be done. Up until Tuesday, I didn't think it was possible.

EnforceConstraints

Have you ever tried to update a Dataset and received this message?

Failed to enable constraints. One or more rows contain values violating non-null, unique, or foreign-key constraints.

Not a problem in a small DataSet. But if your Dataset is more complex sometimes it's not easy to see the problem right away.

There is a property on the DataSet called EnforceConstraints. When you set this to true the DataSet will try to enforce all the constraints in the DataSet (Not null columns, Foreign keys, unique keys) The problem with this exception is, it tells you very little. You know you have violated a constraint, but which one? The way to find out is to look at the row and column errors.

Below is a method that will attempt to apply contraints and write the errors out to the output window. I hope someone out there finds it useful.

Public Sub GetDataSetErrors(ByVal ds As System.Data.DataSet)

 

Public Shared Sub GetDataSetErrors(ByVal ds As System.Data.DataSet)

 

Try

ds.EnforceConstraints =

True

Catch ex As Exception

Debug.WriteLine("DataSet errors: " & ds.DataSetName)

 

For Each table As DataTable In ds.Tables

 

Dim ErrorRows As DataRow()

ErrorRows = table.GetErrors()

 

For Each row As DataRow In ErrorRows

Debug.WriteLine("Table: " & table.TableName)

Debug.WriteLine(" Row Error: " & row.RowError)

 

Dim ErrorColumns As DataColumn()

ErrorColumns = row.GetColumnsInError()

 

For Each column As DataColumn In ErrorColumns

Debug.WriteLine("Column: " & column.ColumnName)

Debug.WriteLine(" Error: " & row.GetColumnError(column))

 

Next

 

Next

 

Next

 

End Try

 

End Sub

Zero Touch and the Cache

I had a chance to travel to Calgary earlier this week to speak at the Microsoft Bigger Better Basic conference.  One of the session I was scheduled for including a number of demos on the Smart Client technology.  So I'm sitting on the plane running through the demos just to make sure that I don't...er...mess up in front of 600 people and I get to the Zero Touch Deployment (ZTD) demo.  This is a straightforward demo showing what happens if you move a .NET application onto a virtual directory.  The deployment is done by clicking on a link in a simple HTML page.  So sweat right?

So I get to the part where the HTML page is displayed, click on the link and...nothing happens.  That strikes me as a little strange, so I start checking on the normal stuff.  Does the file exists.  Is it really a .NET application.  Does the link in the HTML page point to the right place.  All looks good.  So now I go on to the more esoteric possibilities.  Do I have the appropriate permissions defined in the Framework Configuration Wizard.  Are there any settings in the virtual directory that might be giving me grief.  Has ASP.NET been installed in the virtual directory.  Still nothing.

Now I'm beginning to pull my hair out.  After another 15 minutes or so of puzzling, I try to view the source for the HTML page.  Nothing happens.  Fortunately, this is a situation that I'm familiar with.  If you can't view the source for a web page, your temporary Internet cache is full.  And then it hits me.  No room in the Internet cache means that the downloaded application doesn't have a place to live.  So I go into the Internet Options, clear out my cache and ZTD works like it's supposed to.  No damage (other than those haris that are still stuck between my fingers).

But this does bring out one of the biggest weaknesses in the ZTD model.  The downloaded application lives in the cache, along with other pages of stuff from the Internet.  It is too easy for a user to clear out the cache, thus removing the benefit of off-line functionality.  I believe that this situation will be corrected one of the future products (I can't remember whether it's Indigo, Whidbey or Longhorn).  The correction is the cache the ZTD applications in a different location and allowing a generally more granular level of cache clearing.  Can't come too soon for my taste.  ZTD is nice, but when in the hands of the uninitiated, it has the potential to generate some help desk traffic.

October 2003 Visual Studio .NET Documentation Update

It was Eric Rudder who mentioned in his keynote that there was a new update to all the online help for the Visual Studio including something like 5,000 new samples. He mentioned this update like it had been around for months. Well it was posted on Oct 22/2003 in case you missed it.

 It must be good, it's an 80mb download.

MSBuild vs. NAnt

Correct me if I'm wrong but isn't MSBuild a knock off of NAnt or any other derivative of Ant?

Is the only reason MS created such a build tool is so that internal MS employees could use it? Do their employee contracts prohibit them from using Open Source tools? That's my speculation. Please correct me if I'm wrong.

It's not a problem if I'm right - but why not just use it internally at MS? Why make it public for everybody's use. We already have a decent tool and we get the source code to boot - not to mention community support.

I didn't get a chance to go to the MS Build presentation (TLS347) so I apologize in advance if I'm just not getting it. I have looked at the slides but there isn't much there - nothing that I don't see in NAnt.

I'm really looking for the killer reason not to use NAnt and switch to MSBuild. Maybe the answer is this..

MSBuild will be the core build engine underneath Visual Studio and share project file formats with VS.NET. Certainly this has caused me some grief with NAnt in the past, mostly with VB.NET since slingshot (the last time I looked) only converted csproj files to build files and not vbproj. So I think the most important thing about MSBuild is not MSBuild itself but that the C# and VB.NET teams will have to (or already have) collaborated on a unified .proj file format. Hopefully the NAnt contributors will write an XslTransform. As an aside I hear that this level of integration won't be available for C++ developers. Now getting three teams to talk to each other - that's just impossible. I bet that the .proj file format changes again when the C++ team gets around to supporting MSBuild.

In addition to the VS.NET integration, there seems to be only 1 other significant difference between NAnt and MSBuild. MSBuild includes some kind of “full inner-task dependency“ that is supposed to allow for incremental builds. While I might care about this for an IDE experience (slightly) it's less important from a nightly build scenario - IMHO.

ASP.NET and the Event Log

Today's tidbit revolves around enabling the ASP.NET user to generate entries into the event log.  In an ideal world (hint, hint Microsoft designers), this would be a relatively straightforward process.  Or at least one that didn't require a direct hack into the registry.  But that is not the case at the moment.  So without further ado, here are the steps involved in enabling the ASP.NET user to create event log entries.

1. Launch RegEdit
2. Navigate to HKEY_LOCAL_MACHINE\SYSTEM\
    CurrentControlSet\Services\EventLog\
3. From the menu, select Edit->Permissions
4. Click the Add button and write ASPNET.  (if ASP.NET is running under a different user id, use that id instead)
5. Click OK.
6. Select the newly added user from the list (ASP.NET Machine User by default).
7. Click on Full Control in the Allow column.
8. Click OK.

It is usually a good idea at this point to restart IIS with the IISReset command (Start | Run | IISReset).

For those concerned with the security hole that has been opened up.  Once these changes are implemented, the ASP.NET user has full control over the Application event log.  Worst case scenario, a bad process could fill up the event log or delete existing log entries.  However, as far as security breaches go, these are fairly minor, especially when compared to the benefits of being able to view log entries.

PDC Aftermath

It's now the Monday after PDC. I just woke up.

What a week! I didn't get a blog entry in on Tursday it was a very busy day. Talked to some ADO Microsofties about DataSets and Object Spaces. Sat in one a few last sessions. Did some shopping. Headed for the airport. For the next 12 hours we travelled.

Now I have to sit down and watch all the presentations I wanted to go to but couldn't get into, or picked something else.

I hope the DVD comes soon.

 

Bigger Better Basic

Over the next month we are co-presenting with Microsoft at their cross Canada tour called Bigger Better Basic . ObjectSharp will be in 5 cities Calgary, Vancouver, Toronto, Montreal and Ottawa.

Come on out.