The rumours have been swirling. Now the truth is out.
On Sat. Jan 14, there will be a Toronto Code Camp. You can register/find more infomation/hang out at http://www.torontocodecamp.com/
If you're a developer looking for in-depth content from the people who know (that would be other developers), then the code camp is the place to be. If you're a developer that has in-depth content that other developers could use, this is you're chance to shine. Regardless, it will be a blast. Clear your schedule now to be there.
…you hear a presenter say “In the past…er…I mean in the version that’s about to ship”
One of the cool things about being an MVP is that I get invited to the annual MVP Summit. One of the bad things about being at the summit is that it is covered by an NDA agreement. Which means I don’t get the blog about anything that I’m seeing here. That having been said, I think I can get away with this.
Yesterday there was a morning of executive keynote presentations. It was in a large facility, complete with big screens, translators and closed captioning. Closed captioning that was hand-typed as the executive was speaking. By someone who isn’t intimately familiar with technological terms. The following is a brief list of the mistypings:
Lynn Yucks = Linux
urine testing = unit testing
martyr = smarter
Almost seems Freudian, doesn’t it.
I had the opportunity earlier this week to listen on a MSDN chat covering the enhancements to C# that were unveiled at PDC. Listen being an imprecise term given that it was a chat, you understand.
I had two main takeaways from the chat. First, there was a ton of interest in LINQ and the various inner and outer workings of that. I’ll post some observations on that shortly. The second thing I learned was that there seemed to be a high level of confusion over exactly what inferred types are bringing to the table. I think that the real problem is that they don’t bring nearly as much as people expect/want/are afraid of. Please allow me to put the underlying thought into my own words.
In its simplest form, an inferred type looks like the following:
var a = 1;
The ‘var’ keyword is used in place of a data type. At compile time, it is inferred that a is intended to be an integer. From that point forward, the compile treats a as if it had been defined as:
int a = 1;
So much for the simplest case. And, in fact, using this as the example is a bit of a problem because no one should ever write code this way. In this simple a case, you should use the second code rather than the first. This makes it difficult to understand when inferred types are even useful. So let me pull the example that was used in the chat:
Dictionary>> peopleOrders = new Dictionary>>();
Now everyone who thinks this is easy to read hold up your hand. In this blog, where there is no syntax coloration, it’s hard to figure out what the name of the variable is, much less what the type is. When this code is read, the programmer will have to give it a second or third glance to get what is going on. And since variable declarations are not normally that important to the business logic of a method, all this syntax does is add friction to the process of understanding. So consider the inferred version:
var peopleOrders = new Dictionary>>();
Isn’t this much easier to digest? I’m sure that even those people who put their hands up for that last question will agree. At that, ultimately, is the purpose for inferred types. To reduce the friction involved in reading code. Strong typing is still in place. Programs are still type-safe. Casting exceptions will still be thrown when appropriate. In other words, inferred types are just a way to pretty up the syntax of C#. And given some of the code that I’ve looked in the past, anything that helps keep modules cleaner is worth it.
One of the things that really turns my head when it comes to software is great customer service. I have been using a product called ClearContext for about six months now. It is a tool that allows you to prioritize the email that arrive in your Outlook in-box. I have found it quite effective in whittling my in box from almost 1000 to less than 75.
But this post isn’t intended to be an ad for ClearContext. Instead, it was their service that set them off. A little while ago, I lost a hard drive on which I ran Outlook. Which means I also lost the licensing information for ClearContext. While I could download and run the product without the license, the nag screen eventually got to me. On Saturday at 1:33 pm, I sent an email to ClearContext asking if they could resend my registration key. Before 1:45 pm, I had received the keys I requested.
Why is this so incredible? Because I don’t get the sense that ClearContext is a huge company. It’s certainly not a Dell, Microsoft or Oracle that have support staff 24x7. But they gave me exactly what I wanted when I wanted it. Can’t do any better than that and it’s certainly worth a plug in my blog.
I’ve gotten my hands a little more dirty with SQL 2005 over the last couple of months. I’ve used CLR triggers and stored procs to implement some functionality that would have been difficult to do using T-SQL. I’ve worked with XML datatypes to generically extend the fields that can be persisted in a table. In other words, I’ve played with some of the newer and potential more contentious features of SQL 2K5.
While cruising through a number of blogs today, I came across this post from Don Demsak. In it, he refers to an eWeek article DBAs Bar Door Against Big Bad .Net Wolf.
My own experiences suggest that this tone is just the beginning. I recently had a chance to hear a presentation on the Web Service functionality that SQL Server provides. For the uninitiated, it is possible to define an HttpEndpoint in SQL Server. On this endpoint, methods can be defined that expose stored procedures. Interesting functionality to include in SQL Server, but what I found most interesting was the angle that the speaker took in support of the idea of exposing your SQL Server machine to the Internet. The suggestion was that, with ASP.NET Web services, the DBA has no idea what data or functionality that the ignorant (my word, but their tone) developer created. By exposing their own Web services, the DBA could take control over the exposed data by eliminating the middle man.
My first instinct was ‘whoa’. The approach seemed a little extreme. Maybe I’m just an ignorant developer, but for most of my professional life, the idea of having the machine running the database server exposed to the Internet, even if it was ‘only’ port 80, was never considered a good idea. And the HttpEndpoint doesn’t include support for any of the WS-* standards. It is basically just the same sort of Web service that was available back at ASP.NET 1.0 before the introduction of WSE.
Does anyone else thing this feature is a good idea? Am I just being short-sighted and/or narrow-minded?
One of the problems with keeping one eye on the next version of software is getting too comfortable with the name. I still say Whidbey instead of VS.NET 2005 and it will take me a while yet before that problem goes away. And Microsoft has just added another synonym to that problem. Please welcome Microsoft Vista
as the ‘real’ name for that which has been called Longhorn for what, about two years. Yeah, like I’m going to be able to make that adjustment quickly.
I don’t normally post on what’s going on in my personal life. For the most part, nobody would care. Not to mention that my life just isn’t that interesting. However, having just had our 5 month old chocolate labrador spayed, I found this particular comic
(found at Basketcase Comix) to be the biggest laugh I’ve had in a few weeks. So how could I not pass it on.
While spending my weekend perusing blog and other light bedtime reading. In doing so, I came across an interesting entry by Rocky Lhotka. He discusses the Mort persona and it’s impact on the features found in VB.NET. I agree with his observation that the continuous suggestion that VB is not a ‘real’ language has driven a large number of Morts to C#. I did a number of the MSDN Whidbey User Group tours across Canada and was surprise how often C# is the language of choice even if the development team is already VB familiar. So the trend to adding Mort-desired functionality is only going to pick up speed.
So is this a good thing for C# developers. I, for one, believe that it is. Many of the productivity advancement for Visual Studio come up through the VB.NET ranks. Improved intellisense. Edit and continue. Having Mort push on the C# IDE will only improve it over time. If your disappointed by the impurities that might be added to C#, don’t fret. You can always take up Ruby.
Have a weblog? Take the following survey and help out an MIT research project.
BTW, the 'Cameron' I was thinking of is my son. He needs freeing from the enslavement he feels from being 7. :)