Reflections on the PDC Day 1 Keynote

Bill Gates gave a pretty typical high level keynote to introduce the keynote this morning. He talked about the past, how far we've come, and how now is the most exciting time, and that we are in most exciting industry. Not that I don't disagree, but I swear I've heard this keynote before.

After Bill, a series of VP's and Architect's ran through more product details. Things started to get much more interesting at this point. Chris Capossela gave an end user run down of Windows Vista and Office 12 - which will be both released at the same time near the end of 2006.

The UI is just stunning (as it always is in these demos). It was also nice to see the QuickSearch text box integrated through both products. Not unlike Google Desktop Search, and using the same engine as MSN Desktop Search, the QuickSearch text box gives context sensitive searching through the application. If you're in a document explorer - you can search there for documents. If you are in the start menu, you can easily search for applications (and recent documents). If you are in outlook you can easily search your in-box, contacts, etc. Of course you can do broad computer searches too, but that context is nice.

Chris also showed off Sidebar which isn't really big news, but he also showed the audience Sideshow. Sideshow uses the same dock-able gadgets that Sidebar does, but re-use them on what I can only describe as a built in Pocket PC device that is built into the cabinetry of your laptop. This allows you to check real-time information (email, appointments, etc.) without turning on or booting up your laptop. Expedia had a nice gadget working in Sideshow that showed up to the minute flight status. Nice.

RSS is also taking a prominent position in Vista and Office. An RSS store was announced that would store subscribed RSS feed content. This content would be regularly downloaded automatically, and the content would be available to the Sidebar, Outlook, IE7, and your own applications. Cool.

Office 12 has a new user interface that hopes to make more of its features discoverable. At first glance I wasn't all too excited about this, but I'll reserve my judgment until I play around with it. The quick “wizard“ like features were absolutely stunning though.

The integration with Outlook and Sharepoint is quite impressive. We are all accustomed to having our email/contacts/appointments offline stored in our outlook store. With Office 12, you can keep in sync with any Sharepoint folder to keep those files on your local store. Sweet. Better yet, a special new Sharepoint List for sharing PowerPoint decks. When you upload a PowerPoint file, an item appears in the list for each slide. From within PowerPoint, I can create a new deck, and pull individual slides from the Sharepoint server. You can optionally have it keep that slide up to date so if a new version is uploaded to the server, you'll automatically get it. Corporate plagiarism has just become so much easier.

After Chris's “consumer“ demos, Jim Alchin came out with Don Box, Chris Anderson, Anders Hejlsberg and Scott Guthrie. Jim started by giving some demos of some interesting plumbing bits. One cool thing in Vista is Super Fetch. Super Fetch is a preloaded memory cache of things you'll likely need, but unlike typical hard drive caches, it basis it's decisions on analyzing your behavior over days, weeks, months to determine what an idle machine should be preloading. The second part of his demo tied in very nicely where he showed that any USB Memory Stick could be plugged into a Vista machine and it would automatically start using it for expanded virtual ram. That totally rocks for laptops which can quickly max out their ram capacities.

Don and Anders went on to talk about some big news, namely the Language Integrated Query (LINQ) project. Linq provides a query engine on top of XML, Object and Relational data stores using a common query language reminiscent of SQL. No, this isn't an O/R mapping tool, but you can see how they may have wanted to delay ObjectSpaces until they got Linq out the door. I'll have more on this in my blog in the coming days. Attendees at PDC are getting Linq bits to try out, and don't forget to stop by the track lounge to pick up a copy of a Linq whitepaper. 

Next Don and Chris messed around with Indigo, and they also created a goofy Avalon application. Scott Guthrie came out to show off the Atlas product which is a set of cross browser javascripts and server side ASP.NET 2 controls to make Ajax style programming a snap.

To close out the lengthy presentation, Jim brought out a few other people to demonstrate complete applications to bring up the wow factor, including Microsoft Max and a kiosk application created for the North Face.

UPDATE: Dinesh Kulkarni gives some inside scoop on how ObjectSpaces is dead, or rather morphed into DLinq. It would appear ObjectsSpaces is not something you'll see built down the road on top of Linq.

Metro Toronto .NET User Group Meeting September 8th: Managed Code in Sql Server 2005

On September 8th I'll be speaking at the .NET User Group here in Toronto. I'll be talking about how developers can take advantage of Sql Server 2005's ability to host managed code. Full abstract and registration details are here.

DevTeach Conference in Montreal

I'm going to be heading out in a couple of weeks to DevTeach in Montreal. In addition to my regular session talk on Datasets, I'll also be participating in an architecture panel discussion as part of Groupe d’usagers Visual Studio Montréal, Software Architecture Special Interest Group's Special Software Architecture Meeting. The meeting is open to conference attendees, members of the user group, and anybody else for $5. Here's the details....

Speaker: Joel Semeniuk, Microsoft Regional Director, Winnipeg

Subject: Software architecture from the trenches

 

Architecture is the soul of our software. Software Architecture truly helps to define our success since if our architecture fails us, our software fails us. However, what makes a good architecture? What truly drives architectural decisions? Is one architecture better than another? In this session we will explore and discuss some of these questions while taking a close look at a few real-world examples. In each real-world scenario we will explore the resulting architecture and review the constraints the project faced both during design and during production and maintenance phases. We will also look retrospectively at each architecture presented and discuss ways that it could be improved upon with Microsoft .NET 2.0.

 

Joel Semeniuk is a founder and VP of Software Development at ImagiNET Resources Corp, a Manitoba based Microsoft Gold Partner in Ecommerce and Enterprise Systems. Joel is also the Microsoft Regional Director for Winnipeg, Manitoba. With a degree in Computer Science from the University of Manitoba, Joel has spent the last twelve years providing educational, development and infrastructure consulting services to clients throughout North America. Joel is the author of "Exchange and Outlook: Constructing Collaborative Solutions", from New Riders Publishing and contributing author of "Microsoft Visual Basic.NET 2003 KickStart" from SAMS. Joel has also acted as a technical reviewer on many other books and regularly writes articles for .NET Magazine and Exchange and Outlook Magazine on a variety of infrastructure and development related topics. Reach Joel by email at joels@imaginets.com.

 

Followed by a software architecture expert panel:

Beth Massi, Software Architecture MVP

Joel Semeniuk, Software Architecture MVP, Microsoft Regional Director Winnipeg

Barry Gervin, Software Architecture MVP, Microsoft Regional Director Toronto

Mario Cardinal, Software Architecture MVP

Carol Roy, Microsoft Canada .NET architecture specialist for the public sector

 

Well known Nick Landry (MVP .NET Compact Framework) will act as the moderator.

 

Come hear these experts talk about software architecture hot topics.  You'll also have the chance to ask questions and talk to the panelists.

 

Monday June 20th, 5:30PM to 9:30PM

Location: Sheraton Centre, 1201 Boulevard Rene-Levesque West

Cost: free for all the DevTeach attendees and the Groupe d’usagers Visual Studio Montréal members.  $5 for non members or non DevTeach attendees.

Note: this session will be held in English

More info: www.guvsm.net or http://www.devteach.com/BonusSession.asp

 

When is a database oriented as a service?

Do you consider your database as a service? It's worthwhile to review the tenents of a service oriented architecture. The first two tenents above are probably the most relevant to my question.

If you do all of your data access through stored procedures, then you might say your database boundary is explicit.

If your database doesn't depend on other services or applications to exist properly, then you could say that your database is autonomous. That's a little tricky. Although we may use stored procedures to access functionality in our database, we may have well known  practices that we have to call the ap_decrease_inventory  proc after we call the ap_ship_order proc to make sure our that our database values are all in check. I wouldn't call our database autonomous if it has to rely on these external rules being inforced.

I'm going to avoid the discussion of the last two tenents because I think the are a bit to pure for my question. I'm really just trying to differentiate between two types of databases that I see out there. For my purposes, I refer to these as Databases as Services, and Databases as File Systems.

Databases as Services typically are well encapsulated and contain business rules. These databases might be supporting several client applications. You probably take great care in these databases, designing them carefully, perhaps with modeling tools, and encapsulating the persistence function with stored procedures, functions, triggers, etc. You may or may not have a well defined data access layer in your client applications. You might consider all the stored procs to be your data access layer, so you might call you procs directly from UI and/or business layers of your application, but that really depends on how well your client application is written. From you database, you don't really care so much since it's well protected service that operates autonomously.

Databases as File Systems are much less strategic. They serve one purpose only - to save stuff from your application. You probably/hopefully have a well defined data access layer in your application. That may even be an Object Relational Mapping tool (ORM). You probably designed the database to support the persistence of the objects in your application, and to generalize, you probably only have one application using this database. The most important thing though is that all of your business rules should be in your application(s). This type of database doesn't mean you don't have db side logic such as stored procedures or triggers. You may decide for optimization reasons that some code needs to live closer to the tables and that's okay. It's okay, so long as you realize it's harder to reuse some of that logic in higher layers of your application and you are comfortable in having your logic live in multiple platforms.

Stored Procedures are increasingly being used to add encapsulation to our database. No longer is performance the rationale for stored procedures. And increasingly, we are seeing advanced services in our databases - 4GL code such as Java and .NET managed code are making their ways into our databases. User Defined Types, Objects, and with the next version of SQL Server, we're seeing a full fledged message queue mechanism with Service Broker. You can even host web services directly in SQL Server 2005.

Is your database a service? Which camp do you fall into? Unfortunately, I think many people live somewhere in between, and that isn't by design. Most of the architectural decisions here should be motivated by where you decide to draw your boundary for strategic reasons, not for what is handy at the moment. I'd like to see people more consciously make this decision and remain committed to it. What are your thoughts?

From Rainier to Orcas and beyond.

This past 3 days I've spent traveling to and from Redmond to visit with a few of the developer tools teams as part of an Software Design Review (SDR). These are a kind of focus group, with the intention of getting qualitative information from folks about what they'd like to see in upcoming development tools. Hopefully I'll be able to talk more about the content after PDC in the fall so stay tuned. It was a refreshing trip in that normally, I'm traveling to either learn or teach. During this trip I was there more to discuss and influence and I got a real sense of just how careful Microsoft listens to the community.

It's fitting that I drove down to Redmond from Vancouver, passing through the town of Everett and past the Whidbey and Orcas islands (part of the San Juan Islands). These names are probably familiar to some of you as code names for Visual Studio 2003 (Everett), 2005 (Whidbey) and beyond (Orcas). For the sake of completeness we should add Rainier as well which was the code name for Visual Studio 2002. Geographically, these go from south east to north west passing more or less through Redmond.

Not unlike the development of these versions, the journey between these stops is a windy road, taking you over hill and vale, and over several bodies of water.  What's after Orcas? Well the next leg of the journey is as ambitious as the following version after Orcas, namely Hawaii. If you look at this path on a map, you'll see that this is indeed quite a leap.

The most interesting thing that happened is that I realized that come Orcas, I'm likely only going to be interested in coding in Visual Basic, and not C#.

15% off of MCSD/MCAD Certification Exams @ Pearson/VUE

Use the following voucher number MSAU113E1020. Good until August 31, 2005.

Not your father's NGEN

Mr. Reid “NGEN“ Wilkes has a good article in MSDN magazine about NGEN 2.0 coming in whidbey. NGEN is the command-line tool that pre-compiles MSIL into native PE code....which has been around since .NET 1.0. It's can be a good performance kick on large applications to NGEN your assemblies as part of your deployment.

So what are the highlights?

  • when a dependent shared assembly is updated, your executable's ngen cached image is invalidate. The new NGEN service can queue up requests to do “across the board“ re-ngen's to update your cached image. To accomplish this, NGEN keeps track of all of these dependencies so when an update is deployed - bam - things are queued up and recompiled when your machine has idle time. way cool.
  • NGEN 2.0 blows out your generic parameterized classes into pre-compiled code (almost 100% of the time). Therefore any performance drags on generic expansion at runtime are gone if you NGEN.

Reid makes some good arguments for taking the time to NGEN your code by describing the down side of the JITer and the overhead that it imposes on memory compared to loading native images. What scares me even more is all the dynamic compilation coming in ASP.NET applications. Isn't this going to preclude the using NGEN on ASP.NET applications in whidbey?

But alas, these performance improvements are not always a silver bullet and in some cases JITing is more performant. The bottom line - is test to make sure your assumptions of NGEN improving your performance are correct.

Temperature User Defined Type for SQL Server Yukon

In SQL Server 2005/Yukon you can create user defined types using managed code. User defined types can be used in declaring variable and table columns. It is generally considered to only make sense for scalar types - things that can be sorted, compared, etc. not unlike the native numeric types and datetime. This is not however a fixed requirement, but its definitely the primary scenario being addressed by UDTs.

Here is an example of a temperature type, which is in fact a scalar. All UDT's have to be inserted as strings (i.e. in quotes). Not surprisingly, as part of the contract to implement a user defined type you should have a ToString and Parse method. This example allows the user to insert values in either Kelvin, Fahrenheit, or Centigrade by suffixing the entry with K, F, or C respectively. The default is Kelvin should you not include a suffix for the scheme, so 273, 273K, 0C, and 32F are all equivalent entries. All entries are converted and stored into Kelvin. This makes it easy to have the type byte orderable for easy sorting/comparisons, however this implementation also keeps track of the input format. So if you enter in 0C, you'll get OC back by default. You can however call x.Centigrade, x.Fahrenheit, or x.Kelvin properties to get the measurement scheme you desire. And since the type is byte orderable, you can sort by columns of this type and you'll get something like....

0K, -272C, 32F, 0C, 273K, 274K, 100C, all ordered by their underlying storage which in Kelvin is.
0, 1, 273, 273, 273, 274, 373.

Finally I have also declared as an example a TempAdder user defined aggregate to round out the example. I'm no scientist, but I'm not sure that temperatures are exactly additive, so this is really just for example purposes. The UDA is relatively self explanatory. You are responsible for keeping a variable around to accumulate values into. That leads into the accumulate method which gets called for every item in the result set to be aggregated. There is an init method for initializing your variables and finally a terminate method which is what is called when the result set is finished and you can return the result. The odd method is the merge method - which takes another aggregator. Because of parallel query processing, it is possible for two threads to individually deal with different parts of the entire result set. For this reason, when one thread finishes it's part of the aggregation, it will pass the other aggregator it's aggregator so they can be merged. For this reason I have my interim totals as a public property on the aggregator so i have access to that in the merge method. Furthermore, if you are calculating averages for example you may need the count as well as a public property.

Bottom line - is that creating user defined types and aggregates is a pretty straightforward. Any complexity will be a result of the actual complexity of your types. Consider a lineage type that must work with multiple measurement schemes (metric and imperial) and also multiple units of measurement (cm, m, km) but also multiple units of measurement at one time (i.e. 4' 10”). Don't even get me started on 2”x4”s which aren't actually 2” by 4”. The nice thing about UDT's is that it allows you to encapsulate a lot of this code deep down in your storage layer, allowing your business applications to deal with things more abstractly.

using System;
using System.Data.Sql;
using System.Data.SqlTypes;

[Serializable]
[SqlUserDefinedType(Format.SerializedDataWithMetadata, IsByteOrdered = true, MaxByteSize = 512)]
public class Temperature : INullable
{
 
 private double _kelvin;
 private bool n = true;
 private string inputFormat;

 public string InputFormat
 {
  get { return inputFormat; }
  set { inputFormat = value; }
 }

 public double Kelvin
 {
  get { return _kelvin; }
  set
  {
   n = false;
   _kelvin = value;
  }
 }

 public double Centigrade
 {
  get { return (_kelvin -273); }
  set
  {
   n = false;
   _kelvin = value + 273;
  }
 }

 public double Fahrenheit
 {
  get { return (this.Centigrade*1.8 + 32); }
  set
  {
   n = false;
   this.Centigrade = (value - 32)/1.8;
  }
 }
 public override string ToString()
 {
  string s = this.Kelvin + "K";

  if (this.InputFormat == "C")
  {
   s = this.Centigrade + "C";
  }
  else if (this.InputFormat == "F")
  {
   s = this.Fahrenheit + "F";
  }

  return s;
 }

 public bool IsNull
 {
  get
  {
   // Put your code here
   return n;
  }
 }

 public static Temperature Null
 {
  get
  {
   Temperature h = new Temperature();
   return h;
  }
 }

 public static Temperature Parse(SqlString s)
 {
  if (s.IsNull || s.Value.ToLower().Equals("null"))
   return Null;
  Temperature u = new Temperature();
  string ss = s.ToString();

  if (ss.EndsWith("C"))
  {
   u.Centigrade = double.Parse(ss.Substring(0, ss.Length - 1));
   u.InputFormat = "C";
  }
  else if (ss.EndsWith("F"))
  {
   u.Fahrenheit = double.Parse(ss.Substring(0, ss.Length - 1));
   u.InputFormat = "F";
  }
  else if (ss.EndsWith("K"))
  {
   u.Kelvin = double.Parse(ss.Substring(0, ss.Length - 1));
   u.InputFormat = "K";
  }
  else
  {
   u.Kelvin = double.Parse(ss);
   u.InputFormat = "K";
  }

  return u;
 }

}

using System;
using System.Data.Sql;
using System.Data.SqlTypes;
using System.Data.SqlServer;


[Serializable]
[SqlUserDefinedAggregate(Format.SerializedDataWithMetadata, MaxByteSize = 512)]
public class TempAdder
{
 public void Init()
 {
  total = new Temperature();
  total.Kelvin = 0;
 }

 private Temperature total;

 public Temperature Total
 {
  get { return total; }
  set { total = value; }
 }

 public void Accumulate(Temperature Value)
 {
  this.Total.Kelvin  = this.Total.Kelvin + Value.Kelvin;
 }

 public void Merge(TempAdder Group)
 {
  this.Total.Kelvin = this.Total.Kelvin + Group.Total.Kelvin;
 }

 public Temperature Terminate()
 {
  return this.Total;
 }
}


 

Modeling as a Productivity Enhancer in Visual Studio Team System

It's been a theme for me over the past couple of weeks where people have mentioned that they can't afford the time to do modeling. If you've done a lot of UML modeling, you know what I'm talking about. But it doesn't have to be that way. Now just to ward of the UML flames, even when UML modeling seems like a waste of time, it may actual pay for itself if you uncover a requirement or a design flaw that you wouldn't have otherwise. Certainly catching this kind of thing early than later pays for itself. But that's not what I'm talking about.

In my days as an ERwin/ERX user, doing data modeling was done in this tool not only because it was good to visualize something quickly before committing to code. Using ERwin/ERX was just plain faster than cutting DDL code manually - or heck, even using the Database diagramming in SQL Server. One simple feature was foreign key migration - you drew a relationship from parent to child and bam - it copied down the pk from the parent table and set it up as a fk on the child table.

For the VSTS class designer to be successful (or any of the designers for that matter) MS needs to make using them just plain faster than stubbing out the code manually. Visual Studio has a great editor so they have their work cut out for them. It gets even better with things like code snippets. Why not enable some of the code snippets in the class designer? I can still create a single field wrapped up by a public property faster in the code editor using a snippet (in C# “prop“) than using the class designer - but I don't see why they couldn't add support for that in the class designer too.

A feature I stumbled upon last week was the ability to override a member. Simply right click on the descended class, and select “Override Member“ and you'll see a list of members from the ancestor that are overridable. Select the member to override and bam - you have the code stub. This reminds me a bit of the Inherited Class Skeleton Generator. This represents the kinds of productivity features that can make models/designers more usable, even if just for productivity sake.

There are obviously some types of edits that are better performed in the code editor, such as actually editing code fragments. Other types of edits can be performed more appropriately in the model such as stubbing out the api/members of a class, overriding members, etc. Let's not forget the other types of edits which are much better done in a WYSIWYG designer such as the windows or web forms designer.

One thing I'd like to see come in the Class Designer is a flattened out view of a class that collapses all inherited members into one layer. I'll call this the consumer or intellisense view of a class. It's helpful for viewing the entire interface to a class so I can see how the whole thing makes sense to a user. I would propose a greyed out notation or perhaps a lock icon or something similar to the online help view of a class.

Copy & Paste Support in VSTS Class Designer/Whitehorse

You may have noticed that the refactoring menu's that you see in the code editor are also available in the Class Designer. Furthermore, you can also copy & paste things from one class to another. So if you copy a property from one class to another, not only do you get the property added to the class, but also all the code in your getter's and setters. Ok so this is a nice touch that can help with a refactoring effort that requires moving stuff around - but don't consider this code reuse :)