Back Home exploring SQL Genetics

Over the past month I've been doing some Whidbey/Yukon training. First in Ottawa, then in Toronto (home) and most recently last week in Calgary - a city I haven't visited in 10 years, and has grown up a lot since then. I really enjoyed Calgary.

I'm spending the day getting re-acquainted with my wife and our daughters. My 6 year old used a word this morning I hadn't heard her use before. She was showing me a fancy painting she did at school and while comparing her technique to her peers, said that she “used the minimum amount of colors that the teacher suggested”. You know where I'm going with this don't you?

I asked her what “minimum” meant and she said “the least amount of work”. Ok, she's my daughter, no DNA testing required. But to further test her understanding I asked her what the minimum of 1, 4 and 7 was and she correctly chose 1. She's no math genius, so I was happy she got it right. She even knew that the maximum was 7. I figured I blow her mind (and her mother's who was listening) by asking her what the sum was. I nearly fell off my chair when she said 12.

So I'm happy to report that SQL skills are indeed part of the human genome. Next week we'll try her on correlated subqueries.

Getting Schema information from a Database

Yesterday in my class I was going over the new GetSchema API in Whidbey and I learned from one of my students about another technique that I wasn't aware of - SQL ANSI 92 Information Schema Views. They are also supported in Oracle.

There are almost too many ways of getting database schema information but I'll try to summarize here to see how they compare. Depending on what you want to do, one of these will be more appropriate for you.

  • A common technique is to execute a SQL statement and describe the result set. Execute a “Select * FROM known_table_name”. Some API's would have you add a “WHERE 1=2“ if you can't just “describe“ the result set without incurring row retrieval payload. You can take advantage of this technique using the DataReader's GetSchemaTable method. In this example, you are really getting the schema of a query and not per se the underlying table structure. As such, it doesn't have to be constrained to a “*“ or 1 table query either.
  • Query the SQL Server System tables - this is close to the metal - which can burn you, but if you don't mind built in fragility. This technique is going to likely become obsolete in SQL Server Yukon with the new “Sys.*“ views.
  • Slightly better than using the system tables would be using system stored procedures such as sp_tables
  • Use the SQL-92 ANSI Standard Information Schema views such as “SELECT * FROM Northwind.INFORMATION_SCHEMA.COLUMNS WHERE TABLE_NAME = N'Customers'. Much nicer, and generic across anything that implements this standard (SQL Server 7.0 and up, and Oracle 9i? and up)
  • In .NET 1.0, you could also use the GetOleDbSchema method of an OleDbConnection. This is somewhat hard coded to a RDBMS and requires an OleDb driver.
  • In .NET 2.0, you can use new GetSchema method on the connection object. In theory, the idea is you get schematic information specific to the provider - so things that are specific to the provider can be made available/discoverable. But I'll warn you, it's a pretty abstract API. It's also remotely similar at the root level to the MSXML 4.0 “GetSchema“ method in that a) the methods have the same name, and b) they accept an argument of “Schema“ to return. In theory, you'll get more accurate schema information for the thing you are connecting too. Collections can be defined by the provider which differs from GetOleDbSchema which normalizes all db objects into a known set of objects. That may not work so well for new providers, Object Relational databases, or who knows what next.

Temperature User Defined Type for SQL Server Yukon

In SQL Server 2005/Yukon you can create user defined types using managed code. User defined types can be used in declaring variable and table columns. It is generally considered to only make sense for scalar types - things that can be sorted, compared, etc. not unlike the native numeric types and datetime. This is not however a fixed requirement, but its definitely the primary scenario being addressed by UDTs.

Here is an example of a temperature type, which is in fact a scalar. All UDT's have to be inserted as strings (i.e. in quotes). Not surprisingly, as part of the contract to implement a user defined type you should have a ToString and Parse method. This example allows the user to insert values in either Kelvin, Fahrenheit, or Centigrade by suffixing the entry with K, F, or C respectively. The default is Kelvin should you not include a suffix for the scheme, so 273, 273K, 0C, and 32F are all equivalent entries. All entries are converted and stored into Kelvin. This makes it easy to have the type byte orderable for easy sorting/comparisons, however this implementation also keeps track of the input format. So if you enter in 0C, you'll get OC back by default. You can however call x.Centigrade, x.Fahrenheit, or x.Kelvin properties to get the measurement scheme you desire. And since the type is byte orderable, you can sort by columns of this type and you'll get something like....

0K, -272C, 32F, 0C, 273K, 274K, 100C, all ordered by their underlying storage which in Kelvin is.
0, 1, 273, 273, 273, 274, 373.

Finally I have also declared as an example a TempAdder user defined aggregate to round out the example. I'm no scientist, but I'm not sure that temperatures are exactly additive, so this is really just for example purposes. The UDA is relatively self explanatory. You are responsible for keeping a variable around to accumulate values into. That leads into the accumulate method which gets called for every item in the result set to be aggregated. There is an init method for initializing your variables and finally a terminate method which is what is called when the result set is finished and you can return the result. The odd method is the merge method - which takes another aggregator. Because of parallel query processing, it is possible for two threads to individually deal with different parts of the entire result set. For this reason, when one thread finishes it's part of the aggregation, it will pass the other aggregator it's aggregator so they can be merged. For this reason I have my interim totals as a public property on the aggregator so i have access to that in the merge method. Furthermore, if you are calculating averages for example you may need the count as well as a public property.

Bottom line - is that creating user defined types and aggregates is a pretty straightforward. Any complexity will be a result of the actual complexity of your types. Consider a lineage type that must work with multiple measurement schemes (metric and imperial) and also multiple units of measurement (cm, m, km) but also multiple units of measurement at one time (i.e. 4' 10”). Don't even get me started on 2”x4”s which aren't actually 2” by 4”. The nice thing about UDT's is that it allows you to encapsulate a lot of this code deep down in your storage layer, allowing your business applications to deal with things more abstractly.

using System;
using System.Data.Sql;
using System.Data.SqlTypes;

[Serializable]
[SqlUserDefinedType(Format.SerializedDataWithMetadata, IsByteOrdered = true, MaxByteSize = 512)]
public class Temperature : INullable
{
 
 private double _kelvin;
 private bool n = true;
 private string inputFormat;

 public string InputFormat
 {
  get { return inputFormat; }
  set { inputFormat = value; }
 }

 public double Kelvin
 {
  get { return _kelvin; }
  set
  {
   n = false;
   _kelvin = value;
  }
 }

 public double Centigrade
 {
  get { return (_kelvin -273); }
  set
  {
   n = false;
   _kelvin = value + 273;
  }
 }

 public double Fahrenheit
 {
  get { return (this.Centigrade*1.8 + 32); }
  set
  {
   n = false;
   this.Centigrade = (value - 32)/1.8;
  }
 }
 public override string ToString()
 {
  string s = this.Kelvin + "K";

  if (this.InputFormat == "C")
  {
   s = this.Centigrade + "C";
  }
  else if (this.InputFormat == "F")
  {
   s = this.Fahrenheit + "F";
  }

  return s;
 }

 public bool IsNull
 {
  get
  {
   // Put your code here
   return n;
  }
 }

 public static Temperature Null
 {
  get
  {
   Temperature h = new Temperature();
   return h;
  }
 }

 public static Temperature Parse(SqlString s)
 {
  if (s.IsNull || s.Value.ToLower().Equals("null"))
   return Null;
  Temperature u = new Temperature();
  string ss = s.ToString();

  if (ss.EndsWith("C"))
  {
   u.Centigrade = double.Parse(ss.Substring(0, ss.Length - 1));
   u.InputFormat = "C";
  }
  else if (ss.EndsWith("F"))
  {
   u.Fahrenheit = double.Parse(ss.Substring(0, ss.Length - 1));
   u.InputFormat = "F";
  }
  else if (ss.EndsWith("K"))
  {
   u.Kelvin = double.Parse(ss.Substring(0, ss.Length - 1));
   u.InputFormat = "K";
  }
  else
  {
   u.Kelvin = double.Parse(ss);
   u.InputFormat = "K";
  }

  return u;
 }

}

using System;
using System.Data.Sql;
using System.Data.SqlTypes;
using System.Data.SqlServer;


[Serializable]
[SqlUserDefinedAggregate(Format.SerializedDataWithMetadata, MaxByteSize = 512)]
public class TempAdder
{
 public void Init()
 {
  total = new Temperature();
  total.Kelvin = 0;
 }

 private Temperature total;

 public Temperature Total
 {
  get { return total; }
  set { total = value; }
 }

 public void Accumulate(Temperature Value)
 {
  this.Total.Kelvin  = this.Total.Kelvin + Value.Kelvin;
 }

 public void Merge(TempAdder Group)
 {
  this.Total.Kelvin = this.Total.Kelvin + Group.Total.Kelvin;
 }

 public Temperature Terminate()
 {
  return this.Total;
 }
}


 

DevTeach June 18-22, Montreal and a $50.00 Rebate Code!

DevTeach is a really nice conference. I went to it for the first time last year and it's a very intimate and interactive conference. While smaller than your TechEd's and PDC's, it manages to attract a very good set of speakers....Kevin McNeish, Brian Noyes, Julia Lerman, Don Kiely, Patrick Hynds, Carl Franklins, Mario Cardinal, Ted Neward, Nick Landry, Etienne Tremblay, Sam Gentile, Jim Duffy, Guy Barrette, Eric Cote, Markus Egger, Kate Gregory and me too. I'll be doing a DataSet tips and tricks talk - but mostly just so I can get a free pass to go and see all the other great talks. This year it is being held once again in Montreal which is a beautiful city with a great night life.

And last but not least, here is a rebate code for $50.00 "TO000OBJSHARP". You can register at http://www.devteach.com

VS Live Toronto 2005 - 10% Discount Priority Code

VS Live is coming to Toronto again - April 13-16th. This year, the event will be right downtown on the lake at the Harbour Westin Castle hotel. Use Priority Code “BARRY” for a 10% discount when you register. You'll also save an additional $250 if you register early by March 16th. Toronto is a great place for a conference, especially when you factor in the exchange rate, this is very affordable for americans.

I'm speaking at this event, covering some of the methodology customization support in Visual Studio Team System and some new ADO.NET stuff too. Should be a lot of fun.

Get your Free Visual Studio Tools for Office custom applications here.

Ok, this is very cool - not often somebody gives you something for free.

Is your company upgrading to Office 2003 between June & July of this year?

Could you envision a customized solution built on top of Office - perhaps using Visual Studio Tools for Office, to solve some business need? If so, not only can we help with that, but Microsoft Canada is willing to foot the bill for the development effort. This is a great risk free way to try out Visual Studio Tools for Office projects - either the current 2003 or 2005 versions.

For more info, drop me an email. bgervin@objectsharp.com

What are you doing for your summer job?

We have a very exciting internship project coming up this summer for a university student, perhaps co-op - but not mandatory.

This high-profile project is to develop a software system to monitor the various systems in a “green” home that is completely off of the power grid. The house is fed by batteries charged by solar and wind (and a backup generator). There are many other systems in place to optimize power usage throughout the house. Although this home is off the grid, it's not off the internet, so the software will have to publish it's information and allow remote access via it's 2-way satellite system. The software will also be developed in .NET, likely using the current whidbey beta.

You are a university student with some combination of electrical engineering and computer science. You are strong “quality-oriented” programmer. You have strong design skills and are good at listening and capturing requirement. The position will be located in southern ontario.

If you are interested, please send your resume to bgervin@Objectsharp.com .

.NET Celebrity Auction

Be a sport and click on this link:

http://cgi.ebay.com/ws/eBayISAPI.dll?ViewItem&item=5552696499

Then make a generous bid. If you'll win, you'll get an hour (or more) of help from a .NET guru/celebrity (or possibly me). But more, you'll also be helping Tsunami relief efforts.

The top bid gets to pick their consultant. Then next, and so on and so on. If you are in southern Ontario, and you get me, I'll make it up to you by coming to your office - for a whole day, hang out, and bring donuts. What will I do? I can tell you everything I know about Visual Studio Team System (breaking all kinds of NDA rules, etc.), try to convince you to use data sets, do some code reviews, help debug something nasty, defrag your hard drive, organize your mp3's, tell you what DataGrid girl is really like, whatever.

I'm visiting Vancouver, Calgary, Ottawa, Montreal over the next 3 months so if you live/work near there, my offer stands, pending my schedule. I'll also be in Orlando possibly in June (for TechEd), LA in Sept (for PDC), and Chicago in August, so ditto on those as well.

For more info on how it all works....

http://www.stephenforte.net/owdasblog/#a61b646aa-ca24-47ef-b013-012bf852f79d

And finally, special thanks to the other RD's who are volunteering their time (especially all those fellow Canadians). Last but not least, special thanks to Stephen Forte and Julia Lerman for organizing this.

SqlConnection woes on the Compact Framework over WiFi

Last night at the Toronto .NET Users Group talk I did on mobility, a gentleman had a question about directly connecting to an Enterprise SqlServer database from a Pocket PC using the Compact Framework. His users run on a shop floor and some times they lose their wifi signal. While he doesn't keep the SqlConnection open the whole time, it seems that when a wifi signal is back alive his application can't connect to the database using a SqlConnection.Open anymore - despite that he can still ping the server.

He was correctly doing a SqlConnection.Close in a finally clause around his data access, but I suspect that even though pooling is not supported on the compact framework, this is not doing a proper disconnect of the physical connection (a Sql Profiler session would probably tell you that for sure). So a using block in C# will ensure that the SqlConnection is disposed of immediately and will properly terminate the connection. So if between calls you lose and reaquire your wifi connection you'll be starting a bran new connection the next time. It's still a good idea to put all of your actual work in a try/catch/finally block with a SqlConnection Close in the finally.

using (SqlConnection cn = newSqlConnection(CONNECTION_STRING ))
{
 try
  {
   cn.Open();
   //do some work with the connection
  }
  catch (SqlException ex)
  {
   //error handling
  }
  finally 
  {
   cn.Close();
  }
}

CTTDNUG UG Tomorrow Night: Building Pocket PC Applications with the Compact Framework and SQL CE

As part of the continuing MSDN User Group Tour, I'll be speaking at the Canadian Technology Triangle .NET User Group in Waterloo on Thursday October 7th (tomorrow night). There is a new location for the meeting at Agfa (formerly Mitra) in Waterloo. All the details are here.

Correction: Adam Gallant is going to be the speaker at this event tomorrow night. Sorry for the confusion. It should be a good talk.