Windows Live Writer

I finally got around to building a MetaWeblog API Handler for this site, so I can use Windows Live Writer.  It certainly was an interesting task.  I wrote code for XML, SQL Server, File IO, and Authentication to get this thing working.  It’s kinda mind-boggling how many different pieces were necessary to get the Handler to function properly.

All-in-all the development was really fun.  Most people would give up on the process once they realize what’s required to debug such an interface.  But it got my chops in shape.  It’s not every day you have to use a Network Listener to debug code.  It’s certainly not something I would want to do everyday, but every so often it’s pretty fun.

While in the preparation process, there were a couple of procedures that I thought might be tricky to work out.  One in particular was automatically uploading images to my server that were placed in the post.  I could have left it to the manual process, what I started out with, which involved FTP’ing the images to the server, and then figuring out the URL for them, and manually inserting the img tag.  Or, I could let Live Writer and the Handler do all the work.  Ironically, this procedure took the least amount of code out of all of them:

public string NewMediaObject(string blogId, string userName, string password,
string base64Bits, string name) { string mediaDirectory
= HttpContext.Current.Request.PhysicalApplicationPath + "media/blog/"; if (authUser(userName, password)) { File.WriteAllBytes(mediaDirectory + name, Convert.FromBase64String(base64Bits)); return Config.SiteURL + "/media/blog/" + name; } else { throw new Exception("Cannot Authenticate User"); } }

Now its a breeze to write posts.  It even adds drop shadows to images:

1538

Live Writer also automatically creates a thumbnail of the image, and links to the original.  It might be a pain in some cases, but it’s easily fixable.

All I need now is more topics that involve pictures.  Kitten’s optional. :)

ADO.NET Entity Framework and SQL Server 2008

Do you remember the SubSonic project? The Entity Framework is kind of like that. You can create an extensible and customizable data model from any type of source. It takes the boiler plate coding away from developing Data Access Layers.

Entity is designed to seperate how data is stored and how data is used. It's called an Object-Relational Mapping framework. You point the framework at the source, tell it what kind of business objects you want, and poof: you have an object model. Entity is also designed to play nicely with LINQ. You can use it as a data source when querying with LINQ. In my previous post, the query used NorthwindModEntities as a data source. It is an Entity object.

Entity Framework
Courtesy of Wikipedia

The Architecture, as defined in the picture:

  • Data source specific providers, which abstracts the ADO.NET interfaces to connect to the database when programming against the conceptual schema.
  • Map provider, a database-specific provider that translates the Entity SQL command tree into a query in the native SQL flavor of the database. It includes the Store specific bridge, which is the component that is responsible for translating the generic command tree into a store-specific command tree.
  • EDM parser and view mapping, which takes the SDL specification of the data model and how it maps onto the underlying relational model and enables programming against the conceptual model. From the relational schema, it creates views of the data corresponding to the conceptual model. It aggregates information from multiple tables in order to aggregate them into an entity, and splits an update to an entity into multiple updates to whichever table contributed to that entity.
  • Query and update pipeline, processes queries, filters and update-requests to convert them into canonical command trees which are then converted into store-specific queries by the map provider.
  • Metadata services, which handle all metadata related to entities, relationships and mappings.
  • Transactions, to integrate with transactional capabilities of the underlying store. If the underlying store does not support transactions, support for it needs to be implemented at this layer.
  • Conceptual layer API, the runtime that exposes the programming model for coding against the conceptual schema. It follows the ADO.NET pattern of using Connection objects to refer to the map provider, using Command objects to send the query, and returning EntityResultSets or EntitySets containing the result.
  • Disconnected components, which locally caches datasets and entity sets for using the ADO.NET Entity Framework in an occasionally connected environment.
    • Embedded database: ADO.NET Entity Framework includes a lightweight embedded database for client-side caching and querying of relational data.
  • Design tools, such as Mapping Designer are also included with ADO.NET Entity Framework which simplifies the job on mapping a conceptual schema to the relational schema and specifying which properties of an entity type correspond to which table in the database.
  • Programming layers, which exposes the EDM as programming constructs which can be consumed by programming languages.
  • Object services, automatically generate code for CLR classes that expose the same properties as an entity, thus enabling instantiation of entities as .NET objects.
  • Web services, which expose entities as web services.
  • High level services, such as reporting services which work on entities rather than relational data.

DataSet Serialization: Smaller & Faster

DataSets serialize naturally to XML quite well and you have lots of control over that. Typed DataSets have the XSD with some properties that control that (and of course you can do it programmatically too). But one of the common problems with remoting DataSets is that the default binary serialization is actually just the XML Serialization ASCII. Crude. Some people have even used this fact to extrapolate that the DataSet is internally XML - which isn't true.

This is improved in Whidbey. But until then, what's a Serializer to do?

Lots of people have done some great work to do customized binary serialization of datasets. To mention a few:

  • Microsoft Knowledge Base Article 829740 by Ravinder Vuppula where he demonstrates his DataSetSurrogate class which wraps up a DataSet and converts the internal items into array lists which can then be Binary Serialized by the framework by default. It also contains a ConvertToDataSet so that you can reverse the process of a De-serialized surrogate back into a dataset.
  • Dino Esposito has demonstrates a GhostSerializer in his MSDN Article for a DataTable which does a similar ArrayList conversion thing.
  • Richard Lowe's Fast Binary Serialization
  • Dominic Cooney's Pickle
  • Angelo Scotto's CompactFormatter goes a step further with a serializable technique that doesn't rely on the Binary Formatter so it works on the Compact Framework which is event more Compact than the BinaryFormatter
  • Peter Bromberg builds on top of the CompactFormatter to support compression using Mike Krueger's ICSharpCode SharpZiplib in-memory zip libraries