What Makes us Want to Program? Part 3

In my second post I discussed my run in with ASP, and how PHP was far better.  I ended the post talking about an invitation to a Microsoft event.  This was an interesting event.  Greg and I were the only people under 30 there.  When that’s a 15 year difference, things get interesting.  Especially when you need your mother to drive you there…  The talk was a comparison between Microsoft based technologies and Linux based technologies.  The presenter was a 10 year veteran of IBM, working on their Linux platform, who then moved to Microsoft.  For the life of me I can’t remember his name.

His goal was simple.  Disprove myths around Linux costs versus Windows costs.  It was a very compelling argument.  The event was based around the Windows Compare campaign.  It was around this time that Longhorn (Longhorn that turned into Vista, not Server 2008) was in pre-beta soon to go beta, and after discussing it with Greg, we decided to probe the presenter for information about Longhorn.  In a situation like that, the presenter either gets mad, or becomes really enthusiastic about the question.  He certainly didn’t get mad.

Throughout the rest of the talk, the presenter made some jokes at mine and Greg’s expense, which was all in good fun.  Based on that, we decided to go one step further to ask how we can get the latest Longhorn build, at one of the breaks.  the conversation went something like this:

Me: So how do people get copies of the latest build for Longhorn?
Presenter: Currently those enrolled in the MSDN Licensing program can get the builds.
Me: Ok, how does one join such a licensing program?
Presenter: Generally you buy them.
Me: How much?
Presenter: A couple thousand…
Me: Ok let me rephrase the question.  How does a student, such as myself and my friend Greg here, get a the latest build of Longhorn when we don’t have an MSDN subscription, nor the money to buy said subscription?
Presenter: *Laughs* Oh.  Go talk to Alec over there and tell him I said to give you a student subscription.
Me:  Really?  Cool!

Six months later Greg and I some how got MSDN Premium Subscriptions.  We had legal copies of almost every single piece of Microsoft software ever commercially produced.  Visual Studio 2005 was still in beta, so I decided to try it out.  I was less than impressed with Visual Studio 2003, but really liked ASP.NET, so I wanted to see what 2005 had in store.  At the time PHP was still my main language, but after the beta of 2005, I immediately switched to C#.  I had known about C# for a while, and understood the language fairly well.  It was .NET 1.1 that never took for me.  That, and I didn’t have a legal copy of Visual Studio 2003 at the time.

Running a Longhorn beta build, with Visual Studio 2005 beta installed, I started playing with ASP.NET 2.0, and built some pretty interesting sites.  The first was a Wiki type site, designed for medical knowledge (hey, it takes a lot to kill a passion of mine).  It never saw the light of day on the interweb, but it certainly was a cool site.  Following that were a bunch of test sites that I used to experiment with the data controls.

It wasn’t until the release of SQL Server 2005 that I started getting interested in data.  Which I will discuss in the my next post.

Windows Live Writer

I finally got around to building a MetaWeblog API Handler for this site, so I can use Windows Live Writer.  It certainly was an interesting task.  I wrote code for XML, SQL Server, File IO, and Authentication to get this thing working.  It’s kinda mind-boggling how many different pieces were necessary to get the Handler to function properly.

All-in-all the development was really fun.  Most people would give up on the process once they realize what’s required to debug such an interface.  But it got my chops in shape.  It’s not every day you have to use a Network Listener to debug code.  It’s certainly not something I would want to do everyday, but every so often it’s pretty fun.

While in the preparation process, there were a couple of procedures that I thought might be tricky to work out.  One in particular was automatically uploading images to my server that were placed in the post.  I could have left it to the manual process, what I started out with, which involved FTP’ing the images to the server, and then figuring out the URL for them, and manually inserting the img tag.  Or, I could let Live Writer and the Handler do all the work.  Ironically, this procedure took the least amount of code out of all of them:

public string NewMediaObject(string blogId, string userName, string password,
string base64Bits, string name) { string mediaDirectory
= HttpContext.Current.Request.PhysicalApplicationPath + "media/blog/"; if (authUser(userName, password)) { File.WriteAllBytes(mediaDirectory + name, Convert.FromBase64String(base64Bits)); return Config.SiteURL + "/media/blog/" + name; } else { throw new Exception("Cannot Authenticate User"); } }

Now its a breeze to write posts.  It even adds drop shadows to images:

1538

Live Writer also automatically creates a thumbnail of the image, and links to the original.  It might be a pain in some cases, but it’s easily fixable.

All I need now is more topics that involve pictures.  Kitten’s optional. :)

ADO.NET Entity Framework and SQL Server 2008

Do you remember the SubSonic project? The Entity Framework is kind of like that. You can create an extensible and customizable data model from any type of source. It takes the boiler plate coding away from developing Data Access Layers.

Entity is designed to seperate how data is stored and how data is used. It's called an Object-Relational Mapping framework. You point the framework at the source, tell it what kind of business objects you want, and poof: you have an object model. Entity is also designed to play nicely with LINQ. You can use it as a data source when querying with LINQ. In my previous post, the query used NorthwindModEntities as a data source. It is an Entity object.

Entity Framework
Courtesy of Wikipedia

The Architecture, as defined in the picture:

  • Data source specific providers, which abstracts the ADO.NET interfaces to connect to the database when programming against the conceptual schema.
  • Map provider, a database-specific provider that translates the Entity SQL command tree into a query in the native SQL flavor of the database. It includes the Store specific bridge, which is the component that is responsible for translating the generic command tree into a store-specific command tree.
  • EDM parser and view mapping, which takes the SDL specification of the data model and how it maps onto the underlying relational model and enables programming against the conceptual model. From the relational schema, it creates views of the data corresponding to the conceptual model. It aggregates information from multiple tables in order to aggregate them into an entity, and splits an update to an entity into multiple updates to whichever table contributed to that entity.
  • Query and update pipeline, processes queries, filters and update-requests to convert them into canonical command trees which are then converted into store-specific queries by the map provider.
  • Metadata services, which handle all metadata related to entities, relationships and mappings.
  • Transactions, to integrate with transactional capabilities of the underlying store. If the underlying store does not support transactions, support for it needs to be implemented at this layer.
  • Conceptual layer API, the runtime that exposes the programming model for coding against the conceptual schema. It follows the ADO.NET pattern of using Connection objects to refer to the map provider, using Command objects to send the query, and returning EntityResultSets or EntitySets containing the result.
  • Disconnected components, which locally caches datasets and entity sets for using the ADO.NET Entity Framework in an occasionally connected environment.
    • Embedded database: ADO.NET Entity Framework includes a lightweight embedded database for client-side caching and querying of relational data.
  • Design tools, such as Mapping Designer are also included with ADO.NET Entity Framework which simplifies the job on mapping a conceptual schema to the relational schema and specifying which properties of an entity type correspond to which table in the database.
  • Programming layers, which exposes the EDM as programming constructs which can be consumed by programming languages.
  • Object services, automatically generate code for CLR classes that expose the same properties as an entity, thus enabling instantiation of entities as .NET objects.
  • Web services, which expose entities as web services.
  • High level services, such as reporting services which work on entities rather than relational data.

LINQ and SQL Server 2008

No, Zelda is not back.  LINQ stands for Language Integrated Query. It's a set of query operators that can be called in any .NET language to query, project, and filter data from any type of data source. Types include arrays, databases, IEnumerables, Lists, etc, including third party Data Sources. It's pretty neat.

Essentially LINQ pulls the data into data objects, which can then be used as you would use a Business Object. The data object is predefined by a LINQ Provider. Out of the box you have LINQ to SQL, LINQ to XML, and LINQ to Objects for providers. Once you define the data object based on provider you can start querying data:

LINQ


Within the foreach loop the 'Customers' class is a data class that was defined based on a LINQ to SQL Provider. In this case, the database was Northwind.

Syntactically LINQ is very much like the SQL language, mainly because they both work on the same principle. Query (possible) large amounts of data and act on it appropriately. SQL is designed to work with large datasets. Most other languages work iteratively. So SQL was a good language choice to mimic.

However, there is a small problem that I see with LINQ. If I'm doing all the querying at the DAL layer instead of using things like Stored Procedures within the database, and I need to modify a query for performance concerns, that means the DAL has to be recompiled and redistributed to each application out in the field. That could be 10,000 different instances. Wouldn't it make more sense to keep the query within a Stored Procedure? Just a thought...

.Net 2.0 Configuration Articles

I recently looked into creating my own .net configuration section by implementing a custom configuration handler, which leverages the System.Configuration classes.  Piece of cake I thought... I had written my own configuration handlers in .net 1.1, which was straightforward using the IConfigurationSectionHandler Interface.  When using this interface, all you need to do is create your object from the XmlNode that is passed in.  Pretty simple!

I knew System.Configuration had been enhanced from v1.1 to v2.0 (resulting in its' own assembly), however I didn't realise the implementation had changed so radically.  When all you are doing is reading data from configuration, the only real change you have to make in moving from 1.1 to 2.0 (or higher) is to use the ConfigurationManager Class instead of the ConfigurationSettings Class.  This doesn't give you much insight into what else has changed.  It turns out that if you now want to create your own configuration section you need to derive from the right base configuration classes and override the right methods, which when attempting to achieve with the MSDN documentation of the classes is pretty hard, because it's not too crash hot!  So I stumbled upon these 3 fantastic articles by Jon Rista which put me on track in no time...  If my brief experiences with the configuration classes and their lack of documentation is anything to go by, Jon has obviously spent a lot of time banging his head so we don't have to.  Thanks Jon!

MVP Insider - Q & A with Justin Lee

Well, it seems this month I'm up for being interviewed. Here's the link to my interview.

Follow up on Entity Framework talk at Tech Ed 2008

Last week at TechEd I gave a talk about building data access layers with the Entity Framework. I covered various approaches from not having a data access layer at all, to fully encapsulation of the entity framework - and some hybrid approaches along the way.

I gave the first instance of this on Tuesday and then a repeat on Thursday.

To those who saw the first instance of this on Tuesday....

you unfortunately got an abbreviated and disjointed version for which I apologize. After I queued up my deck about 15 minutes prior to the talk I left the room for a minute while people filed in and while I was out, one of the event staff shutdown my deck and restarted it running from a different folder on the recording machine and didn't tell me. I was about 1/3rd into my presentation when I realized that I had the wrong version of the deck. At the time, I had no idea why this version of the deck was running so I wasn't going to fumble around looking for the correct one. Given a change in the order of things - I'm not sure if changing decks at that point would have made things better or worst. I still had no idea why this had happened when I gave the talk again on Thursday but when the same thing almost happened again - this time I caught the event staff shutting down my deck and restarting it again (from an older copy). Bottom line, sorry to those folks who saw the earlier version.

The complete deck and demo project is attached. It is a branch of the sample that is part of the Entity Framework Hands on Lab that was available at the conference and which is included in the .NET 3.5 Enhancements (aka SP1) training kit. You can will need the database for that project which is not included in my down.

Download the training kit here.

 

WPF for Developers and Lead Designers Course

Rob Burke is teaching a WPF training course through Toronto-based consultancy ObjectSharp. The course is called “Windows Presentation Foundation for Developers and Lead Designers,” and, as the title suggests, it offers a hands-on experience designed to give developers and lead designers the knowledge, background, tips and references they’ll need to build smart client applications using the Windows Presentation Foundation.

After enjoying the process of training a team of developers and designers to use WPF, this course is the result of turning that material into a course that we could offer here.

The inaugural course offering is currently scheduled for August 13th-15th. If you’re interested in taking part, please find more information about the course on ObjectSharp’s site. Also, if August 13th is too long for you to wait, or you’re interested in an on-site course, please contact Julie James, ObjectSharp’s Training Manager.

More on Rob Burke.

Pex 0.5 Released

What is Pex?

Pex generates test inputs that cover all, or at least many of the corner cases in your .NET code. These test inputs are plugged into parameterized unit test that you write. The result is a small unit test suite, where each unit test calls the parameterized unit test with particular test inputs. There is a great picture on the main Pex page that illustrates this process.

Pex supports other unit test frameworks since the unit tests that Pex generates can be executed by other unit test frameworks without Pex. Pex comes with support for MSTest, the unit test framework of Visual Studio, out of the box. For support for other unit test frameworks, please look at the Pex Extensions project.

Parameterized unit tests have been around for quite some time already, under several names -- row tests, data-driven tests, theories, etc.

What is really unique about Pex is that it analyzes your .NET code, instruction by instruction, to understand what your code is doing. Then, in a fully automatic way, Pex computes relevant test inputs that trigger the corner cases of the code. When you write assertions, Pex will try to come up with test inputs that cause an assertion to fail.

Feedback

To ask questions, get help, or just give feedback, please take a look at our mailing lists.

Links

Homepage: http://research.microsoft.com/pex

Download: http://research.microsoft.com/pex/downloads.aspx

Nikolai Tillmann's Blog: http://blogs.msdn.com/nikolait

Peli de Halleux's Blog: http://blog.dotnetwiki.org/

Release of Microsoft Source Analysis for C#

Source Analysis, also known as StyleCop, analyzes C# source code to enforce a set of best practice style and consistency rules.

Source Analysis for C# can be downloaded here: https://code.msdn.microsoft.com/Release/ProjectReleases.aspx?ProjectName=sourceanalysis.

Source Analysis Blog: http://blogs.msdn.com/sourceanalysis