AzureFest Follow-Up Links & Videos

Cory Fowler stands beside the big screen in Microsoft Canada's MPR roomThis past Saturday December 11th, Microsoft and ObjectSharp hosted AzureFest, a community event to raise interest and learn a little bit about Microsoft’s cloud platform, Windows Azure.

My colleague Cory Fowler gave an introductory run down on the Azure platform and pricing, and then demonstrated for those in attendance how to go about Creating an Account and Deploying an Azure Application.

The best part, is that our good friends at Microsoft Canada offered $25 in User Group Funding for each person in attendance that followed along on their laptop to activate an azure account and deploy the sample application.

Now Held Over!

The even better part, is that MS Canada is extending the offer until December 31st online for anybody that goes through this process to activate and deploy a sample application online. We’ve got the instructions for you here and it will take you approximately 15 minutes to go through the videos and deploy the sample.

  1. Download the application package that you’ll need for the deployment here.
  2. Create an Azure Introductory Account (5 minutes). You’ll need
    • a Windows Live id. (if you don’t have one, click here for instructions)
    • a Valid Credit Card (don’t worry, in step 4 we’ll show you how to shut down your instance before you get charged).
    • navigate to www.Azure.com and follow along with these instructions
      Signing up for Windows Azure
  3. Deploy the Nerd Dinner Application (8 minutes)
    • follow along with these instructions
      Deploying the Nerd Dinner Package to Azure
    • email a screenshot of your deployed application showing the URL and the name of your user group to cdnazure@microsoft.com
    • Specify TVBug, Metro Toronto .NET UG, CTTDNUG, Architecture UG, East of Toronto .NET UG, Markham .NET UG, etc.
  4. Tear down to the application to avoid any further charges (2 minutes)
    • Tearing down a Windows Azure Service

Here are the slides from Azure Fest

Stay tuned here for the next part of our video blogs where we will review:

  • Deploying a SQL Database to Azure
  • Installing the Azure Tools for Visual Studio and SDK
  • Deploying ASP.NET Applications from within Visual Studio

Technology Predictions and Trends for 2009

I’m sure we’re going to look back at 2009 and say “it was the best of times, it was the worst of times’ and it will no doubt be interesting. Here’s my predictions….

Social Networking Everywhere

Although online social networking companies are already struggling with diminished valuations, in 2009 we’ll see social networks break out of their silos and become essential platform elements that see their way into other online applications such as travel, e-commerce, job-posting boards, online dating services, CRM services, web based email systems, etc. Blogging is also changing, slowing down in fact. Micro-blogging with status update-esque features in FaceBook, Windows Live, and of course the explosion of Twitter will take on even larger roles. It’s as true today as it was back in 1964 when fellow Canadian Marshall McLuhan wrote “The Medium Is The Message”.

The Death of Optical Media

Okay, so you’ll still be able to walk into a video store to rent a DVD or buy a spindle of blanks at your grocery store but make no mistake about it – the death march is on, and that includes you too Blu-Ray. Blu-Ray will never see the adoption curve that DVD’s had. They thought they won when HD-DVD died, but if winning means dying last, then sure, you won. We’ll increasingly be renting our movies on-demand through our cable boxes, on our converged PC’s and XBOX 360’s via services like Netflix. Along with this, the rest of us will start to realize we don’t really need to own our libraries of movies. With IPod penetration as high as it is, it may take longer to realize we don’t need to own our music either – frankly we don’t own it anyway even though the pricing models try to convince us we do. I won’t go out and predict the death of DRM, frankly, I think 2009 maybe the year where DRM starts to get more tolerable once we are clearly renting our music and movies. The Zune Pass is making some inroads here but until Apple starts offering a similar subscription pricing, this may take a bit longer.

The Mac Air may have been a bit ahead of the curve with dropping the optical drive, but get used to it. Expect more vendors to do the same as they reduce size or cram in additional batteries or hard drives.

The Rise of the NetBook

If 2009 is the year of doing more with less, then this will surely be the NetBook’s year. Mainstream hardware manufacturers hate these and their small profit margins, but Acer and Intel will be raking it in building market share if not large bottom lines. Who knows, MS may learn to love the NetBook if they can get Acer to start shipping Windows 7 on them this year as well. Be prepared to see these everywhere in 2009, but don’t expect to see Apple make one (ever).

Zune Phone

The big story at the end of 2008 has been the global suicide of the original Zune 30s. I predict that tomorrow they’ll be they shall rise from the dead but it might take until the 2nd for everybody to figure out that they need to entirely drain the battery. The big news is that there won’t be a Zune phone with the MS brand name on it, but the Zune UI will come to Windows Mobile (6.5?) turning legions of touch based smart phones into music players almost as good as an IPhone. The bad news is that without an App Store to vet software quality, crapware will continue to be the source of reliability issues for the Windows Mobile platform. The good news is that without an App Store, Windows Mobile users will have lots of choice in the software for their devices, not to mention lots of choice in devices, carriers and plans. The battle between Good and Evil may morph into the battle between Reliability and Choice.

Touch Everywhere

Get your head out of the gutter, that’s not what I meant. What I did mean is that 12-24 months from now, it will be difficult to purchase a digital frame, LCD monitor or phone without an onscreen touch capability. Windows 7 will light these devices up and we’ll start to not think about the differences between Tablet PC’s and Notebooks as they just converge into a single device. With the advent of Silverlight, WPF and Surface computing, MS has been banging the “user experience” drum for a while now but when touch starts to be the expectation and not the exception, we’ll have to re-engineer our applications to optimize for the touch experience. This may turn out to be bigger than the mouse or even a windowed operation system.

Flush with Flash

In 2008 we’ve been teased with sold state hard drives but with less than stellar performance at outrageous prices, they’ve been on the fringe. In 2009 prices and read/write times will both come down in solid state drives, but with the increased capacity of USB memory sticks 32gb, 64gb +, we likely won’t see SSD drives hitting mainstream this year. Instead I think we’ll see an increase in the behavior of people keeping their entire lives on USB flash memory sticks. Hopefully we’ll see sync & backup software such as Windows Live Sync, Active Sync, Windows Home Server, etc. become more aware of these portable memory devices that may get synced from any device in your mesh. 

Camera flash will have to have a new format as SDHC currently is maxed at 32gb. With the increase in demand for HD video recording on still and video cameras, we’ll need a new format. As such we’re seeing rock bottom prices on 2gb chips now. Maybe somebody will come out with a SD Raid device that lets us plug in a bank of 2GB SD Cards.

Growing up in the Cloud

Cloud computing is going to be a very long term trend. I think we’ll only see baby steps in 2009 towards this goal. In the consumer space we’ll see more storage of digital media in the cloud, online backup services and the move of many applications to the cloud. Perfect for your Touch Zune Phone and Touch NetBook without an optical drive eh? IT shops will take a bit longer to embrace the cloud. Although many IT Data centers are largely virtualized already, applications are not all that virtual today and that doesn’t seem to be changing soon as developers have not whole-heartedly adopted SOA practices, addressed scalability and session management issues nor adopted concepts such as multi-tenancy. As we do more with less in 2009, we won’t see that changing much as a lot of software out there will be in “maintenance mode” during the recession.

Maybe, Just Maybe, this is the year of the Conveniently Connected Smart Client

Adobe Air & Silverlight are mainstreaming web deployed and updated rich client desktop apps. It’s hard to take advantage of touch interfaces and massive portable flash storage within a browser. All of these other trends can influence Smart Client applications, potentially to a tipping point. We’ll hopefully see out of browser, cross-platform Silverlight applications in 2009 to make this an easy reality on the MS Stack.

Incremental, Value-Based and Agile Software Development

Many of my customers began large-scale re-writes of their key software assets in 2008, many of them against my recommendations. For most of my key customers in 2008 and into 2009 I’m an advocate of providing incremental value in short iterative releases, not major re-writes that take 6+ months to develop. Even if your application is written in PowerBuilder 6 or Classic ASP, avoid the temptation to rewrite any code that won’t see production for 4 months or longer. We can work towards componentized software by refactoring legacy assets and providing key integration points so that we can release updated modules towards gradual migration. It is difficult for software teams in this economy to produce big-bang, “boil the ocean”, build cathedral type projects. We simply can’t predict what our project’s funding will be in 4 months from now, or if we’ll be owned by another company, scaled down, out sourced or just plain laid off. That is of course unless you work for the government. Government spending will continue if not increase in 2009, but still, try to spend our taxpayer money wisely by delivering short incremental software releases. It allows you to build trust with your customers, mark a line in the sand and move onward and upward, and let’s you move quickly in times of fluid business requirements and funding issues.

Incremental, Value-Based software development isn’t easy. It takes lots of work, creative thinking, and much interop and integration work than one would prefer. It might easily seem like an approach that costs more in the long term, and in some cases you could be right. But if a company has to throw out work in progress after 6-8 months or never sees the value of it because of other changing business conditions, then what have you saved? Probably not your job anyway.

Aspiring Architect - Web cast series

With all this heat, I almost wrote "perspiring". Why not beat the heat, and stay cool inside while watching these web casts from MS Canada targeting aspiring architects. With the predicted shortage in IT in the upcoming years, we're sure to see an influx of junior resources into our industry. This is a good opportunity for developers to transition into architecture roles to leverage their existing skill set.

The Aspiring Architect Series 2008 builds on last year’s content and covers a number of topics that are important for architects to understand. So it would be a great idea to watch last year's recordings if you haven't already. Links are available here:  http://blogs.msdn.com/mohammadakif/archive/tags/Aspiring+Architects/default.aspx .

Upcoming sessions are as follows:

June 16th, 2008 – 12:00 p.m. to 1:00 p.m. – Introduction to the aspiring architect Web Cast series

http://msevents.microsoft.com/CUI/EventDetail.aspx?EventID=1032380836&Culture=en-CA

June 17th, 2008 – 12:00 p.m. to 1:00 p.m. – Services Oriented Architecture and Enterprise Service Bus – Beyond the hype

http://msevents.microsoft.com/CUI/EventDetail.aspx?EventID=1032380838&Culture=en-CA

June 18th, 2008 – 12:00 p.m. to 1:00 p.m. – TOGAF and Zachman, a real-world perspective

http://msevents.microsoft.com/CUI/EventDetail.aspx?EventID=1032380840&Culture=en-CA

June 19th, 2008 – 12:00 p.m. to 1:00 p.m. – Services Oriented Architecture (Web Cast in French)

http://msevents.microsoft.com/CUI/EventDetail.aspx?EventID=1032380842&Culture=en-CA

June 20th, 2008 – 12:00 p.m. to 1:00 p.m. – Interoperability (Web Cast in French)

http://msevents.microsoft.com/CUI/EventDetail.aspx?EventID=1032380844&Culture=fr-CA

June 23rd , 2008 – 12:00 p.m. to 1:00 p.m. – Realizing dynamic systems

http://msevents.microsoft.com/CUI/EventDetail.aspx?EventID=1032380846&Culture=en-CA

June 24th, 2008 – 12:00 p.m. to 1:00 p.m. – Web 2.0, beyond the hype

http://msevents.microsoft.com/CUI/EventDetail.aspx?EventID=1032380848&Culture=en-CA

June 25th, 2008 – 12:00 p.m. to 1:00 p.m. – Architecting for the user experience

http://msevents.microsoft.com/CUI/EventDetail.aspx?EventID=1032380850&Culture=en-CA

June 26th, 2008 – 12:00 p.m. to 1:00 p.m. – Conclusion and next steps

http://msevents.microsoft.com/CUI/EventDetail.aspx?EventID=1032380852&Culture=en-CA

Live vs. Google: My Personal Verdict

About a month ago, Live.com released an update to their search engine and I took it upon myself to write down my observations of Live as compared to Google. Although the features seemed to be a pretty good leap in many areas, I concluded that the only way to see which one was better was to change my default engine to Live.com for a week or two and give it a try.

Well, it's been a month now and Live.com is still my search engine. It's not like I didn't stop using Google however. I would say that perhaps 10% or less of the time, I felt frustrated by not finding what I was looking for on Live.com and cross-searched Google.com. Of those cases, I would say only half of the time, I found something useful on Google.com that wasn't found on Live.com. These aren't hard numbers, just an anecdotal feel of my experience. When Google.com was my default search engine, my failed searches were likely in the same order as was Live.com's ability to result in something useful where Google.com did not.

So the verdict for me is that in the area of core test searches, the differences were negligible. At no time did I feel that live.com's performance was slower and I always found the image, video & map search to be superior in Live.com. These features alone for me are reason enough to leave Live.com as my default for the time being.

ObjectSharp is going to experiment with some Live.com ads. I'm not expecting any kind of serious traffic to come through ads on Live.com so this is mostly a research project to get familiar with their ad engine.

Live Search is actually getting pretty good these days!

image Live Search got a refresh recently and it's actually pretty good, dare I say may be even better than Google.

  • Performance: The load times are very snappy now and it feels pretty much on par with google performance.
  • Image Search: Functionality on Image Search has been superior for some time with the ability to size the thumbnails and I quite like the "add to scratchpad" feature but wish it was available for video and web content as well. The ability to refine the search by size, aspect ratio, colour, Style (photo vs. illustration) and Face (i.e. head & shoulders, just face, etc.) is brilliant. It's not 100% accurate, but it is still quite useful. Google can do a subset of this, but frankly, never noticed it until I went looking for it. These features are more discoverable in Live. I have to say that I'm in love with the fact that I can preview the context of the pages split screen so I don't leave my list of results. As far as relevancy goes, I'm going to have to give it a try for a week. Google does a better job finding pictures of me, but Live does a better job of finding black & whites of Darryl Sittler.  Live found 12 good b&w's of the Sitler while google found absolutely nothing of relevance. Unfortunately, Live fails my vanity image search and google prevails with more pictures of me. 
  • Video Search: I honestly have never used video search on live or google until this blog entry. Live has a nice feature where it can play videos from the results page by just hovering over the thumbnails. It can't do it for every video type, some you have to click on but it is a lovely feature when it works which from what I can tell is about 60% of the time. Google only manages to get 10 thumbnails per page where live gets twice that. Live brought back a few extra total results. It seems like google favours primarily youtube.
  • Map Search: Mileage seems to vary greatly depending on where you are searching. Live has the great integrated 3D maps with buildings, contoured terrain and birds eye view vs. google with street view as it's killer feature. For Canadians, google hasn't made any of these features available yet. Live has thrown us Canadians a few bones with birds eye view available in Vancouver, Calgary, Toronto, and Montreal. Live also gives us some traffic in Vancouver, Toronto, Montreal, and Ottawa.

    Live maps has some great directional tools as well. I like the feature where I can just click on a point and ask for 1 click directions - which will give me generic instructions on coming to a location from all major available directions/arteries. This is great if you want to include directions on an invitation but you don't know where everybody is coming from. The Live Map collection editor and the ability to easily add pushpins and draw (with distances) on the map is great.

    One thing I do like about Google maps is the ability to take a set of directions and drag segments around to tell it how I really want to get somewhere. All in all though, I much prefer (as a Canadian) Live Maps as the most


  • Plain Old Web Relevancy: Plain old web results are perhaps the most important criteria. I think this will be impossible to provide any meaningful observations after spending < 1 hour using live search. I'm going to give live a shot as my default search engine for the next week and see how it does but a few immediate observations:
    • Live wins on relevancy for the "Barry Gervin" vanity search, easily. By the bottom of the first page (items 7th-9th), Google is showing results that are irrelevant - in fact, no mention of Barry or Gervin on any of those pages.
    • Searching for some things that I know are on MSDN Forums - posted yesterday, google seems to be crawling that site faster than Live. Google still has a dedicated newsgroups search engine. I used to use this a lot more than I currently do so I have mixed emotions about live search. In fact, in some of my tests for searching things that I knew were known to exist in newsgroups, live search actually referred me to google groups - so it does appear that live is indexing the google groups search engine. I think the thing I find most appealing about google groups is not the content that it finds, but the way it presents it, showing me what group a result was found in, and then making the detailed results essentially a web based NNTP client. Live could do much more than they are doing now that's for sure, I'm just not sure that it really matters.
    • I'm making more and more use of google alerts to have things I am constantly tracking or researching searched and emailed to me. Live has no equivalent to this.

I'm going to give Live Search a trial as my default search engine for the next week or so and see how it goes. I'm optimistic.

The Entity Framework vs. The Data Access Layer (Part 1: The EF as a DAL)

In Part 0: Introduction of this series after asking the question "Does the Entity Framework replace the need for a Data Access Layer?", I waxed lengthy about the qualities of a good data access layer. Since that time I've received a quite a few emails with people interested in this topic. So without further adieu, let's get down to the question at hand.

So let's say you go ahead and create an Entity Definition model (*.edmx) in Visual Studio and have the designer generate for you a derived ObjectContext class and an entity class for each of your tables, derived from EntityObject. This one to one table mapping to entity class is quite similar to LINQ to SQL but the mapping capabilities move well beyond this to support advanced data models. This is at the heart of why the EF exists: Complex Types, Inheritance (Table per Type, Table per Inheritance Hierarchy), Multiple Entity Sets per Type, Single Entity Mapped to Two Tables, Entity Sets mapped to Stored Procedures or mapping to a hand-crafted query, expressed as either SQL or Entity SQL. EF has a good story for a conceptual model over top of our physical databases using Xml Magic in the form of the edmx file - and that's why it exists.

So to use the Entity Framework as your data access layer, define your model and then let the EdmGen.exe tool do it's thing to the edmx file at compile time and we get the csdl, ssdl, and msl files - plus the all important code generated entity classes. So using this pattern of usage for the Entity Framework, our data access layer is complete. It may not be the best option for you, so let's explore the qualities of this solution.

To be clear, the assumption here is that our data access layer in this situation is the full EF Stack: ADO.NET Entity Client, ADO.NET Object Services, LINQ to Entities, including our model (edmx, csdl, ssdl, msl) and the code generated entities and object context. Somewhere under the covers there is also the ADO.NET Provider (SqlClient, OracleClient, etc.)

image

To use the EF as our DAL, we would simply execute code similar to this in our business layer.

var db = new AdventureWorksEntities();
var activeCategories = from category in db.ProductCategory
                 where category.Inactive != true
                 orderby
category.Name
                 select category;

How Do "EF" Entities Fit In?

If you're following along, you're probably asking exactly where is this query code above being placed. For the purposes of our discussion, "business layer" could mean a business object or some sort of controller. The point to be made here is that we need to think of Entities as something entirely different from our Business Objects.

Entity != Business Object

In this model, it is up to the business object to ask the Data Access Layer to project entities, not business objects, but entities.

This is one design pattern for data access, but it is not the only one. A conventional business object that contains its own data, and does not separate that out into an entity can suffer from tight bi-directional coupling between the business and data access layer. Consider a Customer business object with a Load method. Customer.Load() would in turn instantiate a data access component, CustomerDac and call the CustomerDac's Load or Fill method. To encapsulate all the data access code to populate a customer business object, the CustomerDac.Load method would require knowledge of the structure the Customer business object and hence a circular dependency would ensue.

The workaround, if you can call it that, is to put the business layer and the data access layer in the same assembly - but there goes decoupling, unit testing and separation of concerns out the window.

Another approach is to invert the dependency. The business layer would contain data access interfaces only, and the data access layer would implement those interfaces, and hence have a reverse dependency on the business layer. Concrete data access objects are instantiated via a factory, often combined with configuration information used by an Inversion of Control container. Unfortunately, this is not all that easy to do with the EF generated ObjectContext & Entities.

Or, you do as the Entity Framework implies and separate entities from your business objects. If you've used typed DataSets in the past, this will seem familiar you to you. Substitute ObjectContext for SqlConnection and SqlDataAdapter, and the pattern is pretty much the same.

Your UI presentation layer is likely going to bind to your Entity classes as well. This is an important consideration. The generated Entity classes are partial classes and can be extended with your own code. The generated properties (columns) on an entity also have event handlers created for changing and changed events so you can also wire those up to perform some column level validation. Notwithstanding, you may want to limit your entity customizations to simple validation and keep the serious business logic in your business objects. One of these days, I'll do another blog series on handing data validation within the Entity Framework.

How does this solution stack up?

How are database connections managed?

thumbs up Using the Entity Framework natively itself, the ObjectContext takes care of opening & closing connections for you - as needed when queries are executed, and during a call to SaveChanges. You can get access to the native ADO.NET connection if need be to share a connection with other non-EF data access logic. The nice thing however is that, for the most part, connection strings and connection management are abstracted away from the developer.

thumbs down A word of caution however. Because the ObjectContext will create a native connection, you should not wait to let the garbage collector free that connection up, but rather ensure that you dispose of the ObjectContext either explicitly or with a using statement.

Are all SQL Queries centralized in the Data Access Layer?

thumbs down By default the Entity Framework dynamically generates store specific SQL on the fly and therefore, the queries are not statically located in any one central location. Even to understand the possible queries, you'd have to walk through all of your business code that hits the entity framework to understand all of the potential queries.

But why would you care? If you have to ask that question, then you don't care. But if you're a DBA, charged with the job of optimizing queries, making sure that your tables have the appropriate indices, then you want to go to one central place to see all these queries and tune them if necessary. If you care strongly enough about this, and you have the potential of other applications (perhaps written in other platforms), then you likely have already locked down the database so the only access is via Stored Procedures and hence the problem is already solved.

Let's remind ourselves that sprocs are not innately faster than dynamic SQL, however they are easier to tune and you also have the freedom of using T-SQL and temp tables to do some pre-processing of data prior to projecting results - which sometimes can be the fastest way to generate some complex results. More importantly, you can revoke all permissions to the underlying tables and only grant access to the data via Stored Procedures. Locking down a database with stored procedures is almost a necessity if your database is oriented as a service, acting as an integration layer between multiple client applications. If you have multiple applications hitting the same database, and you don't use stored procedures - you likely have bigger problems. 

In the end, this is not an insurmountable problem. If you are already using Stored Procedures, then by all means you can map those in your EDM. This seems like the best approach, but you could also embed SQL Server (or other provider) queries in your SSDL using a DefiningQuery.

Do changes in one part of the system affect others?

It's difficult to answer this question without talking about the possible changes.

thumbs up Schema Changes: The conceptual model and the mapping flexibility, even under complex scenarios is a strength of the entity framework. Compared to other technologies on the market, with the EF, your chances are as good as they're going to get that a change in the database schema will have minimal impact on your entity model, and vice versa.

thumbs up Database Provider Changes: The Entity Framework is database agnostic. It's provider model allows for easily changing from SQL Server, to Oracle, to My Sql, etc. via connection strings. This is very helpful for ISVs whose product must support running on multiple back-end databases.

thumbs down Persistence Ignorance: What if the change you want in one part of the system is to change your ORM technology? Maybe you don't want to persist to a database, but instead call a CRUD web service. In this pure model, you won't be happy. Both your Entities and your DataContext object inherit from base classes in the Entity Framework's System.Data.Objects namespace. By making references to these, littered throughout your business layer, decoupling yourself from the Entity Framework will not be an easy task.

thumbs down Unit Testing: This is only loosely related to the question, but you can't talk about PI without talking about Unit Testing. Because the generated entities do not support the use of Plain Old CLR Objects (POCO), this data access model is not easily mocked for unit testing.

Does the DAL simplify data access?

thumbs up Dramatically. Compared to classic ADO.NET, LINQ queries can be used for typed results & parameters, complete with intelli-sense against your conceptual model, with no worries about SQL injection attacks.

thumbs up As a bonus, what you do get is query composition across your domain model. Usually version 1.0 of a convention non-ORM data access layer provides components for each entity, each supporting crud behaviour. Consider a scenario where you need to show all of the Customers within a territory, and then you need to show the last 10 orders for each Customer. Now I'm not saying you'd do this, but what I've commonly seen is that while somebody might write a CustomerDac.GetCustomersByTerritory() method, and they might write an OrderDac.GetLastTenOrders(), they would almost never write a OrderDac.GetLastTenOrdersForCustomersInTerritory() method. Instead they would simply iterate over the collection of customers found by territory and call the GetLastTenOrders() over and over again. Obviously this is "good" resuse of the data access logic, however it does not perform very well.

Fortunately, through query composition and eager loading, we can cause the Entity Framework (or even LINQ to SQL) to use a nested subquery to bring back the last 10 orders for each customer in a given territory in a single round trip, single query. Wow! In a conventional data access layer you could, and should write a new method to do the same, but by writing yet another query on the order table, you'd be repeating the mapping between the table and your objects each time.

Layers, Schmayers: What about tiers?

thumbs down EDM generated entity classes are not very tier-friendly. The state of an entity, whether it is modified, new, or to be delete, and what columns have changed is managed by the ObjectContext. Once you take an entity and serialize it out of process to another tier, it is no longer tracked for updates. While you can re-attach an entity that was serialized back into the data access tier, because the entity itself does not serialize it's changed state (aka diff gram), you can not easily achieve full round trip updating in a distributed system. There are techniques for dealing with this, but it is going to add some plumbing code between the business logic and the EF...and make you wish you had a real data access layer, or something like Danny Simmons' EntityBag (or a DataSet).

Does the Data Access Layer support optimistic concurrency?

thumbs up Out of the box, yes, handily. Thanks to the ObjectContext tracking state, and the change tracking events injected into our code generated entity properties. However, keep in mind the caveat with distributed systems that you'll have more work to do if your UI is separated from your data access layer by one or more tiers.

How does the Data Access Layer support transactions?

thumbs up Because the Entity Framework builds on top of ADO.NET providers, transaction management doesn't change very much. A single call to ObjectContext.SaveChanges() will open a connection, perform all inserts, updates, and deletes across all entities that have changed, across all relationships and all in the correct order....and as you can imagine in a single transaction. To make transactions more granular than that, call SaveChanges more frequently or have multiple ObjectContext instances for each unit of work in progress. To broaden the scope of a transaction, you can manually enlist using a native ADO.NET provider transaction or by using System.Transactions.

Some References on Extensible XML Schema Design

This topic came up in a WCF training session (i.e. ObjectSharp's WCF course) I am currently delivering so I wanted to post a few references on the subject here:

The first is "Designing Extensible, Versionable XML Formats" by Dare Obasanjo
http://www.xml.com/pub/a/2004/07/21/design.html?page=1

The second, slightly more academic treatment is "Extensibility, XML Vocabularies, and XML Schema", by David Orchard, an extensibility guru who has much to say on the topic:
http://www.xml.com/pub/a/2004/10/27/extend.html

And then finally a series of writings by Tim Ewald which are chock full of extremely powerful Versioning advice:

First his "Services and the Versioning Problem" article in the MS Architecture Journal:
http://msdn2.microsoft.com/en-us/arcjournal/bb245674.aspx

And then a series of blog entries:
http://pluralsight.com/blogs/tewald/archive/2006/04/14/21733.aspx
http://pluralsight.com/blogs/tewald/archive/2006/04/25/22704.aspx
http://pluralsight.com/blogs/tewald/archive/2006/05/05/22961.aspx
http://pluralsight.com/blogs/tewald/archive/2006/05/05/22963.aspx

The Entity Framework vs. The Data Access Layer (Part 0: Introduction)

So the million dollar question is: Does the Entity Framework replace the need for a Data Access Layer? If not, what should my Data Access Layer look like if I want to take advantage of the Entity Framework? In this multi-part series, I hope to explore my thoughts on this question. I don't think there is a single correct answer. Architecture is about trade offs and the choices you make will be based on your needs and context.

In this first post, I first provide some background on the notion of a Data Access Layer as a frame of reference, and specifically, identify the key goals and objectives of a Data Access Layer.

While Martin Fowler didn't invent the pattern of layering in enterprise applications, his Patterns of Enterprise Application Architecture is a must read on the topic. Our goals for a layered design (which may often need to be traded off against each other) should include:

  • Changes to one part or layer of the system should have minimal impact on other layers of the system. This reduces the maintenance involved in unit testing, debugging, and fixing bugs and in general makes the architecture more flexible.
  • Separation of concerns between user interface, business logic, and persistence (typically in a database) also increases flexibility, maintainability and reusability.
  • Individual components should be cohesive and unrelated components should be loosely coupled. This should allow layers to be developed and maintained independently of each other using well-defined interfaces.

Now to be clear, I'm talking about a layer, not a tier. A tier is a node in a distributed system, of which may include one or more layers. But when I refer to a layer, I'm referring only to the logical separation of code that serves a single concern such as data access. It may or may not be deployed into a separate tier from the other layers of a system. We could then begin to fly off on tangential discussions of distributed systems and service oriented architecture, but I will do my best to keep this discussion focused on the notion of a layer. There are several layered application architectures, but almost all of them in some way include the notion of a Data Access Layer (DAL). The design of the DAL will be influenced should the application architecture include the distribution of the DAL into a separate tier.

In addition to the goals of any layer mentioned above, there are some design elements specific to a Data Access Layer common to the many layered architectures:

  • A DAL in our software provides simplified access to data that is stored in some persisted fashion, typically a relational database. The DAL is utilized by other components of our software so those other areas of our software do not have to be overly concerned with the complexities of that data store.
  • In object or component oriented systems, the DAL typically will populate objects, converting rows and their columns/fields into objects and their properties/attributes. this allows the rest of the software to work with data in an abstraction that is most suitable to it.
  • A common purpose of the DAL is to provide a translation between the structure or schema of the store and the desired abstraction in our software. As is often the case, the schema of a relational database is optimized for performance and data integrity (i.e. 3rd normal form) but this structure does not always lend itself well to the conceptual view of the real world or the way a developer may want to work with the data in an application. A DAL should serve as a central place for mapping between these domains such as to increase the maintainability of the software and provide an isolation between changes  in the storage schema and/or the domain of the application software. This may include the marshalling or coercing of differing data types between the store and the application software.
  • Another frequent purpose of the DAL is to provide independence between the application logic and the storage system itself such that if required, the storage engine itself could be switched with an alternative with minimal impact to the application layer. This is a common scenario for commercial software products that must work with different vendors' database engines (i.e. MS SQL Server, IBM DB/2, Oracle, etc.). With this requirement, sometimes alternate DAL's are created for each store that can be swapped out easily.  This is commonly referred to as Persistence Ignorance.

Getting a little more concrete, there are a host of other issues that also need to be considered in the implementation of a DAL:

  • How will database connections be handled? How will there lifetime be managed? A DAL will have to consider the security model. Will individual users connect to the database using their own credentials? This maybe fine in a client-server architecture where the number of users is small. It may even be desirable in those situations where there is business logic and security enforced in the database itself through the use of stored procedures, triggers, etc. It may however run incongruent to the scalability requirements of a public facing web application with thousands of users. In these cases, a connection pool may be the desired approach.
  • How will database transactions be handled? Will there be explicit database transactions managed by the data access layer or will automatic or implied transaction management systems such as COM+ Automatic Transactions, the Distributed Transaction Coordinator be used?
  • How will concurrent access to data be managed? Most modern application architecture's will rely on an optimistic concurrency  to improve scalability. Will it be the DAL's job to manage the original state of a row in this case? Can we take advantage of SQL Server's row version timestamp column or do we need to track every single column?
  • Will we be using dynamic SQL or stored procedures to communicate with our database?

As you can see, there is much to consider just in generic terms, well before we start looking at specific business scenarios and the wacky database schemas that are in the wild. All of these things can and should influence the design of your data access layer and the technology you use to implement it. In terms of .NET, the Entity Framework is just one data access technology. MS has been so kind to bless us with many others such as Linq To SQL, DataReaders, DataAdapters & DataSets, and SQL XML. In addition, there are over 30 3rd party Object Relational Mapping tools available to choose from.

Ok, so if you're  not familiar with the design goals of the Entity Framework (EF) you can read all about it here or watch a video interview on channel 9, with Pablo Castro, Britt Johnson, and Michael Pizzo. A year after that interview, they did a follow up interview here.

In the next post, I'll explore the idea of the Entity Framework replacing my data access layer and evaluate how this choice rates against the various objectives above. I'll then continue to explore alternative implementations for a DAL using the Entity Framework.

Visual Studio 2008 and Windows Server 2008 for aspiring Architects

imageAre you an architect or an aspiring architect interested in learning what's new this year for the MS platform?

On February 11th, 2008 come to the MS Canada Office in Mississauga and visit yours truly and some other dazzling speakers to learn more about Visual Studio 2008 and Windows Server 2008.

You can choose either the morning or afternoon session as your schedule permits. Only a 1/2 day out of your busy schedule and you'll know everything you need! Ok, well maybe not everything, but I hope that you'll be inspired to take the next steps in learning about technologies such as LINQ, Windows Communication Foundation and Windows Presentation Foundation and how you can make the most of these technologies in your applications.

Register Here.

TSPUG Presentation: Using Visual Studio 2008 for Developing SharePoint Workflows (and other stuff)

Last night I gave a presentation to the Toronto SharePoint Users Group on using Visual Studio 2008 to build SharePoint Workflows. I also covered a little bit on LINQ to SharePoint and WCF/WF integration at the end. Attached are my slides. Enjoy.