Architecture with Visual Studio 2010 At The Movies

This is just a quick follow up post from my demo at Visual Studio 2010 At The Movies last week. Thanks for everybody who came out, especially those from out of town.

I didn’t use any slides but I thought about recording my demo here. I may still get to that, but in the meantime, here are a few resources that will help you learn more about the architecture tools in Visual Studio 2010 Ultimate Edition:

  • The .NET Pet Shop that I used as a sample is available for download.
  • Walkthrough: MSDN How-To’s on Modeling in 2010
  • Blogs: Cameron Skinner, Peter Provost and last but not least Chris Lovett who has some most awesome video demos and tips for dealing with large diagrams. He also provides some samples for those that are interested in learning more about Directed Graph Markup Language for creating their own diagram generators.
  • The Patterns and Practices Team has released a set of templated layered diagrams for various references architectures.

Join ObjectSharp for Silverlight on the Silver Screen – July 9 – Scotiabank Theatre Toronto

Silverlight 3 will soon be released.  And to properly celebrate the excitement of its release, ObjectSharp is teaming up with Microsoft to present an action-packed first look at the UX3 platform, live from the Scotiabank Theatre in Toronto. 

As one of the first companies to be featured on Microsoft’s Silverlight gallery, our consultants will share with you their deep knowledge of the next generation of tools.  Whether you are a designer, developer, or purely a marketing geek, you will not want to miss this blockbuster event.  You will see feature-rich demonstrations of Silverlight, Expression Blend, SketchFlow, and  Windows 7 touch technology.  You will also see how these tools can be used to dazzle your customers and gain attention for your brand.







For Developers and Designers:

  • See in-depth demonstrations of Silverlight 3, Expression Blend, and Windows 7 touch technology.
  • Learn how to quickly design user interactions with Microsoft SketchFlow
  • Take Designer/Developer work flow to the next level with Visual Studio Team System
  • Learn how to cut off your bosses head off and paste it on other people’s bodies with Expression Studio


For CTOs and Marketing Managers

  • Understand the benefits of creating line-of-business applications with Silverlight and .NET RIA Services
  • Learn how to integrate Rich Media and Advertising with the Microsoft Platform
  • See Touch technology and natural user interfaces bring kiosk applications to life with Windows 7 and WPF

Technologies You Will See:

  • Silverlight 3 featuring WPF & XAML
  • Expression Blend 3 featuring SketchFlow
  • Windows 7 featuring Touch
  • Microsoft Office SharePoint System 2007 (MOSS) for external facing web sites
  • Visual Studio 2010 Team System

Register Online   |   Watch the Movie Trailer

Generic Implementation of INotifyPropertyChanged on ADO.NET Data Services (Astoria) Proxies with T4 Code Generation

IMG_4855 Last Week Mike Flasko from the ADO.NET Data Services (Astoria) Team blogged about what’s coming in V1.5 which will ship prior to VS 2010. I applaud these out of band releases.

One of the new features is support for two-way data binding in the client library generated proxy classes. These classes currently do not implement INotifyPropertyChanged events nor project into ObservableCollections out of the box.

Last week at the MVP Summit I had the chance to see a demo of this and other great things coming down the road from the broader Data Programmability Team. It seems like more and more teams are turning to T4 Templates for code generation which is great for our extensibility purposes. At first I was hopeful that the team had implemented these proxy generation changes via changing to T4 templates along with a corresponding “better” template.  Unfortunately, this is not the case and we won’t see any T4 templates in v1.5. It’s too bad – would it really have been that much more work to invest the time in implementing T4 templates than to add new switches to datasvcutil and new code generation (along with testing that code).

Anyway, after seeing some other great uses of T4 templates coming from product teams for VS 2010, I thought I would invest some of my own time to see if I couldn’t come up with a way of implementing INotifyPropertyChanged all on my own. The problem with the existing code gen is that while there are partial methods created and called for each property setter (i.e. FoobarChanged() ), there is no generic event fired that would allow us to in turn raise a InotifyPropertyChanged.PropertyChanged event. So you can manually added this for each and every property on every class – but it’s tedious.

I couldn’t have been the first person to think of doing this, and after a bit of googling, I confirmed that. Alexey Zakharov’s post on generating custom proxies with T4 has been completely ripped off, er, inspirational in this derivative work. What I didn’t like about Alexy’s solution was that it completely over wrote the proxy client. I would have preferred a solution that just implemented the partial methods in a partial class to fire the PropertyChanged event. This way, any changes, improvements, etc. to the core MS codegen can still be expected down the road. Of course, Alexey’s template is a better solution if there are indeed other things that you want to customize about the template in its entirely should you find that what you need to accomplish can’t be done with a partial class.

What I did like about Alexey’s solution is that it uses the service itself to query the service meta data directly. I had planned on using reflection to accomplish the same thing but in hindsight, that would be difficult to generate a partial class of a class I’m currently reflecting on in the same project (of course). Duh.

So what do you need to do to get this solution working?

  1. Add the file to the project where you have your reference/proxies to the data service. You will want to make sure there is no custom tool associated with this file – it’s just included as a reference in the next one. This file wraps up all the calls to get the meta data I’ve made a couple of small changes to Alexey’s -- Added support for Byte and Boolean (typo in AZ’s).
  2. Copy the file to the same project. If you have more than one data service, you’ll want one of these files for each reference. So for starters you may want to rename it accordingly. You are going to need to edit this bad boy as well.
  3. There are two options you’ll need to specify inside of the proxy template. The MetadataUri should be the uri to your service suffixed with $metadata. I’ve found that if your service is secured with integrated authentication, then the the metadata helper won’t pass those credentials along so for the purposes of code generation you’d best leave anonymous access on. Secondly is the Namespace. You will want to use the same namespace used by your service reference. You might have to do a Show All Files and drill into the Reference.cs file to see exactly what that is. 
  4. var options = new {
        MetadataUri = "http://localhost/ObjectSharpSample.Service/SampleDataService.svc/$metadata",
        Namespace = "ObjectSharp.SampleApplication.ServiceClient.DataServiceReference"

That’s it. When you save your file, should everything work, you’ll have a .cs file generate that implements through a partial class an INotifyProxyChanged interface. Something like…..

public partial class Address : INotifyPropertyChanged
    public event PropertyChangedEventHandler PropertyChanged;

    private void OnPropertyChanged(string property)
        var handler = PropertyChanged;
        if (handler != null)
            handler(this, new PropertyChangedEventArgs(property));

    partial void OnAddressIdChanged()
    partial void OnAddressLine1Changed()

Technology Predictions and Trends for 2009

I’m sure we’re going to look back at 2009 and say “it was the best of times, it was the worst of times’ and it will no doubt be interesting. Here’s my predictions….

Social Networking Everywhere

Although online social networking companies are already struggling with diminished valuations, in 2009 we’ll see social networks break out of their silos and become essential platform elements that see their way into other online applications such as travel, e-commerce, job-posting boards, online dating services, CRM services, web based email systems, etc. Blogging is also changing, slowing down in fact. Micro-blogging with status update-esque features in FaceBook, Windows Live, and of course the explosion of Twitter will take on even larger roles. It’s as true today as it was back in 1964 when fellow Canadian Marshall McLuhan wrote “The Medium Is The Message”.

The Death of Optical Media

Okay, so you’ll still be able to walk into a video store to rent a DVD or buy a spindle of blanks at your grocery store but make no mistake about it – the death march is on, and that includes you too Blu-Ray. Blu-Ray will never see the adoption curve that DVD’s had. They thought they won when HD-DVD died, but if winning means dying last, then sure, you won. We’ll increasingly be renting our movies on-demand through our cable boxes, on our converged PC’s and XBOX 360’s via services like Netflix. Along with this, the rest of us will start to realize we don’t really need to own our libraries of movies. With IPod penetration as high as it is, it may take longer to realize we don’t need to own our music either – frankly we don’t own it anyway even though the pricing models try to convince us we do. I won’t go out and predict the death of DRM, frankly, I think 2009 maybe the year where DRM starts to get more tolerable once we are clearly renting our music and movies. The Zune Pass is making some inroads here but until Apple starts offering a similar subscription pricing, this may take a bit longer.

The Mac Air may have been a bit ahead of the curve with dropping the optical drive, but get used to it. Expect more vendors to do the same as they reduce size or cram in additional batteries or hard drives.

The Rise of the NetBook

If 2009 is the year of doing more with less, then this will surely be the NetBook’s year. Mainstream hardware manufacturers hate these and their small profit margins, but Acer and Intel will be raking it in building market share if not large bottom lines. Who knows, MS may learn to love the NetBook if they can get Acer to start shipping Windows 7 on them this year as well. Be prepared to see these everywhere in 2009, but don’t expect to see Apple make one (ever).

Zune Phone

The big story at the end of 2008 has been the global suicide of the original Zune 30s. I predict that tomorrow they’ll be they shall rise from the dead but it might take until the 2nd for everybody to figure out that they need to entirely drain the battery. The big news is that there won’t be a Zune phone with the MS brand name on it, but the Zune UI will come to Windows Mobile (6.5?) turning legions of touch based smart phones into music players almost as good as an IPhone. The bad news is that without an App Store to vet software quality, crapware will continue to be the source of reliability issues for the Windows Mobile platform. The good news is that without an App Store, Windows Mobile users will have lots of choice in the software for their devices, not to mention lots of choice in devices, carriers and plans. The battle between Good and Evil may morph into the battle between Reliability and Choice.

Touch Everywhere

Get your head out of the gutter, that’s not what I meant. What I did mean is that 12-24 months from now, it will be difficult to purchase a digital frame, LCD monitor or phone without an onscreen touch capability. Windows 7 will light these devices up and we’ll start to not think about the differences between Tablet PC’s and Notebooks as they just converge into a single device. With the advent of Silverlight, WPF and Surface computing, MS has been banging the “user experience” drum for a while now but when touch starts to be the expectation and not the exception, we’ll have to re-engineer our applications to optimize for the touch experience. This may turn out to be bigger than the mouse or even a windowed operation system.

Flush with Flash

In 2008 we’ve been teased with sold state hard drives but with less than stellar performance at outrageous prices, they’ve been on the fringe. In 2009 prices and read/write times will both come down in solid state drives, but with the increased capacity of USB memory sticks 32gb, 64gb +, we likely won’t see SSD drives hitting mainstream this year. Instead I think we’ll see an increase in the behavior of people keeping their entire lives on USB flash memory sticks. Hopefully we’ll see sync & backup software such as Windows Live Sync, Active Sync, Windows Home Server, etc. become more aware of these portable memory devices that may get synced from any device in your mesh. 

Camera flash will have to have a new format as SDHC currently is maxed at 32gb. With the increase in demand for HD video recording on still and video cameras, we’ll need a new format. As such we’re seeing rock bottom prices on 2gb chips now. Maybe somebody will come out with a SD Raid device that lets us plug in a bank of 2GB SD Cards.

Growing up in the Cloud

Cloud computing is going to be a very long term trend. I think we’ll only see baby steps in 2009 towards this goal. In the consumer space we’ll see more storage of digital media in the cloud, online backup services and the move of many applications to the cloud. Perfect for your Touch Zune Phone and Touch NetBook without an optical drive eh? IT shops will take a bit longer to embrace the cloud. Although many IT Data centers are largely virtualized already, applications are not all that virtual today and that doesn’t seem to be changing soon as developers have not whole-heartedly adopted SOA practices, addressed scalability and session management issues nor adopted concepts such as multi-tenancy. As we do more with less in 2009, we won’t see that changing much as a lot of software out there will be in “maintenance mode” during the recession.

Maybe, Just Maybe, this is the year of the Conveniently Connected Smart Client

Adobe Air & Silverlight are mainstreaming web deployed and updated rich client desktop apps. It’s hard to take advantage of touch interfaces and massive portable flash storage within a browser. All of these other trends can influence Smart Client applications, potentially to a tipping point. We’ll hopefully see out of browser, cross-platform Silverlight applications in 2009 to make this an easy reality on the MS Stack.

Incremental, Value-Based and Agile Software Development

Many of my customers began large-scale re-writes of their key software assets in 2008, many of them against my recommendations. For most of my key customers in 2008 and into 2009 I’m an advocate of providing incremental value in short iterative releases, not major re-writes that take 6+ months to develop. Even if your application is written in PowerBuilder 6 or Classic ASP, avoid the temptation to rewrite any code that won’t see production for 4 months or longer. We can work towards componentized software by refactoring legacy assets and providing key integration points so that we can release updated modules towards gradual migration. It is difficult for software teams in this economy to produce big-bang, “boil the ocean”, build cathedral type projects. We simply can’t predict what our project’s funding will be in 4 months from now, or if we’ll be owned by another company, scaled down, out sourced or just plain laid off. That is of course unless you work for the government. Government spending will continue if not increase in 2009, but still, try to spend our taxpayer money wisely by delivering short incremental software releases. It allows you to build trust with your customers, mark a line in the sand and move onward and upward, and let’s you move quickly in times of fluid business requirements and funding issues.

Incremental, Value-Based software development isn’t easy. It takes lots of work, creative thinking, and much interop and integration work than one would prefer. It might easily seem like an approach that costs more in the long term, and in some cases you could be right. But if a company has to throw out work in progress after 6-8 months or never sees the value of it because of other changing business conditions, then what have you saved? Probably not your job anyway.

DevTeach Montreal December 1-5, 2008 – Coupon Enclosed

DevTeach Montreal is less than a month away but it’s not too late to register. This is a great conference with sessions covering .NET FX, Future, SQL Server, VSTS/Team System, Silverlight, Agile Development, Software Architecture, ASP.NET.

Aside from the great content, build up your professional network by rubbing shoulders with the speakers in an intimate conference. The list of speakers is particularly impressive this year. From MS you’ll get to see Elisa Flasko and Carl Perry from the Data Programmability Team, Beth Massi and Yair Alan Griver. Of course .NET Rockers Carl Franklin & Richard Campbell will also be there with fellow MS Regional Directors Tim Huckaby, Joel Semeniuk, Stephen Forte, Jim Duffy, Guy Barrette and yours truly.

But wait, there’s more. Each attendee will get Visual Studio 2008 Pro and Expression Web 2.0 full copies along with the entire DVD set covering all sessions from TechEd 2008 Developers conference from Orlando this year. It’s Shamwow!

If you attended TechDays, your included coupon is now worth $350 off the price of DevTeach. If you didn’t, you can use this code:TO000OBJSHARP good for 50$ off. Sign up here.

VSTS Development & Database Editions to Merge!

VSTS2008 Database Edition の箱An early Christmas present from our friends in Redmond! Beginning October 1, 2008 subscribers to VSTS Developer Edition and Database Edition will have access to the additional SKU. This was reported this week over at the MSDN VS 2010 & .NET Framework 4.0 overview page, but you don’t have to wait until Visual Studio 2010 is released.

Well that is just an awesome announcement and kudos to MS for listening to the community feedback about the multiple hats people wear on projects. I’m sure that this will lead to more teams actually using the Database edition functionality on their project which offers great source code integration for their database scripts not to mention T-SQL unit testing and test data generation.

Aspiring Architect - Web cast series

With all this heat, I almost wrote "perspiring". Why not beat the heat, and stay cool inside while watching these web casts from MS Canada targeting aspiring architects. With the predicted shortage in IT in the upcoming years, we're sure to see an influx of junior resources into our industry. This is a good opportunity for developers to transition into architecture roles to leverage their existing skill set.

The Aspiring Architect Series 2008 builds on last year’s content and covers a number of topics that are important for architects to understand. So it would be a great idea to watch last year's recordings if you haven't already. Links are available here: .

Upcoming sessions are as follows:

June 16th, 2008 – 12:00 p.m. to 1:00 p.m. – Introduction to the aspiring architect Web Cast series

June 17th, 2008 – 12:00 p.m. to 1:00 p.m. – Services Oriented Architecture and Enterprise Service Bus – Beyond the hype

June 18th, 2008 – 12:00 p.m. to 1:00 p.m. – TOGAF and Zachman, a real-world perspective

June 19th, 2008 – 12:00 p.m. to 1:00 p.m. – Services Oriented Architecture (Web Cast in French)

June 20th, 2008 – 12:00 p.m. to 1:00 p.m. – Interoperability (Web Cast in French)

June 23rd , 2008 – 12:00 p.m. to 1:00 p.m. – Realizing dynamic systems

June 24th, 2008 – 12:00 p.m. to 1:00 p.m. – Web 2.0, beyond the hype

June 25th, 2008 – 12:00 p.m. to 1:00 p.m. – Architecting for the user experience

June 26th, 2008 – 12:00 p.m. to 1:00 p.m. – Conclusion and next steps

Follow up on Entity Framework talk at Tech Ed 2008

Last week at TechEd I gave a talk about building data access layers with the Entity Framework. I covered various approaches from not having a data access layer at all, to fully encapsulation of the entity framework - and some hybrid approaches along the way.

I gave the first instance of this on Tuesday and then a repeat on Thursday.

To those who saw the first instance of this on Tuesday....

you unfortunately got an abbreviated and disjointed version for which I apologize. After I queued up my deck about 15 minutes prior to the talk I left the room for a minute while people filed in and while I was out, one of the event staff shutdown my deck and restarted it running from a different folder on the recording machine and didn't tell me. I was about 1/3rd into my presentation when I realized that I had the wrong version of the deck. At the time, I had no idea why this version of the deck was running so I wasn't going to fumble around looking for the correct one. Given a change in the order of things - I'm not sure if changing decks at that point would have made things better or worst. I still had no idea why this had happened when I gave the talk again on Thursday but when the same thing almost happened again - this time I caught the event staff shutting down my deck and restarting it again (from an older copy). Bottom line, sorry to those folks who saw the earlier version.

The complete deck and demo project is attached. It is a branch of the sample that is part of the Entity Framework Hands on Lab that was available at the conference and which is included in the .NET 3.5 Enhancements (aka SP1) training kit. You can will need the database for that project which is not included in my down.

Download the training kit here.


Visual Studio 2008 SP1 Beta & SQL Server 2008

A quick heads up to let you know that VS 2008 Service Pack 1 is now available (links below). It typically takes a couple of months from this point before we'll see a final release.

This Service Pack includes new cool feature:

One interesting point is that MS is going to simultaneously ship SQL Server 2008 which actually has a hard dependency on SP1.

I thought I’d take a moment to highlight some new features that Dev’s would care about in SQL Server 2008.

  • Change Data Capture: Async “triggers” capture the before/after snapshot of row level changes and writes them to Change Tables that you can query in your app. They aren’t real triggers as this asynchronously reads the transaction log.
  • Granular control of encryption, right through to the database level without any application changes required.
  • Resource Governor – very helpful when you allow users to write adhoc queries / reports against your OLTP database. Allows a DBA to assert resource limits & priorities.
  • Plan Freezing – allows you to lock down query plans to promote stable query plans across disparate hardware, server upgrades, etc.
  • New Date, and Time data types, no longer just DateTime types that you have to manually parse out the time or date to just get the real data you want.
  • DataTimeOffset – is a time zone aware datetime.
  • Table Value Parameters to procs – ever want to pass a result set as an arg to a proc?
  • Hierarchy ID is a new system type for storing nodes in a hierarchy….implemented as a CLR User Defined Type.
  • FileStream Data type allows blobish data to be surfaced in the database, but physically stored on the NTFS file system. ….but with complete transactional consistency with the relational data and backup integration.
  • New Geographic data support, store spatial data such as polygons, points and lines, and long/lat data types.
  • Merge SQL statement allows you to insert, or update if a row already exists.
  • New reporting services features such as access to reports from within Word & Excel, better SharePoint integration

Personally, haven't spent any time with SQL Server 2008 but that's a great set of new features that I can hardly wait to start using in real-world applications.


· VS 2008 SP1 :

· .NET 3.5 SP1 :

· Express 2008 with SP1:





· TFS 2008 SP1:

Feedback Forum:


The Entity Framework vs. The Data Access Layer (Part 1: The EF as a DAL)

In Part 0: Introduction of this series after asking the question "Does the Entity Framework replace the need for a Data Access Layer?", I waxed lengthy about the qualities of a good data access layer. Since that time I've received a quite a few emails with people interested in this topic. So without further adieu, let's get down to the question at hand.

So let's say you go ahead and create an Entity Definition model (*.edmx) in Visual Studio and have the designer generate for you a derived ObjectContext class and an entity class for each of your tables, derived from EntityObject. This one to one table mapping to entity class is quite similar to LINQ to SQL but the mapping capabilities move well beyond this to support advanced data models. This is at the heart of why the EF exists: Complex Types, Inheritance (Table per Type, Table per Inheritance Hierarchy), Multiple Entity Sets per Type, Single Entity Mapped to Two Tables, Entity Sets mapped to Stored Procedures or mapping to a hand-crafted query, expressed as either SQL or Entity SQL. EF has a good story for a conceptual model over top of our physical databases using Xml Magic in the form of the edmx file - and that's why it exists.

So to use the Entity Framework as your data access layer, define your model and then let the EdmGen.exe tool do it's thing to the edmx file at compile time and we get the csdl, ssdl, and msl files - plus the all important code generated entity classes. So using this pattern of usage for the Entity Framework, our data access layer is complete. It may not be the best option for you, so let's explore the qualities of this solution.

To be clear, the assumption here is that our data access layer in this situation is the full EF Stack: ADO.NET Entity Client, ADO.NET Object Services, LINQ to Entities, including our model (edmx, csdl, ssdl, msl) and the code generated entities and object context. Somewhere under the covers there is also the ADO.NET Provider (SqlClient, OracleClient, etc.)


To use the EF as our DAL, we would simply execute code similar to this in our business layer.

var db = new AdventureWorksEntities();
var activeCategories = from category in db.ProductCategory
                 where category.Inactive != true
                 select category;

How Do "EF" Entities Fit In?

If you're following along, you're probably asking exactly where is this query code above being placed. For the purposes of our discussion, "business layer" could mean a business object or some sort of controller. The point to be made here is that we need to think of Entities as something entirely different from our Business Objects.

Entity != Business Object

In this model, it is up to the business object to ask the Data Access Layer to project entities, not business objects, but entities.

This is one design pattern for data access, but it is not the only one. A conventional business object that contains its own data, and does not separate that out into an entity can suffer from tight bi-directional coupling between the business and data access layer. Consider a Customer business object with a Load method. Customer.Load() would in turn instantiate a data access component, CustomerDac and call the CustomerDac's Load or Fill method. To encapsulate all the data access code to populate a customer business object, the CustomerDac.Load method would require knowledge of the structure the Customer business object and hence a circular dependency would ensue.

The workaround, if you can call it that, is to put the business layer and the data access layer in the same assembly - but there goes decoupling, unit testing and separation of concerns out the window.

Another approach is to invert the dependency. The business layer would contain data access interfaces only, and the data access layer would implement those interfaces, and hence have a reverse dependency on the business layer. Concrete data access objects are instantiated via a factory, often combined with configuration information used by an Inversion of Control container. Unfortunately, this is not all that easy to do with the EF generated ObjectContext & Entities.

Or, you do as the Entity Framework implies and separate entities from your business objects. If you've used typed DataSets in the past, this will seem familiar you to you. Substitute ObjectContext for SqlConnection and SqlDataAdapter, and the pattern is pretty much the same.

Your UI presentation layer is likely going to bind to your Entity classes as well. This is an important consideration. The generated Entity classes are partial classes and can be extended with your own code. The generated properties (columns) on an entity also have event handlers created for changing and changed events so you can also wire those up to perform some column level validation. Notwithstanding, you may want to limit your entity customizations to simple validation and keep the serious business logic in your business objects. One of these days, I'll do another blog series on handing data validation within the Entity Framework.

How does this solution stack up?

How are database connections managed?

thumbs up Using the Entity Framework natively itself, the ObjectContext takes care of opening & closing connections for you - as needed when queries are executed, and during a call to SaveChanges. You can get access to the native ADO.NET connection if need be to share a connection with other non-EF data access logic. The nice thing however is that, for the most part, connection strings and connection management are abstracted away from the developer.

thumbs down A word of caution however. Because the ObjectContext will create a native connection, you should not wait to let the garbage collector free that connection up, but rather ensure that you dispose of the ObjectContext either explicitly or with a using statement.

Are all SQL Queries centralized in the Data Access Layer?

thumbs down By default the Entity Framework dynamically generates store specific SQL on the fly and therefore, the queries are not statically located in any one central location. Even to understand the possible queries, you'd have to walk through all of your business code that hits the entity framework to understand all of the potential queries.

But why would you care? If you have to ask that question, then you don't care. But if you're a DBA, charged with the job of optimizing queries, making sure that your tables have the appropriate indices, then you want to go to one central place to see all these queries and tune them if necessary. If you care strongly enough about this, and you have the potential of other applications (perhaps written in other platforms), then you likely have already locked down the database so the only access is via Stored Procedures and hence the problem is already solved.

Let's remind ourselves that sprocs are not innately faster than dynamic SQL, however they are easier to tune and you also have the freedom of using T-SQL and temp tables to do some pre-processing of data prior to projecting results - which sometimes can be the fastest way to generate some complex results. More importantly, you can revoke all permissions to the underlying tables and only grant access to the data via Stored Procedures. Locking down a database with stored procedures is almost a necessity if your database is oriented as a service, acting as an integration layer between multiple client applications. If you have multiple applications hitting the same database, and you don't use stored procedures - you likely have bigger problems. 

In the end, this is not an insurmountable problem. If you are already using Stored Procedures, then by all means you can map those in your EDM. This seems like the best approach, but you could also embed SQL Server (or other provider) queries in your SSDL using a DefiningQuery.

Do changes in one part of the system affect others?

It's difficult to answer this question without talking about the possible changes.

thumbs up Schema Changes: The conceptual model and the mapping flexibility, even under complex scenarios is a strength of the entity framework. Compared to other technologies on the market, with the EF, your chances are as good as they're going to get that a change in the database schema will have minimal impact on your entity model, and vice versa.

thumbs up Database Provider Changes: The Entity Framework is database agnostic. It's provider model allows for easily changing from SQL Server, to Oracle, to My Sql, etc. via connection strings. This is very helpful for ISVs whose product must support running on multiple back-end databases.

thumbs down Persistence Ignorance: What if the change you want in one part of the system is to change your ORM technology? Maybe you don't want to persist to a database, but instead call a CRUD web service. In this pure model, you won't be happy. Both your Entities and your DataContext object inherit from base classes in the Entity Framework's System.Data.Objects namespace. By making references to these, littered throughout your business layer, decoupling yourself from the Entity Framework will not be an easy task.

thumbs down Unit Testing: This is only loosely related to the question, but you can't talk about PI without talking about Unit Testing. Because the generated entities do not support the use of Plain Old CLR Objects (POCO), this data access model is not easily mocked for unit testing.

Does the DAL simplify data access?

thumbs up Dramatically. Compared to classic ADO.NET, LINQ queries can be used for typed results & parameters, complete with intelli-sense against your conceptual model, with no worries about SQL injection attacks.

thumbs up As a bonus, what you do get is query composition across your domain model. Usually version 1.0 of a convention non-ORM data access layer provides components for each entity, each supporting crud behaviour. Consider a scenario where you need to show all of the Customers within a territory, and then you need to show the last 10 orders for each Customer. Now I'm not saying you'd do this, but what I've commonly seen is that while somebody might write a CustomerDac.GetCustomersByTerritory() method, and they might write an OrderDac.GetLastTenOrders(), they would almost never write a OrderDac.GetLastTenOrdersForCustomersInTerritory() method. Instead they would simply iterate over the collection of customers found by territory and call the GetLastTenOrders() over and over again. Obviously this is "good" resuse of the data access logic, however it does not perform very well.

Fortunately, through query composition and eager loading, we can cause the Entity Framework (or even LINQ to SQL) to use a nested subquery to bring back the last 10 orders for each customer in a given territory in a single round trip, single query. Wow! In a conventional data access layer you could, and should write a new method to do the same, but by writing yet another query on the order table, you'd be repeating the mapping between the table and your objects each time.

Layers, Schmayers: What about tiers?

thumbs down EDM generated entity classes are not very tier-friendly. The state of an entity, whether it is modified, new, or to be delete, and what columns have changed is managed by the ObjectContext. Once you take an entity and serialize it out of process to another tier, it is no longer tracked for updates. While you can re-attach an entity that was serialized back into the data access tier, because the entity itself does not serialize it's changed state (aka diff gram), you can not easily achieve full round trip updating in a distributed system. There are techniques for dealing with this, but it is going to add some plumbing code between the business logic and the EF...and make you wish you had a real data access layer, or something like Danny Simmons' EntityBag (or a DataSet).

Does the Data Access Layer support optimistic concurrency?

thumbs up Out of the box, yes, handily. Thanks to the ObjectContext tracking state, and the change tracking events injected into our code generated entity properties. However, keep in mind the caveat with distributed systems that you'll have more work to do if your UI is separated from your data access layer by one or more tiers.

How does the Data Access Layer support transactions?

thumbs up Because the Entity Framework builds on top of ADO.NET providers, transaction management doesn't change very much. A single call to ObjectContext.SaveChanges() will open a connection, perform all inserts, updates, and deletes across all entities that have changed, across all relationships and all in the correct order....and as you can imagine in a single transaction. To make transactions more granular than that, call SaveChanges more frequently or have multiple ObjectContext instances for each unit of work in progress. To broaden the scope of a transaction, you can manually enlist using a native ADO.NET provider transaction or by using System.Transactions.