ASP.NET Application Deployment Best Practices – Part 2

In my previous post I started a list of best practices that should be followed for deploying applications to production systems.  This is continuation of that post.

  • Create new Virtual Application in IIS

Right-click [website app will live in] > Create Application

Creating a new application provides each ASP.NET application its own sandbox environment. The benefit to this is that site resources do not get shared between applications. It is a requirement for all new web applications written in ASP.NET.

  • Create a new application pool for Virtual App
    • Right click on Application Pools and select Add Application Pool
    • Define name: “apAppName” - ‘ap’ followed by the Application Name
    • Set Framework version to 2.0
    • Set the Managed Pipeline mode: Most applications should use the default setting

An application pool is a distinct process running on the web server. It segregates processes and system resources in an attempt to prevent errant web applications from allocating all system resources. It also prevents any nasty application crashes from taking the entire website down. It is also necessary for creating distinct security contexts for applications. Setting this up is essential for high availability.

  • Set the memory limit for application pool

There is a finite amount of available resources on the web servers. We do not want any one application to allocate them all. Setting a reasonable max per application lets the core website run comfortably and allows for many applications to run at any given time. If it is a small lightweight application, the max limit could be set lower.

  • Create and appropriately use an app_Offline.htm file

Friendlier than an ASP.NET exception screen (aka the Yellow Screen of Death)

If this file exists it will automatically stop all traffic into a web application. Aptly named, it is best used when server updates occur that might take the application down for an extended period of time. It should be stylized to conform to the application style. Best practice is to keep the file in the root directory of the application renamed to app_Online.htm, that way it can easily be found if an emergency update were to occur.

  • Don’t use the Default Website instance
    • This should be disabled by default
    • Either create a new website instance or create a Virtual Application under existing website instance

Numerous vulnerabilities in the wild make certain assumptions that the default website instance is used, which creates reasonably predictable attack vectors given that default properties exist. If we disable this instance and create new instances it will mitigate a number of attacks immediately.

  • Create two Build Profiles
    • One for development/testing
    • One for production

Using two build profiles is very handy for managing configuration settings such as connection strings and application keys. It lessens the manageability issues associated with developing web applications remotely. This is not a necessity, though it does make development easier.

  • Don’t use the wwwroot folder to host web apps

Define a root folder for all web applications other than wwwroot

As with the previous comment, there are vulnerabilities that use the default wwwroot folder as an attack vector. A simple mitigation to this is to move the root folders for websites to another location, preferably on a different disk than the Operating System.

These two lists sum up what I believe to be a substantial set of best practices for application deployments.  The intent was not to create a list of best development best practices, or which development model to follow, but as an aid in strictly deployment.  It should be left to you or your department to define development models.

ASP.NET Application Deployment Best Practices – Part 1

Over the last few months I have been collecting best practices for deploying ASP.NET applications to production.  The intent was to create a document that described the necessary steps needed to deploy consistent, reliable, secure applications that are easily maintainable for administrators.  The result was an 11 page document.  I would like to take a couple excerpts from it and essentially list what I believe to be key requirements for production applications.

The key is consistency.

  • Generate new encryption keys

The benefit to doing this is that internal hashing and encrypting schemes use different keys between applications. If an application is compromised, the private keys that can get recovered will have no effect on other applications. This is most important in applications that use Forms Authentication such as the member’s section. This Key Generator app is using built-in .NET key generation code in the RNGCryptoServiceProvider.

  • Version and give Assemblies Strong Names

Use AssemblyInfo.cs file:

[assembly: AssemblyTitle("NameSpace.Based.AssemblyTitle")]
[assembly: AssemblyDescription("This is My Awesome Assembly…")]
[assembly: AssemblyConfiguration("")]
[assembly: AssemblyCompany("My Awesome Company")]
[assembly: AssemblyProduct("ApplicationName")]
[assembly: AssemblyCopyright("Copyright © 2009")]
[assembly: AssemblyTrademark("TM Application Name")]
[assembly: AssemblyCulture("en-CA")]

Strong names and versioning is the backbone of .NET assemblies. It helps distinguish between different versions of assemblies, and provides copyright attributes to code we have written internally. This is especially helpful if we decide to sell any of our applications.

  • Deploy Shared Assemblies to the GAC
    • Assemblies such as common controls
    • gacutil.exe -I "g:\dev\published\myApp\bin\myAssembly.dll"

If any assemblies are created that get used across multiple applications they should be deployed to the GAC (Global Assembly Cache). Examples of this could be Data Access Layers, or common controls such as the Telerik controls. The benefit to doing this is that we will not have multiple copies of the same DLL in different applications. A requirement of doing this is that the assembly must be signed and use a multipart name.

  • Pre-Compile Site: [In Visual Studio] Build > Publish Web Site

Any application that is in production should be running in a compiled state. What this means is that any application should not have any code-behind files or App_Code class files on the servers. This will limit damage if our servers are compromised, as the attacker will not be able to modify the source.

  • Encrypt SQL Connections and Connection Strings

Encrypt SQL Connection Strings

Aspnet_regiis.exe -pe connectionStrings -site myWebSite -app /myWebApp

Encrypt SQL Connections

Add ‘Encrypt=True’ to all connection strings before encrypting

SQL Connections contain sensitive data such as username/password combinations for access to database servers. These connection strings are stored in web.config files which are stored in plain-text on the server. If malicious users access these files they will have credentials to access the servers. Encrypting the strings will prevent the ability to read the config section.

However, encrypting the connection string is only half of the issue. SQL transactions are transmitted across the network in plain-text. Sensitive data could be acquired if a network sniffer was running on a compromised web server. SQL Connections should also be encrypted using SSL Certificates.

  • Use key file generated by Strong Name Tool:

C:\Program Files\Microsoft SDKs\Windows\v7.0A\bin\sn.exe

“sn.exe -k g:\dev\path\to\app\myAppKey.snk”

Signing an assembly provides validation that the code is ours. It will also allow for GAC deployment by giving the assembly a signature. The key file should be unique to each application, and should be kept in a secure location.

  • Set retail=”true” in machine.config

<configuration>

<system.web>

<deployment retail="true"/>

</system.web>

</configuration>

In a production environment applications do not want to show exception errors or trace messages. Setting the retail property to true is simple way to turn off debugging, tracing, and force the application to use friendly error pages.

In part 2 I continue my post on more best practices for deployment to a production environment.

Resources for Students who Hate School

I hated school.  Technically, I’m still enrolled in college.  Bachelors of Business Management.  Blech.  I figured at least with business, I would learn something useful later in life.  I chose against Comp. Sci. for a few reasons.  One being that I know a couple PhD’s that know nothing about building applications in the real world.

In Comp. Sci., you learn how to build data structures, and how to make Mandelbrot Set’s process faster.  In business, you learn why people buy stuff.  Or more appropriately, you learn how to get people to buy your stuff.

Seeing as I learned (taught myself?) about things like linked-lists and pointers while in grade 10-ish, and wrote/re-wrote/re-re-wrote Mandelbrot Set builders as a final project in grade 11, I think I can safely say I would be bored as all hell in University.  Not to mention all the theory.  Comp. Sci. is all about theory.  Maybe 10% is actually coding.  F-that.

Business is inherently hands-on.

I like hands-on.  It’s tangible.

The only problem I had was finding resources.  My programming teachers were pretty cool, and were always willing to help me on algorithms that confused me, as well as extra-curricular programs when something just wasn’t jiving.  But I had cool teachers.  Not everyone is as lucky as I was.  And with the teachers, they weren’t thinking in C# or ASP.NET everyday like I tended to do.  Trying to ask them why something trivial like

<asp:TextBox ID="txtUsername">

didn’t compile was kinda painful.  I usually got a response along the lines of “what’s the colon for?”.  I always felt funny trying to explain the quasi-xml structure of ASP.NET to teachers.  This left me in a lame position of needing to find help.  Forums are great, but separating the wheat from the chaff is a waste of time.  Enter stackoverflow.com (4 years late, mind you) and you get answers quickly.  I like it.  I use it all the time.  I’d like to think that those who are willing to look for resources will find the site fairly easily.  However, there is another site out there that not too many people know about.  It’s the Microsoft Student Experience site.  Yeah yeah, brain wash them early.  I drank the kool-aid early.

Part of the website is dedicated to the DreamSpark program.  Free, fully-licensed Microsoft products.  Nuff said.

image

The other half of the site is dedicated to students.  Good thing, given the name.  Not just students studying software development either.  All students.  It provides tangible resources for students.  Stories, tutorials, and templates look to be the main content.  It’s all surprisingly good stuff too.  It ranges from school studies to general life, to post-school life.

image 

These resources may help those students who are struggling with school – at any level.  There are students out there with lots of potential.  Let’s not see it go to waste.

The Fine Line Between Insanity and Clarity

The BBSM (Building Broadband Service Manager) is a Windows 2000 box that acts as a gateway to the internet for customer access.  It handles that login page when you connect to the open WiFi network.  It is the most convoluted piece of [insert noun here].  The guy who signs my paycheck had asked me a few weeks back to redesign said login page in keeping with corporate designs.  It was also requested that it be mobile browser friendly.  Classic ASP, running JScript (yes, JScript), in IIS 5 on Windows 2000 behind ISA Server 2000.  The new layout was done in about an hour, and it looks pretty good.  It has been 3 weeks and I still can't get the freakin mobile code working.

In a moment of insanity (clarity?) I got the bright idea to install .NET on the box and rewrite all the pages from scratch.  Rewriting took a couple hours, and the mobile support works.  Go to set it up on the box (which must be done via USB key, via Ops guy, via physically walking to box in DataCentre {which I don't have access to}) and come to find permission errors for the ASPNET account doing COM stuff.  Needless to say I hate COM Interop with a passion.  I even sunk to the level of giving the ASPNET account full admin privileges.  Turns out Windows 2000 does not like COM Interop either.

"It looks nice if you use a laptop" was my statement to the boss.  His response was "everyone is using PDA's and their iPhones.  Maybe 10 customers use laptops."

Moral of the story: If the original code was written in the same year you turned 11, run.  Quickly.

Consultation to Salary &amp;ndash; Theoretical Head Banging Meets the Real World

A few weeks ago, six or so, I was offered a position as a Software Developer for the Woodbine Entertainment Group.  The position looked appealing so I accepted the job offer.  I am in a probationary period for the next four months and a bit.  Anything I say can be grounds for firing me.  Never liked that part about non-contract jobs.  Ah well.

Woodbine is an interesting company.  I knew very little about it until I got word of the job.  Seems I was the only one in Canada who didn’t know the company.  My grandmother, who moved to California 50 years ago, knew about the company.  Even used to bet there – well, the Woodbine Race Track, before it moved.  It has an interesting history.

It is migrating to be a Microsoft shop, from a more Novell focused infrastructure.  We are working towards standardizing on .NET for our custom applications.

The one thing that caught my eye with Woodbine is that the company is the technology leader for Horse Racing.  Not just in Canada, but throughout the world.  Our services can let you place a bet live, on a track in Australia, and see results immediately.  Can you imagine the infrastructure required for such a feat?  It’s sweet!  The business-people behind this are really keen on letting technology do it’s thing, so we can make money.  Lots of money.  See our Annual Reports on that.  Check back for latest numbers.

Now, some of you may have noticed that our Corporate Portal is written in what looks to be Classic ASP.  For all intensive purposes, it is.  Archive.org shows the portal went live in 2001, and had a major rebuild in 2003.  Since then incremental changes have taken place, most of which have been built using ASP.NET.  We are working on the new portal.  All I can say at the moment is: it’s going to be awesome.  So awesome that a new word will need to created to contain all of its awesomeness.  HorsePlayer Interactive is pretty amazing, but I’d like to think this new site will be just that much more awesomer.  Yes, I said awesomer.

As for the nature of this site, it won’t change.  I’ll still post my thoughts and experiences.  I might need to change stories a little to protect the innocent, but it’s all in good fun.  I may be forced to post details of how horse racing actually works, because I’m still not sure I get all the facets of it.  In time.

More to follow.

Print-Ready Cheat Sheets for Web Developer

Here’s a great link I thought to share.

http://www.backtoessentials.com/tools/40-useful-print-ready-cheat-sheet-for-web-developers/

It contains 40 great cheat sheets pdf links so you can download them and print it right off.

Windows LiveID Almost OpenID

liveopenidThe Windows Live team announced a few months ago that their Live ID service will be a new provider for the OpenID system.  The Live team was quoted:

Beginning today, Windows Live™ ID is publicly committing to support the OpenID digital identity framework with the announcement of the public availability of a Community Technology Preview (CTP) of the Windows Live ID OpenID Provider.

You will soon be able to use your Windows Live ID account to sign in to any OpenID Web site.

I saw the potential in OpenID a while ago, long before I heard about Microsoft’s intentions.  The only problem was that I didn’t really find a good way to implement such a system on my website.  Not only that, I didn’t really have a purpose for doing such a thing.  The only reason anyone would need to log into the site would be to administer it.  And seeing as I’m the only person who could log in, there was never a need.

Then a brilliant idea hit me.  Let users create accounts to make comment posting easier.  Originally, a user would leave a comment, and I would log in to verify comments, at which point the comment would actually show up.  Sometimes I wouldn’t log in for a couple days, which meant no comments.  So now, if a user wants to post a comment, all they have to do is log in with their openID, and the comment will appear.

Implementing OpenID

I used the ExtremeSwank OpenID Consumer for ASP.NET 2.0.  The beauty of this framework is that all I have to do is drop a control on a webform and OpenID functionality is there.  The control handles all the communications, and when the authenticating site returns it’s data, you access the data through the control’s properties.  To handle the authentication on my end, I tied the values returned from the control into my already in place Forms Authentication mechanism:

if (!(OpenIDControl1.UserObject
== null)) { if (Membership.GetUser(OpenIDControl1.UserObject.Identity)
== null) { string email = OpenIDControl1.UserObject
.GetValue(SimpleRegistrationFields.Email); string username = ""; if (HttpContext.Current.User.Identity != null) { username = HttpContext.Current.User.Identity.Name; } else { username = OpenIDControl1.UserObject.Identity; } MembershipCreateStatus membershipStatus; MembershipUser user = Membership.CreateUser( username, RandomString(12, false), email, "This is an OpenID Account. You should log in with your OpenID", RandomString(12, false), true, out membershipStatus ); if (membershipStatus != MembershipCreateStatus.Success) { lblError.Text
= "Cannot create account for OpenID Account: "
+ membershipStatus.ToString(); } } }
That’s all there is to it.

What Makes us Want to Program? Part 4

In my previous post, I started talking about using Microsoft technologies over PHP and open source technologies.  There were a couple reasons why I chose to make the move.  First, from a development perspective, everything was object oriented.  PHP was just getting started with OOP at the time, and it wasn’t all that friendly.  Second, development time was generally cut in at least half, because of the built in controls of ASP.NET.  Third, the end result was a more rich application experience for the same reason.  The final reason comes down to the data aspect.

Pulling data from a database in PHP wasn’t easy to do.  The built in support was for MySQL, with very little, if next to nothing for SQL Server.  In a lot of cases that isn’t always a bad thing.  MySQL is free.  You can’t argue with that.  however, MySQL wasn’t what you would call ACID compliant.  Defined, MySQL did not have the characteristics of being Atomic, Consistent, Isolated, and Durable.  Essentially, when data goes missing, there is nothing you can do about it.  SQL Server on the other hand is very ACID compliant.  This is something you want.  Period.

Once .NET 2.0 was released, a whole new paradigm came into play for data in a web application.  It was easy to access!  No more, or at least next to very little boiler plate coding was necessary for data access now.  Talk about a selling point.  Especially when the developer in question is 16 going on 17.

Now that I didn’t need to worry about data access code, I could start working on figuring out SQL.  At the time t-SQL scared the crap out of me.  My brain just couldn’t work around datasets.  The idea of working with multiple pieces of data at once was foreign.  I understood single valued iterations.  A for loop made sense to me.  SELECTs and JOINs confused me.  Mind you, I didn’t start Statistics in math until the following year.  Did SQL help with statistics, or did statistics help me finally figure out SQL?  It’s a chicken and the egg paradox.

So here I am, 17 years old, understanding multiple languages, building dozens of applications, and attending developer conferences all the while managing my education in High School.  Sweet.  I have 3 years until the next release of Visual Studio comes out.  It was here that I figured I should probably start paying more attention in school.  It’s not so much that I wasn’t paying attention, it’s just that I didn’t care enough.  I put in just enough effort to skate through classes with a passing mark.  It was also at this point in time that I made an interesting supposition.

Experts tend to agree that people who are programming geniuses are also good at math and critical thinking or reasoning.  Not one or the other, but both.  Now I’m not saying I’m a programming genius, but I suck at math.  It was just never in the cards.  But, according to all those High School exams and the psychological profiling they gather from them, my Critical Thinking and Reasoning skills are excellent.  Top 10% in Canada according to the exam results.  My math skills sit around top 20-30% depending on the type.

Neurologists place this type of thinking in the left hemisphere of the brain.  The left brain is associated with verbal, logical, and analytical thinking. It excels in naming and categorizing things, symbolic abstraction, speech, reading, writing, arithmetic.  Those who live in the left brain are very linear.  Perfect for a software developer.

The supposition I made had more to do with the Pre-Frontal Cortex of the brain.  It does a lot of work, some of which is planning complex cognitive behaviors.  Behaviors like making a list, calculating numbers, abstracting thoughts, etc.  It plans out the processes our brains use to get things done.  This is true for both sides of the brain.  So, suppose you are left brain-oriented.  You are predisposed to be good at development.  Now, suppose your Pre-Frontal Cortex is very well developed, more so than the average person.  It could be reasoned that part of being a programming genius is having a well developed Pre-Frontal Cortex.

So why does this make us want to program?  Find out in Part 5.

What Makes us Want to Program? Part 3

In my second post I discussed my run in with ASP, and how PHP was far better.  I ended the post talking about an invitation to a Microsoft event.  This was an interesting event.  Greg and I were the only people under 30 there.  When that’s a 15 year difference, things get interesting.  Especially when you need your mother to drive you there…  The talk was a comparison between Microsoft based technologies and Linux based technologies.  The presenter was a 10 year veteran of IBM, working on their Linux platform, who then moved to Microsoft.  For the life of me I can’t remember his name.

His goal was simple.  Disprove myths around Linux costs versus Windows costs.  It was a very compelling argument.  The event was based around the Windows Compare campaign.  It was around this time that Longhorn (Longhorn that turned into Vista, not Server 2008) was in pre-beta soon to go beta, and after discussing it with Greg, we decided to probe the presenter for information about Longhorn.  In a situation like that, the presenter either gets mad, or becomes really enthusiastic about the question.  He certainly didn’t get mad.

Throughout the rest of the talk, the presenter made some jokes at mine and Greg’s expense, which was all in good fun.  Based on that, we decided to go one step further to ask how we can get the latest Longhorn build, at one of the breaks.  the conversation went something like this:

Me: So how do people get copies of the latest build for Longhorn?
Presenter: Currently those enrolled in the MSDN Licensing program can get the builds.
Me: Ok, how does one join such a licensing program?
Presenter: Generally you buy them.
Me: How much?
Presenter: A couple thousand…
Me: Ok let me rephrase the question.  How does a student, such as myself and my friend Greg here, get a the latest build of Longhorn when we don’t have an MSDN subscription, nor the money to buy said subscription?
Presenter: *Laughs* Oh.  Go talk to Alec over there and tell him I said to give you a student subscription.
Me:  Really?  Cool!

Six months later Greg and I some how got MSDN Premium Subscriptions.  We had legal copies of almost every single piece of Microsoft software ever commercially produced.  Visual Studio 2005 was still in beta, so I decided to try it out.  I was less than impressed with Visual Studio 2003, but really liked ASP.NET, so I wanted to see what 2005 had in store.  At the time PHP was still my main language, but after the beta of 2005, I immediately switched to C#.  I had known about C# for a while, and understood the language fairly well.  It was .NET 1.1 that never took for me.  That, and I didn’t have a legal copy of Visual Studio 2003 at the time.

Running a Longhorn beta build, with Visual Studio 2005 beta installed, I started playing with ASP.NET 2.0, and built some pretty interesting sites.  The first was a Wiki type site, designed for medical knowledge (hey, it takes a lot to kill a passion of mine).  It never saw the light of day on the interweb, but it certainly was a cool site.  Following that were a bunch of test sites that I used to experiment with the data controls.

It wasn’t until the release of SQL Server 2005 that I started getting interested in data.  Which I will discuss in the my next post.

Windows Live Writer

I finally got around to building a MetaWeblog API Handler for this site, so I can use Windows Live Writer.  It certainly was an interesting task.  I wrote code for XML, SQL Server, File IO, and Authentication to get this thing working.  It’s kinda mind-boggling how many different pieces were necessary to get the Handler to function properly.

All-in-all the development was really fun.  Most people would give up on the process once they realize what’s required to debug such an interface.  But it got my chops in shape.  It’s not every day you have to use a Network Listener to debug code.  It’s certainly not something I would want to do everyday, but every so often it’s pretty fun.

While in the preparation process, there were a couple of procedures that I thought might be tricky to work out.  One in particular was automatically uploading images to my server that were placed in the post.  I could have left it to the manual process, what I started out with, which involved FTP’ing the images to the server, and then figuring out the URL for them, and manually inserting the img tag.  Or, I could let Live Writer and the Handler do all the work.  Ironically, this procedure took the least amount of code out of all of them:

public string NewMediaObject(string blogId, string userName, string password,
string base64Bits, string name) { string mediaDirectory
= HttpContext.Current.Request.PhysicalApplicationPath + "media/blog/"; if (authUser(userName, password)) { File.WriteAllBytes(mediaDirectory + name, Convert.FromBase64String(base64Bits)); return Config.SiteURL + "/media/blog/" + name; } else { throw new Exception("Cannot Authenticate User"); } }

Now its a breeze to write posts.  It even adds drop shadows to images:

1538

Live Writer also automatically creates a thumbnail of the image, and links to the original.  It might be a pain in some cases, but it’s easily fixable.

All I need now is more topics that involve pictures.  Kitten’s optional. :)