Due to the fact that the hosting provider I was using for Syfuhs.net was less than stellar, (names withheld to protect the innocent) I’ve decided to move the blog portion of this site to blogs.objectsharp.com.
With any luck the people subscribing to this site won’t see any changes, and any links directly to www.syfuhs.net should 301 redirect to blogs.objectsharp.com/cs/blogs/steve/.
As I learned painfully of the problems with the last migration to DasBlog, permalinks break easily when switching platforms. With any luck, I will have that resolved shortly.
Please let me know as soon as possible if you start seeing issues.
Cheers!
A few posts back I started talking about what it would take to create a new application
for the new Windows Phone 7. I’m not a fan of learning from trivial applications
that don’t touch on the same technologies that I would be using in the real world,
so I thought I would build a real application that someone can use.
Since this application uses a well known dataset I kind of get lucky because I already
have my database schema, which is in a reasonably well designed way. My first
step is to get it to the Phone, so I will use WCF Data Services and an Entity Model.
I created the model and just imported the necessary tables. I called this model
RaceInfoModel.edmx. The entities name is RaceInfoEntities This is ridiculously
simple to do.
The following step is to expose the model to the outside world through an XML format
in a Data Service. I created a WCF Data Service and made a few config changes:
using System.Data.Services;
using System.Data.Services.Common;
using System;
namespace RaceInfoDataService
{
public class RaceInfo : DataService
{ public static void InitializeService(DataServiceConfiguration config) { if (config
== null) throw new ArgumentNullException("config"); config.UseVerboseErrors
= true; config.SetEntitySetAccessRule("*", EntitySetRights.AllRead); //config.SetEntitySetPageSize("*",
25); config.DataServiceBehavior.MaxProtocolVersion = DataServiceProtocolVersion.V2;
} } }
This too is reasonably simple. Since it’s a web service, I can hit it from a
web browser and I get a list of available datasets:
This isn’t a complete list of available items, just a subset.
At this point I can package everything up and stick it on a web server. It could
technically be ready for production if you were satisfied with not having any Access
Control’s on reading the data. In this case, lets say for arguments sake that
I was able to convince the powers that be that everyone should be able to access it.
There isn’t anything confidential in the data, and we provide the data in other services
anyway, so all is well. Actually, that’s kind of how I would prefer it anyway. Give
me Data or Give me Death!
Now we create the Phone project. You need to install the latest build of the
dev tools, and you can get that here http://developer.windowsphone.com/windows-phone-7/.
Install it. Then create the project. You should see:
The next step is to make the Phone application actually able to use the data.
Here it gets tricky. Or really, here it gets stupid. (It better he fixed
by RTM or else *shakes fist*)
For some reason, the Visual Studio 2010 Phone 7 project type doesn’t allow you to
automatically import services. You have to generate the service class manually.
It’s not that big a deal since my service won’t be changing all that much, but nevertheless
it’s still a pain to regenerate it manually every time a change comes down the pipeline.
To generate the necessary class run this at a command prompt:
cd C:\Windows\Microsoft.NET\Framework\v4.0.30319
DataSvcutil.exe
/uri:http://localhost:60141/RaceInfo.svc/
/DataServiceCollection
/Version:2.0
/out:"PATH.TO.PROJECT\RaceInfoService.cs"
(Formatted to fit my site layout)
Include that file in the project and compile.
UPDATE: My bad, I had already installed the reference, so this won’t
compile for most people. The Windows Phone 7 runtime doesn’t have the System.Data
namespace available that we need. Therefore we need to install them… They
are still in development, so here is the CTP build http://www.microsoft.com/downloads/details.aspx?displaylang=en&FamilyID=b251b247-70ca-4887-bab6-dccdec192f8d.
You should now have a compile-able project with service references that looks something
like:
We have just connected our phone application to our database! All told, it took
me 10 minutes to do this. Next up we start playing with the data.
Earlier this morning, Microsoft launched Visual Studio 2010. Woohoo! here’s
the jist:
Watch the Keynote and Channel 9 Live here: http://www.microsoft.com/visualstudio/en-us/watch-it-live
Get the real bits here (if you have an MSDN license): http://msdn.microsoft.com/en-ca/subscriptions/default.aspx
Get the trial bits here:
-
Microsoft Visual Studio 2010 Professional
-
Microsoft Visual Studio 2010 Ultimate
-
Microsoft Visual Studio Team Foundation Server
Get the Express versions here: http://www.microsoft.com/express/
All the important stuff you want/need to know about Visual Studio 2010 development: http://msdn.microsoft.com/en-ca/ff625297.aspx
Enjoy!
A few minutes ago I just finalized my first CodePlex project. While working
on the ever-mysterious Infrastructure 2010 project, I needed to integrate the Live
Meeting API into an application we are using. So I decided to stick it into
it’s own assembly for reuse.
I also figured that since it’s a relatively simple project, and because for the life
of me I couldn’t find a similar wrapper, I would open source it. Maybe there
is someone out there who can benefit from it.
The code is ugly, but it works. I suspect I will continue development, and clean
it up a little. With that being said:
-
It needs documentation (obviously).
-
All the StringBuilder stuff should really be converted to XML objects
-
It need's cleaner exception handling
-
It needs API versioning support
-
It needs to implement more API functions
Otherwise it works like a charm. Check
it out!
In my previous post I started a list of best practices that should be followed for
deploying applications to production systems. This is continuation of that post.
-
Create new Virtual Application in IIS
Right-click [website app will live in] > Create Application
Creating a new application provides each ASP.NET application its own sandbox environment.
The benefit to this is that site resources do not get shared between applications.
It is a requirement for all new web applications written in ASP.NET.
-
Create a new application pool for Virtual App
-
Right click on Application Pools and select Add Application Pool
-
Define name: “apAppName” - ‘ap’ followed by the Application Name
-
Set Framework version to 2.0
-
Set the Managed Pipeline mode: Most applications should use the default setting
An application pool is a distinct process running on the web server. It segregates
processes and system resources in an attempt to prevent errant web applications from
allocating all system resources. It also prevents any nasty application crashes from
taking the entire website down. It is also necessary for creating distinct security
contexts for applications. Setting this up is essential for high availability.
-
Set the memory limit for application pool
There is a finite amount of available resources on the web servers. We do not want
any one application to allocate them all. Setting a reasonable max per application
lets the core website run comfortably and allows for many applications to run at any
given time. If it is a small lightweight application, the max limit could be set lower.
-
Create and appropriately use an app_Offline.htm file
Friendlier than an ASP.NET exception screen (aka the Yellow Screen of Death)
If this file exists it will automatically stop all traffic into a web application.
Aptly named, it is best used when server updates occur that might take the application
down for an extended period of time. It should be stylized to conform to the application
style. Best practice is to keep the file in the root directory of the application renamed
to app_Online.htm, that way it can easily be found if an emergency update were to
occur.
-
Don’t use the Default Website instance
-
This should be disabled by default
-
Either create a new website instance or create a Virtual Application under existing
website instance
Numerous vulnerabilities in the wild make certain assumptions that the default website
instance is used, which creates reasonably predictable attack vectors given that default
properties exist. If we disable this instance and create new instances it will mitigate
a number of attacks immediately.
-
Create two Build Profiles
-
One for development/testing
-
One for production
Using two build profiles is very handy for managing configuration settings such as
connection strings and application keys. It lessens the manageability issues associated
with developing web applications remotely. This is not a necessity, though it does
make development easier.
-
Don’t use the wwwroot folder to host web apps
Define a root folder for all web applications other than wwwroot
As with the previous comment, there are vulnerabilities that use the default wwwroot
folder as an attack vector. A simple mitigation to this is to move the root folders
for websites to another location, preferably on a different disk than the Operating
System.
These two lists sum up what I believe to be a substantial set of best practices for
application deployments. The intent was not to create a list of best development
best practices, or which development model to follow, but as an aid in strictly deployment.
It should be left to you or your department to define development models.
Over the last few months I have been collecting best practices for deploying ASP.NET
applications to production. The intent was to create a document that described
the necessary steps needed to deploy consistent, reliable, secure applications that
are easily maintainable for administrators. The result was an 11 page document.
I would like to take a couple excerpts from it and essentially list what I believe
to be key requirements for production applications.
The key is consistency.
-
Generate new encryption keys
The benefit to doing this is that internal hashing and encrypting schemes use different
keys between applications. If an application is compromised, the private keys that
can get recovered will have no effect on other applications. This is most important
in applications that use Forms Authentication such as the member’s section. This Key
Generator app is using built-in .NET key generation code in the RNGCryptoServiceProvider.
-
Version and give Assemblies Strong Names
Use AssemblyInfo.cs file:
[assembly: AssemblyTitle("NameSpace.Based.AssemblyTitle")]
[assembly: AssemblyDescription("This is My Awesome Assembly…")]
[assembly: AssemblyConfiguration("")]
[assembly: AssemblyCompany("My Awesome Company")]
[assembly: AssemblyProduct("ApplicationName")]
[assembly: AssemblyCopyright("Copyright © 2009")]
[assembly: AssemblyTrademark("TM Application Name")]
[assembly: AssemblyCulture("en-CA")]
Strong names and versioning is the backbone of .NET assemblies. It helps distinguish
between different versions of assemblies, and provides copyright attributes to code
we have written internally. This is especially helpful if we decide to sell any of
our applications.
-
Deploy Shared Assemblies to the GAC
-
Assemblies such as common controls
-
gacutil.exe -I "g:\dev\published\myApp\bin\myAssembly.dll"
If any assemblies are created that get used across multiple applications they should
be deployed to the GAC (Global Assembly Cache). Examples of this could be Data Access
Layers, or common controls such as the Telerik controls. The benefit to doing this
is that we will not have multiple copies of the same DLL in different applications.
A requirement of doing this is that the assembly must be signed and use a multipart
name.
-
Pre-Compile Site: [In Visual Studio] Build > Publish Web Site
Any application that is in production should be running in a compiled state. What
this means is that any application should not have any code-behind files or App_Code
class files on the servers. This will limit damage if our servers are compromised,
as the attacker will not be able to modify the source.
-
Encrypt SQL Connections and Connection Strings
Encrypt SQL Connection Strings
Aspnet_regiis.exe -pe connectionStrings -site myWebSite -app /myWebApp
Encrypt SQL Connections
Add ‘Encrypt=True’ to all connection strings before encrypting
SQL Connections contain sensitive data such as username/password combinations for
access to database servers. These connection strings are stored in web.config files
which are stored in plain-text on the server. If malicious users access these files
they will have credentials to access the servers. Encrypting the strings will prevent
the ability to read the config section.
However, encrypting the connection string is only half of the issue. SQL transactions
are transmitted across the network in plain-text. Sensitive data could be acquired
if a network sniffer was running on a compromised web server. SQL Connections should
also be encrypted using SSL Certificates.
-
Use key file generated by Strong Name Tool:
C:\Program Files\Microsoft SDKs\Windows\v7.0A\bin\sn.exe
“sn.exe -k g:\dev\path\to\app\myAppKey.snk”
Signing an assembly provides validation that the code is ours. It will also allow
for GAC deployment by giving the assembly a signature. The key file should be unique
to each application, and should be kept in a secure location.
-
Set retail=”true” in machine.config
<configuration>
<system.web>
<deployment retail="true"/>
</system.web>
</configuration>
In a production environment applications do not want to show exception errors or trace
messages. Setting the retail property to true is simple way to turn off debugging,
tracing, and force the application to use friendly error pages.
In part 2 I continue my post on more best practices for deployment to a production
environment.
Last week Silverlight 3.0 was released. In Toronto, ObjectSharp put
on a very cool launch event, with lots of great demos and compelling reasons to start
using Silverlight immediately. I was impressed, but I’m a Microsoft fan-boy
(fan-boi?), so that doesn’t count. It was certainly fitting that ObjectSharp
propose using Silverlight for some parts of our new website www.woodbineentertainment.com,
seeing as they won the bid to build the new site. I saw the potential; as did
a few others on the team. However, some executives did not see the benefit.
I respect their opinion, somewhat because I have to – they can fire me after all,
and mostly because they have business sense on their side.
The company is very much on the cutting edge of technology in a few respects, but
very conservative in the way we choose technology. For instance, our new site
will be built on Microsoft Office SharePoint Server 2007. I’d wager there are
less than a hundred publically facing websites on the internet that use MOSS (probably
due to complexity and cost), yet we chose to use it because of the potential in further
developing it in future iterations.
Silverlight on the other hand is a different story. Recent reports peg Silverlight
penetration at around 25-30% of all browsers. Whether or not this is accurate,
who knows. It’s the only data available. Flash penetration is at 96%.
Now, in my opinion 25% growth in 2 years on Silverlight’s part is impressive.
Flash has been around for nearly 2 decades. There is definitely a correlation
to be made in there somewhere.
At this point, I was sold on using Silverlight. The exec’s still weren’t.
Seeing as Silverlight is a browser plug-in, it must be installed in some way, shape,
or form. At 25%, that means our customer demographic would have around 10% penetration.
That is terrible. Getting them to install a plug-in to view site content is
a tough sell. The executives didn’t want to scare away customers by making them
install the plug-in. SharePoint doesn’t need a browser plug-in.
And here in lies the Catch-22
To expand our marketed audience, we build on Silverlight to give them more content
that is better authored to their needs. In doing so, we lose customers because
they need to install the plug-in. There is no metric at this point in time to
help us extrapolate the difference. There is a reasonable risk involved with
using such cutting-edge technology. We will use it when browser penetration
is high enough, yet browser penetration won’t grow if sites like ours don’t use Silverlight.
Ah Well
I’m a technology risk taker. I live on the bleeding edge. I run Exchange
2010 beta, on Server 2008 virtualized on Hyper-V, with IIS7 running this site, browsed
by IE8 on Windows 7 RC, and authored in Office 2007 (2010 if Microsoft would give
me the flippin bits!). The company, not so much. Risk is good – as long
as you can mitigate it properly. I can manage my risk, as it’s not the end of
the world is something here crashes. I don’t lose an audience. If the
company can’t market to it’s customers because the tools in use are too new, it will
lose audience. Period. And that means lost revenue.
Maybe we can convince the exec’s in Phase II.
Reposted without* permission from the Canadian
IT Pro blog.
I
just wanted to post a reminder that the Windows 7 Beta is set to expire on July 1st,
2009. What does that mean? Well it isn’t going to explode, eat your data
or lock you out. What is going to happen is that the PC will force you to reboot
every two hours. But have no fear there is a way to fix this, simply install
the Windows 7 Release Candidate which you can still download.
While an upgrade isn’t supported, and I strongly recommend a clean install,
you can find
a workaround that will allow you to do an in place upgrade.
Grab the Release
Candidate here!
* I never asked. I doubt they will care. Correct me if I am wrong, Rodney!
A couple days ago Daniel Shapiro offered 10
people Virtual Servers hosted by Rack Force.
I jumped on the offer, as I’ve been wanting to migrate this website to it’s own privately
hosted server. It really came down to never having the time to test out hosts,
so this was a perfect opportunity. Shortly thereafter I found out Exchange 2010
hit beta, and I wanted to run it through it’s paces.
After installing Active Directory, I installed the beta, which went really smooth.
Given that it went smooth, I decided to update the DNS MX records for syfuhs.net to
point to this server.
One thing I didn’t realize is you have to set up Receive Connectors and Send Connectors.
The wording is kinda misleading, so I ended up setting my first Send Connector to
only route mail going to syfuhs.net from syfuhs.net. Not so useful. The
Receive connector was the same way. However, this is all similar to Exchange
2007.
Now some pictures:
Outlook
Web Access
Exchange
Management Console
IIS Manager Hosting Outlook Web Access
In my second post I discussed
my run in with ASP, and how PHP was far better. I ended the post talking about
an invitation to a Microsoft event. This was an interesting event. Greg
and I were the only people under 30 there. When that’s a 15 year difference,
things get interesting. Especially when you need your mother to drive you there…
The talk was a comparison between Microsoft based technologies and Linux based technologies.
The presenter was a 10 year veteran of IBM, working on their Linux platform, who then
moved to Microsoft. For the life of me I can’t remember his name.
His goal was simple. Disprove myths around Linux costs versus Windows costs.
It was a very compelling argument. The event was based around the Windows
Compare campaign. It was around this time that Longhorn (Longhorn that turned
into Vista, not Server 2008) was in pre-beta soon to go beta, and after discussing
it with Greg, we decided to probe the presenter for information about Longhorn.
In a situation like that, the presenter either gets mad, or becomes really enthusiastic
about the question. He certainly didn’t get mad.
Throughout the rest of the talk, the presenter made some jokes at mine and Greg’s
expense, which was all in good fun. Based on that, we decided to go one step
further to ask how we can get the latest Longhorn build, at one of the breaks.
the conversation went something like this:
Me: So how do people get copies of the latest build for Longhorn?
Presenter: Currently those enrolled in the MSDN Licensing program can get
the builds.
Me: Ok, how does one join such a licensing program?
Presenter: Generally you buy them.
Me: How much?
Presenter: A couple thousand…
Me: Ok let me rephrase the question. How does a student, such as myself
and my friend Greg here, get a the latest build of Longhorn when we don’t have an
MSDN subscription, nor the money to buy said subscription?
Presenter: *Laughs* Oh. Go talk to Alec over there and tell him I said
to give you a student subscription.
Me: Really? Cool!
Six months later Greg and I some how got MSDN Premium Subscriptions. We had
legal copies of almost every single piece of Microsoft software ever commercially
produced. Visual Studio 2005 was still in beta, so I decided to try it out.
I was less than impressed with Visual Studio 2003, but really liked ASP.NET, so I
wanted to see what 2005 had in store. At the time PHP was still my main language,
but after the beta of 2005, I immediately switched to C#. I had known about
C# for a while, and understood the language fairly well. It was .NET 1.1 that
never took for me. That, and I didn’t have a legal copy of Visual Studio 2003
at the time.
Running a Longhorn beta build, with Visual Studio 2005 beta installed, I started playing
with ASP.NET 2.0, and built some pretty interesting sites. The first was a Wiki
type site, designed for medical knowledge (hey, it takes a lot to kill a passion of
mine). It never saw the light of day on the interweb, but it certainly was a
cool site. Following that were a bunch of test sites that I used to experiment
with the data controls.
It wasn’t until the release of SQL Server 2005 that I started getting interested in
data. Which I will discuss in the my next
post.