Getting the Data to the Phone

A few posts back I started talking about what it would take to create a new application for the new Windows Phone 7.  I’m not a fan of learning from trivial applications that don’t touch on the same technologies that I would be using in the real world, so I thought I would build a real application that someone can use.

Since this application uses a well known dataset I kind of get lucky because I already have my database schema, which is in a reasonably well designed way.  My first step is to get it to the Phone, so I will use WCF Data Services and an Entity Model.  I created the model and just imported the necessary tables.  I called this model RaceInfoModel.edmx.  The entities name is RaceInfoEntities  This is ridiculously simple to do.

The following step is to expose the model to the outside world through an XML format in a Data Service.  I created a WCF Data Service and made a few config changes:

using System.Data.Services;
using System.Data.Services.Common;
using System;

namespace RaceInfoDataService
{
    public class RaceInfo : DataService
{ public static void InitializeService(DataServiceConfiguration config) { if (config
== null) throw new ArgumentNullException("config"); config.UseVerboseErrors
= true; config.SetEntitySetAccessRule("*", EntitySetRights.AllRead); //config.SetEntitySetPageSize("*",
25); config.DataServiceBehavior.MaxProtocolVersion = DataServiceProtocolVersion.V2;
} } }

This too is reasonably simple.  Since it’s a web service, I can hit it from a web browser and I get a list of available datasets:

image

This isn’t a complete list of available items, just a subset.

At this point I can package everything up and stick it on a web server.  It could technically be ready for production if you were satisfied with not having any Access Control’s on reading the data.  In this case, lets say for arguments sake that I was able to convince the powers that be that everyone should be able to access it.  There isn’t anything confidential in the data, and we provide the data in other services anyway, so all is well.  Actually, that’s kind of how I would prefer it anyway.  Give me Data or Give me Death!

Now we create the Phone project.  You need to install the latest build of the dev tools, and you can get that here http://developer.windowsphone.com/windows-phone-7/.  Install it.  Then create the project.  You should see:

image

The next step is to make the Phone application actually able to use the data.  Here it gets tricky.  Or really, here it gets stupid.  (It better he fixed by RTM or else *shakes fist*)

For some reason, the Visual Studio 2010 Phone 7 project type doesn’t allow you to automatically import services.  You have to generate the service class manually.  It’s not that big a deal since my service won’t be changing all that much, but nevertheless it’s still a pain to regenerate it manually every time a change comes down the pipeline.  To generate the necessary class run this at a command prompt:

cd C:\Windows\Microsoft.NET\Framework\v4.0.30319
DataSvcutil.exe
     /uri:http://localhost:60141/RaceInfo.svc/
     /DataServiceCollection
     /Version:2.0
     /out:"PATH.TO.PROJECT\RaceInfoService.cs"

(Formatted to fit my site layout)

Include that file in the project and compile.

UPDATE: My bad, I had already installed the reference, so this won’t compile for most people.  The Windows Phone 7 runtime doesn’t have the System.Data namespace available that we need.  Therefore we need to install them…  They are still in development, so here is the CTP build http://www.microsoft.com/downloads/details.aspx?displaylang=en&FamilyID=b251b247-70ca-4887-bab6-dccdec192f8d.

You should now have a compile-able project with service references that looks something like:

image

We have just connected our phone application to our database!  All told, it took me 10 minutes to do this.  Next up we start playing with the data.

Testing Code Highlighting

Teeeeeeeeeeeeeeeeeeeeeeeeeeeeeeeeeeeeeeeesting.

For those curious, this is the only way I can figure to do a permanent redirect in ASP.NET 3.5 and lower.

using System;

namespace newtelligence.DasBlog.Web
{
    public partial class rss : System.Web.UI.Page
    {
        protected void Page_Load(object sender, EventArgs e)
        {
            Response.Status = "301 Moved Permanently";
            Response.AddHeader("Location", "/SyndicationService.asmx/GetRss"); 
        }
    }
}

Upgrade to DasBlog

One of the downsides to using a Blog Engine one wrote while in High School is that you are using a blog engine that someone wrote while in High School. 

I got tired of writing a new feature every time I wanted to do anything different.  The logical approach was to upgrade to a bigger and better system, and DasBlog was the engine of choice.

With any luck I’ll stick with it, and won’t need to write any more code for this site in the near future.

Data as a Service and the Applications that consume it

Over the past few months I have seen quite a few really cool technologies released or announced, and I believe they have a very real potential in many markets.  A lot of companies that exist outside the realm of Software Development, rarely have the opportunity to use such technologies.

Take for instance the company I work for: Woodbine Entertainment Group.  We have a few different businesses, but as a whole our market is Horse Racing.  Our business is not software development.  We don’t always get the chance to play with or use some of the new technologies released to the market.  I thought this would be a perfect opportunity to see what it will take to develop a new product using only new technologies.

Our core customer pretty much wants Race information.  We have proof of this by the mere fact that on our two websites, HorsePlayer Interactive and our main site, we have dedicated applications for viewing Races.  So lets build a third race browser.  Since we already have a way of viewing races from your computer, lets build it on the new Windows Phone 7.

The Phone – The application

This seems fairly straightforward.  We will essentially be building a Silverlight application.  Let’s take a look at what we need to do (in no particular order):

  1. Design the interface – Microsoft has loads of guidance on following with the Metro design.  In future posts I will talk about possible designs.
  2. Build the interface – XAML and C#.  Gotta love it.
  3. Build the Business Logic that drives the views – I would prefer to stay away from this, suffice to say I’m not entirely sure how proprietary this information is
  4. Build the Data Layer – Ah, the fun part.  How do you get the data from our internal servers onto the phone?  Easy, OData!

The Data

We have a massive database of all the Races on all the tracks that you can wager on through our systems.  The data updates every few seconds relative to changes from the tracks for things like cancellations or runner odds.  How do we push this data to the outside world for the phone to consume?  We create a WCF Data Service:

  1. Create an Entities Model of the Database
  2. Create Data Service
  3. Add Entity reference to Data Service (See code below)
 
    public class RaceBrowserData : DataService
{ public static void InitializeService(DataServiceConfiguration config) { if (config
== null) throw new ArgumentNullException("config"); config.UseVerboseErrors
= true; config.SetEntitySetAccessRule("*", EntitySetRights.AllRead); //config.SetEntitySetPageSize("*",
25); config.DataServiceBehavior.MaxProtocolVersion = DataServiceProtocolVersion.V2;
} } 

That’s actually all there is to it for the data.

The Authentication

The what?  Chances are the business will want to limit application access to only those who have accounts with us.  Especially so if we did something like add in the ability to place a wager on that race.  There are lots of ways to lock this down, but the simplest approach in this instance is to use a Secure Token Service.  I say this because we already have a user store and STS, and duplication of effort is wasted effort.  We create a STS Relying Party (The application that connects to the STS):

  1. Go to STS and get Federation Metadata.  It’s an XML document that tells relying parties what you can do with it.  In this case, we want to authenticate and get available Roles.  This is referred to as a Claim.  The role returned is a claim as defined by the STS.  Somewhat inaccurately, we would do this:
    1. App: Hello! I want these Claims for this user: “User Roles”.  I am now going to redirect to you.
    2. STS: I see you want these claims, very well.  Give me your username and password.
    3. STS: Okay, the user passed.  Here are the claims requested.  I am going to POST them back to you.
    4. App: Okay, back to our own processes.
  2. Once we have the Metadata, we add the STS as a reference to the Application, and call a web service to pass the credentials.
  3. If the credentials are accepted, we get returned the claims we want, which in this case would be available roles.
  4. If the user has the role to view races, we go into the Race view.  (All users would have this role, but adding Roles is a good thing if we needed to distinguish between wagering and non-wagering accounts)

One thing I didn’t mention is how we lock down the Data Service.  That’s a bit more tricky, and more suited for another post on the actual Data Layer itself.

So far we have laid the ground work for the development of a Race Browser application for the Windows Phone 7 using the Entity Framework and WCF Data Services, as well as discussed the use of the Windows Identity Foundation for authentication against an STS.

With any luck (and permission), more to follow.

Installing IIS 7.5 on Windows 7 from the Command Line

This is more of a place for me to store something I use fairly often, but can never remember off the top of my head.  This script, when run as administrator, will install all the features of IIS for developing on Windows 7.  Mind you, this is the prettified* version so it’s web-readable.

START /WAIT DISM /Online /Enable-Feature
/FeatureName:IIS-ApplicationDevelopment
/FeatureName:IIS-ASP
/FeatureName:IIS-ASPNET
/FeatureName:IIS-BasicAuthentication
/FeatureName:IIS-CGI
/FeatureName:IIS-ClientCertificateMappingAuthentication
/FeatureName:IIS-CommonHttpFeatures
/FeatureName:IIS-CustomLogging
/FeatureName:IIS-DefaultDocument
/FeatureName:IIS-DigestAuthentication
/FeatureName:IIS-DirectoryBrowsing
/FeatureName:IIS-FTPExtensibility
/FeatureName:IIS-FTPServer
/FeatureName:IIS-FTPSvc
/FeatureName:IIS-HealthAndDiagnostics
/FeatureName:IIS-HostableWebCore
/FeatureName:IIS-HttpCompressionDynamic
/FeatureName:IIS-HttpCompressionStatic
/FeatureName:IIS-HttpErrors
/FeatureName:IIS-HttpLogging
/FeatureName:IIS-HttpRedirect
/FeatureName:IIS-HttpTracing
/FeatureName:IIS-IIS6ManagementCompatibility
/FeatureName:IIS-IISCertificateMappingAuthentication
/FeatureName:IIS-IPSecurity
/FeatureName:IIS-ISAPIExtensions
/FeatureName:IIS-ISAPIFilter
/FeatureName:IIS-LegacyScripts
/FeatureName:IIS-LegacySnapIn
/FeatureName:IIS-LoggingLibraries
/FeatureName:IIS-ManagementConsole
/FeatureName:IIS-ManagementScriptingTools
/FeatureName:IIS-ManagementService
/FeatureName:IIS-Metabase
/FeatureName:IIS-NetFxExtensibility
/FeatureName:IIS-ODBCLogging
/FeatureName:IIS-Performance
/FeatureName:IIS-RequestFiltering
/FeatureName:IIS-RequestMonitor
/FeatureName:IIS-Security
/FeatureName:IIS-ServerSideIncludes
/FeatureName:IIS-StaticContent
/FeatureName:IIS-URLAuthorization
/FeatureName:IIS-WebDAV
/FeatureName:IIS-WebServer
/FeatureName:IIS-WebServerManagementTools
/FeatureName:IIS-WebServerRole
/FeatureName:IIS-WindowsAuthentication
/FeatureName:IIS-WMICompatibility
/FeatureName:WAS-ConfigurationAPI
/FeatureName:WAS-NetFxEnvironment
/FeatureName:WAS-ProcessModel
/FeatureName:WAS-WindowsActivationService

*Interesting that “prettified” is a word according to Live Writer.

ADFS 2.0 Windows Service Not Starting on Server 2008

I’ve been working on getting a testable ADFS environment setup for evaluation and development.  Basically, because of laziness (and timeliness), I’m using Windows Virtual PC to host Server 2008 guests for testing.  I didn’t have the time to setup a fully working x64 environment, so I couldn’t go to R2.

One of the issues I’ve been running into is that the Windows Service won’t start properly.  Or rather, at all.  It’s running into a timing issue when running as Network Service, as its timing out while waiting for a network connection.  More Googling with Bing returned the fix for me from here.

In the file [C:\Program Files\Active Directory Federation Services 2.0\Microsoft.IdentityServer.Servicehost.exe.config] add this entry to it:

<runtime>
    <generatePublisherEvidence enabled="false"/> 
</runtime>

Other places have noted that this isn’t a problem on R2.  I haven’t tested this yet, so I don’t know if it’s true.

IIS 7 Certificate Request Completion breaking with &amp;ldquo;ASN1 bad tag value met 0x8009310b&amp;rdquo;

Only took a couple quick searches Googling with Bing, but in IIS 7 if you create a request for a certificate, create it by a CA and then complete the request, and find it blows up with this message box:

CertEnroll::CX509Enrollment::p_InstallResponse: ASN1 bad tag value met. 0x8009310b (ASN: 267)

All it means is that the CA that issued the certificate isn’t trusted on the server.  I came across this in a test environment I was building.  I had a Domain with CA Services, and a server that existed outside the domain.  I used the domain CA to create the certificate, but because the web server wasn’t part of the domain, it didn’t trust the CA.

My fix was to add the CA as a trusted Root Authority on the web server.

C# Dynamic Type Conversions

I’ve been looking at ways of parsing types and values from text without having to do switch/case statements or explicit casting.  So far, based on my understanding of statically typed languages, is that this is impossible with a statically typed language.

<Question> Is this really true?</Question>

Given my current knowledge, my way of bypassing this is to use the new dynamic type in .NET 4.  It allows me to implicitly assign an object without having to cast it.  It works by bypassing the type checking at compile time.

Here’s a fairly straightforward example:

static void Main(string[] args)
{
	Type boolType = Type.GetType("System.Boolean");
	Console.WriteLine(!parse("true", boolType));

	Type dateTimeType = Type.GetType("System.DateTime");

	DateTime date = parse("7/7/2010", dateTimeType);
	Console.WriteLine(date.AddDays(1));

	Console.ReadLine();
}

static dynamic parse(string value, Type t)
{
	return Convert.ChangeType(value, t);
}

Now, if I were to do something crazy and call

DateTime someDate = parse(“1234”, Type.GetType(“System.Int32”));

a RuntimeBinderException would be thrown because you cannot implicitly convert between an int and a DateTime.

It certainly makes things a little easier.

SQL Server 2008 R2 Launch Event &amp;ndash; Application Lifecycle Management

Unfortunately I will be unable to attend the ALM presentation later this afternoon, but luckily I was able to catch it in Montreal last week.

When I think of ALM, I think of the development lifecycle of an application – whether it be agile or waterfall or whatever floats your boat – that encompasses all parts of the process.  We’ve had tools over the years that help us manage each section or iteration of the process, but there was some obvious pieces missing.  What about the SQL?  Databases are essential to pretty much all applications that get developed nowadays, yet for a long time we didn’t have much in the way to help streamline and manage the processes of developing database pieces.

Enter ALM for SQL Server.  DBA’s are now given all the tools and resources developers have had for a while.  It’s now easier to manage Packaging and Deployment of Databases, better source control of SQL scripts, and something really cool: Database schema versioning.

I have a story: Sometime over the last couple years, a developer wrote a small little application that monitors changes to database schemas through triggers, and then sync’ed the changes with SVN.  This was pretty cool.  It allowed us to watch what changed when things went south.  Problem was, it wasn’t necessarily reliable, it relied on some internal pieces to be added to the database manually, and made finding changes through SVN tricky.

With ALM, versioning of databases happens before deployment.  Changes are stored in TFS, and its possible to rollback certain changes fairly easily.  Certain changes. :)

That’s pretty cool.

SQL Server 2008 R2 Launch &amp;ndash; PowerPivot

We just finished the SQL Server 2008 R2 Launch Keynote.  That’s quite a mouthful.  One of the problems I saw with this release was that not a lot of people knew what went into it.  R2 products are strange in that people just sort of assume they are nothing more than Service Pack releases.  Well, this isn't actually the case.

There were some really cool things shown in the keynote this morning.  PowerPivot being my all-time favorite.  Excel on Analysis Services steroids would be an apt description.  Analyzing huge sets of data within Excel was sort of tricky because you ran into a million row limit.  This was only half the problem though.  Performance was painful if you had that many rows in Excel -- sluggish at best.  PowerPivot helps change this dramatically.  The demo just shown pulled down 100,000,000 rows of data into Excel, filtered down to 19, and then filtered back up to the initial set -- all within a second.  That's pretty cool.

Just don't hit print.