In a previous post we looked at what it takes to actually write a Security Token Service. If we knew what the STS offered and required already, we could set up a relying party relatively easily with that setup. However, we don’t always know what is going on. That’s the purpose of federation metadata. It gives us a basic breakdown of the STS so we can interact with it.
Now, if we are building a custom STS we don’t have anything that is creating this metadata. We could do it manually by hardcoding stuff in an xml file and then signing it, but that gets ridiculously tedious after you have to make changes for the third or fourth time – which will happen. A lot. The better approach is to generate the metadata automatically. So in this post we will do just that.
The first thing you need to do is create a endpoint. There is a well known path of /FederationMetadata/2007-06/FederationMetadata.xml that is generally used, so let’s use that. There are a lot of options to generate dynamic content and in Programming Windows Identity Foundation, Vitorrio uses a WCF Service:
[ServiceContract]
public interface IFederationMetadata
{
[ServiceBehavior]
[webGet(UriTemplate = "2007-06/FederationMetadata.xml")]
XElement FederationMetadata();
}
It’s a great approach, but for some reason I prefer the way that Dominick Baier creates the endpoint in StarterSTS. He uses an IHttpHandler and a web.config entry to create a handler:
<location path="FederationMetadata/2007-06">
<system.webServer>
<handlers>
<add
name="MetadataGenerator"
path="FederationMetadata.xml"
verb="GET"
type="Syfuhs.TokenService.WSTrust.FederationMetadataHandler" />
</handlers>
</system.webServer>
<system.web>
<authorization>
<allow users="*" />
</authorization>
</system.web>
</location>
As such, I’m going to go that route. Let’s take a look at the implementation for the handler:
using System.Web;
namespace Syfuhs.TokenService.WSTrust
{
public class FederationMetadataHandler : IHttpHandler
{
public void ProcessRequest(HttpContext context)
{
context.Response.ClearHeaders();
context.Response.Clear();
context.Response.ContentType = "text/xml";
MyAwesomeTokenServiceConfiguration
.Current.SerializeMetadata(context.Response.OutputStream);
}
public bool IsReusable { get { return false; } }
}
}
All the handler is doing is writing metadata out to a stream, which in this case is the response stream. You can see that it is doing this through the MyAwesomeTokenServiceConfiguration class which we created in the previous article. The SeriaizeMetadata method creates an instance of a MetadataSerializer and writes an entity to the stream:
public void SerializeMetadata(Stream stream)
{
MetadataSerializer serializer = new MetadataSerializer();
serializer.WriteMetadata(stream, GenerateEntities());
}
The entities are generated through a collection of tasks:
private EntityDescriptor GenerateEntities()
{
if (entity != null)
return entity;
SecurityTokenServiceDescriptor sts = new SecurityTokenServiceDescriptor();
FillOfferedClaimTypes(sts.ClaimTypesOffered);
FillEndpoints(sts);
FillSupportedProtocols(sts);
FillSigningKey(sts);
entity = new EntityDescriptor(new EntityId(string.Format("https://{0}", host)))
{
SigningCredentials = this.SigningCredentials
};
entity.RoleDescriptors.Add(sts);
return entity;
}
The entity is generated, and an object is created to describe the STS called a SecurityTokenServiceDescriptor. At this point it’s just a matter of sticking in the data and defining the credentials used to sign the metadata:
private void FillSigningKey(SecurityTokenServiceDescriptor sts)
{
KeyDescriptor signingKey
= new KeyDescriptor(this.SigningCredentials.SigningKeyIdentifier)
{
Use = KeyType.Signing
};
sts.Keys.Add(signingKey);
}
private void FillSupportedProtocols(SecurityTokenServiceDescriptor sts)
{
sts.ProtocolsSupported.Add(new System.Uri(WSFederationConstants.Namespace));
}
private void FillEndpoints(SecurityTokenServiceDescriptor sts)
{
EndpointAddress activeEndpoint
= new EndpointAddress(string.Format("https://{0}/TokenService/activeSTS.svc", host));
sts.SecurityTokenServiceEndpoints.Add(activeEndpoint);
sts.TargetScopes.Add(activeEndpoint);
}
private void FillOfferedClaimTypes(ICollection<DisplayClaim> claimTypes)
{
claimTypes.Add(new DisplayClaim(ClaimTypes.Name, "Name", ""));
claimTypes.Add(new DisplayClaim(ClaimTypes.Email, "Email", ""));
claimTypes.Add(new DisplayClaim(ClaimTypes.Role, "Role", ""));
}
That in a nutshell is how to create a basic metadata document as well as sign it. There is a lot more information you can put into this, and you can find more things to work with in the Microsoft.IdentityModel.Protocols.WSFederation.Metadata namespace.
A few posts back I started talking about what it would take to create a new application
for the new Windows Phone 7. I’m not a fan of learning from trivial applications
that don’t touch on the same technologies that I would be using in the real world,
so I thought I would build a real application that someone can use.
Since this application uses a well known dataset I kind of get lucky because I already
have my database schema, which is in a reasonably well designed way. My first
step is to get it to the Phone, so I will use WCF Data Services and an Entity Model.
I created the model and just imported the necessary tables. I called this model
RaceInfoModel.edmx. The entities name is RaceInfoEntities This is ridiculously
simple to do.
The following step is to expose the model to the outside world through an XML format
in a Data Service. I created a WCF Data Service and made a few config changes:
using System.Data.Services;
using System.Data.Services.Common;
using System;
namespace RaceInfoDataService
{
public class RaceInfo : DataService
{ public static void InitializeService(DataServiceConfiguration config) { if (config
== null) throw new ArgumentNullException("config"); config.UseVerboseErrors
= true; config.SetEntitySetAccessRule("*", EntitySetRights.AllRead); //config.SetEntitySetPageSize("*",
25); config.DataServiceBehavior.MaxProtocolVersion = DataServiceProtocolVersion.V2;
} } }
This too is reasonably simple. Since it’s a web service, I can hit it from a
web browser and I get a list of available datasets:
This isn’t a complete list of available items, just a subset.
At this point I can package everything up and stick it on a web server. It could
technically be ready for production if you were satisfied with not having any Access
Control’s on reading the data. In this case, lets say for arguments sake that
I was able to convince the powers that be that everyone should be able to access it.
There isn’t anything confidential in the data, and we provide the data in other services
anyway, so all is well. Actually, that’s kind of how I would prefer it anyway. Give
me Data or Give me Death!
Now we create the Phone project. You need to install the latest build of the
dev tools, and you can get that here http://developer.windowsphone.com/windows-phone-7/.
Install it. Then create the project. You should see:
The next step is to make the Phone application actually able to use the data.
Here it gets tricky. Or really, here it gets stupid. (It better he fixed
by RTM or else *shakes fist*)
For some reason, the Visual Studio 2010 Phone 7 project type doesn’t allow you to
automatically import services. You have to generate the service class manually.
It’s not that big a deal since my service won’t be changing all that much, but nevertheless
it’s still a pain to regenerate it manually every time a change comes down the pipeline.
To generate the necessary class run this at a command prompt:
cd C:\Windows\Microsoft.NET\Framework\v4.0.30319
DataSvcutil.exe
/uri:http://localhost:60141/RaceInfo.svc/
/DataServiceCollection
/Version:2.0
/out:"PATH.TO.PROJECT\RaceInfoService.cs"
(Formatted to fit my site layout)
Include that file in the project and compile.
UPDATE: My bad, I had already installed the reference, so this won’t
compile for most people. The Windows Phone 7 runtime doesn’t have the System.Data
namespace available that we need. Therefore we need to install them… They
are still in development, so here is the CTP build http://www.microsoft.com/downloads/details.aspx?displaylang=en&FamilyID=b251b247-70ca-4887-bab6-dccdec192f8d.
You should now have a compile-able project with service references that looks something
like:
We have just connected our phone application to our database! All told, it took
me 10 minutes to do this. Next up we start playing with the data.
I finally got around to building a MetaWeblog API Handler for this site, so I can
use Windows Live Writer. It certainly was an interesting task. I wrote
code for XML, SQL Server, File IO, and Authentication to get this thing working.
It’s kinda mind-boggling how many different pieces were necessary to get the Handler
to function properly.
All-in-all the development was really fun. Most people would give up on the
process once they realize what’s required to debug such an interface. But it
got my chops in shape. It’s not every day you have to use a Network Listener
to debug code. It’s certainly not something I would want to do everyday, but
every so often it’s pretty fun.
While in the preparation process, there were a couple of procedures that I thought
might be tricky to work out. One in particular was automatically uploading images
to my server that were placed in the post. I could have left it to the manual
process, what I started out with, which involved FTP’ing the images to the server,
and then figuring out the URL for them, and manually inserting the img tag.
Or, I could let Live Writer and the Handler do all the work. Ironically, this
procedure took the least amount of code out of all of them:
public string NewMediaObject(string blogId, string userName, string password,
string base64Bits, string name) { string mediaDirectory
= HttpContext.Current.Request.PhysicalApplicationPath + "media/blog/"; if (authUser(userName,
password)) { File.WriteAllBytes(mediaDirectory + name, Convert.FromBase64String(base64Bits));
return Config.SiteURL + "/media/blog/" + name; } else { throw new Exception("Cannot
Authenticate User"); } }
Now its a breeze to write posts. It even adds drop shadows to images:
Live Writer also automatically creates a thumbnail of the image, and links to the
original. It might be a pain in some cases, but it’s easily fixable.
All I need now is more topics that involve pictures. Kitten’s optional. :)
Do you remember the
SubSonic
project? The Entity Framework is kind of like that. You can create an extensible
and customizable data model from any type of source. It takes the boiler plate coding
away from developing Data Access Layers.
Entity is designed to seperate how data is stored and how data is used. It's called
an Object-Relational Mapping framework. You point the framework at the source, tell
it what kind of business objects you want, and poof: you have an object model. Entity
is also designed to play nicely with LINQ. You can use it as a data source when querying
with LINQ. In my
previous post,
the query used NorthwindModEntities as a data source. It is an Entity object.
|
Courtesy of Wikipedia |
The Architecture, as defined in the picture:
-
Data source specific providers, which abstracts the ADO.NET interfaces to connect
to the database when programming against the conceptual schema.
-
Map provider, a database-specific provider that translates the Entity SQL command
tree into a query in the native SQL flavor of the database. It includes the Store
specific bridge, which is the component that is responsible for translating the
generic command tree into a store-specific command tree.
-
EDM parser and view mapping, which takes the SDL specification of the data
model and how it maps onto the underlying relational model and enables programming
against the conceptual model. From the relational schema, it creates views of the
data corresponding to the conceptual model. It aggregates information from multiple
tables in order to aggregate them into an entity, and splits an update to an entity
into multiple updates to whichever table contributed to that entity.
-
Query and update pipeline, processes queries, filters and update-requests to
convert them into canonical command trees which are then converted into store-specific
queries by the map provider.
-
Metadata services, which handle all metadata related to entities, relationships
and mappings.
-
Transactions, to integrate with transactional capabilities of the underlying
store. If the underlying store does not support transactions, support for it needs
to be implemented at this layer.
-
Conceptual layer API, the runtime that exposes the programming model for coding
against the conceptual schema. It follows the ADO.NET pattern of using Connection
objects to refer to the map provider, using Command objects to send the query, and
returning EntityResultSets or EntitySets containing the result.
-
Disconnected components, which locally caches datasets and entity sets for
using the ADO.NET Entity Framework in an occasionally connected environment.
-
Embedded database: ADO.NET Entity Framework includes a lightweight embedded
database for client-side caching and querying of relational data.
-
Design tools, such as Mapping Designer are also included with ADO.NET Entity
Framework which simplifies the job on mapping a conceptual schema to the relational
schema and specifying which properties of an entity type correspond to which table
in the database.
-
Programming layers, which exposes the EDM as programming constructs which can
be consumed by programming languages.
-
Object services, automatically generate code for CLR classes that expose the
same properties as an entity, thus enabling instantiation of entities as .NET objects.
-
Web services, which expose entities as web services.
-
High level services, such as reporting services which work on entities rather
than relational data.