Over on the Geneva forums a question was asked:
Does anyone have an example of how to change the HomeRealmDiscovery Page in ADFSv2 to accept an e-mail address in a text field and based upon that (actually the domain suffix) select the correct Claims/Identity Provider?
It's pretty easy to modify the HomeRealmDiscovery page, so I thought I'd give it a go.
Based on the question, two things need to be known: the email address and the home realm URI. Then we need to translate the email address to a home realm URI and pass it on to ADFS.
This could be done a couple ways. First it could be done by keeping a list of email addresses and their related home realms, or a list of email domains and their related home realms. For the sake of this being an example, lets do both.
I've created a simple SQL database with three tables:
Each entry in the EmailAddress and Domain table have a pointer to the home realm URI (you can find the schema in the zip file below).
Then I created a new ADFS web project and added a new entity model to it:
From there I modified the HomeRealmDiscovery page to do the check:
// Copyright (c) Microsoft Corporation. All rights reserved.
public partial class HomeRealmDiscovery : Microsoft.IdentityServer.Web.UI.HomeRealmDiscoveryPage
protected void Page_Init(object sender, EventArgs e)
protected void PassiveSignInButton_Click(object sender, EventArgs e)
string email = txtEmail.Text;
SetError("Please enter an email address");
SetError("Cannot find home realm based on email address");
private string FindHomeRealmByEmail(string email)
using (AdfsHomeRealmDiscoveryEntities en = new AdfsHomeRealmDiscoveryEntities())
var emailRealms = from e in en.EmailAddresses where e.EmailAddress1.Equals(email) select e;
if (emailRealms.Any()) // email address exists
// email address does not exist
string domain = ParseDomain(email);
var domainRealms = from d in en.Domains where d.DomainAddress.Equals(domain) select d;
if (domainRealms.Any()) // domain exists
// neither email nor domain exist
throw new ApplicationException();
private string ParseDomain(string email)
return email.Substring(email.IndexOf("@") + 1);
private void SetError(string p)
lblError.Text = p;
If you compare the original code, there was some changes. I removed the code that loaded the original home realm drop down list, and removed the code to choose the home realm based on the drop down list's selected value.
You can find my code here: http://www.syfuhs.net/AdfsHomeRealm.zip
Ever have one of those days where you swear that you've written something, but can't find it? I could have sworn that I wrote this article before. Ah well.
It makes a lot of sense to use ACS to manage Identity Providers. It also makes sense to use Active Directory for letting users sign in to your cloud application. Therefore we would hope that ACS and ADFS play nicely together. It turns out they do. in a previous post I talked about federating ACS and ADFS, where ACS is an identity provider to ADFS. Now lets reverse it. We want users to be redirected to ACS, then to ADFS to sign in.
First things first. Lets log into our ACS namespace and navigate to the Identity Provider section, and then Add an Identity Provider:
From there we want to select what type of provider to use, and in this case we will select WS-Federation:
We are now provided with a form to fill out. There are five properties: Display Name, WS-Federation metadata, Login Link text, Image Url, and Email domain names.
Display name is fairly straightforward. What do you want the internal name of this IdP to be?
Next we need to provide a link to the Federation Metadata document that ADFS provides. The path is https://adfs.domain.com/FederationMetadata/2007-06/FederationMetadata.xml.
Then we give it a public name, such as "ObjectSharp Internal Users".
If we want to use an image instead if showing text, we can provide a path to the image.
Finally we are asked for a semicolon separated list of email domains. This may seem a bit confusing at first. Basically, it allows us to filter out the IdP from the Home Realm Discovery page, and requires that the user enter in their email address. That way, instead of seeing the "ObjectSharp Internal Users" link, we are provided a text box, where we need to enter an email address like email@example.com. ACS will then look up the domain in their list, and if there is a reference to it, it will redirect to the IdP.
This takes care of the ACS bit. just like in the previous post, you need to tell the other IdP about the other. So we need to tell ADFS that ACS will be calling. This is pretty simple. We just need to add a relying party to ADFS using the ACS metadata. You can find the ACS metadata under Application Integration:
There isn't much to federating ADFS to ACS and vice-versa.
When you start working with Windows Azure in your spare time there are quite a few things that you miss.
I knew that it was possible to manage Windows Azure with multiple accounts, but since I was the only one logging into my instance, I never bothered to look into it. Well as it turns out, I needed to be able to manage Azure from a separate Live ID. It's pretty simple to do. You get into your subscription, navigate to User Management under the Hosted Services tab, and then you add a new Co-Admin.
Turns out that you can't manage ACS this way though. You don't have access to the namespaces as the Co-Admin. Crap. That's really what I wanted to manage with the separate account. After a minute of swearing at the control panel, I logged into ACS with my original account and looked around.
Aha! It was staring me right in the face:
There is a full MSDN article on how to deal with Portal Administrators.
Upon clicking the link you are given a list of current administrators. I wanted to add one.
When you add an administrator you are given a list Identity Providers to choose from. Interesting.
This means that I can manage this ACS namespace using any IdP that I want. I already have ADFS created as an IdP, so I'm going to use it. Getting Single Sign-On is always a bonus.
It asks for a claim type. When the ACS management portal receives a token, it will look for this claim type and compare it's value to the Identity claim value. If it matches the value, you are authorized to manage the namespace. I chose email address. It seemed simple enough. To log in I just navigate to https://syfuhs2.accesscontrol.windows.net/ and then gives me the default Home Realm Discovery page:
I've already preconfigured ACS to redirect any email addresses with the objectsharp.com domain to our ADFS instance. Once I click submit it redirects to ADFS, I authenticate using Windows Authentication, and then I'm back at the ACS Control Panel. The next time I go to log in, a cookie will be there and the Home Realm Discovery page will see that I logged in with ADFS last time, so it will list that option first:
It just so happens that ObjectSharp is Awesome.
Now how cool is that?
One of the cornerstones of ADFS is the concept of federation (one would hope anyway, given the name), which is defined as a user's authentication process across applications, organizations, or companies. Or simply put, my company Contoso is a partner with Fabrikam. Fabrikam employees need access to one of my applications, so we create a federated trust between my application and their user store, so they can log into my application using their internal Active Directory. In this case, via ADFS.
So lets break this down into manageable bits.
First we have our application. This application is a relying party to my ADFS instance. By now hopefully this is relatively routine.
Next we have the trust between our ADFS and our partner company's STS. If the company had ADFS installed, we could just create a trust between the two, but lets go one step further and give anyone with a Live ID access to this application. Therefore we need to create a trust between the Live ID STS and our ADFS server.
This is easier than most people may think. We can just use Windows Azure Access Control Services (v2). ACS can be set up very easily to federate with Live ID (or Google, Yahoo, Facebook, etc), so we just need to federate with ACS, and ACS needs to federate with Live ID.
Creating a trust between ADFS and ACS requires two parts. First we need to tell ADFS about ACS, and second we need to tell ACS about ADFS.
To explain a bit further, we need to make ACS a Claims Provider to ADFS, so ADFS can call on ACS for authentication. Then we need to make ADFS a relying party to ACS, so ADFS can consume the token from ACS. Or rather, so ACS doesn't freak out when it see's a request for a token for ADFS.
This may seem a bit confusing at first, but it will become clearer when we walk through the process.
First we need to get the Federation Metadata for our ACS instance. In this case I've created an ACS namespace called "syfuhs2". The metadata can be found here: https://syfuhs2.accesscontrol.windows.net/FederationMetadata/2007-06/FederationMetadata.xml.
Next I need to create a relying party in ACS, telling it about ADFS. To do that browse to the Relying party applications section within the ACS management portal and create a new relying party:
Because ADFS natively supports trusts, I can just pass in the metadata for ADFS to ACS, and it will pull out the requisite pieces:
Once that is saved you can create a rule for the transform under the Rule Groups section:
For this I'm just going to generate a default set of rules.
This should take care of the ACS side of things. Next we move into ADFS.
Within ADFS we want to browse to the Claims Provider Trusts section:
And then we right-click > Add Claims Provider Trust
This should open a Wizard:
Follow through the wizard and fill in the metadata field:
Having Token Services that properly generate metadata is a godsend. Just sayin'.
Once the wizard has finished, it will open a Claims Transform wizard for incoming claims. This is just a set of claims rules that get applied to any tokens received by ADFS. In other words, what should happen to the claims within the token we receive from ACS?
In this case I'm just going to pass any claims through:
In practice, you should write a rule that filters out any extraneous claims that you don't necessarily trust. For instance, if I were to receive a role claim with a value "Administrator" I may not want to let it through because that could give administrative access to the user, even though it wasn't explicitly set by someone managing the application.
Once all is said and done, you can browse to the RP, redirect for authentication and will be presenting with this screen:
After you've made your first selection, a cookie will be generated and you won't be redirected to this screen again. If you select ACS, you then get redirected to the ACS Home Realm selection page (or directly to Live ID if you only have Live ID).
When you set up ADFS as an IdP for SAML relying parties, you are given a page that allows you to log into the relying parties. There is nothing particularly interesting about this fact, except that it could be argued that the page allows for information leakage. Take a look at it:
There are two important things to note:
- I'm not signed in
- I can see every application that uses this IdP
I'm on the fence about this one. To some degree I don't care that people know we use ADFS to log into Salesforce. Frankly, I blogged about it. However, this could potentially be bad because it can tell an attacker about the applications you use, and the mechanisms you use to authenticate into them.
This is definitely something you should consider when developing your threat models.
Luckily, if you do decide that you don't want the applications to be visible, you can make a quick modification to the IdpInitiatedSignOn.aspx.cs page.
There is a method called SetRpListState:
protected void SetRpListState( object sender, EventArgs e )
RelyingPartyDropDownList.Enabled = OtherRpRadioButton.Checked;
ConsentDropDownList.Enabled = OtherRpRadioButton.Checked;
To get things working I made two quick modifications. First I added the following line of code to that method:
OtherRpPanel.Visible = this.IsAuthenticated;
Then I added a line to the Page_Init method:
Now unauthenticated users just see this:
And authenticated users see everything as expected:
You could extend this further and add some logic to look into the App Settings in the web.config to quickly and easily switch between modes.
One of the projects that’s been kicking around in the back of my head is how to make Windows Phone 7 applications able to authenticate against a Windows domain. This is a must have for enterprise developers if they want to use the new platform.
There were a couple ways I could do this, but keeping with my Claims-shtick I figured I would use an STS. Given that ADFS is designed specifically for Active Directory authentication, I figured it would work nicely. It should work like this:
Nothing too spectacularly interesting about the process. In order to use ADFS though, I need the correct endpoint. In this case I’m using
That takes care of half of the problem. Now I actually need to make my application call that web service endpoint.
This is kind of a pain because WP7/Silverlight don’t support the underlying protocol, WS-Federation.
Theoretically I could just add that endpoint as a service reference and build up all the pieces, but that is a nightmare scenario because of all the boiler-plating around security. It would be nice if there was a library that supported WS-Federation for the phone.
As it turns out Dominick Baier came across a solution. He converted the project that came from the Identity training kit initially designed for Silverlight. As he mentions there were a few gotchas, but overall it worked nicely. You can download his source code and play around.
I decided to take it a step further though. I didn’t really like the basic flow of token requests, and I didn’t like how I couldn’t work with IPrincipal/IIdentity objects.
First things first though. I wanted to start from scratch, so I opened the identity training kit and looked for the Silverlight project. You can find it here: [wherever you installed the kit]\IdentityTrainingKitVS2010\Labs\SilverlightAndIdentity\Source\Assets\SL.IdentityModel.
Initially I thought I could just add it to a phone project, but that was a bad idea; there were too many build errors. I could convert the project file to a phone library, but frankly I was lazy, so I just created a new phone library and copied the source files between projects.
There were a couple references missing, so I added System.Runtime.Serialization, System.ServiceModel, and System.Xml.Linq.
This got the project built, but will it work?
I copied Dominick’s code:
private void button1_Click(object sender, RoutedEventArgs e)
_client = GetWSTrustClient(
new UsernameCredentials("username", "password"));
var rst = new RequestSecurityToken(WSTrust13Constants.KeyTypes.Bearer)
AppliesTo = new EndpointAddress("[…]")
_client.IssueCompleted += client_IssueCompleted;
void client_IssueCompleted(object sender, IssueCompletedEventArgs e)
_client.IssueCompleted -= client_IssueCompleted;
if (e.Error != null)
var token = e.Result;
button2.IsEnabled = true;
GetWSTrustClient(string stsEndpoint, IRequestCredentials credentials)
var client = new WSTrustClient(new WSTrustBindingUsernameMixed(),
new EndpointAddress(stsEndpoint), credentials);
To my surprise it worked. Sweet.
This left me wanting more though. In order to access any of the claims within the token I had to do something with the RequestSecurityTokenResponse (RSTR) object. Also, how do I make this identity stick around within the application?
The next thing I decided to do was figure out how to convert the RSTR object to an IClaimsIdentity. Unfortunately this requires a bit of XML parsing. Talk about a pain. Helper class it is:
public static class TokenHandler
private static XNamespace ASSERTION_NAMESPACE
private const string CLAIM_VALUE_TYPE
= "http://www.w3.org/2001/XMLSchema#string"; // bit of a hack
public static IClaimsPrincipal Convert(RequestSecurityTokenResponse rstr)
return new ClaimsPrincipal(GetClaimsIdentity(rstr));
private static ClaimsIdentity GetClaimsIdentity(RequestSecurityTokenResponse rstr)
XDocument responseDoc = XDocument.Parse(rstr.RequestedSecurityToken.RawToken);
XElement attStatement = responseDoc.Element(ASSERTION_NAMESPACE + "Assertion")
.Element(ASSERTION_NAMESPACE + "AttributeStatement");
var issuer = responseDoc.Root.Attribute("Issuer").Value;
ClaimCollection claims = new ClaimCollection();
foreach (var c in attStatement.Elements(ASSERTION_NAMESPACE + "Attribute"))
string attrName = c.Attribute("AttributeName").Value;
string attrNamespace = c.Attribute("AttributeNamespace").Value;
string claimType = attrNamespace + "/" + attrName;
foreach (var val in c.Elements(ASSERTION_NAMESPACE + "AttributeValue"))
claims.Add(new Claim(issuer, issuer, claimType,
return new ClaimsIdentity(claims);
Most of this is just breaking apart the SAML-goo. Once I got all the SAML assertions I generated a claim for each one and created a ClaimsIdentity object. This gets me a step closer to how I wanted things, but keeping the identity around within the application is still up in the air. How can I keep the identity for the lifetime of the application? I wanted something like Thread.CurrentPrincipal but the phone platform doesn’t let you access it.
There was a class, TokenCache, that was part of the original Silverlight project. This sounded useful. it turns out it’s Get/Add wrapper for a Dictionary<>. It’s almost useful, but I want to be able to access this cache at any time. A singleton sort of solves the problem, so lets try that. I added this within the TokenCache class:
public static TokenCache Cache
if (_cache != null)
_cache = new TokenCache();
private static TokenCache _cache;
private static object _sync = new object();
now I can theoretically get access to the tokens at any time, but I want to make the access part of the base Application object. I created a static class called ApplicationExtensions:
public static class ApplicationExtensions
public static IClaimsPrincipal
GetPrincipal(this Application app, string appliesTo)
throw new ArgumentException("Token cannot be found to generate principal.");
public static RequestSecurityTokenResponse
GetPrincipalToken(this Application app, string appliesTo)
public static void
SetPrincipal(this Application app, RequestSecurityTokenResponse rstr)
It adds three extension methods to the base Application object. Now it’s sort of like Thread.CurrentPrincipal.
How does this work? When the RSTR is returned I can call:
Accessing the identity is two-part.
If I just want to get the identity and it’s claims I can call:
var principal = Application.Current.GetPrincipal("https://troymcclure/webapplication3/");
IClaimsIdentity ident = principal.Identity as IClaimsIdentity;
If I want to reuse the token as part of web service call I can get the token via:
var token = Application.Current.GetPrincipalToken(https://troymcclure/webapplication3/);
There is still quite a lot to do in order for this to be production ready code, but it does a pretty good job of solving all the problems I had with domain authentication on the Windows Phone 7 platform.
Every couple of weeks I start up Autoruns to see what new stuff has added itself to Windows startup and what not (screw you Adobe – you as a software company make me want to swear endlessly). Anyway, a few months ago around the time the latest version of Windows Live Messenger and it’s suite RTM’ed I poked around to see if anything new was added. Turns out there was:
A new credential provider was added!
Not only that, it turns out a couple Winsock providers were added too:
I started poking around the DLL’s and noticed that they don’t do much. Apparently you can use smart cards for WLID authentication. I suspect that’s what the credential provider and associated Winsock Provider is for, as well as part of WLID’s sign-on helper so credentials can be managed via the Credential Manager:
Ah well, nothing too exciting here.
Skip a few months and something occurred to me. Microsoft was able to solve part of the Claims puzzle. How do you bridge the gap between desktop application identities and web application identities? They did part of what CardSpace was unable to do because CardSpace as a whole didn’t really solve a problem people were facing. The problem Windows Live ran into was how do you share credentials between desktop and web applications without constantly asking for the credentials? I.e. how do you do Single Sign On…
This got me thinking.
What if I wanted to step this up a smidge and instead of logging into Windows Live Messenger with my credentials, why not log into Windows with my Windows Live Credentials?
Yes, Windows. I want to change this:
Question: What would this solve?
Answer: At present, nothing ground-breakingly new. For the sake of argument, lets look at how this would be done, and I’ll (hopefully) get to my point.
First off, we need to know how to modify the Windows logon screen. In older versions of Windows (versions older than 2003 R2) you had to do a lot of heavy lifting to make any changes to the screen. You had to write your own GINA which involved essentially creating your own UI. Talk about painful.
With the introduction of Vista, Microsoft changed the game when it came to custom credentials. Their reasoning was simple: they didn’t want you to muck up the basic look and feel. You had to follow their guidelines.
As a result we are left with something along the lines of these controls to play with:
The logon screen is now controlled by Credential Providers instead of the GINA. There are two providers built into Windows by default, one for Kerberos or NTLM authentication, and one for Smart Card authentication.
The architecture looks like:
When the Secure Attention Sequence (CTRL + ALT + DEL / SAS) is called, Winlogon switches to a different desktop and instantiates a new instance of LogonUI.exe. LogonUI enumerates all the credential provider DLL’s from registry and displays their controls on the desktop.
When I enter in my credentials they are serialized and supposed to be passed to the LSA.
Once the LSA has these credentials it can then do the authentication.
I say “supposed” to be passed to the LSA because there are two frames of thought here. The first frame is to handle authentication within the Credential Provider itself. This can cause problems later on down the road. I’ll explain why in the second frame.
The second frame of thought is when you need to use custom credentials, need to do some funky authentication, and then save save the associated identity token somewhere. This becomes important when other applications need your identity.
You can accomplish this via what’s called an Authentication Package.
When a custom authentication package is created, it has to be designed in such a way that applications cannot access stored credentials directly. The applications must go through the pre-canned MSV1_0 package to receive a token.
Earlier when I asked about using Windows Live for authentication we would need to develop two things: a Credential Provider, and a custom Authentication Package.
The logon process would work something like this:
- Select Live ID Credential Provider
- Type in Live ID and Password and submit
- Credential Provider passes serialized credential structure to Winlogon
- Winlogon passes credentials to LSA
- LSA passes credential to Custom Authentication Package
- Package connects to Live ID STS and requests a token with given credentials
- Token is returned
- Authentication Package validated token and saves it to local cache
- Package returns authentication result back up call stack to Winlogon
- Winlogon initializes user’s profile and desktop
I asked before: What would this solve?
This isn’t really a ground-breaking idea. I’ve just described a domain environment similar to what half a million companies have already done with Active Directory, except the credential store is Live ID.
On it’s own we’ve just simplified the authentication process for every home user out there. No more disparate accounts across multiple machines. Passwords are in sync, and identity information is always up to date.
What if Live ID sets up a new service that lets you create access groups for things like home and friends and you can create file shares as appropriate. Then you can extend the Windows 7 Homegroup sharing based on those access groups.
Wait, they already have something like that with Skydrive (sans Homegroup stuff anyway).
Maybe they want to use a different token service.
Imagine if the user was able to select the “Federated User” credential provider that would give you a drop down box listing a few Security Token Services. Azure ACS can hook you up.
Imagine if one of these STS’s was something everyone used *cough* Facebook *cough*.
Imagine the STS was one that a lot of sites on the internet use *cough* Facebook *cough*.
Imagine if the associated protocol used by the STS and websites were modified slightly to add a custom set of headers sent to the browser. Maybe it looked like this:
Finally, imagine if your browser was smart enough to intercept those headers and look up the user’s token, check if they matched the header ”Relying-Party-Accepting-Token-Type” and then POST the token to the given reply URL.
Hmm. We’ve just made the internet SSO capable.
Now to just move everyone’s cheese to get this done.
I’ve always found Kerberos to be an interesting protocol. It works by way of a trusted third party which issues secured tickets based on an authentication or previous session. These tickets are used as proof of identity by asserting that the subject is who they claim to be. Claims authentication works on a similar principle, except instead of a ticket you have a token. There are some major differences in implementation, but the theory is the same. One of the reasons I find it interesting is that Kerberos was originally developed in 1983, and the underlying protocol called the Needham-Schroeder protocol, was originally published in 1978.
There have been major updates over the years, as well as a change to fix a man-in-the-middle attack in the Needham-Schroeder protocol in 1995, but the theory is still sound. Kerberos is the main protocol used in Windows networks to authenticate against Active Directory.
The reason I bring it up is because of a comment I made in a previous post. I made an assertion that we don’t necessarily abstract out the identity portion of our applications and services.
Well, It occurred to me that up until a certain period of time, we did. In many environments there was only one trusted authority for identity. Whether it was at a school, in a business, or within the government there was no concept of federation. The walls we created were for a very good reason. The applications and websites we created were siloed and the information didn’t need to be shared. As such, we created our own identity stores in databases and LDAP directories.
This isn’t necessarily a problem because we built these applications on top of a foundation that wasn’t designed for identity. The internet was for all intents and purposes designed for anonymity. But here is where the foundation became tricky: it boomed.
People wanted to share information between websites and applications, but the data couldn’t be correlated back to the user across applications. We are starting to catch up, but it’s a slow process.
So here is the question: we started with a relatively abstract process of authentication by way of the Kerberos third party, and then moved to siloed identity data. Why did we lose the abstraction? Or more precisely, during this boom, why did we allow our applications to lose this abstraction?
Food for thought on this early Monday.
Over on the Claims-Based Identity blog, they announced a whitepaper was just released on using ADFS 2 to federate with Shibboleth 2 and the InCommon Federation. I just started reading through it, but it looks really well written.
Here is the abstract of the paper itself:
Through its support for the WS-Federation and Security Assertion Markup Language (SAML) 2.0 protocols, Microsoft® Active Directory® Federation Services 2.0 (AD FS 2.0) provides claims-based, cross-domain, Web single sign-on (SSO) interoperability with non-Microsoft federation solutions. Shibboleth® 2, through its support for SAML 2.0, enables cross-domain, federated SSO between environments that are running Microsoft and Shibboleth 2 federation infrastructures.
You can download the whitepaper in .docx format.
What is Shibboleth?
The Shibboleth System is a standards based, open source software package for web single sign-on across or within organizational boundaries. It allows sites to make informed authorization decisions for individual access of protected online resources in a privacy-preserving manner.
What is InCommon?
InCommon eliminates the need for researchers, students, and educators to maintain multiple passwords and usernames. Online service providers no longer need to maintain user accounts. Identity providers manage the levels of their users' privacy and information exchange. InCommon uses SAML-based authentication and authorization systems (such as Shibboleth®) to enable scalable, trusted collaborations among its community of participants.
When SharePoint 2010 was developed, Microsoft took extra care to include support for
a claims-based identity model. There are quite a few benefits to doing it this
way, one of which is that it simplifies managing identities across organizational
structures. So lets take a look at adding a Secure Token Service as an Authentication
Provider to SharePoint 2010.
First, Some Prerequisites
You have to use PowerShell for most of this. You wouldn’t/shouldn’t be adding
too many Providers to SharePoint all that often so there isn’t a GUI for this.
The claims that SharePoint will know about must be known during setup. This
isn’t that big a deal, but…
Telling SharePoint about the STS
Once you’ve collected all the information you need, open up PowerShell as an Administrator
and add the SharePoint snap-in on the server.
Next we need to create the certificate and claim mapping objects:
$cert = New-Object System.Security.Cryptography.X509Certificates.X509Certificate2("d:\path\to\adfsCert.cer")
$claim1 = New-SPClaimTypeMapping -IncomingClaimType "http://schemas.microsoft.com/ws/2008/06/identity/claims/role"
-IncomingClaimTypeDisplayName "Role" –SameAsIncoming
$claim2 = New-SPClaimTypeMapping "http://schemas.xmlsoap.org/ws/2005/05/identity/claims/emailaddress"
-IncomingClaimTypeDisplayName "EmailAddress" –SameAsIncoming
There should be three lines. They will be word-wrapped.
The certificate is pretty straightforward. It is the public key of the STS.
The claims are also pretty straightforward. There are two claims: the roles
of the identity, and the email address of the identity. You can add as many
as the STS will support.
Next is to define the realm of the Relying Party; i.e. the SharePoint server.
$realm = "urn:" + $env:ComputerName + ":adfs"
By using a URN value you can mitigate future changes to addresses. This becomes
especially useful in an intranet/extranet scenario.
Then we define the sign-in URL for the STS. In this case, we are using ADFS:
$signinurl = https://[myAdfsServer.fullyqualified.domainname]/adfs/ls/
Mind the SSL.
And finally we put it all together:
New-SPTrustedIdentityTokenIssuer -Name "MyDomainADFS2" -Description "ADFS
2 Federated Server for MyDomain" -Realm $realm -ImportTrustCertificate $cert
-ClaimsMappings $claim1,$claim2 -SignInUrl $signinurl -IdentifierClaim $claim2.InputClaimType
This should be a single line, word wrapped. If you wanted to you could just
call New-SPTrustedIdentityTokenIssuer and then fill in the values one at
a time. This might be useful for debugging.
At this point SharePoint now knows about the STS but none of the sites are set up
to use it.
Authenticating SharePoint sites using the STS
For a good measure restart SharePoint/IIS. Go into SharePoint Administration
and create a new website and select Claims Based Authentication at the top:
Fill out the rest of the details and then when you get to Claims Authentication
Types select Trusted Identity Provider and then select your STS.
In this case it is my ADFS Server:
Save the site and you are done. Try navigating to the site and it should redirect
you to your STS. You can then manage users as you would normally with Active