Outrageous Claims: SP Claims and Auth Will Lag Behind the Industry

Principal Architect Thomas Carpe shares his thoughts and opinions on the state of the art in SharePoint security, including predictions about things to come. This blog post is part of a continuing series leading up to and following the official launch of Liquid Mercury Solutions' new product Beowulf Identity Server.

Okay, this isn't really fair, since it's really more a case of predicting the present.


To be honest, I was completely caught off guard back in 2013 when the new version of SharePoint was released into the wild without even mediocre support for basic things like FederationMetadata.xml, token encryption, or a half-decent people picker for claims. I'd previously assumed that developing anything in this area was a lost cause; Since Microsoft could easily catch up, and whatever they implemented would inevitably become the standard.

Seems that I was mistaken about where they'd put their energy, and this got me thinking about why SharePoint, which was among the first Microsoft products to fully embrace the claims authentication model, would be so slow to mature.

First thing that comes to mind is that SharePoint really suffers from early adopter syndrome. Back in 2010 when claims authentication was still pretty new, SharePoint was one of the first to implement its own Secure Token Service. Unlike other web applications than can be easily adapted to use an external claims service, this STS still serves as the backbone of SharePoint security to this day, even when external providers are in the mix. At that time it was built with a still-beta version of Windows Identity Framework. Likewise, when 2013 was developed, it's also true that it was one of the first MS products built on .NET 4.5. However, at that time WIF still hadn't been fully integrated into the .NET framework - though parts of it had.

Lately I've been doing a lot of digging around in SharePoint's STS using Reflector, and I can see that a lot of design choices were made here without interoperability or extensibility in mind.

Just as one example, let's take the relationship between SPTrustedLoginProvider and the STS itself. Leaving aside the unusual naming convention (sometimes it's a trusted identity token issuer and other times it's called a login provider), it's interesting to note that much of the information actually needed to federate with another provider isn't actually part of this object, but has to be read from the STS itself. Compare this design with ADFS, which also serves as a kind of STS but has the structure for Relying Party configuration, wherein practically everything that you need to form a relationship between ADFS and another server is stored in one location.

Additionally, a lot of critical functionality here is internal and sealed. While I have never been shy about using reflection to invoke critical methods where needed, this is going to make life difficult for anyone who wants to develop capabilities that require these functions. Just from a support perspective alone, it means that you can't count on Microsoft not to change these functions later on - though from the look of things most of this stuff has not changed much in the past few years. IMHO, MS would do well to open up some of these classes and methods, since sealing them doesn't really provide much in the way of code security anyway. Until they do, it will always be a race to make sure that any patch they release don't radically change things.

Finally, the last reason that I think MS will continue to lag behind others in terms of supporting claims in SharePoint comes down to one simple thing. Microsoft's SharePoint strategy is cloud-first, and the fact is that what federation they needed to support SharePoint Online access via WAAD and externally shared MS accounts has already been implemented. Plus, they have their roadmap in place for SSO using ADFS. So, in essence, they have no impetus to make major improvements to the way this is being done. Sure, there'll continue to be improvements in the API for apps, client side code, etc. But don't expect future versions of SharePoint to be oriented around major usability enhancements for authentication - at least until there's something in it for Microsoft.

This op-ed piece is by no means the end of the story. What experiences have you had with configuring SharePoint security and do you agree with me or disagree that a lot of ground will continue to be left uncovered? Leave your opinion in the comments.

Outrageous Claims: SP Advanced Security Config Will Get Easier

Principal Architect Thomas Carpe shares his thoughts and opinions on the state of the art in SharePoint security, including predictions about things to come. This blog post is part of a continuing series leading up to and following the official launch of Liquid Mercury Solutions' new product Beowulf Identity Server.

I feel a bi t like Thomas Veil from Nowhere Man when I find myself saying "I know they will. They have to." I guess what I'm really trying to say here is that implementing security configurations for SharePoint is still too difficult. 

Take for example that blog from Wictor Wilen on setting up SharePoint 2013 with Thinktecture Identity Server. This is a great article, but it's typical of a configuration between two identity products in that there are a ton of settings to consider and some of it can only be done through the use of complicated PowerShell commands. 

Likewise, our own product Beowulf Identity Server has faced similar challenges in early deployments. The product is great, however there are still reams of documentation on how to set it up. Don't get me wrong; I'm all for having complete documentation. Still, you know you're in for a time when one of the first things you need to tell folks is the laundry list of skills they're likely to need to configure your product. 

So when I say that advanced SharePoint security will get simpler, understand where we're starting from is truly very complicated. As the demand for more security focused installations grows, those companies that thrive in this space will need to find creative ways to do more with the resources they have on hand in what is already a pretty tight labor market for a niche skill set. From where I sit, this means making the product easier to install and configure, whether that means creating an MSI package, PowerShell administration commands, a setup wizard, or all of the above. 

Further, since some of this complexity comes from the SharePoint side of things, and Microsoft isn't really going to make that easier, the community and vendors will have to pick up the slack. (see Improvements to SharePoint Claims Authentication and Security Will Lag Behind the Industry for reasons why.) 

Wizards and installers can give you a basic set of options that will work for most customers with typical needs, but they can't tell you what is the best practice in your particular circumstance. It's important to remember that wherever you find a security wizard, you'll probably find a security loophole there too. Let's just hope that people will do the right thing and not rely on self-signed certificates and other default settings. However, I would not bet the SharePoint farm on this being the case. 

At the end of the day, IT security itself isn't going to get any easier. I think we'll see security solutions and products that will offer a basic set of turn-key options. Anything advanced or unique to your organization left for experts to figure out how to accomplish it.

 

Real World Claims 2: Provision ADFS and SharePoint Demo Environment

This post is part 2 of a series, and is a companion to my presentation titled Real World Claims in ADFS and SharePoint 2010. This presentation will be given Saturday, April 13th, 12 noon at SharePoint Saturday, The Conference, Washington, DC April 11-13, 2011. Table of contents, additional links, slides, video recording, and other funny liner notes will be posted in Part 1 as they become available.

Planning an Enterprise Class Demo Environment for SharePoint and ADFS
First, I need a demo environment. I know I'm not gonna have much fun trying to show off our cool stuff unless I can squeeze SharePoint, Active Directory, and ADFS onto my laptop somehow. Quite possibly I won't even have network access while I am giving this demo – a big problem for a solution based heavily on Public Key Infrastructure. Plus, a lot has changed in the time since SharePoint 2010 first went RTM back in May of last year. We had a service pack of Windows 2008 R2 and one for SharePoint too, as well as a major release of ADFS. While we witnessed many bugs and issues over the past year, I want to be sure I'm presenting on the current state of the art. So, I begin by downloading new ISOs for everything. (There doesn't appear to be an image for SharePoint with SP1 included from install, but no biggie.)

What's the environment going to look like?

Server 1: DEMO\MasterControl
Active Directory, ADFS, SQL Server 2008 R2, [optional] Certificate Authority
normally you would put these services on multiple machines
typically installed behind the company firewall

Server 2: DEMO\SmartyPants
IIS, SharePoint 2010, ADFS Proxy
would normally be a WFE with an SP application server on another box
single server will suffice for this demo
if web facing, typically installed in the DMZ

I like this configuration, because it should be easy enough to do with only two VMs, but it will effectively mimic some of the inter-machine network traffic you would normally expect to see in the real world. The presentation is called "Real World Claims" after all. I could potentially re-use this demo environment for a multitude of purposes, including BI solutions or other stuff too.

One thing I want to talk about in a bit more detail here. At the time of this writing, I do not know 100% what the effect of taking these machines off the network will be. There are two things about my plan that concern me somewhat: 1) I am configuring this environment as a sub-domain of my forest, instead of an independent forest, and 2) I am using the certificate authority on the root domain of my forest. I've done some basic research into this, and it is my belief that certificates and CRLs (certificate revocation lists) are cached on the servers for a sufficient period of time such that – if I leave this system up and running at my hotel – I can disconnect it from the internet long enough to deliver my demo without any adverse impact. Time will tell, but later this week I plan to test this before it would be too late to formulate a backup plan. (Actually, my backup plan would be to install a standalone CA on my demo box, but even under those circumstances I am not certain how Kerberos will react.)

Now I need to install all these things. Having done this many, many times I can tell you it isn't all that exciting. Here's the basic rundown.

Step 1: Core Install and Configuration on MasterControl
Not much to say here. Install Windows Server 2008 R2 Standard Edition <yawn />     

 

Can somebody tell me, why the heck is there a close/cancel button on this window? Now we configure the network, install all updates, and rename the machine.
Keep in mind that these need to work when disconnected from the Internet.
For now, I am using local addresses 10.11.12.80 and 10.11.12.81,
but eventually I may switch to 192.168.10.1 and 192.168.10.2.

Next, run dcpromo to promote the server to an Active Directory domain controller.

 


In retrospect, I am having second thoughts about whether it was wise to join this demo environment to my existing forest. I guess we'll find out!

 

 

 

Oh, hey, I should probably make a second physical site for these servers.

 


Notice that I made this DC a global catalog server, because I'll be running it some physical distance from the forest.

 

 

 

One other thing I did is that I configured a new physical site on the domain, because I'm going to run these demo machines on my laptop in a [potentially] disconnected scenario.

Okay, inside joke: above site location named "I'm on a Boat" is actually a song by rapper T-Pain, sorta similar to "Like a Boss". At Liquid Mercury, we've been talking for over a year about doing a code camp on a Caribbean cruise! And now, you know the reference #nullreferenceexception.

Step 2: Install ADFS 2.0 on MasterControl
After downloading the ADFS 2.0 RTW, the installer will ask you if you want to install an ADFS Server or an ADFS Proxy. For this machine we're going to install the server. In this case, we do the GUI install and we know that ADFS will install to the Windows Internal Database. Later, I'll probably want to upgrade this server to use SQL Server instead (more info: here). Other than those two choices, it's a pretty brain dead installer. Then, we open the admin tool and do the real work.


After the installer completes, you can find the ADFS 2.0 Management console in the Start Menu under Administration Tools. To start the Configuration Wizard, click the link shown above next to the [extremely obvious] red arrow.

 

 


Now, personally, I'm a glutton for punishment, so I'm going to install this machine as the first member of a farm, even though I know that for the purposes of my demo it'll be the only machine to host ADFS - and I've effectively limited my load balancing options by using the WID instead of SQL.

 


Here, I'll use the default certificate for the Default Web Site in IIS, which happens to be the domain controller server certificate. Later I can generate a new certificate especially for SSL. The name of the ADFS web site is not glamorous. If this were a production environment, I'd possibly choose a cleaner name like "login.somecompany.com", but this will suffice for now, and I can always change it later.

Next, I create an OU in AD called Services, which is where I'll keep all the service accounts I need to create. Then I create an account in that OU. I call my account "ADFS Service" with logon "DEMO\adfs-service". Keeping this account in its own OU with other service accounts helps me find them easily and manage Group Policy Objects more easily if I need them later on. (Hint: I will.)

 

 


Finally, we confirm settings and click Finish to run the wizard, and ... <drumroll />

 


Oh crap. Better fix that. :-(

 


Now that the installer is finished, we need to address that error we got. Run these commands to give SPNs to the NetBIOS and DNS names for the ADFS server

SETSPN -A HTTPS/mastercontrol.demo.colossusconsulting.com DEMO\adfs-service
SETSPN -A HTTPS/mastercontrol DEMO\adfs-service

You also want to trust the ADFS service account for delegation. This is something I often forget to do. To accomplish this, jump into Start > Administration Tools > AD Users and Computers, find its user account and view its properties.

 


Note that the Delegation tab only appears as an option after an account's SPN is defined using the commands shown earlier.

 


After you complete the wizard, you'll see that IIS now has the virtual directories for /adfs and /adfs/ls under Default Web Site.

Something to keep in mind here, even though it doesn't apply on a new server: if you were doing anything special with Default Web Site that caused you to reconfigure it, the ADFS wizard is just going to drop its stuff right on top of your other changes. Be watchful for this, since they might have conflicting settings.

I'll kick the can down the road a bit farther on the rest of ADFS configuration we'll need to do.

Step 3: Configure a Certificate Authority
[A few words here about the idea of a standalone CA] I had considered using a standalone CA on my demo domain controller, but eventually I decided it wasn't needed and uninstalled it.

In my case, I'm going to use the Certificate Authority on the root of my forest. This works for me because I already have one set up, and I've tested it a few times with other development environments. I can be reasonably sure it's going to work when it's time to run my demo.

Creating a Certificate Template for ADFS / STS
For the most part, we're able to get by with either existing certificates that were created by AD when our servers were provisioned, or by creating certificates through the interface provided by IIS. But, Secure Token Services (and in particular ADFS) require a few special tweaks that will deviate slightly from the Web Server template provided out-of-the-box in AD Certificate Authority.

1. Specifically, ADFS wants you to use a certificate with a 2048 bit or higher key strength. Depending on your version of Windows and how long the CA has been in service, you might still be generating SSL certificates with 1024 bit keys.
2. As you'll also see, we'll need to be able to export the private keys for this certificate. This is something that is not enabled on the Web Server template.
3. Lastly, the version of your certificate template can make a big difference. 2008 compatible templates allow you to set up signature hash algorithms with SHA-512 or MD5, neither of which is supported by ADFS 2.0. In fact, ADFS is extremely fickle about what certificates it is willing to use – as we'll see. So, you need to be careful in the type of template you create, or risk losing a night of sleep to troubleshooting - like I did.


To request and issue such a certificate, we'll need to create a certificate template. Do this by opening Certificate Authority, then right-click Certificate Templates, and choose Manage.

 

 


Find the Web Server template, right-click, and then click Duplicate Template.

You will be asked to choose Windows 2003 Server and Windows Server 2008. This seemingly innocuous dialog box is actually quite insidious.

IMPORTANT: I have no Windows 2003 servers left in my forest, so I normally would have no issue with picking the second option here. However, according to comments left here:

  "If you are generating certificates from AD-CS, make sure to request  the certificates using a template that supports a Windows Windows 2003 Enterprise CA. If you use a Windows Server 2008 CA template, the Federation server will fail to start and report a generic private key error message in the logs (Event ID 133)"

 

"ADFS2 is written in managed c#, using X509Certificates2 - X509Certificates2 don't support CNG (CAPI Next Generation) - this means (by extension) that you can't use certificates which rely on CNG - using 2003 templates as Ross suggests above will ensure you get CAPI keys, not CNG keys."

Actually, what I did here was to create one of each.

I never could get the 2008 version to work, so for the sake of this walkthrough, assume we are using the Legacy STS template, regardless of what you see in the screen capture. ;-)

Creating the 2008 Template: Secure Token Service

 


Pick a good name for the template, "Secure Token Server". It's definitely okay to publish this in AD.

 


Indicate that keys should be exportable, because we'll need this later.

 


Set 2048 bit key to meet ADFS minimum requirement. I have complained before that SHA-1 is a weak hash that should be destroyed. SHA-256 is expensive, but for my demo I don't really care.

Warning: DO NOT USE SHA-512!

From Microsoft: "AD FS 2.0 does not support the use of certificates with other hash methods, such as MD5 (the default hash algorithm that is used with the Makecert.exe command-line tool). As a security best practice, we recommend that you use SHA-256 (which is set by default) for all signatures."

 

Creating the 2003 Template: Legacy STS

 

 


Set 2048 bit key to meet ADFS minimum requirement. Notice there is no Cryptography tab and that you have no opportunity to select the hash algorithm.

 

The other tabs can be left as their default options.

Making the Template Available for Requests
Okay, now back in Certification Authority MMC:

 


Right-click Certificate Templates, once again. This time, click New > Certificate Template to Issue.

 


Choose the template(s) we just created.

If you have multiple CAs in your domain, you should do this for each one that you want to be able to issue this type of certificate.

And that's it. Now, you can request a certificate from your server that will meet the fickle needs of ADFS.

Step 4: Install SQL Server on MasterControl
The next step is to install SQL Server 2008 R2 with Reporting and Analysis (no PowerPivot... yet... maybe someday).

 


I wasn't sure if the ADFS installed a version of SQL Express that would mess with the SQL install, so I did the needful. The SKUUPGRADE=1 command line switch ensures the installer will not fail if it finds a conflict between existing and installing SQL versions. (Turns out it was totally unnecessary. Oh well.)

I got the usual expected warnings:

  • "Boo hoo! Installing SQL on a domain controller isn't recommended.", and
  • "Hey dipstick, don't forget to configure Windows Firewall so people can connect to your databases!"


Next, I picked the components that I thought we might need.

I created 3 new accounts in the OU I created earlier for service accounts:

  • DEMO\sql-servics
  • DEMO\sqlreports-service
  • DEMO\sqlanalysis-service

Next I chose Windows Authentication only, added my admin account, and provided credentials for the service accounts. Finally, I choose to "install but do not configure" Report Server. I want to save this for when I prepare the same machine as a BI demo server on some future date.

Click Next, Next, Next, and watch the blue bars grow!

As a preliminary step, I made some aliases and host records in DNS, in preparation for using this SQL Server with Kerberos and SharePoint. SPDB resolves to the name of the SQL server, in case we need to move the SQL databases later on. (But, because it's a CNAME, we should use SetSPN against the actual A record for the server itself. See more Kerberos stuff farther down in this article for details.)

 


I tested my DNS host names and CNAMEs to be sure they work.

We have to run these two commands to register Service Principal Names the SQL Server NetBIOS and FQDN names:

setspn -A MSSQLSvc/mastercontrol:1433 DEMO\sql-service
setspn -A MSSQLSvc/mastercontrol.demo.colossusconsulting.com:1433 DEMO\sql-service

 

And don't forget to visit AD afterwards and enable the DEMO\sql-service (SQL Server service account) for delegation. The DEMO\MASTERCONTROL$ (computer account) already has this right, because it's a domain controller but would need it to be set if you installed SQL on a separate server.

Step 5: Install OS on SmartyPants
Having completed the basic installation for our domain controller and database server, we can move on to our web server. This machine will run SharePoint 2010 and an ADFS Proxy.

It's a good thing I did this while I was waiting for other stuff to finish before, or I'd be taking a nap now. Steps are essentially identical to Step 1 above. Windows Server 2008 R2 Standard Edition is fine here too. Do your network configuration, machine renaming, and join the server to the DEMO domain.

And, for the love of God, Montresor! Change the time zone on your servers, before it's too late!

Step 6: Install ADFS 2.0 Proxy on SmartyPants
Same comatose installer, except this time we tell it we're installing a proxy instead of a server.

Unlike the domain controller, that had a certificate, No SSL certificate has been set up on the SharePoint box. We need to do that now in the usual way, by requesting a certificate from the Domain CA in IIS.
 

 

 


Here your Common Name needs to be the DNS name you will use for the web site. Fill in all the other information as required. (Note: the *correct* value for State is the full name, not the postal abbreviation.)

Little Known Fact(s): There's actually a "Johnson Cave" in the vicinity of Knoxville, TN
ASHPD = Aperture Science Handheld Portal Device

   

   

 
I had a little trouble getting the certificate issued, because my DC's CA wasn't set up to let the admin from the sub-domain request certificates. But, a quick jump over to the CA and I was able to issue the certificate manually and export it to a shared folder where I could use "Complete Certificate Request" (right click on the screen above) to bring it into IIS.

Set the bindings for Default Web Site in IIS to use the new certificate for SSL transactions on port 443.

Now, start the ADFS admin tool and the configuration wizard should begin automatically.

 


Remember, the ADFS 2.0 Management console is located in Start Menu > Administration Tools.

 


Here, enter the DNS name for the ADFS Server created earlier.

 


At this point, ADFS wants an account to establish a trust relationship. If you make sure that DEMO\adfs-service has admin on MasterControl, then you can use it here. If not, you'll have to fall back on the DEMO\Administrator account instead. :-(

 

I went into Services and changed the account for the ADFS proxy to run as DEMO/adfs.service instead of NETWORK SERVICE. This is just my preference.

 

After finishing the wizard, we still need to go into DNS configuration on the domain controller and create a DNS record for "login.aperturelabs.local"

 

Make sure you update Network adapter properties and set the DNS server for SMARTYPANTS to point to the IP for MASTERCONTROL. Do a couple ping tests to be sure that it works.

Okay. So far, so good.

A Special Note from Captain Hindsight: As I started using the ADFS proxy, I noticed some event messages in my Audit logs telling me that the firewall was blocking connections to the ADFS service. You can head this problem off at the pass by following a similar procedure to allow connections to ADFS service the way I describe for the SQL Service on the other box. You can read about this in detail under Pre-Requisites for Configuring SharePoint a few sections down form here.

Now, keep in mind that if you run SharePoint on SSL, you're going to need another IP address on SMARTYPANTS besides the one you already configured, and another SSL certificate to boot! But, we're going to kick that down the road a ways and figure it out later, when we really need it.

 


Note that now IIS has the /adfs and /adfs/ls virtual directories that we should expect.

One benefit to using proxies is that we can put custom branding on each separate proxy, even when we are using a single AD/ADFS server as our identity claims provider.

Step 7: Install SharePoint Server 2010 on SmartyPants


This screen should be familiar to everybody, right?

 


Install Pre-requisites. Even though we've installed every Windows Update and started with Service Pack 1, there are still components missing. Later on, we'll update WIF to the newer version.

 


Later on, I'll need to verify later that we have the latest and greatest WIF on here.

Run the SP2010 installer. Have your product key ready at this point.

 


Since we put SQL Server on MASTERCONTROL, we're technically doing a Farm install. I don't know about you, but I don't think I've ever actually hit that top button on this screen.

 


Once again, we need to specify that we are *not* using SQL Express in this scenario.

 


Beyond these steps, the actual installer itself is also quite boring. Tell you what – let me go get myself a cup of coffee while this thing runs.

 


Before we run the configuration wizard, we're going to install Service Pack 1 and the latest CU for SharePoint. So, we don't want to run the Configuration Wizard yet.

There was an update in June that came out right on the heels of SP1, and there may have been other ones since then. In any case the June CU had a bit of a rocky start and you may need to download a refresh. According to Microsoft, "If you installed a version of the June 2011 cumulative update that you downloaded prior to June 30, 2011, you need to download and install the update refresh from the download center."

The SharePoint Update Center reports the following as of this writing in early August 2011:

  • Latest Service Pack: SP1 (KB 2510690 and KB 2510766)
    • SP Foundation SP1 direct download
    • SP Server SP1 direct download
  • Latest Cumulative Update: June CU Refresh (KB 2553023)
    • SP Foundation CU direct download
    • SP Server CU direct download
  • Latest Public Update: July PU (KB 2582185) download from Windows Update(?)
    On the day I was setting this up, the hotfix request system was temporarily down. Fun!

 


So, a few hours later, I actually had all the files I needed.

There's one thing I see a lot of newbies mess up (and even some people with experience forget this occasionally), so I will underline it in flashing red double-bold typeface: you need to install the update(s) for SharePoint Foundation as well as the ones for SharePoint Server. There, now that I've gotten that out of system, I can continue.

Step 8: Pre-requisites for Configuring SharePoint
We need to make sure we meet some prerequisites in order for SharePoint to connect to SQL Server. Otherwise, you're going to get stuck at the database screen of the PSCONFIG wizard, shown below.

 

Correct DNS naming, permissions, and open ports are all needed for this to work correctly. (I can't tell you how often a SharePoint install has gone over its time budget because I found out too late that the network admins had everything tightly locked down.)

DNS Guidelines for SQL Server
Here are some tips on how to go about providing a name for the SQL server.

  • Earlier, when I installed SQL Server, I created an alias called SPDB. This was so we can redirect it later if needed - for example, if we want to create a third VM dedicated to SQL Server.
  • The SPNs for such DNS based pseudo-aliases should point to A (host) records, not CNAMEs. If you use a CNAME to define your pseudo-alias for SQL, set your SPN to the A (host) record. (See my notes about Kerberos below.)
  • In this case, since I'm not changing SQL port numbers, I can create this alias in DNS alone. If you use custom ports, you'll need to create a *real* SQL alias using the SQL Server Configuration Manager. To do so requires installing the SQL client connectivity tools on the SharePoint machine

 Creating Service Accounts in AD
I created more accounts in AD:

  • DEMO\spdba
  • DEMO\spservice
  • DEMO\spadmin

DEMO\spdba is my SharePoint database access account that I'll use when I run PSCONFIG. I'll use the other accounts later on for the default services and farm admin accounts.

Granting Correct Security Roles in SQL Server
For the install wizard to run, I need to give the database access account both dbcreator and dbsecurityadmin role based permissions in SQL Server. To add a login and grant the correct permissions, open SQL Management Studio on MASTERCONTROL, then connect to the server and expand Security > Logins.

 

 

 

Opening SQL Ports on the Firewall
We also need to make a rule in Windows Firewall on MASTERCONTROL to let our SQL traffic go from SharePoint to SQL Server. You'll recall that earlier we got a warning about this. Do this under Control Panel > Windows Firewall > Advanced Settings (look in the links on the left hand side).



We need to add a rule under Inbound Rules to let SQL traffic into the machine.

 

 


Choose All Programs and hit the Customize button next to Services.

 


Find the MSSQLSERVER service. Optionally, you can repeat this process to make rules for other SQL services you want open.

 


This is just a demo, so keep it simple, stupid.

 


Hit Add button for the second pick-list to add a range.

 


Limiting access to the local subnet reduces our exposure while requiring fewer configuration changes later on.

 


Allow all traffic to open this service completely. Since we're making this rule to allow just the local network acess, that should be secure enough.

 


Limiting access to the domain network should suffice.

 


Give your rule a name and explanation.

Ports may also need to be opened in your hardware firewall as well. The way I set up the Windows Firewall here will work regardless of the server's port settings.

Step 9: Configure SharePoint
Run the PSConfig.exe configuration wizard from Start > SharePoint 2010 Products and Technologies > SharePoint 2010 Configuration Wizard.

 

 


This is the essential screen I talked about earlier. The server name, firewalls, user account, and permissions have to be right in order to get past this point.

 


Don't forget to store your farm passphrase someplace safe, like in KeePass or an unencrypted text file on the root of C drive. ;-)

 


I want Kerberos to work, so I chose Negotiate and configured a custom port number (53281) instead of a random one. A fixed port makes it easier to assign an SPN to the Central Administration web site. However, before I do, I need to confirm that I know the account that'll be used as its application pool.

Little known fact: 53280 and 53281 are the addresses for the screen background and border colors on the Commodore 64 home computer. I use these because since early childhood I've had thousands of these now useless trivialities memorized, so I prefer to put them to work when assigning ports.

 


Continuing, you'll get the "well, duh" warning at this point. Umm, yeah. Pretty sure I just said basically the same thing a couple paragaphs ago. :-) I know we are doing things the hard way, as I'll explain later.

 

 


And it's time to refill my coffee!

Oh yeah, one other thing. You have to do an IISRESET at this point too. Although in a moment you'll see why we'll have to do a lot more than that. ;-)

Step 9(a): Configuring and Testing Kerberos
This is where you hit the roof, since you just locked yourself out of Central Admin. ;-) When the SharePoint configuration wizard finishes, it will report success. However, when it takes you to the Central Administration web site, you'll try to log in 3 times, and it'll fail into a blank browser page. This is because Kerberos is not properly set up for the Central Administration web site.

If you are desperate, one quick workaround that is *very* temporary, and also not a very good idea, is to turn on Kernel Mode Authentication in IIS for the Central Admin site. It'll get you into the site, and earn you a lecture from Spencer Harbar. ;-)

Another thing you can do is use powershell to turn off Negotiate/Kerberos and revert to NTLM. (At the time of this writing, I looked for such a script, and couldn't find one.) By the way, I also like the suggestion that some people have made, to set up Central Administration for the first time in NTLM mode, and then extend the web application into a new site that uses Kerberos. That way, you always have the NTLM version to fall back on if Kerberos fails, and you can use the extended application to configure Central Admin as a load balanced site. As cool as that is, it's totally uneccessary for this demo, so I'm saving it for some other time.

Configure the SPN for Central Admin
A better idea (at least in the long run) is to get Kerberos working, starting by doing the SPN set up.

 


Let's go into IIS and figure out the application pool account.

 


Go to Advanced Settings for the Central Administration web site, then look for Identity property.

So we have our answer there; it's the same account we put into the configuration wizard a few minutes ago for SharePoint database access.

I ran these commands to make an SPN for the CA web site:
setspn -A HTTP/smartypants:53281 DEMO\spdba
setspn -A HTTP/smartypants.demo.colossusconsulting.com:53281 DEMO\spdba
setspn -A HTTP/smartypants DEMO\spdba
setspn -A HTTP/smartypants.demo.colossusconsulting.com DEMO\spdba

The first time I did this, I only used the first two commands, and it didn't work the way I'd intended. What followed was an exercise in frustration that shouldn't surprise anyone who's ever configured Kerberos before – or tried. After installing KerbTray and WireShark on both servers, several reboots, and hours of lost productivity later, I finally got enough information out the system to tell me why I was unable to login to Central Admin. Ultimately, though I tried several things, I couldn't log in until I added the last two lines (underlined above), which correspond to port 80. Pretty weird, I must admit. (I should do some research to figure out why that might be.)

 

Once you finally do get into Central Admin, don't forget to add an AAM for the FQDN. :-)

In a little while, I'll have to configure more of these SPNs for the SharePoint web sites, but I haven't created those application pool acounts yet – all in good time.

Other Stuff I Tried That Helped - Maybe
As an additional step during my troubleshooting process, I changed the following as described on MSDN Blogs, in C:\Windows\System32\inetsrv\config\applicationHost.config under the system.webServices > security section:

<windowsAuthentication enabled="true" useAppPoolCredentials="true">
<providers>
<add value="Negotiate" />
<add value="NTLM" />
</providers>
</windowsAuthentication>

Is it really necessary? I'm not sure, since by itself it didn't fix the problem I was trying to troubleshoot.

Note that in the above case, the command described in the MSDN blog article failed:

C:>c:\windows\system32\inetsrv\Appcmd set config "SharePoint Central Administration v4" /section:windowsauthentication /useAppPoolCredentials:true /commit:MACHINE
ERROR ( message:Configuration error
Filename: \\?\C:\inetpub\wwwroot\wss\VirtualDirectories\12592\web.config
Line Number: 0
Description: The configuration section 'system.webServer/security/authentication/windowsAuthentication' cannot be read because it is missing a section declaration. )

I also added reverse DNS entries for both machines. Maybe it helped, maybe not. But, certainly this is something you should do.

Also, if I haven't mentioned it yet (I'm pretty sure I did), make sure all your servers are set to the same time zone and have their time set correctly. Microsoft-centric Windows installs to Pacific Time, so if the rest of your domain is running in EST that won't help. Time synchronization issues are a common cause of Kerberos failures.

Don't forget to visit AD and trust the DEMO\spdba and DEMO\$SMARTYPANTS accounts for delegation like before. I'll note here that Spence says this isn't needed, and he's probably right. Frankly, I had enough trouble getting Kerberos working to begin with. This is just a demo environment! (Later, I plan to experiment with this and update this article accordingly.)

Another Side Note about Kerberos
Many folks - who are much more experienced than I am on this topic - report that use of CNAMEs causes problems with Kerberos. They might very well be right. However, I haven't had any issues come up yet. If I learn different, I'll let you know.

My rule of thumb is to assume that Kerberos (or maybe it's Internet Explorer) will resolve a CNAME and replace it with the underlying A record's DNS name *before* completing the process that involves looking up the SPN. So, from Kerberos' point of view, it's like the CNAME doesn't even exist. Thus, if you use a CNAME in DNS, don't use it when defining your SPNs; use the A record that it points to instead.

I've never tried this with IE 6; IE 6 sucks, so if this doesn't work that's honestly just one more reason to stop using it forever.

And, if it isn't obvious, I'm certainly not suggesting that you should point 10 SharePoint sites with host headers to a single A record, because then you'd either end up with 10 sites using one application pool - or duplicate SPNs that just won't work. Use A records for your web sites; my CNAME trick is something I do pretty much just for the SQL Server.

Further Reading for the Truly Masochistic
Getting Kerberos to work is a pretty common problem, and troubleshooting Kerberos is so far outside the scope of this article, it isn't even funny. If you need help, here are some resources I used to fix it.

  • Everything Spencer Harbar Has Ever Written on Kerberos (that man is a SharePoint god!)
  • Troubleshooting Kerberos in a SharePoint environment (Part 1)
  • Troubleshooting Kerberos in a SharePoint environment (Part 2)
  • Troubleshooting Kerberos in a SharePoint environment (Part3)
  • Troubleshooting the Kerberos error KRB_AP_ERR_MODIFIED

Step 10: Setting up SharePoint Services
Now, finally we can get to Central Administration to run the [second] configuration wizard.

 

Then, click on Launch the Farm Configuration Wizard (It's the only option shown, so IMHO it wasn't worth the screen shot.)

 


Here I entered the credentials for the "DEMO\spservice" account, and just left all the checkboxes in their default state.

 


<Yawn /> It's been so much time wasted in fixing Kerberos, that now it's dark outside - time to grab a cold beer and soldier on into the night.

 


Success! But I bet that User Profile Synchronization Service probably *still* won't work ;-)

We've had a long day installing a whole bunch of software and dealing with the little issues that came up along the way. But, we've got all the ground work laid out for the good stuff to follow! Tomorrow, I'll go into detail on how to provision the SharePoint web application(s) needed for the demo to work.

World Claims Part 3: Setting Up the Demo Web Sites

This post is part 3 of a series, and is a companion to my presentation titled Real World Claims in ADFS and SharePoint 2010. This presentation will be given Saturday, April 13th, 12 noon at SharePoint Saturday, The Conference, Washington, DC April 11-13, 2011. Table of contents, additional links, slides, video recording, and other funny liner notes will be posted in Part 1 as they become available.

 

In the previous installment, I laid the groundwork for a two server demonstration environment. We installed the OS, Active Directory, ADFS, SQL Server, an ADFS Proxy, SharePoint, and even got Kerberos configured. We have almost everything needed to start demonstrating the real power of Claims Based Authentication.

However, our battle station is not yet quite fully operational. We need to configure some sites, and then deploy some custom code, and configure everything before we can pass the BBQ sauce.

Clearing the Way for the Future


Now, I have a personal grudge against the settings used by the wizard, so the first thing I'll do is delete the "SharePoint – 80" web application. I didn't have to. It just felt good.

Setting an Application Pool Account


I want to create a new managed account for the web site's application pool account, DEMO\ap-aperture. This time, I want SharePoint to manage the password for me.

But this blew up in my face. I got an error saying "Access denied. Only machine administrators are allowed to create administration service job definitions of type", which is funny because I was logged into DEMO\Administrator and Domain Admins are included on the SMARTYPANTS\Administrators group. C'est la vie! I add the account explicitly.

But, now I get "Account DEMO\ap-aperture has already been registered" and then the dreaded "Object reference not set to an instance of an object" error when I go to the managed accounts page! My managed account exists, and it's corrupted! We no worries I guess; been there done that. You need PowerShell to back it out.

$a = Get-SPManagedAccount -Identity "DEMO\ap-aperture"
$a.Delete()

Get back into the web page, and do my thing. And I get the error AGAIN. :-( I add DEMO\spdba and DEMO\spservice to the Administrators group and prepare to try again.

This is why I like creating these accounts in PowerShell to begin with. To heck with the managed accounts web page!

$cred = Get-Credential
New-SPManagedAccount -Credential $cred
$i = Get-SPManagedAccount -Identity "DEMO\ap-aperture"
Set-SPManagedAccount -Identity $i -AutoGeneratePassword

If you run this on a clean farm and domain you'll get the "The password does not meet the password policy requirements…" error. You need to make a new GPO for your service accounts OU in Active Directory to disable the "minimum password age" policy. Do a "gpupdate /force" on the servers to make this change take effect.

 

 

 

 

On yet another side note, for your managed accounts with automated password change, you need to make sure you leave these two boxes unchecked in Active Directory. If you don't, bad things will happen! Maybe not today, maybe not tomorrow, but soon, and when they do you'll have a hell of a time recovering your corrupted list of managed accounts.

More info at:

The Sean Blog Updating Passwords on SharePoint 2010
Configure Automatic Password Change (SharePoint Foundation 2010)

Planning the Site Configuration


Before we create a new site that will make use of ADFS, we need to make some configuration changes to our environment. Why? Because claims based authentication through any Secure Token Service whether it's SharePoint or ADFS will require SSL to be secure. We already used port 443 for the only IP we had when we set up the ADFS Proxy.

Let's take a step back for a second and think about the architecture we're trying to configure.

I want to demonstrate the purpose of using Claims in the real world. One such purpose is for extranets, where you don't necessarily want to give external users accounts in Active Directory – like that contractor guy, who's probably a spy sent over by Black Mesa to infiltrate your our network! Wait, maybe I'm getting a bit paranoid, but it'd be good at least to not have to manage accounts for all those people who don't even work here. All those "forgotten" passwords – let 'em be someone else's problem. Yeah!!

To accomplish this, I need a web site that will be exposed on the public Internet at the address http://www.aperturelabs.local (Yes, I know this address won't work in the real Internet, silly! It's a demo.) When the user logs into the site, I want them to be directed to the ADFS Proxy which will be hosted as https://login.aperturelabs.local and will act as the front facing web site for the ADFS server https://mastercontrol.demo.colossusconsulting.com. Once the user is logged in, they can't stay at the public, unencrypted, site anymore. At this point, I'd like them to be sent to https://www.aperturelabs.local.

Why do we start at an unencrypted public site and then move to a site that uses SSL after logging in? Well, there are two reasons. In this demo, I want to show that a single site can serve double duty as both an extranet site and a public facing web site. There are certainly times when you would not want to do this, and instead you can just build these two sites using completely separate content. The second reason is that claims based authentication is not secure unless the site uses SSL.

Also, I'd like to have a version of the site that doesn't rely on ADFS or SSL at all, so that older technologies (like WebDAV) can still work when needed, and so I can still access the site if everything in ADFS goes pear shaped at some future point. I'll host that one at http://internal.aperturelabs.local, and assume for this demo that the network is locked down to prevent access to it from outside the Aperture Science Enrichment Center.

Now, I understand this is the first time you're hearing any of this. That's because I only thought of it just now. Otherwise, I'd have set things up the right way to begin with. But I think you'll agree it helps to make what we're trying to cook here a bit more clear.

There are some problems with this design, like the fact that ADFS is going to light-your-lemons-on-fire-and-throw-them-back-at-you if you tell it to return the user to a site that's different than the one they left in the first place. That's a limitation of the architecture we had to overcome. But, I'm getting ahead of myself. I'll cover those details when we get to talking about the code.


Table 1: a chart showing what the general configuration of the sites will look like

In the Liquid Mercury Solutions web site, we happen to use a similar configuration. However, the SSL web site uses a noticeably different DNS name; extranet.liquidmercurysolutions.com, so it's much more obvious to the user that they've logged into a secure site. Here I'm favoring simplicity, or at least the appearance of it.

Okay. So remember in Part 2 when I said we'd need more IP addresses later? No? Well, too bad. Go to Control Panel > Network > Change Adapter Settings > Local Area Connection > Properties > TCP/IP IPv4 > Properties > Advanced. Here we need to add an additional IP address. For the time being I will use 10.11.12.82.

 


Our plan also requires that we update DNS. Here is what it should look like now.

Creating Our New Web Applications


Before I begin, it'd be a good idea to create SPNs for this application pool account and DNS name, so Kerberos will work on the new site we're going to create.

setspn -A HTTP/www.aperturelabs.local DEMO\ap-aperture
setspn -A HTTPS/www.aperturelabs.local DEMO\ap-aperture

To create the web application, I go to Central Administration > Manage Web Applications > Ribbon > New. We're going to start by creating the secure site at "www". Why? I don't know exactly, but if I think of a reason I'll put it here… Oh yeah! I remember now; it's because you can't pick anything except the Default zone when you make the first site for a web application, and I want to make the authenticated site the default site.

 

 

 

 

And so on, and so on…. Ugh…

 

(From here down, the default values are fine.)

 

Thank you SharePoint. You broke SnagIt. :-(

 

Next, I created the site collection at the root of the application. I used the Enterprise Wiki publishing template. This seems like a likely scenario for a company portal. :-)

 

Of course you have to add this site to Trusted Sites in IE.

 

Next, I changed the master page from v4.master to nightandday.master. I'd really like to crank out some custom branding for my demo. For now, I'll have to make do with just a couple enhancements. We have a lot we still need to do.

To summarize the above in a copy-pasta friendly format:

  • New IIS Web Site Name: "SharePoint - www.aperturelabs.local (443)"
  • Port: "443"
  • Host Header: "www.aperturelabs.local"
  • Path: This should be auto-generated for you, ending in www.aperturelabs.local443
  • Allow Anonymous: No
  • Use SSL: Yes
  • Enable Windows Authentication: Checked (Yes, we're not going to let users log into this site, but I have a good reason to leave this active that I will share later.)
  • Integrated Windows Auth: Negotiate/Kerberos
  • Trusted Claims Providers: We have none to pick from yet. Return to this later.
  • Sign-in Page: Use the default sign-in page. We'll change this later on.
  • Public URL: "https://www.aperturelabs.local"
  • Zone: Default
    A few tweaks to the UI, and we have ourselves the beginning of a little corporate intranet!

I'll come back later and snazz it up a bit, time permitting.

 

Okay, obviously, I need to do something about that SSL certificate. We can't be getting a bunch of security warnings in our presentation.

After trying and failing to generate a domain certificate from IIS for the second time, I've decided to add DEMO\Domain Admins to the security settings for the Web Server certificate template on my domain controller for the root domain of the forest. Perhaps I should install a CA on the DEMO domain controller? Time will tell. I can always reissue my certificates if I need to.

 


The important setting here is that CN must match your web site's DNS name.

Changing the security setting works, and I change the certificate for the site in IIS. I realize too late I typed the wrong CN into the request, and have to remove it from IIS, revoke the erroneous cert, and re-request a correct one. Finally, the security warnings are gone.

I can't say this often enough. Public key infrastructure is the key to getting claims based solutions to work. If you can't figure out how to generate valid certificates, don't understand what a chain of authority is, or just generally are a bit fuzzy on PKI (the infrastructure behind SSL), then you are going to have a heckuva time working with Claims.

Now that I have the core web site set up, I need to do some work to set the other supporting sites up. This is done in Central Admin > Manage Web Applications, except instead of hitting New in the ribbon to create a new web application, we're going to click the link for the www.aperturelabs.local web application (so it's highlighted). Then, click Extend in the ribbon.

The dialog is very similar to the New Web Application dialog, but with fewer options, since many settings must be the same across all the web sites for a single web application.

  • New IIS Web Site Name: "SharePoint - www.aperturelabs.local (Public)"
  • Port: "80"
  • Host Header: "www.aperturelabs.local"
  • Path: This will be auto-generated for you.
  • Allow Anonymous: Yes
  • Use SSL: NoEnable Windows Authentication: Checked (Yes, we're not going to let users log into this site, but I have a good reason to leave this active that I will share later.)
  • Integrated Windows Auth: Negotiate/Kerberos
  • Trusted Claims Providers: We have none to pick from yet. Return to this later.
  • Sign-in Page: Use the default sign-in page. We'll change this later on.
  • Public URL: "http://www.aperturelabs.local"
  • Zone: Internet

 

And, the same process for the internal-only web site.

  • New IIS Web Site Name: "SharePoint - internal.aperturelabs.local"
  • Port: "80"
  • Host Header: "internal.aperturelabs.local"
  • Path: This will be auto-generated for you.
  • Allow Anonymous: No
  • Use SSL: No
  • Enable Windows Authentication: Checked
  • Integrated Windows Auth: Negotiate/Kerberos
  • Trusted Claims Providers: We have none to pick from yet. Return to this later.
  • Sign-in Page: Use the default sign-in page. We'll change this later on.
  • Public URL: "http://internal.aperturelabs.local"
  • Zone: Intranet


Now go into IIS and edit the bindings for www.aperturelabs.local and Default Web Site as follows:


Table 2: SSL enabled web site bindings

 


When you're done, it should look like this. You won't need to adjust any bindings for the other sites, because they will use the host headers defined when you created/extended the application.

Browse to each web site to test that it is working properly. You'll note the SSL web site takes you from IIS to https://10.11.12.82 instead of https://www.aperturelabs.local. That's okay. It's a normal thing that happens when IIS doesn't have any host headers configured for a web site, and you can ignore it. Just enter the URL into your browser manually.

But, our Public site still challenges us to log in. We need to do something about that.

Go to Web Application Management, click on the web application, and then click Anonymous Policy in the Ribbon. Change the settings for the Internet zone to Deny Write as shown.

 

You could leave this unrestricted, especially if you want to open up blog comments to the public or something similar. But doing so requires that you consider the security implications of what you're doing. Otherwise, some random Google surfer might inadvertently disable GlaDOS's morality core and enable her deadly neurotoxin emitters - and we can't have that.

Let's enable anonymous access to the public facing site. We can do this within the new web site itself from Site Action > Site Settings > People and Groups. Adding a user policy in CA would make it more difficult for the Site Collection Admins to accidentally revoke the permission, but the globally applied (non-override-able) permissions would be extremely dangerous. (Besides, I tried it in Central Admin, and IUSR wouldn't resolve properly there anyway.)

 
Here we enter the anonymous access account for IIS by typing IUSR and hitting the Check Names icon.

 


But, we're not completely done. We still need to go to Site Actions > Site Settings > Site Permissions to fully enable anonymous access. In order to get this button to appear in the ribbon, you actually have to log in to the public web site at http://www.aperturelabs.local. That's because anonymous access wasn't enabled on any of the other extended web sites.

 


Here I chose, Entire Web Site but you can lock it down more if your needs differ.

 


Now when I visit http://www.aperturelabs.local, I'm no longer asked for credentials.

For our partner portal, this is exactly what I am hoping for. If we were content to use Active Directory as our sole login option and have our users self-select their access level through the URL they put in the browser, we could leave everything just as it is. So far, we haven't really even started using Claims. What we have done is provision a single SharePoint application, with multiple ways of gaining access to it.

We're innovators - Aperture Science Innovators - and science marches on! We've built the foundation, so to speak, of a bright tomorrow.

And that is *exactly* when I'm gonna configure the rest of this crap – tomorrow.

(This is Thomas Carpe. We're done here!)

Real World Claims Part 1: Prelude to a SharePoint Presentation

This post is the first of a multi-part series, and is a companion to my presentation titled Real World Claims in ADFS and SharePoint 2010. This presentation was given Saturday, April 13th, 12 noon at SharePoint Saturday, The Conference, Washington, DC April 11-13, 2011. Additional links, slides, video recording, and other funny liner notes will be posted here as they become available.

 

I sit here at my computer on the Saturday only a week before the main event. It's been weeks, maybe even months, since I originally submitted my idea for a presentation at SharePoint Saturday. And while there were many things that prevented me from working on this presentation sooner, including my own nature as a chronic procrastinator, everything I've been doing for pretty much the whole year has been leading up to this moment.

The cursor blinks, expectantly. I imagine that it must be disappointed in me. Is it restless?

Where to begin? Do I start by trying to flush out my slides? Do I review the notes I've been carefully collecting all this time? Sometimes staring at a blank page and figuring out what to write first can be the hardest part.

I fix myself a rum'n'Coke, and I head out to the office. It's raining, drearily, producing an early dusk. Our office is a small earth-sheltered room with about 300 square feet, carved from the back half of our 3 car garage and fondly referred to as "The Bat Cave" by friends of the business. The garage was utterly useless for parking cars, and entirely too useful for collecting large piles of useless stuff. Now, it serves as the nerve center for Liquid Mercury Solutions. There's a large U shaped array of desks, comfortable chairs, monitor arrays connected to nothing. The people who're entirely too accustomed to sitting elbow to elbow – and their laptops – are all gone for the weekend.

I sit down at my desk and fire up Pandora – Daft Reactor Society channel. It plays Mad as Hell by The Vandals, and I turn it up to 11. Howard Beale's screams echo into the empty space around me. Next stop: "The Zone".

(Links to other parts of this series will be posted here as they become available.)