Microsoft Cloud News Round Up

We Told You So: Microsoft Cloud Leads in Gartner Magic Quadrant, Yet Again 

Gartner Magic Quadrant ranks Microsoft as a “Leader” in the BI and Analytics section for 2017. Gartner highlighted multiple key benefits of using Microsoft’s OneDrive for Business for companies who already use productivity and collaboration tools offered by Microsoft (i.e. Office365, SharePoint).  

Our founder and CEO, Thomas Carpe, remarks “Finally business advisors officially recognize what Liquid Mercury Solutions has known for years – that Microsoft’s cloud services are quickly becoming the standard by which all others will be compared. Where Office 365 once provided an advantage to businesses willing to take a chance on the cloud, soon enough it’ll become ubiquitous in business.” 

Microsoft’s optimism and determination seems to promise a growing trend of advancements in this area. Kamal Hathi, Microsoft Power BI general manager, wrote “We’re humbled by this recognition for the innovation we’ve delivered with Microsoft Power BI in the past year … But more importantly, we’re encouraged by the progress we’ve made as a community in executing against the ambitious goal set when Power BI was made generally available only a short time ago in July 2015: Provide BI to more people than ever before, across all roles and disciplines within organizations.” 

Alara Rogers responds, “Microsoft has actually been an industry leader in BI for many years. Excel is the world’s most widely used BI tool by a large margin. Where Microsoft struggles is in communicating to customers about the power and abilities of their platform, especially when we see lots of changes like we have with the shift from traditional tools like PowerPivot and SQL Reporting Services to a cloud platform Power BI. There’s room for improvement, but things are headed in the right direction.” 

Whatever your opinion, it’s safe to expect even bigger and better collaborative innovations in the years to come. 

Read More: 

https://blogs.microsoft.com/firehose/2017/02/16/microsoft-named-a-leader-in-gartner-magic-quadrant-for-business-intelligence-and-analytics-platforms/ 

https://mspoweruser.com/microsoft-named-leader-2017-gartner-magic-quadrant-content-collaboration-platforms/ 

  

Office 365 Takes the Lead Over Traditional Office for the First Time 

This week marks the first time that Office 365 users exceed the number of users for traditional versions of Office, making it the clear favorite both among users and at Microsoft. That trend will only continue as Microsoft plans to officially end support for Office 2007 later this fall. 

But what’s made Office 365 such a success? Let’s look at a few benefits. 

  • For starters, no one needs to suffer with the frustration of losing files that weren’t backed up because someone was too lazy to keep the laptop charged, thanks to OneDrive for Business and SharePoint Online. 
  • All the relevant applications people use for themselves can be synced across devices. That means next time your computer has a spontaneous meltdown, you can just switch to another one, easy as pie. 
  • Files can be shared with anyone, both inside the company and with vendors, partners, and customers. No need to clutter folks’ email with bulky and potentially risky attachments. 
  • Multiple people can edit a Word document or Excel spreadsheet simultaneously and it updates in real time. It’s so much easier than passing it around the office for review and revision. 
  • Thanks for applications like Delve, everything you need can be found in one place so say goodbye to having 27 windows open at once. 
  • Best of all, your information is kept safe using the latest encryption, and you can protect your account with more than just a password by using multi-factor authentication such as your phone. 

Office 365 has all the perks you wished Office XP had back in 2002. No wonder the trend of moving everything to the cloud is here to stay, because it works beautifully. 

 

Microsoft Welcomes Our Robot Overlords 

There has been buzz of Microsoft creating an AI supercomputer and people in the industry have strong opinions on how this will impact the future. Will it be chaos like The Terminator come to life? Or will we all fall in love like Joaquin Phoenix did with Her? 

OpenAI, a non-profit AI research organization, has teamed up with Microsoft to implement accessibility of AI to the masses. Azure is the crucial component that will take this idea and cultivate it into a reality. 

Microsoft already has tools within this platform that can will assist in this development (i.e. Azure Batch, Azure N-Series Virtual Machines, and Azure Machine Learning) but they are in the process of creating further technology to aid in AI research. 

Some of these advancements have already come to light, including the upcoming Hardware Microservices via Azure. Microsoft aims to have FPGA (field-programmable gate arrays) processing power be accessible to developers sometime next year to have a cloud that is 100% configurable. There are major perks to having this type of access including increased speed and functionality. 

What the heck is an FPGA? Thomas Carpe explains “Simply put, FPGA are hardware, like your graphics card for example. Unlike special purpose hardware, they can be programmed and reconfigured as needed using software, potentially including AI. Thus, they’re super-fast, and can facilitate machine learning.” 

This all sounds wonderful, but some think converting and relying on AI technology is going too far. World renowned icons in the tech and science communities have conflicting ideas on what this means for the future of civilization. 

Elon Musk has spoken out against AI, referring to it as “our greatest existential threat” almost three years ago. He’s taken a precautionary role as a member of OpenAI. 

Stephen Hawking made a point to compare the speed of biological evolution versus advancements in AI to show that AI would eventually outgrow us. 

Mark Zuckerberg seems to favor the idea of a world more heavily dependent on AI. He believes it could create vast improvements in every day scenarios like healthcare and transportation. 

Where do you stand on the subject? Will you embrace AI? How would you like to maximize the cloud with the new capabilities of Hardware Microservices via Azure? Let us know what you think in the comments below.

Learn More: 

http://www.techrepublic.com/article/microsoft-partners-with-openai-to-advance-ai-research-with-azure/ 

http://www.zdnet.com/article/how-microsoft-plans-to-turn-azure-into-an-ai-cloud/ 

http://observer.com/2015/08/stephen-hawking-elon-musk-and-bill-gates-warn-about-artificial-intelligence/ 

http://fortune.com/2017/07/26/mark-zuckerberg-argues-against-elon-musks-view-of-artificial-intelligence-again/ 

https://channel9.msdn.com/Blogs/Azure/Why-OpenAI-chose-Azure+video 

https://fossbytes.com/satya-nadella-microsoft-is-turning-azure-into-the-first-ai-supercomputer/ 

 

Stay Ahead of Emerging Cloud Security Threats  

Recently, a massive cybersecurity attack on Office 365 targeted several Fortune 500 companies. How?! 

Skyhigh Networks explained the attackers consistently tried variations in a skillfully discreet manner to get into the accounts of “…high value targets, including more than 100,000 failed Office 365 logins from 67 IP addresses and 12 networks. The attempts in all targeted 48 different organizations.” 

Evidence shows the attackers might have already known some employee names and passwords through phishing and tried different combinations of usernames and passwords based on that. 

Your business can be vulnerable too! Do you reuse the same easy password for everything? Do you interact with spam emails? Do you have a basic username-password authentication system? If you answered yes to any of these questions, you need to up your security game. 

Don’t worry, you’re not alone. But that’s exactly how and why something like this could happen to your business soon. Embrace the modern world and get educated on how you can protect your data. 

There has been a huge shift of bringing sensitive information to the cloud amongst enterprise corporations as well as SMBs in recent years. Almost 60% of all sensitive corporate data in the cloud is based in Office 365. Additionally, it works on a myriad of devices which makes it even more appealing to users. 

The downside to this is that it’s also the hackers’ bull’s eye. It is often said “[w]ith great power comes great responsibility” …to protect your data. 

Slawomir Ligier, senior vice president of engineering at Skyhigh elaborates on this. “While companies traditionally have invested extensively in perimeter security, those without a dedicated cloud security solution will lack visibility and control for a growing category of attacks. Enterprise cloud providers secure their infrastructure, but the ultimate responsibility to control access to sensitive data lies with the customer.” 

Thomas Carpe goes on to say, “Many existing security experts as well as their tools and standards are seriously behind the times when it comes to including the cloud into their security plans. Where our customers have sensitive data, we must consider not just things like their firewalls or patching Windows, but also whether they’re subscribing to the right mix of cloud services to fully protect themselves.” 

Let that sink in for a moment. 

Protect your business! Now, wanna upgrade your security? Contact Liquid Mercury Solutions today to set yourself up with high quality cloud security and data protection to fit your business needs. 

Read More About It: 

https://www.infosecurity-magazine.com/news/widespread-bruteforce-office-365/ 

 

Microsoft Renaming Kiosk Plans to Frontline Worker Plans 

For years, we’ve struggled to explain to our clients what a Kiosk plan is, often calling it the “deskless worker” plan instead in favor of Microsoft’s preferred naming. Now Microsoft seems to be catching on to the longstanding communication gap. This week, they’ve announced a naming change to the K1 plan, which will henceforth be known as… wait for it… the F1 plan! 

What’s the F for? Well, all joking aside (and, yes, we’ve had some good-natured fun at Microsoft’s expense), the F is for Frontline Worker – but it could easily mean Field, Fleet, Factory, First-line, or one of those other words that starts with the same letter. 

Whatever way you spell it or decide that it stands for, the F1 plan is still the cheapest way to get your non-IT, non-administrative employees into Office 365. The price is still the same at $4 a month, and while the plan doesn’t include a copy of Office it does have email, Skype, and access to OneDrive and SharePoint – which is fine since a key requirement to the F1 plan is that the user doesn’t have their own PC. The F1 plan is perfect for users who’ll access Office 365 primarily via their smartphone or tablet, and may use a shared computer (kiosk) on occasion. 

So, if just the name is changing, what do Office 365 subscribers need to know? Not a heck of a lot, just keep it in mind when you get your next billing statement. Nothing’s really changed at all, so you’re not getting “F”ed. ;-) 

Office 365 Security and You - Ransomware

Minecraft creeper: 'That's a very nice file share you've got there - be a shame if something happened to it'

Since I'll be on a SharePoint security panel speaking at next week's Federal IT Security Conference, I wanted to do a couple blog posts this week about cloud security.

I'm going to leave discussion of Windows zero-days, Strontium / Fancy Bear / Apartment 2B etc. for another time. There's already plenty of FUD going around about that topic. If you're not sure whether you're protected, you can update to Windows 10 Anniversary Edition and you'll be covered. The easiest way I know to do that is to buy a Win 10 Enterprise E3 subscription from us for $6.50 a month; throw in Enterprise Mobility Suite and Symantec Endpoint Protection Cloud and you'd still be spending only $19 a month. That's about all there is to that, so let's move on.

Instead I want take some time this week to talk about recent (albeit non-federal) security challenges that we see our Office 365, and particularly SharePoint Online, customers facing. Specifically, two questions I'm being asked a lot lately are "Can Office 365 protect me from ransomware?" and "Can we control when and where people can connect to Office 365?".

Today, I'll be talking about ransomware. When I come back for Part 2 we'll talk about controlling access to Office 365. Part 3 will talk specifically about securing SharePoint in the cloud.

Part 1: All About Ransomware and Office 365

Q: "I've heard of this new thing called 'ransomware'. What is it?"

Firstly, for those of you who don't know, I'll explain what ransomware is all about, and then I'll tell you what you can do about it.

Maybe I should've done this post for Halloween, because ransomware is scary stuff. Ransomware is like other virus or malware, but with a twist. It does something much more insidious than just infecting your computer, turning it into a zombie, or deleting random files.

Ransomware uses our own security defenses against us, by applying encryption on us against our will and then attempting to extort money from us to undo the damage.

So how does that work? Well, what if somebody put a lock on your door and then demanded $100 from you to remove the lock so you can get inside your home? It's sort of the same thing. Once the ransomware infects your system, it will open whatever documents that it can get access to, scramble the contents with a secret key only known to it, and save them. It then sends the key to organized cyber-criminals, and alerts you to contact them and make payment arrangements to unlock your files.

Q: "Am I in danger from ransomware?"

Yes. Yes you are.

Be afraid. Be very afraid.

But seriously, Dusty, Seth, and I saw our first case of a client affected by ransomware back in 2014 - and it wasn't pretty. This was in the spring, around the time Microsoft ended support for Windows XP. We had a client who - despite our advice - was dragging their feet about buying new Windows 7 PCs because of the cost involved. As a result, ransomware got into one PC, spread to their other workstations and servers, and then proceeded to extort and threaten their employees. That's when we got the call for help.

For the three of us, it was two hellish days working double shifts to purge the virus (from slow outdated machines), restore backup files, and clean up the mess that totaled over 100 hours and ten thousand dollars in labor charges - services for which we were never fully paid. I would never wish this fate on any client, and I hope to never receive such an emergency call again in my lifetime.

Fast forward to two years later, we're seeing an increasing number of customers telling us that they have contracted ransomware. Everybody's reaction is a bit different. Some folks are willing and able to simply walk away from their lost files, while other businesses faced a real and existential threat to their continued operations.

Any way you look at it, ransomware is a very similar problem as having your hard drive crash.

But hard drives are pretty reliable; they tend to fail from heavy use, if they are dropped, or when they get very old.

Unlike hardware failure, ransomware *wants* to be a problem for you, and there are organized teams of cyber-criminals all over the world who are actively working every day to try and find new ways to infect you with it.

If you are not working to stay ahead of this threat, it will eventually get the better of you.

Q: "What kind of information is at risk from ransomware?"

Ransomware is smart enough to go after files that you use, like Word, Excel, or PDFs while leaving program files like EXEs and DLLs alone. It can also distinguish between files you access often and files that you haven't opened in years and aren't likely to ever notice.

Ransomware can detect attached portable USB drives and find network shared folders that you have access to, so if you're infected then any folder you have access to is at risk even if it isn't necessarily on your computer. I have personally witnessed ransomware that attacked a network file server at one company and scrambled their case files for literally hundreds of customers.

Q: "Should I pay the ransom?"

Generally speaking, I want to say that you should never negotiate with terrorists - or criminals. That's a nice sentiment, and it sounds good in the movies. But in reality I think maybe that's a bit naïve.

Your best bet of course is to have a backup strategy in place and simply recover a working copy of your files from the backup. Only do this after you have thoroughly scanned, found, and cleaned the ransomware from all of your computers. Otherwise, you're putting your backup copies at risk by accessing them, which may let the ransomware know where they are too.

If you don't have a backup of your files, then paying the ransom might be your only option.

In such a case, definitely do not give a criminal your credit card info when they ask you for it. That'd be dumb. Certainly if they run your card for the ransom, you can expect the info will also circulate into databases of cards that should be used for fraud later. If you must pay the ransom, purchase a pre-paid Visa gift card to do it. Some credit card companies will provide a temporary card number you can use for a one-time online purchase. If you have that option, it’s a good idea.

Q: "I already own a firewall. Doesn't that mean I'm protected?"

Having a firewall alone is not enough unless you also have anti-virus software on all your PCs and devices. More commonly these days it is called "endpoint protection", because the threat landscape has grown to include not only viruses but also malware, ransomware, zombies, and more.

Think of it this way. Your firewall is like building a wall around a city. It doesn't make sense to have a wall to protect yourself if you don't also have soldiers inside the wall who can react to intruders. In this case, the story of the Trojan Horse is very appropriate; you must have a layer of defense inside your walled city to protect yourself in case a threat does get a foothold inside the gates.

Having anti-virus software installed is like posting guards at important bases like your armory, grain store, or government center - or having a soldier boarding in each person's house. Anyone who has ever looked at how much CPU is used by their anti-virus software understands that it may be necessary, but it's also another mouth to feed.

We also need to account for the way that mobility has affected computer security. Today, we have laptops, tablets, and smart phones that come and go freely from within our fire-walled city and out into the wide, wide world. To extend our city metaphor, it is now a bustling metropolis with merchants and travelers coming and going at all times; and the freedom to travel has become a key aspect to life that we all benefit from. We connect to Wi-Fi networks at our friends' homes or the local coffee shop, as well as cellular data networks. Then we return to our own network, usually without much fuss. Unfortunately, we also potentially bring whatever plague we've exposed ourselves to from outside back with us when we return.

Protecting the desktop doesn't need to be an expensive proposition either. It costs only $4/month per user to purchase Symantec Endpoint Protection cloud, and Microsoft's advanced security tools that are part of Enterprise Mobility Suite and/or the Office 365 E5 plan each add only $8.70 and $15 (compared to E3 plan) respectively. This is something we can help you purchase and deploy, so please do reach out to us if you want to get this set up for your organization.

Modern IT security now also includes the concept of active network defense, which takes the fight from the PCs to the network itself. These are next generation Ethernet or Wi-Fi switches than can detect and block communications known to come from viruses, malware, etc. This is a lot like making the roads in your city unfriendly to invaders by having police guards on patrol. These new technologies haven't really filtered down to the consumer and small business market yet, but I expect that will happen fairly soon.

I hope that I've been able to explain why having a network firewall alone isn't enough to protect you from security threats out there today. While endpoint protection does add a cost and can sometimes limit PC performance, it's still very much a necessary evil. Meanwhile, new products are being developed that can do even more, so it may be time soon to start looking at replacing your old equipment.

Q: "Can ransomware affect files in Office 365?"

I get this question a lot, both from existing customers and from those considering Office 365 as a possible solution for protecting themselves from ransomware.

The answer is complicated, because really "it depends". I'm sorry if that sounds like consulting-speak, so let me explain what I mean.

Firstly, let me start by saying that we haven't observed yet any instance of ransomware in the wild that directly targets Office 365. But this alone doesn't mean these files are completely safe.

Let's say for example that you are using OneDrive for Business. You have a copy of your files in Office 365 and synced copy is also on your local C: drive. If the ransomware encrypts the file on your local drive, OneDrive for Business would simply see this change as being similar to if you had opened the file yourself in Word and then saved some changes. It would then sync the [bad] changes to the cloud and overwrite the file there.

Furthermore, if ransomware infects the Microsoft Office desktop software like Outlook, Word, or Excel, then it could theoretically corrupt the process by which files are saved, regardless of where you're saving them. In fact, Microsoft Office has its own layer of file encryption called Azure Rights Management. It's not difficult to imagine a possible exploit that might somehow subvert that mechanism - or replace it with one where you don't have the keys.

So in both cases, I would say that while we don't know of any ransomware - yet - that can log in to your Office 365 account and use that access to reach your emails or documents stored in SharePoint, it is still technically possible that your files stored in the cloud are not completely out of reach.

Q: "I was thinking of buying Office 365 and moving my files to the cloud to protect them. Does what you say mean that it won't work and I shouldn't do that?"

Not at all. Moving your files to Office 365 is a good first step, and it has lots of other benefits besides security.

For starters you'd be taking advantage of Microsoft's advanced Data Protection strategy. Microsoft also has a 15 day backup window on some types of data. As a first line of defense, these are going to be a lot more secure and reliable than saving files on a USB drive in your office - even if you just look at it from a hardware perspective.

To really cover yourself, you should always have a backup strategy in place.

If your needs are minimal and the cost is a big concern, that might just involve occasionally copying important emails or files to a local drive and then unplugging it from the network at sticking it in a drawer or safety deposit box. Of course, doing things this way takes time and work. There are better options.

Third-party backup solutions for Office 365 have been around for a while. These aren't expensive - most will back up both email and SharePoint/OneDrive for Business files for just $5/month/user. Compared to other cloud backup platforms, these can be cost effective alternatives. They also add the benefit that your data isn't entirely with Microsoft, so you can feel more secure knowing that you are not keeping all your eggs in one basket.

So, if you are looking for a way to escape the threat of ransomware, Office 365 may still be a good option for you - as long as you're prepared to purchase a bit more than just the basic Office 365 plan itself.

About the Author

Thomas is an acknowledged expert on information security, the creator of Beowulf Identity Server, and will be speaking on a panel about SharePoint Security November 8th at FITSI.org's First Annual Federal IT Security Conference. You can follow him on Twitter and LinkedIn - but if you really want to connect, you're best bet is probably to call us at 410-633-5959.

An Army of One Asks "SharePoint, What Is It Good For?" - Using SharePoint in One Person Companies

It's dangerous to go alone. Take SharePoint. Recently, we've been getting a lot of new customers who are the sole proprietor of their businesses. This isn't too unusual; many businesses are one-person shops who don't have any employees. For example, while it isn't unusual to eventually take on assistants, many tax preparation specialists, accountants, architects, lawyers, IT folks, marketing gurus, or business consultants start out as just an individual person going into business for themselves. I personally went this route; rather than take on a full-time job, I operated as an independent contractor for nearly 15 years.

Liquid Mercury has always been a company based on helping our customers get the best value out of SharePoint. This used to mean mostly Fortune 500 companies and government agencies. Then, Office 365 came along and has greatly increased the audience for whom SharePoint is accessible. Now, even a single person business can buy an Office 365 Business Premium plan for $12.50 a month and get access to SharePoint.

There's a lot of interest in the platform, and one question that people in business for themselves ask us more than any other is "What's the point to SharePoint when you haven't got anybody to share with?"

At first, the answer wasn't entirely obvious, even to me, so I thought it might be worth sharing a few tips on how sole proprietors can get the most out of the SharePoint component of their Office 365 service.

Author's Note: This article got to be much more involved than I expected. So I've decided to break it up into two parts. In this, part 1, I'll go over the first three tips, which are primarily about benefits you can achieve for yourself. In the next part, I'll go into depth about ideas that can help you when working with your customers.

Tip #1: Develop a Filing System

When I think about how to use SharePoint in a one-person office, the first thing that comes to mind for me is to simply get better organized with all the documents needed to operate the business every day.

Any business will have these. There will be invoices and communications from vendors that need to be scanned, filed, and paid out. Possibly, there will be invoices sent to customers. You may have to write your own contracts and then keep track of variations as you negotiate with your customers. Perhaps you'll need to write quotes or formal proposals in order to win the business. There might be status reports and time sheets.

You can certainly organize all these documents into folders. That's how people have been doing it for years. I will give you one good example why this might not be the best option in the long run.

Suppose you decide that your filing system will be organized by customer. One folder per customer, no problem. To keep clutter from piling up, under each customer folder you create a folder for time sheets, work logs, and invoices; a folder for documents the customer shares with you (so you can honor that NDA you signed); and one for the original proposal and agreement (so you can remember what you promised to do for them). You did remember to scan the signed copy of your contract and put it there, right?

Anyway, suppose you hire some help for a large customer. You need to share documents for that customer with your hired help. But there are certain details you'd prefer to keep in house, such as how much the customer is paying you, those confidential/proprietary documents, etc.

Now you also want to hire a bookkeeper to help you convert work activity into invoices. This person needs access to all the customer's documents, but only needs the financial stuff not the contracts or project documents.

You start to think that maybe it would've been better to organize the top level folders first by the type of document, and then have sub-folders for each customer. Over time you change the way your organizing your files, coming up with newer/better categorizations - but you don't really have time to go back and change the historical documents. What you need now is called a "matrix". What you actually have is probably better classified as a "mess".

But what does SharePoint do to resolve this problem?

SharePoint lets you attach any number of properties to a document. These are called Fields and they work exactly like you'd expect fields in a database or columns in a spreadsheet to work. You can have a Field for which customer a document relates to, and a different field for the purpose of the document. Say that later you decide to add a follow-up date to keep track of work on certain documents. With SharePoint, you can add that easily at point down the road.

Of course, we wouldn't just enter extra data about files for the fun of it. Learning to file things in a way that is completely different than what we've been taught to do for the past 25 years takes a certain amount of discipline. New skills will have to be learned and new work habits developed. For this effort, there must be a proportionate reward.

As it turns out there is such a benefit. Fields are useful because you can then create something called a View. Views let you show only the documents that meet certain criteria. For example, "Show me only the proposals that I won the business." or "Show me only the invoices where the customer hasn't paid me yet." Things can also be set up so that your bookkeeper wouldn't need to be confused by all those non-invoice documents that you have to track, because from their point of view (no pun intended) these can be completely hidden. So, you can start to see how Views would be very useful indeed and worth the effort of putting data into Fields on almost all your documents.

Tip #2: Find Things Faster, Easier

One thing that SharePoint has always done pretty well is search. (Hey, you SharePoint experts, don't laugh; I am serious.) Since the first version back in 2001, I have been very impressed that SharePoint was able to crawl all the documents on my entire network, including file shares, and bring back results that often times I'd completely forgotten even existed.

This was no small accomplishment, and SharePoint's ability to uncover hidden gems has only gotten better with time.

Quick Benefits Right Out of the Box

Today, in Office 365 we have something called Delve, which will show you not only what documents you've been working on, but timeline of your work with thumbnail representations of what these documents actually look like. Most one person shops are not running an traditional server with an enterprise version of SharePoint, so I feel pretty safe saying that for the purpose of this article, most interested readers will have access to Delve.

Here's a screen from Delve showing my recent documents.

Also, many people do not realize that OneDrive for Business is essentially SharePoint with another face. Yes, OneDrive lets you sync files to your local hard drive. However, when you browse the web site to look at the copies of your documents that are stored in the cloud, that web site is a SharePoint web site and those documents are stored in SharePoint Libraries. As a result, they are also searchable in SharePoint and will show up in Delve.

So, you can get a tremendous benefit without any extra effort at all on your part simply by choosing to save your documents into SharePoint or OneDrive for Business.

Taking Search to the Next Level

Combined with the proper use of the Fields we talked about in Develop a Filing System, SharePoint search can be used to not only search for documents based on their content, but also on how they were categorized using the data in those Fields. For example, just like you can create a View to show you certain types of documents within a Library, you can also use Search to surface documents stored on any SharePoint site.

This feature has many practical applications, especially for larger businesses, but the most compelling for a sole proprietor will likely be digging through lots of documents to find the one you need - as quickly as possible. Imagine for example that search results can be filtered by a specific customer, by a set of products that they relate to, or let's say... maybe by whether you remembered to scan and upload the final signed version.

Tip #3: Create Standard Operating Procedures

Almost every one person shop starts out with the idea that if you build a better mouse trap, people will beat a path to your door. Yet, in the course of business, we often fall into a trap ourselves. We discover that we're spending more time being a bookkeeper, bill collector, contract writer, office clerk, tech support, etc. rather than the thing we went into business to do.

Eventually, if you are going to stay focused on your mission, your one person business is going to take on hired help. That could mean employees or it could mean contracting with other specialty firms.

Either way, how you go about getting your work done is something that will need to be documented and shared. Without proper documentation of your processes, it becomes much more difficult to identify those parts of your work that can be effectively retooled, delegated, or outsourced to make your operation as efficient and competitive as it can be.

If you get to the point where you're successful enough that you are forced to grow, then you'll have no choice but to try and explain to other people what you want them to do and how you want it done.

Take it from me, it will be better for you if you start writing these things down before that day comes.

I learned the hard way that rapid business growth can be every bit as dangerous as a period of decline. In fact growth can trigger missteps, leading to long term problems and the ultimate downfall of a small business. Growth can turn many strategies that help the tiny business survive into bad habits that hold it back. Growth puts such a strain on the leadership of a business, that it might make one reconsider why they went into business for themselves in the first place.

By documenting your business processes before you're busting at the seams, you can go a long way towards making sure that once you're simply too busy to train new employees, there'll be a guidebook they can follow to help you get the most out of hiring them.

So enough about why you need to be writing SOPs before you actually think you need to have them. How does exactly SharePoint fit in with helping you define your business process?

Unstructured Notes

The first step is having a ready-to-share platform for writing things down as you think of them. At this stage, your ideas may not even be fully formed, so getting things on record quickly without interrupting your other work is essential.

For the unstructured piles of stuff I tend to generate at this stage, I use OneNote. OneNote is great because I never have to remember to hit Save, and it makes it relatively easy to record the web site where I found whatever helpful bit of information I might be working with. It has lots of features in that are helpful in taking down information quickly.

Okay, but you don't actually need SharePoint to use OneNote. It's part of Office and you could simply save your Notebook files to your laptop, or if you're really cloud savvy you can put them into OneDrive.

SharePoint sites include something called a Site Notebook. Site Notebooks are simply OneNote Notebooks that are already saved to a SharePoint library, set up for sharing with team members, and web accessible. If you start with a Site Notebook rather than creating a new Notebook some other way, then no extra steps are needed to start sharing the notes you take there.

Say that all you do when you start your business is create one SharePoint Team Site for each hat you have to wear - accounting, marketing, sales, management, and operations. Then, open the Site Notebook for each site in OneNote so you have a central place to start taking notes. When the time comes that you're ready to bring on some outside help, just share access to the appropriate Team Site, and they'll have your notes too.

By the way, there's a nice thing about sharing Notebooks this way. Two people can edit the notes at the same time and see one another's changes in real time.

Structured Documentation

Suppose you get to the point where you want to formalize your notes a bit further into something your assistant can use to help you perform some business tasks that come up fairly often. There are a couple things you can do in SharePoint that might be a better choice than using OneNote.

The first option is to create a Wiki Library. Wikis are web sites where you can quickly post and edit information directly on the web page. For example, this can be useful for creating and updating a company FAQ, employee policy handbook, etc. It's a bit easier to lock down a Wiki so that only certain people can make changes but everyone can read it. Wikis have the advantage that you have more control over how you structure the pages and navigation between them, and that users will not need any special knowledge beyond how to get to the web page using a browser. Wiki pages also show up as individual entries in search results (see Finding Things Faster) rather then one search result for an entire Notebook.

The other option for structured information is to copy your notes into Word documents. For example, if you wanted to create an Employee Handbook this might be the way to go. Personally, I find that if a process has a lot of diagrams, pictures, or screen shots, then creating the Word document is a lot easier than the work involved with uploading all those images to a Picture Library in SharePoint so they can be used on a Wiki. It's also easier to create a PDF from a Word document than a Wiki or Notebook, so if your process is something you'll have to share with people who don't have either Office or access to your SharePoint site, you might want that option. Word documents also show up in search as one result per document.

Defining a Process

When I talk about defining a business process, a lot of people will immediately jump to thinking about workflows. Workflows in SharePoint provide a way to marshal a process through several steps, with notifications for people when their step comes up.

Let me just get this out front; developing a workflow is not necessarily a great idea. There are several reasons. Firstly, workflows add overhead to a process. In addition to completing the task, you often have to report to the workflow that the task has been completed. Second, workflows define a process rather rigidly. This becomes a problem if your process changes fairly often - or worse yet maybe you don't even have the process fully defined. These issues are most obvious when you're a single person operation and need to track your own work.

SharePoint does provide some ways to improve your processes without forcing yourself into taking on a cumbersome system to track every step of what you do.

For example, Task Lists are a great way to plan a project and keep tabs on the steps involved so that you don't lose track of your progress. Over the years we've built a number of SharePoint add-ons to Tasks that let you do things like copy a set of template tasks to a new Task List, manage multiple projects within a single Task List, and more.

Microsoft recently released a tool called Planner that comes with some Office 365 subscriptions. We really like Planner! It shows a lot of potential, and in many ways it is easier to use than the SharePoint Task List. We wonder what Microsoft's plan for SharePoint Tasks will be in the long term, now that there are two different ways to accomplish essentially the same thing. Even so, Planner is a new product with several caveats and limitations that make it less amazing than we'd like it to be. For the moment, there are still times when choosing SharePoint Tasks instead is a valid option.

Screenshot of Microsoft Planner in Office 365

Beyond Task Lists, there are other ways to use SharePoint to structure your processes. Many people do not know that you can create a custom List in SharePoint very easily. These Lists can hold any kind of information you can imagine. For example, you could record a list of product prices, or a series of trade show events that are important to your business. You can even build a customer relationship management database using SharePoint.

Next Time in Part II

I'll post again soon about the next three tips, which are primarily about how you present your tiny rowboat of a company when you're working with all the tugs, oil tankers, and cruise liners of the world.

  • Tip #4: Look Bigger Than You Are
  • Tip #5: Share Documents, Securely
  • Tip #6: Structure Customer Service and Interactions

I hope you'll join us. Please consider subscribing to the blog to get notification for the next part and other content that might be of interest to you.

As always, if you use SharePoint or you're considering Office 365 for your one-person operation or army of employees, please don't hesitate to contact me, or visit us at http://www.liquid-hg.com/cloud to learn more about what we offer and how we can help you.

Something New is Springing Up at Liquid Mercury

Today’s technology marketplace is constantly changing. Larger IT departments are working with smaller budgets, and in-the-cloud capabilities are bringing abilities to smaller businesses that they’ve never had before. Disruptive technologies have everyone feeling a little bit irritable, and somebody keeps moving their cheese. The overall result is strong down-market pressure on the entire market.

Many companies tell us that they’re now working actively to reduce their recurring monthly costs for cloud based solutions such as PaaS, SaaS, and hosting. As such, they’re seeking to return to the earth with solutions that – while they may represent a larger investment in the short term – allow them to control the terms under which they incur costs for initiatives such as upgrades, maintenance, and support.

In short, our clients are reporting that cloud based solutions simply provide too much functionality for their money; they want to do less with less.

Liquid Mercury Solutions is constantly striving to stay ahead of these emerging business trends. In response to overwhelming requests from you the customer (usually made in the form of late or non-payment) we’ve made an important decision to diversify our offerings.

Announcing Liquid Mercury Farms, a new venture devoted to getting our head out of the cloud.

“I thought you were talking about server farms.”
– LMS founder and CEO Thomas Carpe seen with Attila (left) and Seth (right)
As an alternative to moving to the cloud, Liquid Mercury Farms offers a broad array of ground-based solutions for “subsistence IT”. For example, our premier line of Liquid Mercury Eggs* is Grade A extra-large. They’re an excellent source of protein, delicious with toast, and the perfect add-on once you’ve uploaded bacon into SharePoint.

Our farm is also highly secured, with all equipment stored behind electrified barbed-wire fencing. Production servers are kept in locked cages, and all gates require two-tractor authentication.

The farm is fault tolerant, offering five nines of capacity – that’s almost 4 dozen eggs a day. Farm infrastructure has been fully optimized for production layers. Also, our cluck-through ratio is off the chart. 

Best of all, our support staff work for chicken feed.

To make licensing Liquid Mercury Farms’ products as pain-free as possible, we’re now accepting sacks of potatoes and fresh dairy in addition to our usual methods of payment. So, at the risk of beating a dead horse, why not give us a call and save a few bucks?

*Please note that Liquid Mercury Eggs contain no actual mercury. Happy April 1st!

“I thought you were talking about server farms.”
– LMS founder and CEO Thomas Carpe seen with Attila (left) and Seth (right)

Outrageous Claims: SP Advanced Security Config Will Get Easier

Principal Architect Thomas Carpe shares his thoughts and opinions on the state of the art in SharePoint security, including predictions about things to come. This blog post is part of a continuing series leading up to and following the official launch of Liquid Mercury Solutions' new product Beowulf Identity Server.

I feel a bi t like Thomas Veil from Nowhere Man when I find myself saying "I know they will. They have to." I guess what I'm really trying to say here is that implementing security configurations for SharePoint is still too difficult. 

Take for example that blog from Wictor Wilen on setting up SharePoint 2013 with Thinktecture Identity Server. This is a great article, but it's typical of a configuration between two identity products in that there are a ton of settings to consider and some of it can only be done through the use of complicated PowerShell commands. 

Likewise, our own product Beowulf Identity Server has faced similar challenges in early deployments. The product is great, however there are still reams of documentation on how to set it up. Don't get me wrong; I'm all for having complete documentation. Still, you know you're in for a time when one of the first things you need to tell folks is the laundry list of skills they're likely to need to configure your product. 

So when I say that advanced SharePoint security will get simpler, understand where we're starting from is truly very complicated. As the demand for more security focused installations grows, those companies that thrive in this space will need to find creative ways to do more with the resources they have on hand in what is already a pretty tight labor market for a niche skill set. From where I sit, this means making the product easier to install and configure, whether that means creating an MSI package, PowerShell administration commands, a setup wizard, or all of the above. 

Further, since some of this complexity comes from the SharePoint side of things, and Microsoft isn't really going to make that easier, the community and vendors will have to pick up the slack. (see Improvements to SharePoint Claims Authentication and Security Will Lag Behind the Industry for reasons why.) 

Wizards and installers can give you a basic set of options that will work for most customers with typical needs, but they can't tell you what is the best practice in your particular circumstance. It's important to remember that wherever you find a security wizard, you'll probably find a security loophole there too. Let's just hope that people will do the right thing and not rely on self-signed certificates and other default settings. However, I would not bet the SharePoint farm on this being the case. 

At the end of the day, IT security itself isn't going to get any easier. I think we'll see security solutions and products that will offer a basic set of turn-key options. Anything advanced or unique to your organization left for experts to figure out how to accomplish it.

 

Sharepoint For Mere Mortals: What Can Be Done In Sharepoint 101

Sharepoint can do a lot of things and because of that it is hard to accurately describe it to people without using a lot of technical jargon.  However let us start with something simple that everyone is familiar with and expand from there to try and get people to understand what you can do with sharepoint.  Let us look at, an invite list.

 

Invite lists can be for anything but moving forward our example will be an invite list for a wedding.  First, think about all the people who would need to see it, you first and foremost, your significant other, maybe your parents, maybe their parents, maybe your wedding coordinator, and so on and so on.  Sharepoint would be able to store that list so that everyone would be able to see the most up to date one and there would not be a need to combine "this list" with "that list," or compare two lists to see which one is newer.  There is only one list stored in one location and whoever has permission can go and look at it.

But there is more, how about adding and taking people off the list?  Well you certainly would not want the caterers to be able to add and take off people but you may want to let them be able to see it.  Well, in sharepoint you can do that sort of thing with "permissions settings."  You can determine who is able to view the list, who can edit the list and who can add and take away people's ability to do those things.  Basically, do you want to give your soon to be mother-in-law the ability to give her friend the permission to edit your invite list?  How about the ability to take away your ability to view and edit the list?

So now you have your list and the contents are constantly (or sparingly) being adjusted to show the most up-to-date information.  There is more we can do with this.  In this list we can store each invitee's address, whether or not they are coming, their meal choice and who they are related to.  This sort of data is called "meta data."  Essentially it is data about data.  This can be helpful in terms of sorting or information gathering.  With a few simple commands in sharepoint you can have a quick count of how many people are coming who are vegetarian or want the steak dinner.  You can find out which side of the family has more people counting or you can find out how many people are coming who buy "the good gifts."  The limit is set by you and what kind of meta-data you would find useful.

 

So that is just a little taste of something very simple that sharepoint can do.  Sharepoint can also automatically send out reminder e-mails based upon your credentials, build a webpage for your wedding, save all the wedding pictures from every guest afterwards, display maps, give out directions, all sorts of details that are involved with weddings or businesses.  But this is just a simple introduction for now.  We can expand on this on a later date.

AgilePoint Anounces Office 365 and Forms Capabilities at SPC14

Well, it's that time of year again where all the SharePoint product companies trot out to Las Vegas to strut their stuff.

Today, we have a big anouncement from the SPC 2014 Keynote Sponsor, AgilePoint.

AgilePoint - SharePoint Conference New Product Highlights

In this release, there are two things I noticed right away that we've been eagerly awaiting for a long time. 1) AgilePoint support for Office 365 not just as something that can be manipulated by workflow, but in a fully integrated fashion similar to Nintex workflow. 2) An alternative to InfoPath forms that emphasizes responsive web design.

As readers of our blog will know, we're quite fond of AgilePoint's product. One of the difficulties we face in working with it, however is that it didn't really play well with customers working in Office 365. We're happy to see now that is a possibility, and we'll be putting together some demonstrations in the next few weeks, as we definitely want to be able to take this out for a test drive and see what's possible.

CloudPrep 2014 Development Update

I wanted to take a few minutes today to talk about what we've been doing since late January in regards to CloupPrep and the PowerShell commands for file migration and management of SharePoint Online.

First thing I can say is that one of our most difficult choices was in choosing an e-commerce platform and licensing API to use for our product. Even though we plan to keep our licensing fairly simply, we wanted to have options for future products and well as many of the items we also sell through our partners.

This turned out to be more challenging than I imagined, but we have settled down on using Fast Spring and LogicNP Crypto License. Perhaps in some future post I will talk about those more from a software developer's perspective. What I can say today is that it will be at least a couple weeks before we can get a working prototype of the licensing server and the store online, and so we have had to push back release closer to the end of March or early April, mostly for that reason.

Meanwhile, we have been developing features for the different editions of CloudPrep 2014. Progress on that front continues at a rapid pace and I am pretty satisfied with the way our tools are maturing.

When we decided to produce this software, we planned to release the lite and standard editions first and follow up with premium and professional features later this spring. I was a bit surprised to see that where we are putting our development efforts, probably all four editions of CloudPrep will be available at one time.

Now for the geeky stuff. Here's some of what's been happening as we've been building.

Features we've essentially completed:

  • Upload an entire folder or specific set files to document library
    We've tested that these commands will work against network drives and UNC paths. Take that, OneDrive!
  • Preserve metadata about the local file system that the document was uploaded from
  • Create and Modified dates on files are preserved, though we did find that with larger files there are limits to what we can accomplish here
  • You can specify the content type for uploaded files, root folders, and sub-folders - including Document Set and its child content types
  • A bunch of other random stuff including commands for manipulating SharePoint lists and reports to make sure that file uploads won't exceed SharePoint limits


We noticed that Office 365 throws us a lot of connectivity errors that we don't normally see in on-premises SharePoint environments. If you've been trying to copy files using their standard UI or using OneDrive, some of these errors might be hidden from you. However, they're readily apparent if you're using Web Folders (WebDAV) or Client Side Object Model to connect. We see unexpected dropped connections quite often, and certain upload methods will time out on files that are too big and required some fun workarounds. There are different methods needed for files under 2MB, under 35MB, and larger.

Our path was also complicated by the fact that on certain Office 365 sites, our rights come from delegated admin privileges. This is the preferred way that consultants get their rights to help clients manage SharePoint Online, so we figure a lot of folks who are interested in CloudPrep are seeing this phenomenon as well. When you log in with delegated admin to a client's Office 365 site using the credentials from your own Office 365 account, you sometimes see the access denied page; login again a few seconds later, everything is fine. Our code had to expect and handle this contingency.

Another thing that we did not expect is that we're seeing some reasonable evidence that Office 365 uploads are being throttled. Most of the time, file transfers seem to be limited to about 300KB/sec; there are days when the transfer speed is even slower than that, sometimes by half. As such, it is difficult for us to estimate file upload times, and we're having to improve our algorithms to take these fluctuations and sea changes into account.

As for the cause, we can't say if this is something Microsoft is doing, or if it comes from the erosion of net neutrality. We do wonder if Comcast or other providers may be limiting traffic to Office 365 in order to give their own offerings a competitive advantage or just to control their own costs. I expect we'll be doing some tests in the near future, and we've been kicking around some ways to circumvent these bandwidth caps - at least partially. One test we did in January showed that if we took half our files to a different physical location, we were able to upload them to SharePoint Online in about half the time it would have taken if we'd uploaded them all from one server.

One thing that became clear during early development was that the disconnected nature of cloud storage was going to introduce multiple random problems along the way. As a result, in any large set of documents to be moved to the cloud, there would be some which for one reason or another may not be successfully copied. We started by trying to get this failure rate as low as possible, down to less that 0.25% of files in most cases. We did a lot of work in early February to improve the code and reach this threshold.

Even so, we needed to be able to easily run multiple passes on any file copy operation and track the results. Our first prototypes had to crawl the Document Library in SharePoint one folder and file at a time. This proved to be incredibly slow, and it quickly became apparent that we needed to be able to gather status information for thousands of files at a time if we wanted to hone in on only those which required an update from the local copy. This is something we added to the code base about a week ago, and we're now in the process of replacing some of our early code to use the new file comparison analytics logic.

As a side note, a bevy of SharePoint management features found their way into our PowerShell library simply because we had customers who needed them in short order. For example, we now have the ability to take a View from any SharePoint List and make a copy of it in the same List or a different one even on another SharePoint Site. Of course, one must be very careful with this kind of power, since creating Views with field references that don't exist in the List will certainly break the View if not the entire List itself. When we've added sufficient safety checks, we'll open the capability up as part of the CloudPrep product.

This week, we introduced the concept of using a hash algorithm to test whether files in SharePoint match those on our local drive. Use of a hash in addition to checking the file size and date stamps of a document ensures that the document has been uploaded into SharePoint and that it has not been corrupted in the process. We developed this ability in order to add credibility to Office 365 migrations where we may be moving hundreds of thousands or even millions of files, and we need to establish that the migration process has been completed satisfactorily. This capability can also be used to perform duplicate file detection, and we may develop a follow on product or feature to do just that later on.

Next week, we're planning to work on some important features that we feel are a must for getting this product to where we want it to be.

The first is the make sure that we can translate between Active Directory permissions on the local file system and users in SharePoint. The primary purpose here is to preserve meaningful data for Created By and Modified By fields in SharePoint; this is something we can't do yet. As part of this process, we'll be introducing PowerShell commands to add new users into SharePoint sites and manage groups. For most customers, this is probably of limited use. However, those with several hundred users or groups to manage will find it much easier to deal with these via PowerShell instead of using the SharePoint admin web pages. For consultants, it will make migrations faster by speeding up the time it takes to implement the security configuration. Our goal here is to lower the cost of our migration services.

The next things we do after that will be:

  • Download documents from SharePoint to the local drive
  • Assign metadata from CSV file as you upload documents
  • Flatten a folder structure as you upload it.


These are harder to do than you might think. I'll post more on this in coming weeks, including our challenges and progress updates.

Anouncing CloudPrep 2014 Migration Toolkit for SharePoint Online

We do a lot of Office 365 migrations. Most of these are for businesses with fewer than 50 employees. This should surprise nobody except maybe Microsoft, who seemed to be slow to realize that their cloud platform would have the most appeal to companies with limited budgets – or that most jobs in the US are provided by small businesses. Go figure.

Over the years, I’ve written several times about the challenges of moving from a conventional file store to Office 365. Fact is, it’s just not simple to do. It really makes sense to have an experienced IT professional help you make the move. I like helping customers make the switch, but doing so has presented interesting challenges for my business that I’m sure other SharePoint consultants share too.

Firstly, there are great third party tools out there for migrating files. We often use ShareGate and Content Matrix from MetaLogix. MetaVis is another great company that has great tools with lots of features. Fact is that even though these tools are great, they are also quite expensive. They’re feature rich, so really knowing the tool is a skillset of its own – and it makes good IT people hard to find when I need them to do a job. We also run up against serious limitations when trying to use these tools; sometimes we cannot find a way to use the tools to migrate the files in exactly the way we want to.

Second, some of my client already have a part-time IT person or managed services company that helps them service their PCs and on premises servers. Traditionally, we’re a SharePoint consultancy and we never set out to try and replace other IT folks; they need work too. They have the relationship with my customer, and the local presence needed for that on-site work. Over the years, I’ve seen that customers prefer to have their own local IT provider for most small requests. We needed to find a way to coexist with these other businesses in a way that would benefit us both.

Back in 2012, at the behest of a marketing consultant (who gave me lots of advice that was either bad or I couldn’t follow it at the time) I created a small tool called CloudPrep. This tool wasn’t much; I never had much confidence in it and so I never really promoted it. But, it did the work of renaming files that SharePoint didn’t like, and combined with WebDAV it was enough to make getting 20 to 50 GB of customer files into the cloud in a few days’ time. I released it into the wild, and CloudPrep has been getting downloaded a few times a week – mostly by other Office 365 consultants to my chagrin. Lesson learned and another checkmark for finding a way to compete with other IT providers; there are more of you than there are of me!

One problem I’ve noticed is that Office 365 migration budgets are small – I mean really tiny! That’s weird when you consider that for a 25 person company the ROI could be hundreds of thousands of bucks. But, we have been in an economic slump for something like 5 years now. I guess that takes its toll; even if you knew it would make you a thousand dollars next month, you can’t spend $100 today unless you have it to spare. Some companies are reluctant to spend even a few thousand to plan and execute.

There are a few tools that are in the “beer money” range. I tried FilesToGo once – and only once. It lacked some features that seems obvious to me, but made my client extremely angry. It didn’t have a lot of options either, one size fits all. I won’t discourage anyone from using it if it meets your needs, but I’m not going to risk my relationship with my clients on it. I am honestly surprised that after all this time, there’s nothing else in its price range.

I guess you could say that I’ve gotten fed up with this situation. Yet another migration we had to do where the current tools on the market couldn’t meet our needs for the client’s budget. That story gets old.

So, the boys in the lab and I finally built our own!

Announcing CloudPrep 2014! Forget everything you ever knew about that crappy tool we made back in 2012, because this is completely something at a whole new level.

CloudPrep 2014 is not one of those big expensive tools with a fancy GUI. It’s a set of PowerShell command-lets that work with SharePoint Online and your local file system. These commands and the sample scripts provided with them are designed to empower IT people and make migrating files to and from SharePoint Online a piece of cake.

These tools don’t replace an IT person or their experience. You’ll still need an experienced consultant to tell you how to organize your files, use metadata, overcome or avoid SharePoint Online limitations, and of course actually use the tools. You needed all that before anyway. The difference is that now much of this can be provided by your own experienced IT staff; or if you’re an IT consultant yourself, you can use our tool and make your small-business and small-budget migrations a breeze instead of a quagmire.

Our commands fall into basic categories: planning, preparation, file migration, and SharePoint management. We’re still putting the finishing touches on the product now. We’re hoping to have the Lite and Standard editions released to market sometime in February, with the Premium and Professional versions available as soon as March or April.

In the meantime, please take a look at our feature matrix and proposed pricing structure. There’s still time to collect some feedback. So, if you have a feature you’d like to see that isn’t here, then leave us a comment and let us know. Even if you don’t add a feature by the launch date, we’re planning to add even more features later. We’ll entertain any reasonable suggestion – except charging more for the product.

Like what you see and can’t wait to try it out? Contact us and I’ll give you a 15% discount if you purchase during the early access period.

Edition->Feature        Lite   standard Premium Professional
Release Date   Feb  Feb  March  April
Proposed Price Free $285 $576

$1,092

+$300 Per Tenant>2

Number of Office 365 Tenants Unlimited Unlimited unlimited Unlimited
Numbre of Site collections Unlimited Unlimited Unlimited Unlimited
Requires powershell 2.0 or higher Yes Yes Yes Yes
Requires Sharepoint client connectivity Yes Yes Yes

Yes

1 year support and Updates

(renewable Annually)

  Yes Yes Yes
Supported OS: Windows server 2008 or 2008 R2 N/A Yes Yes

Yes

Supported OS: Windows XP N/A   ?? ??
Supported OS: Windows Server 2003 N/A   ?? ??
Planning and Reporting        
Sizes and Numbers of items by folder, extention, ect. Yes Yes Yes

Yes

 

Check for Potentially Illegal file types   Yes Yes

 Yes

Folder and File Path Length Checking   Yes Yes

Yes

Permissions Checking for Local Files     Yes

Yes

Target URL Length Check Report     Yes

Yes

Upload Time Estimates      

Yes

File Preparation        
File Renaming for Illegal charaters Yes Yes Yes Yes

File Renaming for Illegal Paths

(_files,_forms)

Yes Yes Yes

Yes

Preserve Author and Editor for uploaded Files

  Yes Yes

Yes

Check for and Automatically ZIP files with illegal extentions (EXEs, Ect.)

    Yes

Yes

Check for and Automatically ZIP "_files" Folders

  Yes Yes

Yes

Migrate and Manage Files

       

Supports Network Mapped Drives

yes yes yes yes

Supports Network UNC Paths

yes yes yes

yes

 

Upload Entire Folder to Document Library

Yes Yes Yes

Yes

Upload Specific File to Document library

  yes Yes

Yes

Download Document Library to Folder

  Yes Yes

Yes

Download Specific File

  Yes Yes

Yes

Warns if Source Exceeds 5,000 items

  yes Yes

Yes

Warns if Target URL length Too Long

  yes Yes

Yes

Specify Content Type for Uploaded Documents

  Yes Yes

Yes

Specify Content Type for Top Level Folder

    Yes

Yes

Specify Content Type for Sub-Folders

    yes

Yes

Support for Documents Sets

     

Yes

Flatten Folder Structure with duplicate filename handing

    Yes

Yes

Flatten Folder Structure at 1 or more levels deep

     

Yes

Convert Folder Names to Metadata Fields

    Yes

Yes

Create Source URL Field for Uploaded Files

    Yes

Yes

Create MD5 Hash Field for Uploaded Files

     

Yes

Export Metadata to CSV File when Downloading Files

     

Yes

Synchronize of Local and Cloud files using File Modified Time

    Yes

Yes

Synchronize of Local and Cloud Files using File Modified Time+ MD5 Hash

     

Yes

Automation Features

       

Powershell command-lets

Yes Yes Yes Yes

Unattended Execution

  Yes Yes Yes

Sharepoint Management &

Development      

Create and Edit SharePoint Users

  Yes Yes Yes

Set Common Properties for Lists and Document Librarys

  Yes Yes Yes

Create and Edit Columns in Lists and Document Libraries

  Yes Yes Yes

Create and Edit Views Lists and Document Libraries

    Yes Yes

Copy a view to same or Different Document Library or list and site

    Yes Yes

Import and Export Site Columns

    Yes Yes

Import and Export Content Types

    Yes Yes

Import and export views

    Yes Yes

Add, Remove users and Groups, Permission Sets

    Yes Yes

 

CloudPrep Lite
This edition is a good fit for small file migration needs and try-before-you-buy. You can use it to do basic reporting on the structure of your files, rename files that are known to cause problems during migration, and upload folder structures to your SharePoint Online document libraries. In most cases it has a 99.7% or better success rate, and it produces a handy report so that your remaining files can be uploaded manually.

CloudPrep Standard
This edition includes a standard set of features designed to help you move files into Office 365 with a minimum amount of difficulty. You can upload and download large file collections without having to stand by the computer, perform multiple upload/download passes, and specify a default content type for files. Run it from anywhere, including various versions of Windows Server. We also include some additional pre-migration reporting tools that help to identify problems before you migrate your files.

CloudPrep Premium
For the seasoned SharePoint admin or IT professional, this edition includes features that will help you get the most out of Office 365 in the cloud. We include even more reports to give you a 360 degree view into any potential file migration issues. The file upload tool includes a variety of features for setting metadata and flattening folder structures.

CloudPrep Professional
This edition enables the true Office 365 IT professional to handle migrations for multiple clients. All the features of the Premium Edition plus advanced content type features including support for Document Sets. It also includes the ability to create MD5 Hash file uploaded files, which helps in detecting duplicate files and in determining that if two files are not the same even when their date stamps match.

Lessons from the Field for Migrating to Office 365

Recently, I’ve talked a bit about how companies can save money in lots of places by moving to the cloud with Office 365, and I’ve also described some of the complexities involved in moving large file shares to SharePoint. Today, I’d like to take a few minutes to talk about some of the lessons learned on some of our Office 365 migration projects over the past several months.

Getting Good Information Up Front is A Challenge
As SharePoint developers, we’re used to working with the IT departments of larger organizations (say 500 to 5000 employees) as we develop solutions. However, with Office 365 customers, many times we’re not working directly with IT folks. The customer may have a managed service provider for desktop support, a part-time IT contractor, and some clients do not even have their own IT staff at all.

Needless to say, planning a move to Office 365 requires us to take stock of a great many technical details. It’s not surprising that folks outside of IT might miss the importance of the myriad trivial details involved.

But getting these facts wrong during the early stages can lead to incorrect estimates and costly mistakes down the road. It’s important to get the discovery right.
Here are some things customers should pay careful attention to when gathering information in the pre-project planning phase.

Basic Planning
Make a User Inventory
Know how many users you plan to have. We’re going to need their contact information, including phone and e-mail, because more than likely this information isn’t up to date in Active Directory. From there we can talk about what plans are best for your users.

Make a Workstation and Mobile Inventory
Know how many desktop PCs, laptops, and mobile devices you’ll be configuring. It’s also important to know what kind of mobile devices will be used and how many of each type.

Make a Server Inventory
Know exactly what servers you have, what operating system and version they run on, and exactly what purposes they serve (file shares, print server, domain controller, e-mail, etc.) If you do not know these things, you should consider paying for a 1 to 3 day evaluation to document all of your systems.

What Will You Turn Off After Migration?
Part of calculating the cost is understanding the benefits you get in return for it. If you’re not sure that a system can be fully disabled after moving to the cloud, that’s something we can help you figure out.

Will You Need Any Servers You Don’t Have?

For example, if you are synching Active Directory users to Office 365, you need a server to run this on - though it needn’t be very powerful. If you have applications running on servers that you otherwise want to decommission, you may need a server in the cloud to replace them. Likewise, if your security needs are high, you’ll want to have a CipherPoint Eclipse or F-5 Big IP running in the cloud in front of Office 365.

Domain Registration
You should verify that all your domain names are still current and that you have access to the DNS registration. We’ve occasionally had customers who have some older DNS names that were being used for e-mail aliases, and they weren’t able to migrate them fully because they’d lost the ability to manage the domain name. Check on these beforehand and avoid unpleasant surprises.

Remote Access
Some companies have VPN; this is ideal. Some do not and have to rely on clunky terminal servers or third-party services such as TeamViewer or LogMeIn. If you’re in the later circumstance or haven’t set anything up at all, we should talk about what is likely to cause issues for the folks doing the migration work, because not all of these services are created equal.

What’s Your Actual Available Bandwidth
Knowing if you have a T-1, cable modem, or DSL is helpful; it’s not the end of the story. We’ll want to perform some bandwidth tests at different times of the day in order to account for the connectivity that your company is already using. In general, migrations that have to be pushed to the evening or weekends will take longer.

Test for Equipment Bottlenecks
It’s also worth pointing out that some older equipment can actually be slower than the Internet connection can handle. Early on, we can do a trial run with a few files or a single mailbox in order to determine if there are going to be unexpected problems due to slow hard drives and outdated or overloaded servers.

E-mail Migration Planning
Know Your E-mail Server
Whether you’re using Exchange, Lotus, or some other server it helps to know what we’re dealing with. We’ll need to know how many users you have, how big are their mailboxes, and what distribution lists you’re using. It’s not unusual to find a few people in a company with mailboxes approaching 20GB (or bigger!). Anything at this size is going to take a lot longer to move than usual and that needs to be taken into account.

Great Firewall of Spam
For the mail server, the above is a good start, but not enough. You need to identify if you have an anti-spam appliance (e.g. Barracuda) or service (e.g. Postini) in front of your mail server. You probably won’t need it after moving to Office 365, but if you want us to make it a part of the move we need to know ahead of time.

E-mail Archives
Most people do not think about this, but Outlook Archives (*.PST) files do not move automatically to the cloud. One of the best approaches we’ve found is to copy their contents up into Exchange Online so that you’ll have access to them everywhere you go. If you’re using archives, it’s important to know this so we can take them into account when looking at mailbox sizes, migration plans, etc.

File Migration Planning
Make a File Inventory
Know where your files are, how big they are, what you will move, and what you might leave behind. Professionals have tools that can help to analyze your files and better determine the cost to migrate. However, these tools are only helpful if we have the opportunity to run them against all the files that will be moved.

Public Folders
If you use Exchange Public Folders, you will need to have those files copied down into a regular file share so they can be moved into SharePoint. Exchange Online does not support public folders, which have been phased out in recent versions of Exchange. When we determine the size of the file stores you’ll be moving, these files need to be included.

How Will the Migration Team Access Files?
Depending on the remote access method and the speed of your Internet connection, in some cases it may actually be faster to copy your files to a portable drive and FedEx them to us rather than have us try to copy them from your office. This also provides the fringe benefit of being able to split the migration up across multiple sites, which can make everything go faster.

Dealing with the Unexpected
Obvious, there’s no such thing as a crystal ball, and that’s even more true for IT. Aside from the things I talk about above that, little things can go awry during the project. It’s important to remember that migrating to Office 365 is a big change from the way companies used to work back in the 90s. Be ready to expect and deal with the unexpected.
Here are some things we’ve seen happen in the middle of a project that can really get things out of whack.

Slippage
Sometimes it just takes longer to move files or e-mail than it seems like it should. It really helps to know exactly what we’re moving in the first place, but if your estimate and schedule were written sight unseen before we had access to the servers, then probably there are baked in assumptions that may prove to be wrong.

Even if we did a 1 day triage visit at the start of the project, sometimes the technology can make fools of us all. I had one customer where most mail moved over fine, but then one user’s mail dragged on and on weeks on end simply because their outdated server would not provide it any faster.

Needless to say, schedule creep can be very disruptive. As a result, we’ve learned to base our schedules on being 95% complete – anything more can be managed as ongoing support and needn’t cause everything else to back up waiting for it.

Limits of File Migration Tools
To move files into SharePoint is not a drag and drop operation. Fortunately, there are many good products on the market, and the state of the art is constantly changing. But, these products are not what I’d call mature - partly because Microsoft keeps changing the Office 365 platform itself. Over the years, we’ve seen file migration tools for SharePoint Online that don’t copy the date stamps on your documents, tools with poor or quirky support for Document Sets, and tools with draconian restrictions on the size of files that can be copied.

If we are copying a large volume of files, it is not uncommon that we may need to do a test run and then start over. We try to account for this in our estimates, but it’s not a perfect science. Tools are great, but if a tool or product does not get the results we want, we may have to switch tactics. This is not a sign of the coming apocalypse. Be prepared for this to be a part of the process.

Limits of E-mail Migration Tools
If you are migrating from Exchange 2007 or better, Microsoft has some great built in tools to make this possible. There are good third-party solutions for other platforms. Each of these has its own limitations. For example, Microsoft tools may not do well on extremely large mailboxes. Third party tools may be more robust, but they will take almost twice as long because they have to copy from the source and then copy to Office 365, whereas Microsoft has the benefit of running their tool in the same local network.

Limits of SharePoint
SharePoint is like any complex software product; it has boundaries. There are limits on the amount of storage you can have in a Site Collection, and limits on the number of items you can effectively put in a List or Library. Our job as consultants is to come up with plans and designs that avoid as many of these as possible. Still, it’s important to understand that Microsoft is constantly changing Office 365 – usually for the better. There have been times that we tried out a particular approach for organizing content and then had to change tactics because one of our assumptions proved to be incorrect.

Here are some examples of fiddly details that have sometimes pushed us around:

  • Flat views don’t work in large libraries (> 5000 items) even though you’d think they should be limited to the current folder.
  • In large libraries, indexes must be created before items > 5000.
  • Document Sets can only have one view inside the Document Set itself.
  • Nesting folders within Documents Sets is quirky.
    You cannot easily change the look and feel of the “my-sites” part of SharePoint.
  • And many more…


Shifting Requirements
Migrating to Office 365 is a big change. Training and discovery are a part of the process, and so you might learn something about the platform that you did not know at the beginning.

Likewise, we may learn something about your business that was not clear at the start and this could cause us to change our recommendations. Stay nimble and flexible; these moments can be opportunities to improve rather than a cause of stress.

15 Things: a Day in the Life of a SharePoint Life Coach

Recently, we launched a new service called SharePoint Life Coach. This service was designed to be of value of customers who need help with SharePoint but have a limited budget they can work with to get the support they need. To help folks understand this service better, I'd like to describe what sets this service apart and some of the questions we answer in a typical session.

With traditional consulting, you the customer tell us what to do and then we tell you how long it will take.  We then run off and accomplish these things for you, and sometimes we work with you along the way. Some consulting is about making recommendations, some is about troubleshooting. In general, the focus is to bring you a finished product, whether that deliverable is a document, a working system, or a piece of code. Of course, all of this is billed by the hour, and having a consultant working full-time is beyond reach for many companies.

SharePoint Life Coach service differs from traditional consulting in a couple of important ways.

  • The customer sets the pace for sessions, based on their time and budget.
  • Sessions follow a semi-structured format, so that desired material can be fit within the allocated time.
  • Focus on consistency and results will emerge - the idea is to have regular sessions over an extended period that help ensure better results.
  • The deliverable is you - our approach is "teach a person to fish and they'll eat for a year".


As we bring new folks into the Life Coach system, one of the first things we do is to set up topics for that all-important first session. Sometimes the hardest thing is knowing where to begin. Many times, our customers come to us after just getting started with Office 365 and SharePoint Online. They quickly realize that SharePoint is a very complicated product, and that there is more to managing it than just pulling some levers on the Office 365 management portal web site.

Over time, we've found that folks are asking some of the same questions over and over again. On some topics, we start to feel a bit like a broken record. Though we've answered those question many times, the answer varies from customer to customer based on their specific story - for example the size of their company, tech savvy of staff, etc.  As a result, while there are common themes, there is no one-size-fits-all solution for these things. Thus, being able to tailor these recommendations to your needs is what having a SharePoint Life Coach is all about.

Here are some of the popular topics that people have asked for:

  1. What are some of the pitfalls that I should avoid while working in SharePoint?
  2. What training do my end-users need to work effectively in SharePoint?
  3. How do I keep SharePoint from becoming a mess?
  4. What is the right way to structure my SharePoint web site and sub-sites?
  5. When should I use a sub-site, a list, a library, a document set, or folders?
  6. I know folders are bad, but my users love them; how do we cope?
  7. Everyone misses the network shared drive. How can people work with files quickly in SharePoint?
  8. I want to use SharePoint as an Intranet for my company; what kinds of content and things should I put on it?
  9. What's the best way to structure users, groups, and permissions?
  10. I've heard of SharePoint governance. It turns out it was a 500 page document. Is there anything for small businesses that is like governance-lite?
  11. Should I buy a file migration tool, move my files by hand, or just hire someone to do it for me?
  12. Should I use an Outlook Shared Calendar or a SharePoint Calendar?
  13. Can I organize our list of customers in SharePoint?
  14. How long does it take for stuff to show up in Search? Why aren’t my PDF files showing up?

Who Really Has the Best SharePoint Workflow Product?

I came across this blog article today, asking the question "Who has the best SharePoint Workflow Product?" This seems to have gotten a lot of attention, and so far I see that over 4,500 people have voted. That's some serious interest!

I sometimes get this question from our customers, and this is particularly challenging for me because often the correct answer is "it depends". Sure it sounds like a copout, but it's really not a very simple question.

It gets even a bit more complex for us, because we partnered with Nintex and AgilePoint, and needless to say Thanksgiving dinner can get a little bit awkward if I were to try and declare a unilateral favorite. But, read on and you'll see there's a reason that things played out that way.

I'm going to do my best to approach this question as impartially as I can. I will be very candid. From my point of view, the three workflow products mentioned in the article, AgilePoint, Nintex, and K2 are certainly the best of breed for all SharePoint workflow products. There's also Bamboo, Datapolis Workbox, HarePoint, MetaStorm, and Global360 to name just a few; but I really feel like most of these have missed their chance to take a leadership position in this space, in one way or another.

So here it goes: the good the bad, and the ugly of SharePoint workflow and third-party products.

Why Not OOTB SharePoint Workflow?


You can't have a serious discussion of third-party workflow products in SharePoint without asking the obvious question, "Why not use SharePoint workflow in the first place?" Personally, I am not a fan of SharePoint so-called out-of-the-box workflow for a lot of reasons.

OK - deep breath, inhale…

The first thing that jumps out at me is the way that Microsoft has absolutely bungled SharePoint workflow when you look at what they've done over the past ten years. In SharePoint 2001, they had workflow, but in SharePoint Portal Server 2003 they took it away completely. In 2007 they brought workflow back, using something like Outlook rules to help end users develop simple workflows, or Workflow Foundation in Visual Studio for the really complex stuff. These had serious limitations and neither could be effectively created by analysts alone, so in SharePoint 2010 they introduced some Visio capabilities - but then totally dropped the ball by taking away any ability to do simple workflows with loops or anything like "go back to step 2". I was sure they'd get it right in SharePoint 2013, so I was horrified to learn that they have completely revamped the workflow system so that now 2010 workflows and 2013 workflows are completely different and incompatible - and that in the 2013 version there are a significant number of actions you can no longer do that worked in the 2010 version. To me, this is not a stable and mature part of the platform; to leverage it will be like building on shifting sand and you should be prepared to rebuild everything in a couple of years if you go this way.

More so, SharePoint's native workflow cannot handle complex, recursive, or long-running flow patterns. Some processes are just too complex, long-running, or rapidly changing to be supported by SharePoint's native workflows without either a great deal of custom code - unless you use a third-party workflow product or in some cases a full-blown Business Process Management (BPM) suite.

As for Visual Studio workflow, custom code is expensive and time-consuming both to create and to maintain on an ongoing basis, so the best practice in almost any situation where the problem can be solved by either custom code or an existing product on the market is to use the existing product.

Finally, even in SharePoint designer there's a valid point that if you have developers available to do SharePoint workflow in Visual Studio or SharePoint designer, there is a very, very, very good chance that there's something (anything!) else that they could be doing instead which would give you a better return on their development time. Thus, we strongly recommend that you pick at least one third-party workflow product.

That being said, let's move on and take a look at some products!

Nintex Workflow


Nintex is an excellent choice if you have workflows that are more complex than can be easily done out-of-the-box with SharePoint. It's comparatively easy to set it up and use it when you contrast it with anything K2 has to offer. As a result, Nintex is commonly used to supplement SharePoint Workflow within the vast majority of SharePoint farms.

 

I haven't recently looked for any market data, but I'm pretty sure that Nintex is by far the most successful SharePoint product in terms of pure sales, and as a partner we love the Nintex web site for their ability to give us the resources we need to market and demonstrate their product. You will have no absolutely difficulty finding a reseller in your area to help you with professional services - although I hope that you'll just call on us instead. We'd be happy to give you a demo. Personally. ;-)

Nintex has pretty good integration with systems outside of SharePoint. Off the top of my head I know that we can use it to do most basic tasks within SharePoint, plus we can call web services from outside systems or manipulate databases. These features are pretty easy to use, but I would not say that your average SharePoint user or site owner will necessarily know how to leverage them. That said, most business users will be able to get by doing things purely inside of SharePoint.

Nintex has EXCELLENT support for the cloud. They have a version of their product that runs on Amazon Web Services and integrates with Office 365 SharePoint Online. At this writing, I'm not aware of any other workflow product for SharePoint that can claim this.

As far as downsides go, I'd say that Nintex architecture suffers from the same issues that SharePoint workflow does, so on SP 2010 or older your workflows are going to run on the web front end and will consume resources there. As a result, you may need to add more WFE to your farm the more you use it. This is mighty convenient for Nintex, since they license the product per WFE in your farm.

One other thing to note is that Nintex can't really handle the high complexity in some of the processes that we develop for our clients. We're talking about long-running processes that could take months or over a year to complete, and they have hundreds of steps. You see things like this in government agencies a lot. I've done really mind bogglingly complex ones for NIH, FAA, and most recently USDA. Personally, I wouldn't want to try to use Nintex to solve these sorts of problems.

AgilePoint (particularly Genesis Edition)


Considering all the workflow products for SharePoint, probably the main thing to point out about AgilePoint is that it is so much more than just a workflow engine. There just aren't that many players out there in SharePoint workflow who can honestly claim they are a fully functional business process management system, or BPMS.

 

As a result, AgilePoint workflows can be changed while running. A long-running flow will not be "orphaned" by changes to the process that occur while it is in progress. This is perhaps the very best feature. As a rule, if you process has 25 or more steps and is completed over the span of a month or more, you should strongly consider AgilePoint.

And yet, in contrast to many other BPMS systems, AgilePoint is designed exclusively for the Microsoft .NET framework, and relies heavily on the MS product suite for its creation and implementation, rather than using a proprietary tools. AgilePoint uses Microsoft Visio to design workflows and InfoPath to create forms, so any office with full Microsoft Office licensing already has all the tools AgilePoint requires. It integrates natively into SharePoint workflow; AgilePoint workflows can be deployed to SharePoint at least as easily as SharePoint's own native workflows can be deployed from SharePoint Designer.

AgilePoint Genesis installs natively alongside (and co-exists with) other SharePoint workflows. It supports every known pattern of dynamic and ad-hoc workflow identified by the BPM industry and provides 36 different functions for interacting with SharePoint. More are available with the Enterprise edition, and the possibilities with custom AgileParts are virtually limitless. This functionality leads us to conclude that AgilePoint sports A+ level “tight integration” with SharePoint. Users will never know their process has left the SharePoint server, yet AgilePoint will not negatively impact SharePoint performance in any way.

As a company, AgilePoint's primary focus is in workflow, and it’s designed to make creation and modification of workflows easily accessible to business users, rather than requiring high levels of programming skill. A business analyst with strong knowledge of Visio can be trained to create a fairly complex workflow within half a day. Workflow activities can be based on InfoPath forms created by anyone with technical savvy to create forms in Microsoft Access. Most process revision consists of moving objects on a diagram and doesn’t require a developer at all.

Call me a total geek, but one cannot discuss the strengths of AgilePoint without at least mentioning some of the obscure but important technical aspects that make it a truly impressive product. For one, AgilePoint’s model is declarative, meaning that there's almost entirely no code generated to drive the workflow process, only XML; this is in sharp contrast to many BPMS as ell as the MS Workflow Foundation Engine (SharePoint, K2, Nintex) which all use a high amount of dynamically generated source code to drive the workflow logic. In fact, AgilePoint actually uses the Visio document format itself to drive its workflow engine, so the process is literally running the exact same flow-charts drawn by the business analysts and developers! Another advantage is that AgilePoint is one of only a very few pure-play .NET BPMS out there in the market. Also, the product is built entirely on .NET; there is no part of the product which inherits from COM as many older and more well-established players in the market still do (over 10 years after .NET’s debut).

That’s not to say you can’t program against it if you want to; developers can write full featured extension in .NET, and they often know tricks to make InfoPath and SharePoint do things that go well beyond out-of-the-box capabilities. We've found that many additional things can be done if you're willing to add custom web services to the mix (also true for Nintex, to be completely fair). For example, we built a set of web services for one of our clients that allows them to move and copy Documents Sets around in SharePoint using AgilePoint, and it also implements structured creation of new team sites which is an important aspect to SharePoint governance.

Finally, the very low cost of AgilePoint's Genesis product is a significant advantage, putting it within reach of smaller companies and even single-project level budgets. AgilePoint's Enterprise Edition is traditionally a product costing five-figures; however they recently reduced their pricing quite substantially to be competitive in the SharePoint market. For 100 users, a typical annual fee for Genesis with AgileReports and InfoPath support would be less than $5k, and governments and non-profits get even better pricing. They've also proven to be flexible about selling additional components a-la-carte from the higher edition of the product if you only need a few. It's worth pointing out that even at the Enterprise price, it holds its own nicely against many six and even seven figure alternatives.

For up to date AgilePoint pricing or other product information, please fill out our short request form. You'll be taken to their product information and download page afterwards If you decide to download the free version, please let them know we sent you.
By now you probably realize that I truly love working with this product. So, I will mention a couple of disadvantages, just to prove I am being completely honest.

Firstly, I have to say that while AgilePoint comes closer to the promise of developer free workflow than just about anyone else does, their system is still quite complex and you will need the help of an experienced consultant to really make it sing. (I swear I am not saying that just so that you'll hire us.) For simple workflows, you will be fine following the basic patterns for which there are many demos, and I think most business users could probably make minor adjustments to processes. This is where a ley AgilePoint strength can become a bit of a weakness, because it really lets you build these amazingly complicated workflows. Once something gets that complex, of course it is going to require a specialist.

Also, AgilePoint does have a runs-in-the-cloud option, but it lags behind Nintex in terms of support for Office 365. Last we heard, you can't initiate a workflow instance from inside a SharePoint Online list or document library. However, their support for Office 365 sites as part of a workflow that starts in some other way is pretty good. If you're running a hybrid farm scenario with one foot on-premises and one foot in the cloud, you might be able to work around this. Also, their technical team is pretty savvy, and I live in hope that they might catch up pretty soon.

Another drawback is that AgilePoint Genesis is reliant on InfoPath. That could be a strength, depending on how you look at it. Microsoft has promised that InfoPath will be a part of SharePoint until at least 2020, but they've pretty effectively bumbled the message to customers and partners alike about what we should use instead of InfoPath. AgilePoint does have their own forms engine that is part of their enterprise product, and we're hoping to see some flavor of that included into Genesis edition so we can offer an option for folks trying to actively avoid InfoPath in their solutions.

One final note is that we've learned that very, very large forms could cripple the ability to do parallel process. This is because each step in the AgilePoint process is a view in the same InfoPath document; two people can't edit the document at the same time. However, it's possible to work around this issue, by designing you processes with this limitation in mind.

All in all, we find that AgilePoint pros far outweigh the cons. If you want a six-figure BPMS at a four-figure price and would like to avoid spending millions of dollars to support a system that might see fewer actual workflows implemented on it than I have fingers on one hand, skip the big boys and build it in AgilePoint.

K2


K2 BlackPearl and BlackPoint, its lightweight SharePoint version, are great products built on .NET technology and well suited to strong integration with SharePoint. K2 has been around a long time, and as a result their product has a great feature set. They were the dominant player in SharePoint workflow until Nintex came along and ate their lunch as people made the switch from 2007 to 2010.

K2 has good integration with products that are not SharePoint. In fact, I'd describe their flagship product is a standalone workflow product that just happens to play really well with SharePoint. As such you won't have any serious issues using it to connect to Oracle or other non-Microsoft systems - though it is built on Microsoft so it's going to be stronger in that scenario.

In some ways, I feel a little bit guilty - as if my review of K2 should be a little bit longer. However, simply put, they're far too expensive for my taste. It costs a lot to buy, there aren't that many people who know if really well, and development isn't lightweight enough to give to the business users, so there will always be a development cost for using it.

My recommendation is that if you already leverage K2 in your enterprise, then using it in SharePoint is a no brainer; if you haven't already got it In house, you should weigh it against the other options available.

(More of my thoughts on this are now in the comments; thanks to the community for challenging my thinking on this.)

MetaStorm


As a BPMS platform, MetaStorm has considerable strengths. Its primary focuses are on forms creation and business process modeling (i.e., analyzing and optimizing a flow that is not well understood in order to improve it). Its proprietary forms creation mechanisms are fairly robust, and they are fully integrated with its process flows. In addition, it can be integrated with Microsoft Office - a toolbar at the top allows work in MS Office applications to be integrated into MetaStorm processes, once the MetaStorm client has been installed. MetaStorm's design philosophy was to create a "One-stop shop" where flows, forms, reports and dashboards can all be created and managed within the same interface. For those who are adept with that interface, this can be an enormous advantage.

However, MetaStorm's weaknesses make it less than ideal for managing the workflow within SharePoint.

To begin with, while both products make the claim that they are integrated with SharePoint, it is very important to point out that MetaStorm is only “loosely” integrated with SharePoint. It offers web parts that are "windows" into the MetaStorm engine, allowing access to forms and dashboards, but these web parts can't be used to create MetaStorm elements, they merely interact with them. The actual forms and processes are housed entirely within the MetaStorm server, and users of the web parts are frequently directed to external web pages within that server. Sometimes web users are forced to accept functionality that is much more limited than that provided by the MS Office add-on.

The connections to SharePoint processes are not native and need considerable configuration and technical expertise. While MetaStorm processes and forms used solely within that product can indeed be developed by mid-level Information Workers, the ability to wire MetaStorm flows to SharePoint at various connection points requires strong developer-level skills; it is our opinion that it's a tool best suited for large organizations where an entire IT department exists to create and modify workflow, where that department can be trained on the use of a specialized, proprietary tool. There would be a substantial technical cost for most organizations to acquire these additional skills on top of skills in SharePoint development.

On a recently completed project, we had approached the company to show us how we might do manipulation of SharePoint sites, documents, and other assets from within MetaStorm. What we found was that this always came down to custom code. OpenText does have some impressive libraries of scripts that can be used for this purpose and they seem willing enough to share; but I keep coming back to this - it is more code and it will need to be maintained.

Finally, we could find no example of anyone leveraging InfoPath as the form repository with MetaStorm as the BPMS nor could OpenText point us to one, although this issue may pale compared to the complete confusion regarding Microsoft's plan vis-a-vis the future of InfoPath.

MetaStorm has been around for a while under different names and companies. It has both a Java version and a .NET version. Parts of the .NET version of their process engine pre-date .NET and require higher-level developer knowledge to program or troubleshoot. What will happen to MetaStorm in the future is really unclear to us, since OpenText also owns a couple other workflow products including the formerly known Global360.

For these reasons, we don't generally recommend trying to implement SharePoint workflow in MetaStorm. It's not necessarily a bad product, but it just doesn't seem like the right product for the job 99% of the time.

Other BPMS Products


The vast majority of BPMS products come from the IBM technology space, are written in Java, and they typically do not integrate with SharePoint at all. This makes the set of developer skills required to build and maintain flows in those products far different from the set needed for managing SharePoint. Many are also cost prohibitive. In an environment where SharePoint already exists this would certainly drive up costs beyond what is reasonable. In general, I don't think it's such a great idea to use these systems combined with SharePoint - YMMV.

SharePoint Workflow? Why Not Zoidberg?


If I had to pick a favorite from the list above, I would have a very hard time choosing between AgilePoint and Nintex. So, here's where I have to ask the question that I do not hear people asking very often. If these different products have such different strengths and weaknesses, why not simple use more than one?

I happen to think that's a great idea. Use Nintex for your quick-and-dirty, self service, six-guns blazing, SharePoint workflows that will work really well with the lazier faire approach to SharePoint collaboration - particularly in Office 365. Use AgilePoint to develop complex or long-running processes that will improve the maturity level of your organization and require continual adaptation and improvement. Especially when you look at prices for both AgilePoint Genesis and Nintex for Office 365, you'll see that you can probably fit both of them into your budget easily.

Did you like this article or find it helpful in making a decision? Do you work for one of these companies and feel like I didn't give your product a fair shake or left something out? Perhaps you've used one of these products in your organization and have an experience or opinion you'd like to share. Leave me something in the comments, subscribe to my blog (see upper right of this page), tell your friends about us, or give us a 5.0 on PinPoint - it's cheaper than buying me a beer and won't get lost in the mail.

----------------Comments from the old blog---------------------

Massimo
12/12/2013, 7:17:50 AM
Great write up and very useful for anyone wanting to make a decision on choosing the right workflow/BPM tool for their SharePoint. 
 
Thanks and keep them coming!


Jey
12/12/2013, 11:25:46 AM
K2 is a fantastic product. It provides simple and easy approach to bringing data, forms and workflow capabilities together into applications that are configured. Reuse is at the core and configuration is everywhere. This is where I see the product lending itself for people to learn it quickly and leverage it massively.


Dayv
12/12/2013, 11:46:04 AM
Your experience with K2 must be very outdated, as web based designers allow everyone to build processes. K2 also integerates with any .Net service application such CMS, SalesForce, SAS, etc., so there really are no limits.  
 
As to the price, well it is true that you get what you pay for. 
 
Too many of the SharePoint integrated BPMS products, especially those built on SharePoint Declarative Workflows (ah... extensions of SharePoint Designer), are too dependent on Microsoft not making any changes, and tend to break when SP Service Packs are rolled out.


Thomas Carpe
12/12/2013, 12:01:30 PM
Dayv and Jay, 
 
I agree with you that K2 is an excellent product. They've been around for several versions of SharePoint and so their feature set it robust and mature. 
 
I do not agree that it is a problem of you get what you pay for, as all these products are excellent for what they were designed to do. From my point of view, the main challenge with getting customers to adopt K2 has always been price, and that goes for any large scale product (take AvePoint's DocAve as one example). Especially since 2009, there's been a lot of downward pressure in the marketplace and with the appetization of the SharePoint market it is a challenge to get any but the largest enterprise to adopt a five or six figure solution no matter what bells, whistles, and unicorns are included in the box.  
 
Workflow (and the need for BPM) often starts at the project level and not in the enterprise - at least that's my experience where I have seen it succeed, and therefore the means are going to be on a much smaller scale in general. In particular, Office 365 customers have especially small budgets. 
 
I wouldn't say that my experience with K2 is limited, but I do admit that it is a bit out of date. The last time I used the product in a solution was on the SP2007 platform and at that time the engine was reliant on Workflow Foundation and thus had all the same fundamental flaws that Dayv describes regarding patching regimens and such. 
 
One thing I do feel like I need to rebut about your comments on declarative workflows: what you say about anything relying on SPD workflow, XOML, etc. is absolutely correct. One might say that same is true with WFE as a service packs and Microsoft product upgrades will almost certainly break workflow - just look at what happened with SP2010 vs. 2013 and workflow. However, AgilePoint's declarative model is their own XML schema and not based on XOML at all; therefore it has none of those drawbacks. In my view it has proven to be very reliable. 
 
It seems not we've heard a bit from some folks from K2, and I do appreciate that since my review of that product is a bit sparse and I think people need to hear about what it can do well in addition to where it falls short. I may take another look at the product if the opportunity presents itself.


Renier Britz
12/12/2013, 2:49:50 PM
Hi Thomas, 
 
Thank you for taking the time to put this post together.  
 
To be completely transparent, I work for K2.  
 
The first thing, you are correct by saying that K2 relies on Workflow Foundation our runtime execution engine is built on workflow foundation, to date we have never been disrupted by any updates on Windows Workflow Foundation. SharePoint ships with a workflow runtime that is also build on top of windows workflow foundation. You called out SP 2010 vs SP 2013 workflows, it’s not because of changes to Windows Workflow Foundation it is because of changes to the implementation of workflow runtime on top of workflow foundation. K2 is unaffected by changes made on the SharePoint workflow runtime as we don’t rely on the SharePoint workflow runtime and therefore don’t inherent the same set of limitations. 
 
The second point I would like to address: Pricing – as you mentioned “Workflow (and the need for BPM) often starts at the project level and not in the enterprise” I agree 100%. K2 pricing is competitive and allows organizations to start small, in many cases started on department level. We have more options in the works, lookout for major announcements in March ’14 at the K2 User Conference. 
 
Now back to what this is all about, workflow tools for SharePoint – With SharePoint 2013 Microsoft made a ton of existing enhancements, the app model being one of the most interesting changes. We had a choice, take what we have and make it work on SharePoint 2013 OR take full advantage of the changes and build something that will truly change the way people create forms and workflow-driven apps on the SharePoint platform (emphasis on create, this should not be a developer only play). The easiest way to get familiar with what K2 has to offer, go and have a look at the following recorded webcast: http://pages.k2.com/sharepoint2013beta.aspx 
 
If you have any questions let me know. 
 
Cheers, 
r


Thomas Carpe
12/12/2013, 3:15:45 PM
Renier, 
 
Thanks for sharing. I really do appreciate getting an outside perspective on K2. Our goal is always to make sure our customers get the right solution that works for them, which as I said before is one of the reasons that I may seem a bit ambivalent when it comes SharePoint workflow and third-party products. 
 
Going to what you said about WFE, you are correct and I am sorry if I was less than clear on this. WFE is a sub-structure which is different from SharePoint workflow in the same way that ASP.net is a sub-structure and not the same thing as the SharePoint API. What I was referring to in the comment about the move from 2010 to 2013 SP workflow is that there was a lot of shifting around in the way Microsoft implemented workflow between the two versions. 
 
Some of that may be like you said, part of a fundamental shift in the way Microsoft wants developers to deliver on the platform. Back in July, I heard Ira Fuchs present on the differences between SP workflow in 2010 and 2013 and I have to say that I was not impressed at the loss of capabilities on the new model and that MS has basically said that if you don't like the takeaways well then you can still build SPD 2010 workflows instead. 
 
Maybe they will provide it later - maybe not. Either way, if you can't manipulate SharePoint with it, what was the point in Microsoft making the update in the first place? For now, all the third-party workflow products are safe until MS figures out how to do it right, but after 5+ versions I'm not holding my breath. 
 
At any rate, it's all part of a big shift to client side code, and like many people in the SharePoint development world I have mixed feelings about that too - and I am not fully sure that I can say I trust MS to deliver a framework that will be a place where we can exceed our client's expectations - at least in the near term future, since they usually take their time and a few tries before they do anything right. ;-) 
 
I do look forward to seeing what you guys are cooking for next year. Perhaps I will revisit this topic then, or take a more detailed look into K2 at that point. 
 
By the way, I maintained your product link in the comment. Make sure to hit your bosses up for a Christmas bonus and have a great holiday. ;-)


Steven Bretti
12/12/2013, 8:58:04 PM
K2 is a serious BPM product that allows for a number of capabilities that cover business needs very well. Forms, Data, Workflow and Reporting. It empowers the business users with simple user interfaces and no-code solutions. 
I think this is its biggest advantage, the ability to provide no-code solutions not just for the workflow, but also for capturing data through K2 smartforms, and any CRUD based requirements through K2 smartobjects without having to write code. Purely point and click through the UI. 
Yet it still has the capability to scale this out at a later stage to do more advanced business automation. 
This is important. You want to buy one product to cover the enterprise needs, rather than have multiple tools that you're paying multiple licences for. 
K2 can live within your SharePoint solution as a seamless application, or it can live on its own. It is not limited by SharePoint. It also provides integration options to common enterprise systems such as CRM, Salesforce and other LoB systems. 
 
Definitely a product worth considering in any BPM based solution, whether you are looking at it for a SharePoint based solution, or for your enterprise needs. I think it covers both very well, and priced accordingly.


Thomas Carpe
12/13/2013, 5:52:12 PM
Thank you steven for your insights. 
 
I do think that what you're saying about is probably true for any enterprise class BPMS product include AgilePoint Enterprise Edition. K2 also has a light-weight version that runs in SharePoint alone as I understand it, so to be fair I think I'd compare *that* version to Genesis and Nintex. It just seems a bit unfair to me to judge different weapons manufacturers by comparing a tank to a rifle. ;-) 
 
I have to say, I've noticed that this blog has gotten a lot of attention from the folks at K2. Their marketing department must be really good about getting the word out. I certainly don't mind since it's great exposure and a love a vigorous debate. It would also be cool to see hear some more from some of those people from in the original survey who voted for Nintex and AgilePoint. ^_^


Thomas Carpe
12/13/2013, 6:06:16 PM
Decided to check in on the original survey and see if things were still holding neck and neck or if there might be some trends. I was surprised to see today that AO is pulling ahead at almost 40% and Nintex is not too far behind. 
 
You folks who love K2 may have a point, but the survey seems to be saying something slightly different. Well, it's a web-poll so I guess you can't take these things too seriously, right? ^_^
Gerhard
12/16/2013, 11:36:50 AM
Hi Thomas, 
 
Thanks for a very interesting article. I do not have any experience with Nintex or Agilepoint but your comparisons and descriptions of them has been interesting. I do however have more than 10 years’ experience with K2 and I’d like to add to your review of K2 – and in fact compare it more closely to what you have written about Nintex and Agilepoint, especially since as you said you did not have a lot to write about K2, maybe this will be helpful.  
 
K2 vs Nintex 
Building simple SharePoint only workflows in either tool I guess is going to be a matter of preference. However the fact that the Nintex workflows run on the WFE vs K2 having its own dedicated execution engine is quite a big drawback. I guess that could even out if K2 is installed on a very small single server footprint, but I really like the fact that K2 can easily scale and a process that starts out simple now can grow as your organization’s needs grows. I believe that Nintex workflow reporting data is only stored for something like 90 days - In K2 this is not an issue at all as all data is stored in K2’s own databases and can be stored indefinitely or archived after a period of time. The other drawback for Nintex is complicated workflows as you mention – should not be a problem with K2.  
 
K2 vs AP 
 
You mention long running processes as a major feature and benefit, something that K2 is also good at, as mentioned above. K2 processes are versioned so by default process instances remain on the process version it started on (which is a good thing imo). Tools exist to manage cross process version migrations if that is really necessary. The ability to create K2 workflows inside Visio has existed since the late 2000’s but I guess it never really caught on with customers, since the K2 UI is already so user friendly and easy to use that the Visio UI components disappeared from the scene and frankly I do not miss them – so me personally… not to excited about building processes inside Visio in AgilePoint. It sounds almost like building forms inside MS Word.. the tool does not fit the job. I have not seen AP’s implementation of this, but that is just my opinion of it. Ease of use, non-developers making changes while still having flexibility to create amazing add-on components can all be done in K2, just like in AP. Your point about needing a consultant to “really make it sing”… I like the way you put it, and that’s also true for K2, but again it’s probably fair enough because in any such a complicated platform you will need a specialist to really bring it to its full potential.  
 
So from the information on your blog it seems to me that K2 is really not that different than Nintex or AgilePoint - and I guess you are right in that you will have to know your requirements for a workflow product very well and only with that in mind can you really decide amongst these contenders. Personally my vote will go for K2 because I am confident that I can really build any solution on it and scale it well into the future.  

Thomas Carpe
12/16/2013, 12:29:47 PM
Gerhard, 
 
All your points are well taken. This was the best read on K2 that I have seen yet, and yes I think you really are helping to create a complete picture which is great. I've gotten a lot of good information about a product I admittedly knew less about. 
 
I'd like to throw in a few Segway comments if I might, that were just sort of inspired by your remarks. 
 
The first is about Visio and its relationship to workflow. If I were diagramming a workflow or business process without any sort of BPA behind it, obviously I would diagram that in Visio using a flow chart. That's a no brainer. 
 
Over the years, I've used a variety of tools including BizTalk and yes sometimes even SharePoint Designer, and I've always been disappointed in their sad attempts to integrate with Visio. I never used the tools for Visio and K2 either, so I can't comment if they're better than the ones I've used. 
 
AgilePoint was the first product that I saw where I could basically map out my entire flow as a drawing and then publish it and refine the properties etc. In that aspect my biggest frustration is often that I get a flow chart in Visio from a business analyst and I have to re-draw the entire thing using AP shapes. The thing that I really like about it is that when it shows the status of my workflow in SharePoint it uses my actual Visio drawing to display it, including any custom shapes and comments I might add to it. There's an image up on this blog post as an example. Somehow people just really like seeing where their process it; and having the flexibility to display it in ways other than a cascading downward waterfall is great for me. If some other products can do that for me, yes I would really like to see that. So, maybe you don't go squeeeeee over Visio, and that's OK, but I see real value in using it. 
 
The other thing I thought of here is that it does generally depend on the developer's confidence in a product. And, confidence is generally a function of experience. We need those past products to help us understand the capabilities and also the limits of the products we are working with. This is true of SharePoint, and also I think that's true of any product including the workflow products we've been discussing here. In the end I think we will support those products where we have the most positive experiences and tend to drift away from products where we have negative experiences - or insufficient ones. And with that in mind, it's more than just a technical question but also a question of the marketplace. Though I worked with it before, my small firm couldn't find and win K2 projects at the time when we really started doing business, which was 2010. Thus, we never really developed that affection for the product. We got some lucky breaks working with AP and found that we were able to do some really impressive things with it, and at the same time there were a lot of clients asking for Nintex because their needs were less complex and they liked what the product could do. 
 
For this reason, I think you can't necessarily read what I've had to say all along as a recommendation per-se; rather it's a comparison based on my personal experience. If someone is looking for impartial analysis to choose a product, there's Forrester and Gartner etc. My hope is that we can help people to understand what we're able to work with as consultants and developers, and sort of which types of projects we've found that these products do a good job at meeting the requirements. To that end, I think there's no right or wrong answer to the question at the top of this article. 
 
That being said, if we can all manage to argue about it for just a little bit longer, I think everyone will benefit from the free publicity. ;-)


Mike Fitzmaurice
12/20/2013, 11:22:35 AM
Full disclosure: I am an employee of Nintex. 
 
I love blog posts that spark discussion — especially when Nintex is part of that discussion. Looking amid the fray, what’s quite apparent is that the wonderful world of SharePoint workflow solutions is alive, well, and worthy of plenty of enthusiasm. And that’s a Good Thing.


Thomas Carpe
12/20/2013, 12:10:45 PM
Mike, 
 
Good to hear from you. I completely agree! More information is always a good thing, and I enjoyed hearing from everyone on this. 
 
As a Nintex person, I do have a question for you. Did the changes to SharePoint workflow architecture in SP2013 cause any significant changes in the way that Nintex Workflow runs or is licensed on SP2013 as opposed to 2010? For example, I understand that now in 2013 all the workflow is supposed to run on its own service and not on the WFE anymore, so do you guys still sell it by the number of servers in the farm and did you find that this change has improved performance or scalability over the older version of SharePoint?


Rob
2/12/2014, 1:03:16 PM
Biggest downsides of Nintex: 
1. Workflow data and history are stored in mulitple locations. If you run into any issues with a workflow or the databases, it's extremely difficult to manage and control the workflow data. Their best practices are extremely cumbersome and not database friendly. 
2. Rollbacks are next to impossible. If you need to roll back a deployment, you will be SOL. You will need to copy the entire web app and nintex DB onto another environment. 
3. Documentation seems to have been written by a crossword puzzle designer. Info and steps are broken up between God knows how many docs. Best of luck.
Luis
3/25/2014, 6:56:56 PM
I think you need to investigate more of Sharepoint 2013 workflow manager and service bus as this is a highly scalable product, even more than Nintex, NIntex runs on wfe servers, but with workflow manager you can have a separate farm for workflows, which makes it an amazing product. 
 
This comparisson without comparing the ootb product makes no sense to me.


Thomas Carpe
3/27/2014, 1:18:53 PM
Hi Luis, 
 
While we haven't done a great number of solutions based on SPD workflow lately, I can say that I've done a lot of these over the years. I understand there were a lot of improvements in the architecture of workflow on SP2013. However, my issues with it are more a question of what features existed in 2010 workflows that were not carried over to the new version. You should check out Ira Fuchs' presentations on SharePoint workflow; I agree with a lot of his points. Beyond that, it is just a matter of the fact that we're now several major versions into SharePoint and yet the workflow portion of the product has had major shifts with each one. From a business perspective it is a real challenge to get someone to invest in technology that will have a shelf life of just a few years. 
 
That being said, I do have some customers who are asking what they can do with OoTB workflow and if something comes along that changes my opinion I will be sure to share it on the blog.


MikeL
3/28/2014, 4:55:43 AM
Nintex is good, but only for simple workflows. Also it's forms application is terrible from what I've heard. 
 
K2 - much better when it comes to more complex workflows. Altough with high barrier to entry. Additionaly my customers say forms application has lots of bugs and performance issues. 
 
AgilePoint is the closer to BPM than simple workflow here. Although visio and their graphic designer have their limitations. 
 
You may want to check out WEBCON BPS. Forms and workflows (and business processes) in one application. Graphical designer, changes in processes can be done on the fly (no need to publish new version) and you also get plenty of DMS capabilities along with OCR, barcodes etc. On the downsize: it doesn't support cloud.


Alex
3/30/2014, 11:06:56 AM
Great review !!! 
nowadays I actually need to decide between nintex and agile point for our company . 
 
I still a bit confused regarding the strength of nintex compared to agile point . since agile point uses an external and separate server ,hence processing resources ,from share point itself (it's agnostic to SharePoint) and it cost about the same as nintex what's the question here ? would you say that nintex is not up to complexed and long processes managing ? 
 
from the review it seems like agile point is perfect if you have the time and human resources to invest in learning it's foundations and from the moment you get it it's one league above nintex for the same $ so what's the dilemma? or am I missing something here..... 
  Alex


Thomas Carpe
3/31/2014, 10:13:57 AM
First I want to thank everyone who has made this a very active thread since it was posted back in December. You guys rock. 
 
Alex, to answer your question, there are differences between the two products that might affect your choice. 
 
AgilePoint has a forms engine also, but until recently they were using InfoPath as their main forms engine for Genesis. The Nintex forms engine has been a part of the product for a long time. If you are looking for an escape from InfoPath, this might be your option although AgilePoint is catching up fast. 
 
Nintex has been ahead on Office 365 development for quite a while. We are still waiting eagerly for an offering from AgilePoint but unfortunately it is still vaporware at this point. 
 
Nintex has pretty good support but they are a huge company. If your needs are complex they will probably connect you with a random partner. AgilePoint is a smaller shop and you would get to know the development team personally. 
 
AgilePoint server is a separate install. It can be configured to run on one of the SharePoint servers if necessary. There are different options if you are running SP2010 or SP2013 because of the .NET framework 3.5/4.0 difference. IMHO, the 4.0 version is much better. The installer varies a bit among the different builds and sometimes there are issues setting up advanced features such as data export. Nintex installer is pretty tight, but the options are fairly standard - NW 2010 or NW 2013. 
 
There are designer differences. Nintex designer is a web based tool built into SharePoint. If you have Visio aleady, then using it for AgilePoint will not bother you. If you don't have it, then the extra licensing cost for it will be another obstacle. Nintex charges by the number of WFE in your farm, and to the best of my knowledge there's no extra cost per user. 
 
Because as you mentioned, it is more system agnostic, AgilePoint will integrate well with more third-party systems. Both will do just about anything you want via calls to web services if you have that option. 
 
And off the top of my head that's about all I could think of today, but I hope it will help you.


pravesh kumar sharma
4/15/2014, 2:40:37 AM
i agree with most of the point which you have mentioned in blog .. i have worked on SharePoint and Skelta BPM tool for 8 years.. Technical and functional point of view.. 
i would like to add some points.. 
 
1. most of the enterprise tools having workflow capabilities (ERP,CRM, CCM, SCM etc. even MOSS) but that is very limited to its own suit. 
2. SP workflow does not have capability of process life cycle(designing, deploying, monitoring etc.) 
 
3. BPM engagement starts from Business point of view keeping in mind many factors (SOA, agility etc..), but workflow initiatives most of the time comes from IT initiatives and just customization of some small process 
 
4. workflow required customization most of the time whereas BPM suits having its own OTB modal like designing process, reports, UI and readymade LOBs connectors. 
 
5. SP workflow is very limited to SharePoint artifacts like DL, List, InfoPath, outlook etc.. whereas BPM starts from automating process via connecting machines to machine(SAP, oracle, SQL etc.) and people to people(Fin to HR to Procurement to Admin etc.) 
 
6. BPM provides BAM components and monitoring tool for KPI, MIS, graphical reports for Process and data in order to Optimize the process 
 
7. in last but not the least.. BPM is superset of workflow  

Thomas Carpe
4/15/2014, 8:25:39 AM
Thank you Pravesh for your comments. I agree with basically everything you have said here. If there were a line representing a maturity model, SP workflow would be on the lower-left end of that line, and BPM would be at the upper-right. Products like Nintex sit somewhere in the middle. It is important to understand that BPM goes well beyond what SharePoint does - or tries to do - and that these different tools serve different needs.


Walter
4/17/2014, 1:44:43 PM
Excellent overview and comments. We are using SP2010 OOB and a big issue we have is that it's so hard to troubleshoot a misbehaving WF. I'd be interested in hearing from Nintex and K2 "light" users on how well or poorly those products might address this.


Ahmed Mostafa
5/14/2014, 3:30:27 AM
I like this article. I have used K2 on a number of engagements for my clients. What I really want to investigate is ECM dimension. I believe the way SP stores fields and attachments together in SQL is a big drawback that affects the total performance. Have any of these vendors addressed this issue? how do they store their attachments and meta data?


Michael Mangan
11/10/2014, 9:24:11 AM
One of our specific requirements is to embed the workflow approval as a digital stamp on the document, do any of the solutions support this?  
 
I really appreciate this blog. Thanks


Kamal
2/20/2015, 2:56:44 PM
Excellent article, Thanks all, hang around everyone will learn from each
Shalin
3/6/2015, 4:53:31 AM
flowchart likes diagrams can be drawn from many visio alternatives as well. Its ok to use a visio alternative if it is online as platform indepedent


Cierra Luke
4/24/2015, 2:51:32 AM
Hi Thomas, 
Thanks for this helpful post. I a professional software solution provider in a web designing and development company. We have been using MetaStrom from last 3 years, but now our CEO wants a new BPM software to manage our workflow and processes. Can you suggest any good one? 
Thanks in advance


Thomas Carpe
4/24/2015, 10:57:26 AM
Thanks to everyone for the comments and questions. Haven't gotten the chance to come back here and reply very often, which I regret. Business has been getting quite active - perhaps the economy has finally really turned a corner? Will post some replies now to try and catch up. 
 
In the question of lite versions of Nintex and K2, I can't really make the comparison since I have seen the Workgroup and cloud versions of Nintex but have not played around much with K2. An old friend of mine, Mark McGovern recently went over to a new job at K2, and perhaps he can connect somebody with a person there who can speak to their product line. 
 
From what I know of the products we work with often, I would say that the workflow engine that drives the process will be the same in the lite version as in the enterprise edition. That's just good software development, because who wants to maintain two build sets. right? That being said where I think you'll see differences is in things like number of allowed users, what activities/actions you can perform in a workflow, etc. 
 
=== 
 
Moving on the SharePoint Foundation 2013 and the workflow product that Microsoft has produced. We've done some projects lately where this was required, because the customer did not have a budget to purchase another product. If you must go this way, I think you'll survive. But, I still feel like the MS workflow manager is not as robust as any of the products I talked about in this blog, and probably even Bamboo and Datapolis have something to offer that it does not. The new version has the disadvantage of needing to be installed on a seperate server from the SharePoint farm. Otherwise, you're stuck with SP 2010 workflows. Yes, I realize maybe I am talking out of both sides of my mouth here, because I will say the same thing is a feature of products such as AgilePoint which can be installed on their own server - however, it is possible to install AgilePoint on the SharePoint server if necessary and they cohabit quite nicely. I've never had anyone recommend to me that this is possible with MSWM. So there's the distinction in my mind. I also know that many of the activities that were possible in SP2010 workflow are no longer available in WM/2013. Haven't looked into it lately, so maybe that's changed. However the laundry list of takebacks was quite lengthy last I heard and MS would have serious catching up to do to make it comparable to what it was in the previous version. Anyone who wants to know the current status should seek out Ira Fuchs, as he's the guy that I have learned so much from in the past and can tell you what Microsoft is doing now at a level of detail that would just go beyond my own knowledge. 
 
If you find yourself in the unenviable position of working with SharePoint OOTB workflow, I believe you will find that it gets the job done well enough for simple processes. We use it all the time in Office 365 where it is a built in part of the feature set and I don't have to worry over things like who installed the extra server on the farm. I like it quite a lot when I need a short workflow to produce the same kind of behavior on SharePoint Online that in past years I might have been able to do with a List Item Event Receiver or Timer Job instead. But, like anything you get what you pay for, and I think with SP workflow what you gain by not buying a product is offset because it is back to IT people having to implement the workflow and there's a lot that business users will not be able to figure out for themselves. Since all the big players now have cloud capable versions of their workflow products, there's no reason to be stuck in this situation. 
 
Speaking of the lightweight and OOTB workflow in SharePoint, one product that came up on my radar recently was KissFlow. This product uses Zapier to integrate with other cloud products and says it has a connector connector to Office 365. While they don't offer one yet for SharePoint, look for them to become a major force in the market if that ever changes. The current lack of a proper connection to SharePoint is a major shortcoming IMHO and an opportunity for somebody. 
 
=== 
 
On the question of whether it is OK to develop workflow using online tools instead of Visio, I think this is just a question of personal preference. To be honest with you, I have tried many cloud platforms over the years - not just workflow - and for something really complex, I want to have the file on my desktop and not have to rely on a web browser to properly render it. I find cloud based development to be well over that magic 400ms threshold where my mind will wander and eventually I find myself checking email and standing up to get a cup of coffee. That being said, this is my criteria and for you the pros and cons may be different. Web based platforms have the advantage of being workable from anywhere regardless of what software you have. One thing I really like about AgilePoint by the way is that you can now choose either method to develop a workflow, so if you like Visio use Visio and if you like the browser use the browser. :-) 
 
=== 
 
On the most recent question about Metastorm. That's a tough one. I did a thorough analysis of Metastorm for a client a couple years ago, and I have to say that while it was a powerful product, it was definitely showing its age and there were lots of things that made it a poor fit for our project. 
 
That being said, without detailed criteria, it is impossible to say which product you should switch to if you're moving away from Metastorm. For example, you need to consider what processes you already have in that platform and whether you are using it like a BPMS or are the workflows you already have actually quite simple. Obviously, if you have a lot built on the Metastorm platform, you are going to need something equally robust which is going to take you toward the K2 and AgilePoint end of the spectrum. If your lack of satisfaction comes from feeling that you never fully utilized the product feature set, then maybe going with a simpler solution like Nintex or SP workflow would be the way to go. 
 
Hopefully this helps you realize that the problem is a really complex one and can't be easily solved with a one size fits all answer. "It depends." Classic consultant response, am I right? :-) So, go back and start asking the detailed questions and I think the answer will present itself. Best of luck!


Thomas Carpe
4/24/2015, 11:02:09 AM
I was reading my own old posts and wanted to chime in to remind everyone that you can now subscribe to MS Visio in Office 365, so that's another barrier to entry that's come down. If you need a quote for anything Office 365 related, follow the links to contact us and we'll do that for you. :-)


Brian Garnica
4/27/2015, 1:49:46 PM
It was really good to read this article!! Excellent job about comparing different products. We had been making some research to offer the right option to our customer. 
 
Thanks.
Thomas Carpe
4/27/2015, 3:36:33 PM
Brain, thanks for the whuffie. Feedback about the blog posts is always appreciated, as they're a labor of love. Hope your customer makes a good choice; it is always a complex decision figuring out which horse to hitch your wagon to. ;-)
ali
7/8/2015, 3:16:23 AM
just to make it clear i dont work for any of these companies but I am a sharepoint developer from 2003 to 2013 so far. I have used k2 2003, black pearl and now nintex. nintex by far is seriously bad for complex worflows and time consuming. K2 is far better in complex workflows that span over time. plus i dont care what people think of what they like but care more about what business use.... in my experience in the uk the majority of banks and consultant companies use K2. the place i work for at the moment are using nintex and they already regret it.
Thomas Carpe
7/8/2015, 12:08:11 PM
Ali, 
 
Good to hear from you. 
 
I am so surprised this post still attracts comments so long after I wrote it! It seems everyone has strong feelings about this stuff. 
 
I was more familliar with Black Pearl and Black Point back in the 2003-2007 days and really started drifting away from it in 2010. With AppIt from K2 released now as well as AgilePoint NX One cloud offering and new pricing models for Nintex on Office 365 too, I feel like maybe this topic is due for a revisit. There are so many more exciting possibilities to choose from this year than there've ever been before, especially if you don't have the luxury of managing your own SharePoint farm. 
 
I imagine that market share for the big workflow software products varies by country and type of company. It's my understanding that Nintex has the most market share overall when it comes to SharePoint - I wouldn't be shocked if Microsoft buys them outright - but there will be local differences in different places around the world and various sizes and types of companies. That being said, my feeling is that who's on top shouldn't be the way that companies decide what product to choose. You have to go with your experiences and those of others in the community, and I am grateful for everyone who shared here because it gives me a good sense of what people are seeing outside of my own little world. 
 
You're definitely not the first person I've heard express some negatives about a particular product. I think that each case is different and you have to consider the pluses and minuses individually, and that the market and the best choice is a moving target. I still have to say that whatever the down sides may be, almost any product would be better than the OOTB experience in SharePoint. 

Geographically Dispersed SharePoint and Other Collaboration Tools

For those who may have mised the #CollabTalk TweetJam today, hosted by our good pal Mark McGovern @DocPointMark from MetaLogix and Chistian Buckley @BuckleyPlent, we're posting our Q&A from the event.

From the event description:

Many organizations are looking for ways to reduce costs, and improve performance associated with managing SharePoint between geographically-dispersed teams. While many organizations struggle to make their environments highly-available and performant, the breadth of SharePoint content available does not focus on SharePoint in high-performing, high-availability scenarios – and the purpose of this tweetjam is to share some of the community knowledge and expertise for these environments. We're assembling a great panel for this event including MVPs and other industry leaders.
Mark McGovern and Christian Buckley promise a blog about the event soon, to recap the entire conversation. You can also see the full conversation including responses from other participants at http://twubs.com/CollabTalk The tweetjam was also captured using the #CollabTalk hash tag, so you can use the Twitter platform of your choice. 

Q1. What are the top 3 issues geographically dispersed teams face when trying to collaborate?
A1 #1: Disconnected Teams: Updates in the field don’t make it back to the head office and updates at the home office don’t make it to the field.
A1 #2: Latency: Whether you have a single farm in one location, or separate farms with synchronization, somewhere there will be delays in getting data where it needs to go.
A1 #3: Networks: team members may be working in locations with limited bandwidth or inconsistent/poor connectivity.

Lots of chuckles about the poor quality of conference calls, background noise, heavy breathing, etc.

There was also a bit of back and forth about latency and the fact that people not only do not understand the difference between bandwidth and latency - but also most folks cannot tell whether their performance issue is coming from the network or the browser.

It came in a bit late, but I really liked Michael Herman's garden hose metaphor for latency. No, I don't mean, "it's not how long your hose is; its how you use it." In this case its all about how long the hose is; in other words, latency is the time it takes water (information) to get down the hose (internet) after you turn it on. Bandwidth would be the size of your hose I guess, and I need to stop making these metaphors, because I've got a dirty mind - and.. yeah. Movin' on!

Q2. How have social and mobile impacted your worldwide collaboration?
A2 #1: They’ve allowed us to reach out to the community for collaboration - at least as far as the dev and support side.
A2 #2: We are now able to share information with our client as it happens, ex: Tweeting key points of a SharePoint Saturday presentation in real time.
A2 #3: We see more new customers coming from outside of our core operating area, nationwide and abroad.

Q3. What technologies have you found that can improve geographically dispersed collaboration/communication?
A3 #1: Faster and more reliable cellular connections. This applies to broadband too.
A3 #2: Cloud services have come a long way in making this more manageable, ex: SharePoint and Lync Online.
A3 #3: There are also products/appliances and applications that do well to synchronize SharePoint content between farms.

We talked a bit here about the F-5 BigIP. Some folks on the TweetJam have had good success using this appliance. Liquid Mercury Solutions is an F-5 partner. We can sell, service, and implement F-5 based solutions using the BigIP.

We had a great discussion about security, and maybe we can convince HelloItsLiam to participate in a panel specifically about SharePoint security at some point soon. It suffices to say this is a topic that needs some more attention.

Q4. If your team is geographically dispersed, is your best option to move your data to the cloud? Why/why not?
A4: Sometimes: it depends on the type of data and the location of team members.
A4 Why #1: Having data in the could improves access, assuming the cloud provider has distributed their datacenters across a large geographical area.
A4 Why #2: Greater availability to resources - gets around firewalls, corporate networks / VPN, and political boundaries.
A4 Why Not: Some data is too sensitive to store in the cloud without a plan to protect it. Ex: defense, proprietary secrets, healthcare PII.

The cloud is great from some circumstances and not so much for others. If you are on the borderline between these two scenarios and would like to talk to us about securing your sensitive data in the cloud, we are a CipherPoint partner and can develop you a solution using their on premises or in-the-cloud product offerings. By the way, F-5 BigIP is another solution that can enhance cloud security for SharePoint content.

Q5. How do global collaboration teams deal with poor quality bandwidth/connections?
A5 #1: Use asynchronous communication channels like e-mail, Yammer, and Lync instead of Skype, etc.
A5 #2: SharePoint 2013 can reduce the amount of transfer using MDS feature (Minimal Download Strategy).
A5 #3: Develop low-bandwidth tolerant branding (ex: Metro UI) and apps (server side vs. AJAX). There are optimizers for JavaScript and CSS, as well as ways to do this at the firewall/proxy/load-balancer.

Collaboration tools are an indispensable part of today's business world. How many of us could survive without GotoMeeting (or something like it) for example. What starts as a competitive advantage will eventually become the standard by which all businesses are judged, and tools like SharePoint are no exception to this. Someday, all restaurants will be Taco Bell. :-)

Yammer gets a lot of attention, but people are unhappy with the limited (read: "weak") integration between Yammer and SharePoint. In 2012, when Microsoft purchased Yammer, I shelved plans for a Yammer+SP product release in order to see what they would do. Seeing that they have not decided to eat our lunch, Liquid Mercury Solutions has plans to release a stronger Yammer + SharePoint integration solution in the very near future. The best way you can find out about this tool is the subscribe to our blog or newsletter (It's over there in the upper-right part of the page).

I made an additional comment here about using all the tools in the drawer, no silver bullet.

Q6. What are the best ways to maintain multiple systems/versions of your collaboration platform, such as SharePoint?
A6 #1: A communication plan with stakeholders + predictable schedule for updates / merges is essential to make sure everyone knows what they're seeing and how out of date it may be.
A6 #2: Assigning a "system of record" is extremely important to maintain one version of the truth.
A6 #3: There are tools in SharePoint like content syndication, cross-farm publishing, etc. - as well as a variety of third-party tools that fill this need nicely. Syncing at the SQL level is also an option, but less favored than it used to be.

Somebody said that they had 5 versions of a document to maintain. If this sounds like something you have to deal with on a regular basis, talk to us, because we may have a solution that will work for you.

Q7. What are the leading factors that restrict organizations from maintaining high-availability systems?
A7: Factors that limit orgs use of High availability and DR include cost, bandwidth, product limits, undefined SLA, lack of institutional support, and insufficient technical knowledge and/or best practices.

Money was the big winner on this question. There is always, always, always going to be a relationship between your budget and the capabilities you can obtain. My adivce is to be up front with with your IT professional about your budget, and work with them to understand how to get what you need within your means, and don't set your expectations unneccessarily high.

Q8. How does a geographically dispersed infrastructure impact disaster recovery planning?
A8 #1: If the primary datacenter is impacted by a disaster, then the outlying datacenter will experience higher loads and in some cases becomes the systems of record.
A8 #2: If it not previously planned and drilled, during major disasters (natural or civil), communication to outlying centers re tactics - or even that there's a problem - can be confused or conflicted.
A8 #3: Sometimes switching back to normal after DR can be just as difficult.

Can't say it enough, when it comes to disaster preparedness "Drill, baby, drill!" ;-)

I hope you enjoyed our recap of today's #CollabTalk tweet jam. If you feel like I've left something out, or if you just want to throw your 2 cents in, leave us something in the comments. If you found this information helpful, please give us a 5 Star Rating on PinPoint, so so we can reach more customers.

Know Your Meme: SharePoint Staffing Anti-patterns

Over the years, the Microsoft SharePoint ecosystem has given rise to new and wondrous beings. These mythical memes deserve some light shed upon them, lest one find oneself trapped in their terrible clutches or wandering the Development Wastelands, like Sir Percival, in search of an impossible quest.

Join me now as we explore the first of what may become many entires to the compendium of hiring anti-patterns for SharePoint.

The SharePoint Unicorn


The SharePoint Unicorn is someone who can do it all - and perfectly too! She or he is a project manager, consultant, architect, administrator, and developer all rolled into one; a one person wrecking crew; cheap-by-volume, quick, often very hard-working, and has near-perfect delivery; and a multitasking master who is so damned good that instead of costing money, they can recover the cost of the project before it's even finished. In other words, SharePoint Unicorns have so much ROI that their net billable rate is actually negative - no matter what they charge. ;-)

I won't deny they exist, I’m pretty sure my friend, fellow Baltimore SharePoint Users Group commitee member, and founder of SharePoint-Careers.com, Shadeed Eleazer, is one such being; he actually coined the term “SharePoint Unicorn” back in 2011 along with Marie-Michelle Strah and Mack Sigman. People have sometimes accused me of being one, so much so that it’s become sort of a running office joke. There's even a @Shareicorn on Twitter, although I am a bit peeved they haven't posted anything ever. You can go to any SharePoint Saturday and if you swing a broom around too vigorously you’ll probably knock 4 or 5 SP Unicorns over. Seriously, don’t do that; we don’t like getting hit in the head with a stick, no matter what your hiring manager says!

But, people like this are rare - and almost impossible to pin down. Much akin to the equine Unicorn of legend, which can only be approached by virginal maidens, one must be damned sexy to capture the SharePoint Unicorn. Don’t all you LMS clients feel super-special now?

These masters of their craft have worked hard to become so and know their own value. They want to work only on well managed, challenging, and interesting projects. They're typically well into their careers, and many would rather be consulting or going on speaking tours than developing in a cubicle for months on end. Many of them get scooped up by Microsoft or other best of breed SharePoint consulting firms. They're in very high demand, and thus probably not available for your random meetings. They will not be cheap.

A SharePoint Unicorn is not one of your cavalry horses. They will not be your full time employee - unless you're prepared to offer something truly unique. They're best leveraged as a precision instrument to ensure designs are sound, lead technical discussions, overcome specific obstacles, and find creative solutions to truly tough problems.

One last word of warning. If you think you are about to capture such an amazing creature, Ware the SharePoint Rockstar, which may be lurking just beneath a thin façade.

The SharePoint Rockstar


This clever beast is a brilliant mimic of the SharePoint Unicorn, a wolf in sheep’s clothing if you will. The SharePoint Rockstar is a young intelligent individual who has the ability to do every aspect of a SharePoint development project (or at least they think they can and tell you so). But, once they are hired to consult, design, or architect they begin to hoard technical responsibility insist that you only use them for the entire project.

Like the movie Aliens, there may be a voracious Ego that dwells within the architect or developer, incubating until it erupts to insinuate itself into every aspect of development, eventually becoming too expensive to remove. It then transforms into a prima donna and/or drama queen, sowing decent and causing friction within the whole project team. The Rockstar has a lot of talent and appeal, but ultimately they become more trouble than they're worth.

Possible indications that you have encountered such a beast are: 1) you think you hired a SharePoint Unicorn; 2) their Ego arrives at the meeting 15 minutes prior to their person; 3) they're not helpful to and well-liked by the rest of your team; or 4) they tell you not to hire any other developers, they can do it all themselves. A little Ego can be a good thing. The problem is that the Rockstar does not play well with others.

Protect yourself by considering personality as well as a candidates technical skills. Let your technical team interview as a group - and collect candid feedback from them. If you are interviewing the first person on your team, consider a relaxed social setting to see how they behave when their guard is down, and craft interview questions that make them reveal their personality and how they work with others and share responsibility.

Given enough time, a SharePoint Rockstar may mature into a SharePoint Unicorn. Indeed, they can become an invaluable and irreplaceable asset - but at a terrible cost to you. Caveat Emptor.

The SharePoser, aka SharePoint Shyster, or SharePoint Fraud


These vile parasites may be masters of illusion - but they are very real! They are senior-seeming people who don’t know anything about SharePoint at all. They exist only to suck the lifeblood out of a company. As comedy, Jen Barber from the “IT Crowd” is a good example. But seriously, I once met a SharePoint Architect who was actually just a regular building architect and somehow got hired to lead the whole SharePoint development team.

Frighteningly enough, I’ve seen this kind of thing happen at more than one organization. It sounds like something that should be rare, but it is not. Lots of reasons for this, I suppose. Firstly, there's a legitimate shortage of qualified SharePoint talent – though that problem solves itself slowly over time. Secondly, lots of people are attracted to SharePoint because it is such a marketable skill. Many are ethical, hardworking folks - a few are not.

If you’re lucky, you only hire someone at a low level who sneaks through the interview by saying all the right things. It would be bad to hire a Windows Admin who doesn’t know how to add users to Active Directory, a SharePoint Developer who doesn’t know the difference between a Site Column and a List Column, or a SharePoint Architect who has to ask the developers if a Web Part Zone can hold more than one Web Part – but it’s not the end of the universe. Just fire the bum and move on! (By the way, these are all real life examples.)

The damage is really greatest if the SharePoser actually establishes their power base. One needs to be especially careful when qualifying those in leadership positions like project manager, architect, or consultant. The well-spoken Fraud flourishes when there’s no one to vet their technical skills and disqualify them during the hiring process. IMHO these creatures are the gateway to hell; in order to perpetuate the lie and insulate themselves from discovery, they often hire a whole department of equally inexperienced or incompetent people who can do the work - slowly - but are too beholden to them for the job to rat them out. Thus begins the downward spiral to the underworld of underperformance and waste.

Possible indicators of these individuals are: 1) a well-padded resume, 2) unreachable contacts or not-so-professional references, 3) lots of buzzwords during the interview, 4) claiming to have “all the SharePoint certifications”, 5) and/or a secret copy of SharePoint for Dummies in their briefcase. Remember that insecurity is the hallmark (and weakness) of the Shyster, because they live in the constant shadow of being exposied.

Protect your company by vetting hires using real-world work simulations (instead of just interview questions), having a proven expert screen and qualify your team, verifying any Microsoft certifications, and hiring a variety of differently-skilled individuals who complement each other (and can keep the each-other honest).

Tune in next time for The Do’s and Don’ts of hiring a SharePoint Developer.

A SharePoint Success Story

Before I got to Liquid Mercury and indeed before I even got into SharePoint at all, I was a contractor working for HHS’s Health Resources and Services Administration (HRSA).  When I started, we worked as the “Application Processing Center” the NHSC Loan Repayment Program.  It was a program for doctors who were willing to work in high need areas, like free clinics in urban or rural areas to get some help in paying back their student loans.  The branch had about 40 people in total working on processing, analyzing and approving about 11,000 applications per fiscal year.  Then we made the transfer to SharePoint, once everything got up and running, 50 of us were not only processing, tracking and generating business intelligence 11-15k applications, but we also had so much excess utility that we absorbed the work load for 9 other programs as well!  I cannot even begin to imagine how much money that saved the HRSA.

Allow me to start from the beginning and paint you a picture.  Originally the application process for just one program (the loan repayment program), candidate applications were all hard copy and tracking them involved Excel sheets, hundreds of them.  We tried to keep everything on the computer, but most of the people in the chain would make notes on the hard copies and this would never make it into the Excel files notes or worse, they would get lost entirely. In addition to the actual application process we also had a team of contractors who assembled and collated the data from the application for analysis and business intel.  I would guess that the ideal-est of candidates could be processed in about 2 weeks, but that only happened once, generally it took about 6 weeks to process an application.  Needless to say it was a fairly inefficient and labor intensive process.

In 2010, we moved over to the SharePoint environment and moved the entire application and tracking process online.  We eventually used SharePoint to manage our application intake through the website portal; an intranet to handle document storage and tracking; and business intelligence from automated report generation, analysis, and easy to digest metrics through management dashboards.  Bottom line: we went from a clerical office to a technology office by getting rid of the receiving, assembly, and data entry teams; then tripling the size of our IT staff.  But once the process was automated, applicants were processed loss went to zero, redundant processes were heavily reduced, metrics and analysis were 1 click away, and thus utility also dropped. This meant that we started doing the same work for similar departments like nursing and medical school scholarships.  In the end we were doing 9 times the work for less than double the cost and that is what made a SharePoint convert out of me.  

And yes, you should probably study IT in school.

SharePoint Quirks: Getting Past Content Type Errors

SharePoint is awesome. It really is. But it sure has its quirks. Case in point – you want to do something simple like deleting a content type. Seems like it should be pretty simple, but you get an error like "Content Types Still In Use" or "Can't Delete This Column While It's Part of a Content Type."

So you look at the documentation, but it's no help at all. It suggests you check SharePoint Manager to see where the content types and columns are being used. Good luck -- that's not going to help.

Frustrating!

But like so many things with Sharepoint, the solution is actually painfully simple.

Why is SharePoint Such a Pain?

Before we get to the solution, first a little background. There's actually a pretty good reason why SharePoint is the way it is. It's to protect you from yourself.

Why do you need protection? Because it's awfully easy to delete content by mistake. Microsoft tries to prevent this by offering trash bins. LOTS of trash bins.

 You can find your trash bin either in the left navigation or in the site collection settings if you're in the root of the site collection.

When you delete content, it goes to the appropriate trash bin for your user level. But here's the thing. Only the site collection administrators can empty the end user trash bins. If any deleted lists or columns are still in the trash bins, then any content types or columns that those deleted lists or columns point to, cannot be deleted.

So what can you do about it?

A Simple Solution 

So, the easy, but hard-to-find solution? Have the site collection administrator delete all "End user recycle bin items" and then delete all items from the "Deleted from end user Recycle Bin."

 

It’s really that simple. You should now be able to delete your orphaned columns and content types.

********************************** 

Still having trouble? Liquid Mercury Solutions helps all kinds of businesses deal with the quirks of their SharePoint systems. Let us know how we can help.

Office 365 User Permissions Gotcha!

Congratulations! You convinced your client to sign up for Office 365. They subscribed to licenses for various apps like SharePoint Online and SQL Azure. The wireframes look good. You've created the SharePoint site structure. Wonderful! You're off to a great start. But… get ready for a big "gotcha!"

You decide to make use of the Content Type Syndication Hub. Wise move. You go into the SharePoint online site, activate the Hub and click on the link to the Hub… bzzzzzzzzzzzz! Access Denied.

Uh oh.

You've run into a little known gotcha in Microsoft Office 365.

According to the small-print regarding Permissions in Office 365, "The person who signs up for Office 365 for his or her organization automatically becomes the 'top-level administrator.'" Notice, though, this says nothing about the fact that it's also the only person/account who can access certain features of the Office 365 products.

At the time of this writing, the Content Type Hub and the Search Center are the only two features known to be provisioned with a default administrator (I've only encountered this in SharePoint Online so far). The Search Center is provisioned automatically when the root site collection is set up (the first time Office 365 is logged into by the person who created the account). The Content Type Hub is provisioned when the feature is activated in the Site Collection Features.

Microsoft's logic is that, like an on-premises server hosted at your facility, roles need to be delegated such that no single person (except the domain admin) has access to every role and resource. This is essential for many reasons, security being the biggest.

But unfortunately, this becomes illogical when you move to the cloud. Now, instead of increasing security, you're actually committing an IT sin by creating a single point of failure. When you're hosting your own server, you probably have access to the domain admin account (or at least to someone who does). But if you're in the cloud, you probably don't. And even if you do, you don't have access to the administrative console on Office 365.

Needless to say, as a SharePoint consultant, having to ask your client for permission to do things on the "very-first-ever" account is less than ideal. What if you lose access to that person at a critical juncture? (In my case, they went to the Carribean for two weeks, yay!)

When we discovered this problem, the client was small and the need for the Content Type Hub was not particularly urgent. No harm, no foul. If your client's business depends on a web site that hosts thousands of transactions per day, you can see where that could make for some trouble all the way around.

Microsoft, unfortunately, says they will not reassign this account under any circumstances without explicit orders from the portal creator. But what if the Portal account is deleted by mistake? What if the account creator quits the business, leaves the country, or gets hit by a bus?

Just ask this guy what happens. Not pretty.

According to Microsoft, best practice is to delegate this control to other admins. Agreed!

But, your client may not be that savvy - or that motivated. Or your client may be a control freak. Maybe, like many of our clients, they only became your client *after* they created their O365 free trial account and got in over their head. I can list a million reasons this creates risk, but mostly, it's the reasons I can't think of that usually kill me.

So how to handle this issue? The best way would be to change this policy system-wide.

It's Microsoft, so don't hold your breath.

Another way is to ask the client to share their credentials with you after they've created the portal. This is also risky, and, like I said, some clients are reluctant to give up so much control.

A third way is to ask your client to let you create the portal for them. This is fairly low-risk and should work in many cases, but there's always those situations where you got involved at a point where it's too late - or control issues, yet again.

The last reasonable solution is to ask the client to create a dummy Windows Live account (such as 365Admin), which they can use to create the portal and which they'll be comfortable sharing with you or any other vendor. We recommend doing this from the beginning; it is slightly more painful, but possible, to rename the "primary" account after the fact (and create a new account for the CEO).

Got any more ideas to improve these best practices or know of any other features in SharePoint Online or other Office 365 components that would have this issue? Post them in the comments and I'll stick them into a follow-up post. Hope this helps some of you avoid this weird gotcha. Getcha next time!

8 Must-Ask Questions When Implementing SharePoint Drop-off Libraries

A great new feature in SharePoint 2010 is the drop-off library, which provides a central place to upload documents that are programmatically redirected to a different location based on rules you can define.

For example, if you’ll be holding different types of documents that are all associated with a contract, you can simply add them to the drop-off library, which will automatically place the documents in the library of the correct contract based on the metadata.

In order to correctly implement this feature, there are eight key questions we ask as part of our process. These are key – it’s all too easy to forget to answer these up front, resulting in a poor execution.

 

1. What content will be routed?

This is an obvious question we need to get out of the way, but there are several other questions that follow that are not quite as obvious, but are required to successfully create the drop-off list.

Sample answer: Contract related documents

 

2. Why is it being routed?

This question can help with defining the content type associated with the documents. A good descriptive name will help when setting up the rules that will trigger the library’s action, as well as later on as the site is maintained.

Sample answer: To keep documents associated with a contract in a common location.

 

3. What metadata fields are needed to differentiate the content?

A good, descriptive name for metadata fields should be something simple like “Contract_Number.” But, do not lose site of SharePoint’s other powerful features like filtering and searching. Ensuring that document type, sales person, and date are associated with the document can help index the content so you can filter the correct data you need after the document has been routed.

Sample answer:  Title, Contract Number, Type of document

 

4. How will the end-user provide this data?

How the data will be gathered from the end-user is something our clients must consider. Will the list of possible contract numbers grow over time? Or is this a set list that the end-user can choose from a drop down? Something that can rally end-users to support the site is a balance between being able to easily associate documents with metadata and the flexibility to adapt to unique scenarios. A drop-down list can be an easy way to define what type of document the end-user is uploading, but a text box might be required for a more unique answer like the title of the document.

Sample answer:  Type of documents listed in a drop down list. Contract numbers listed in a drop down list. Text field for title.

 

5. Is there any other metadata needed to associate with the content?

Again, do not lose site of SharePoint’s other powerful features that are facilitated with metadata. Searching and filtering views in the target location for the documents is something to keep in mind during the planning phase of creating the drop-off library. 

Sample answer: Any additional metadata.

 

6. Which metadata fields are required? Optional?

It’s tempting to demand that the end-user provide every piece of metadata to ensure documents are easily searchable, but in practice this can be difficult or impossible. The end-user might simply enter any answer “just-to-get-through” the meeting. For this reason it is important that you clearly understand why a field is required, and only require metadata that truly is needed.

Sample answer:  Contract_Number (required)

 

7. What is the target location for the documents?

If the library to which the content will be routed is outside of the current sub-site or even site collection, SharePoint makes it possible to route the documents to those destinations through Central Administration.

Sample answer:  Contract libraries

 

8. Should the routed content be moved, copied, or moved with a link placed in the drop-off library?

While setting up the connection we can define what specific send-to action the rule should take. The options are “move”, “copy”, and “move and leave a link.”

Sample answer:  Move

 

Conclusion

With these critical eight questions answered, you have all of the key information you need to create rules that will trigger the drop-off list to route content to the library based on metadata gathered from the end-user.

Real World Claims 2: Provision ADFS and SharePoint Demo Environment

This post is part 2 of a series, and is a companion to my presentation titled Real World Claims in ADFS and SharePoint 2010. This presentation will be given Saturday, April 13th, 12 noon at SharePoint Saturday, The Conference, Washington, DC April 11-13, 2011. Table of contents, additional links, slides, video recording, and other funny liner notes will be posted in Part 1 as they become available.

Planning an Enterprise Class Demo Environment for SharePoint and ADFS
First, I need a demo environment. I know I'm not gonna have much fun trying to show off our cool stuff unless I can squeeze SharePoint, Active Directory, and ADFS onto my laptop somehow. Quite possibly I won't even have network access while I am giving this demo – a big problem for a solution based heavily on Public Key Infrastructure. Plus, a lot has changed in the time since SharePoint 2010 first went RTM back in May of last year. We had a service pack of Windows 2008 R2 and one for SharePoint too, as well as a major release of ADFS. While we witnessed many bugs and issues over the past year, I want to be sure I'm presenting on the current state of the art. So, I begin by downloading new ISOs for everything. (There doesn't appear to be an image for SharePoint with SP1 included from install, but no biggie.)

What's the environment going to look like?

Server 1: DEMO\MasterControl
Active Directory, ADFS, SQL Server 2008 R2, [optional] Certificate Authority
normally you would put these services on multiple machines
typically installed behind the company firewall

Server 2: DEMO\SmartyPants
IIS, SharePoint 2010, ADFS Proxy
would normally be a WFE with an SP application server on another box
single server will suffice for this demo
if web facing, typically installed in the DMZ

I like this configuration, because it should be easy enough to do with only two VMs, but it will effectively mimic some of the inter-machine network traffic you would normally expect to see in the real world. The presentation is called "Real World Claims" after all. I could potentially re-use this demo environment for a multitude of purposes, including BI solutions or other stuff too.

One thing I want to talk about in a bit more detail here. At the time of this writing, I do not know 100% what the effect of taking these machines off the network will be. There are two things about my plan that concern me somewhat: 1) I am configuring this environment as a sub-domain of my forest, instead of an independent forest, and 2) I am using the certificate authority on the root domain of my forest. I've done some basic research into this, and it is my belief that certificates and CRLs (certificate revocation lists) are cached on the servers for a sufficient period of time such that – if I leave this system up and running at my hotel – I can disconnect it from the internet long enough to deliver my demo without any adverse impact. Time will tell, but later this week I plan to test this before it would be too late to formulate a backup plan. (Actually, my backup plan would be to install a standalone CA on my demo box, but even under those circumstances I am not certain how Kerberos will react.)

Now I need to install all these things. Having done this many, many times I can tell you it isn't all that exciting. Here's the basic rundown.

Step 1: Core Install and Configuration on MasterControl
Not much to say here. Install Windows Server 2008 R2 Standard Edition <yawn />     

 

Can somebody tell me, why the heck is there a close/cancel button on this window? Now we configure the network, install all updates, and rename the machine.
Keep in mind that these need to work when disconnected from the Internet.
For now, I am using local addresses 10.11.12.80 and 10.11.12.81,
but eventually I may switch to 192.168.10.1 and 192.168.10.2.

Next, run dcpromo to promote the server to an Active Directory domain controller.

 


In retrospect, I am having second thoughts about whether it was wise to join this demo environment to my existing forest. I guess we'll find out!

 

 

 

Oh, hey, I should probably make a second physical site for these servers.

 


Notice that I made this DC a global catalog server, because I'll be running it some physical distance from the forest.

 

 

 

One other thing I did is that I configured a new physical site on the domain, because I'm going to run these demo machines on my laptop in a [potentially] disconnected scenario.

Okay, inside joke: above site location named "I'm on a Boat" is actually a song by rapper T-Pain, sorta similar to "Like a Boss". At Liquid Mercury, we've been talking for over a year about doing a code camp on a Caribbean cruise! And now, you know the reference #nullreferenceexception.

Step 2: Install ADFS 2.0 on MasterControl
After downloading the ADFS 2.0 RTW, the installer will ask you if you want to install an ADFS Server or an ADFS Proxy. For this machine we're going to install the server. In this case, we do the GUI install and we know that ADFS will install to the Windows Internal Database. Later, I'll probably want to upgrade this server to use SQL Server instead (more info: here). Other than those two choices, it's a pretty brain dead installer. Then, we open the admin tool and do the real work.


After the installer completes, you can find the ADFS 2.0 Management console in the Start Menu under Administration Tools. To start the Configuration Wizard, click the link shown above next to the [extremely obvious] red arrow.

 

 


Now, personally, I'm a glutton for punishment, so I'm going to install this machine as the first member of a farm, even though I know that for the purposes of my demo it'll be the only machine to host ADFS - and I've effectively limited my load balancing options by using the WID instead of SQL.

 


Here, I'll use the default certificate for the Default Web Site in IIS, which happens to be the domain controller server certificate. Later I can generate a new certificate especially for SSL. The name of the ADFS web site is not glamorous. If this were a production environment, I'd possibly choose a cleaner name like "login.somecompany.com", but this will suffice for now, and I can always change it later.

Next, I create an OU in AD called Services, which is where I'll keep all the service accounts I need to create. Then I create an account in that OU. I call my account "ADFS Service" with logon "DEMO\adfs-service". Keeping this account in its own OU with other service accounts helps me find them easily and manage Group Policy Objects more easily if I need them later on. (Hint: I will.)

 

 


Finally, we confirm settings and click Finish to run the wizard, and ... <drumroll />

 


Oh crap. Better fix that. :-(

 


Now that the installer is finished, we need to address that error we got. Run these commands to give SPNs to the NetBIOS and DNS names for the ADFS server

SETSPN -A HTTPS/mastercontrol.demo.colossusconsulting.com DEMO\adfs-service
SETSPN -A HTTPS/mastercontrol DEMO\adfs-service

You also want to trust the ADFS service account for delegation. This is something I often forget to do. To accomplish this, jump into Start > Administration Tools > AD Users and Computers, find its user account and view its properties.

 


Note that the Delegation tab only appears as an option after an account's SPN is defined using the commands shown earlier.

 


After you complete the wizard, you'll see that IIS now has the virtual directories for /adfs and /adfs/ls under Default Web Site.

Something to keep in mind here, even though it doesn't apply on a new server: if you were doing anything special with Default Web Site that caused you to reconfigure it, the ADFS wizard is just going to drop its stuff right on top of your other changes. Be watchful for this, since they might have conflicting settings.

I'll kick the can down the road a bit farther on the rest of ADFS configuration we'll need to do.

Step 3: Configure a Certificate Authority
[A few words here about the idea of a standalone CA] I had considered using a standalone CA on my demo domain controller, but eventually I decided it wasn't needed and uninstalled it.

In my case, I'm going to use the Certificate Authority on the root of my forest. This works for me because I already have one set up, and I've tested it a few times with other development environments. I can be reasonably sure it's going to work when it's time to run my demo.

Creating a Certificate Template for ADFS / STS
For the most part, we're able to get by with either existing certificates that were created by AD when our servers were provisioned, or by creating certificates through the interface provided by IIS. But, Secure Token Services (and in particular ADFS) require a few special tweaks that will deviate slightly from the Web Server template provided out-of-the-box in AD Certificate Authority.

1. Specifically, ADFS wants you to use a certificate with a 2048 bit or higher key strength. Depending on your version of Windows and how long the CA has been in service, you might still be generating SSL certificates with 1024 bit keys.
2. As you'll also see, we'll need to be able to export the private keys for this certificate. This is something that is not enabled on the Web Server template.
3. Lastly, the version of your certificate template can make a big difference. 2008 compatible templates allow you to set up signature hash algorithms with SHA-512 or MD5, neither of which is supported by ADFS 2.0. In fact, ADFS is extremely fickle about what certificates it is willing to use – as we'll see. So, you need to be careful in the type of template you create, or risk losing a night of sleep to troubleshooting - like I did.


To request and issue such a certificate, we'll need to create a certificate template. Do this by opening Certificate Authority, then right-click Certificate Templates, and choose Manage.

 

 


Find the Web Server template, right-click, and then click Duplicate Template.

You will be asked to choose Windows 2003 Server and Windows Server 2008. This seemingly innocuous dialog box is actually quite insidious.

IMPORTANT: I have no Windows 2003 servers left in my forest, so I normally would have no issue with picking the second option here. However, according to comments left here:

  "If you are generating certificates from AD-CS, make sure to request  the certificates using a template that supports a Windows Windows 2003 Enterprise CA. If you use a Windows Server 2008 CA template, the Federation server will fail to start and report a generic private key error message in the logs (Event ID 133)"

 

"ADFS2 is written in managed c#, using X509Certificates2 - X509Certificates2 don't support CNG (CAPI Next Generation) - this means (by extension) that you can't use certificates which rely on CNG - using 2003 templates as Ross suggests above will ensure you get CAPI keys, not CNG keys."

Actually, what I did here was to create one of each.

I never could get the 2008 version to work, so for the sake of this walkthrough, assume we are using the Legacy STS template, regardless of what you see in the screen capture. ;-)

Creating the 2008 Template: Secure Token Service

 


Pick a good name for the template, "Secure Token Server". It's definitely okay to publish this in AD.

 


Indicate that keys should be exportable, because we'll need this later.

 


Set 2048 bit key to meet ADFS minimum requirement. I have complained before that SHA-1 is a weak hash that should be destroyed. SHA-256 is expensive, but for my demo I don't really care.

Warning: DO NOT USE SHA-512!

From Microsoft: "AD FS 2.0 does not support the use of certificates with other hash methods, such as MD5 (the default hash algorithm that is used with the Makecert.exe command-line tool). As a security best practice, we recommend that you use SHA-256 (which is set by default) for all signatures."

 

Creating the 2003 Template: Legacy STS

 

 


Set 2048 bit key to meet ADFS minimum requirement. Notice there is no Cryptography tab and that you have no opportunity to select the hash algorithm.

 

The other tabs can be left as their default options.

Making the Template Available for Requests
Okay, now back in Certification Authority MMC:

 


Right-click Certificate Templates, once again. This time, click New > Certificate Template to Issue.

 


Choose the template(s) we just created.

If you have multiple CAs in your domain, you should do this for each one that you want to be able to issue this type of certificate.

And that's it. Now, you can request a certificate from your server that will meet the fickle needs of ADFS.

Step 4: Install SQL Server on MasterControl
The next step is to install SQL Server 2008 R2 with Reporting and Analysis (no PowerPivot... yet... maybe someday).

 


I wasn't sure if the ADFS installed a version of SQL Express that would mess with the SQL install, so I did the needful. The SKUUPGRADE=1 command line switch ensures the installer will not fail if it finds a conflict between existing and installing SQL versions. (Turns out it was totally unnecessary. Oh well.)

I got the usual expected warnings:

  • "Boo hoo! Installing SQL on a domain controller isn't recommended.", and
  • "Hey dipstick, don't forget to configure Windows Firewall so people can connect to your databases!"


Next, I picked the components that I thought we might need.

I created 3 new accounts in the OU I created earlier for service accounts:

  • DEMO\sql-servics
  • DEMO\sqlreports-service
  • DEMO\sqlanalysis-service

Next I chose Windows Authentication only, added my admin account, and provided credentials for the service accounts. Finally, I choose to "install but do not configure" Report Server. I want to save this for when I prepare the same machine as a BI demo server on some future date.

Click Next, Next, Next, and watch the blue bars grow!

As a preliminary step, I made some aliases and host records in DNS, in preparation for using this SQL Server with Kerberos and SharePoint. SPDB resolves to the name of the SQL server, in case we need to move the SQL databases later on. (But, because it's a CNAME, we should use SetSPN against the actual A record for the server itself. See more Kerberos stuff farther down in this article for details.)

 


I tested my DNS host names and CNAMEs to be sure they work.

We have to run these two commands to register Service Principal Names the SQL Server NetBIOS and FQDN names:

setspn -A MSSQLSvc/mastercontrol:1433 DEMO\sql-service
setspn -A MSSQLSvc/mastercontrol.demo.colossusconsulting.com:1433 DEMO\sql-service

 

And don't forget to visit AD afterwards and enable the DEMO\sql-service (SQL Server service account) for delegation. The DEMO\MASTERCONTROL$ (computer account) already has this right, because it's a domain controller but would need it to be set if you installed SQL on a separate server.

Step 5: Install OS on SmartyPants
Having completed the basic installation for our domain controller and database server, we can move on to our web server. This machine will run SharePoint 2010 and an ADFS Proxy.

It's a good thing I did this while I was waiting for other stuff to finish before, or I'd be taking a nap now. Steps are essentially identical to Step 1 above. Windows Server 2008 R2 Standard Edition is fine here too. Do your network configuration, machine renaming, and join the server to the DEMO domain.

And, for the love of God, Montresor! Change the time zone on your servers, before it's too late!

Step 6: Install ADFS 2.0 Proxy on SmartyPants
Same comatose installer, except this time we tell it we're installing a proxy instead of a server.

Unlike the domain controller, that had a certificate, No SSL certificate has been set up on the SharePoint box. We need to do that now in the usual way, by requesting a certificate from the Domain CA in IIS.
 

 

 


Here your Common Name needs to be the DNS name you will use for the web site. Fill in all the other information as required. (Note: the *correct* value for State is the full name, not the postal abbreviation.)

Little Known Fact(s): There's actually a "Johnson Cave" in the vicinity of Knoxville, TN
ASHPD = Aperture Science Handheld Portal Device

   

   

 
I had a little trouble getting the certificate issued, because my DC's CA wasn't set up to let the admin from the sub-domain request certificates. But, a quick jump over to the CA and I was able to issue the certificate manually and export it to a shared folder where I could use "Complete Certificate Request" (right click on the screen above) to bring it into IIS.

Set the bindings for Default Web Site in IIS to use the new certificate for SSL transactions on port 443.

Now, start the ADFS admin tool and the configuration wizard should begin automatically.

 


Remember, the ADFS 2.0 Management console is located in Start Menu > Administration Tools.

 


Here, enter the DNS name for the ADFS Server created earlier.

 


At this point, ADFS wants an account to establish a trust relationship. If you make sure that DEMO\adfs-service has admin on MasterControl, then you can use it here. If not, you'll have to fall back on the DEMO\Administrator account instead. :-(

 

I went into Services and changed the account for the ADFS proxy to run as DEMO/adfs.service instead of NETWORK SERVICE. This is just my preference.

 

After finishing the wizard, we still need to go into DNS configuration on the domain controller and create a DNS record for "login.aperturelabs.local"

 

Make sure you update Network adapter properties and set the DNS server for SMARTYPANTS to point to the IP for MASTERCONTROL. Do a couple ping tests to be sure that it works.

Okay. So far, so good.

A Special Note from Captain Hindsight: As I started using the ADFS proxy, I noticed some event messages in my Audit logs telling me that the firewall was blocking connections to the ADFS service. You can head this problem off at the pass by following a similar procedure to allow connections to ADFS service the way I describe for the SQL Service on the other box. You can read about this in detail under Pre-Requisites for Configuring SharePoint a few sections down form here.

Now, keep in mind that if you run SharePoint on SSL, you're going to need another IP address on SMARTYPANTS besides the one you already configured, and another SSL certificate to boot! But, we're going to kick that down the road a ways and figure it out later, when we really need it.

 


Note that now IIS has the /adfs and /adfs/ls virtual directories that we should expect.

One benefit to using proxies is that we can put custom branding on each separate proxy, even when we are using a single AD/ADFS server as our identity claims provider.

Step 7: Install SharePoint Server 2010 on SmartyPants


This screen should be familiar to everybody, right?

 


Install Pre-requisites. Even though we've installed every Windows Update and started with Service Pack 1, there are still components missing. Later on, we'll update WIF to the newer version.

 


Later on, I'll need to verify later that we have the latest and greatest WIF on here.

Run the SP2010 installer. Have your product key ready at this point.

 


Since we put SQL Server on MASTERCONTROL, we're technically doing a Farm install. I don't know about you, but I don't think I've ever actually hit that top button on this screen.

 


Once again, we need to specify that we are *not* using SQL Express in this scenario.

 


Beyond these steps, the actual installer itself is also quite boring. Tell you what – let me go get myself a cup of coffee while this thing runs.

 


Before we run the configuration wizard, we're going to install Service Pack 1 and the latest CU for SharePoint. So, we don't want to run the Configuration Wizard yet.

There was an update in June that came out right on the heels of SP1, and there may have been other ones since then. In any case the June CU had a bit of a rocky start and you may need to download a refresh. According to Microsoft, "If you installed a version of the June 2011 cumulative update that you downloaded prior to June 30, 2011, you need to download and install the update refresh from the download center."

The SharePoint Update Center reports the following as of this writing in early August 2011:

  • Latest Service Pack: SP1 (KB 2510690 and KB 2510766)
    • SP Foundation SP1 direct download
    • SP Server SP1 direct download
  • Latest Cumulative Update: June CU Refresh (KB 2553023)
    • SP Foundation CU direct download
    • SP Server CU direct download
  • Latest Public Update: July PU (KB 2582185) download from Windows Update(?)
    On the day I was setting this up, the hotfix request system was temporarily down. Fun!

 


So, a few hours later, I actually had all the files I needed.

There's one thing I see a lot of newbies mess up (and even some people with experience forget this occasionally), so I will underline it in flashing red double-bold typeface: you need to install the update(s) for SharePoint Foundation as well as the ones for SharePoint Server. There, now that I've gotten that out of system, I can continue.

Step 8: Pre-requisites for Configuring SharePoint
We need to make sure we meet some prerequisites in order for SharePoint to connect to SQL Server. Otherwise, you're going to get stuck at the database screen of the PSCONFIG wizard, shown below.

 

Correct DNS naming, permissions, and open ports are all needed for this to work correctly. (I can't tell you how often a SharePoint install has gone over its time budget because I found out too late that the network admins had everything tightly locked down.)

DNS Guidelines for SQL Server
Here are some tips on how to go about providing a name for the SQL server.

  • Earlier, when I installed SQL Server, I created an alias called SPDB. This was so we can redirect it later if needed - for example, if we want to create a third VM dedicated to SQL Server.
  • The SPNs for such DNS based pseudo-aliases should point to A (host) records, not CNAMEs. If you use a CNAME to define your pseudo-alias for SQL, set your SPN to the A (host) record. (See my notes about Kerberos below.)
  • In this case, since I'm not changing SQL port numbers, I can create this alias in DNS alone. If you use custom ports, you'll need to create a *real* SQL alias using the SQL Server Configuration Manager. To do so requires installing the SQL client connectivity tools on the SharePoint machine

 Creating Service Accounts in AD
I created more accounts in AD:

  • DEMO\spdba
  • DEMO\spservice
  • DEMO\spadmin

DEMO\spdba is my SharePoint database access account that I'll use when I run PSCONFIG. I'll use the other accounts later on for the default services and farm admin accounts.

Granting Correct Security Roles in SQL Server
For the install wizard to run, I need to give the database access account both dbcreator and dbsecurityadmin role based permissions in SQL Server. To add a login and grant the correct permissions, open SQL Management Studio on MASTERCONTROL, then connect to the server and expand Security > Logins.

 

 

 

Opening SQL Ports on the Firewall
We also need to make a rule in Windows Firewall on MASTERCONTROL to let our SQL traffic go from SharePoint to SQL Server. You'll recall that earlier we got a warning about this. Do this under Control Panel > Windows Firewall > Advanced Settings (look in the links on the left hand side).



We need to add a rule under Inbound Rules to let SQL traffic into the machine.

 

 


Choose All Programs and hit the Customize button next to Services.

 


Find the MSSQLSERVER service. Optionally, you can repeat this process to make rules for other SQL services you want open.

 


This is just a demo, so keep it simple, stupid.

 


Hit Add button for the second pick-list to add a range.

 


Limiting access to the local subnet reduces our exposure while requiring fewer configuration changes later on.

 


Allow all traffic to open this service completely. Since we're making this rule to allow just the local network acess, that should be secure enough.

 


Limiting access to the domain network should suffice.

 


Give your rule a name and explanation.

Ports may also need to be opened in your hardware firewall as well. The way I set up the Windows Firewall here will work regardless of the server's port settings.

Step 9: Configure SharePoint
Run the PSConfig.exe configuration wizard from Start > SharePoint 2010 Products and Technologies > SharePoint 2010 Configuration Wizard.

 

 


This is the essential screen I talked about earlier. The server name, firewalls, user account, and permissions have to be right in order to get past this point.

 


Don't forget to store your farm passphrase someplace safe, like in KeePass or an unencrypted text file on the root of C drive. ;-)

 


I want Kerberos to work, so I chose Negotiate and configured a custom port number (53281) instead of a random one. A fixed port makes it easier to assign an SPN to the Central Administration web site. However, before I do, I need to confirm that I know the account that'll be used as its application pool.

Little known fact: 53280 and 53281 are the addresses for the screen background and border colors on the Commodore 64 home computer. I use these because since early childhood I've had thousands of these now useless trivialities memorized, so I prefer to put them to work when assigning ports.

 


Continuing, you'll get the "well, duh" warning at this point. Umm, yeah. Pretty sure I just said basically the same thing a couple paragaphs ago. :-) I know we are doing things the hard way, as I'll explain later.

 

 


And it's time to refill my coffee!

Oh yeah, one other thing. You have to do an IISRESET at this point too. Although in a moment you'll see why we'll have to do a lot more than that. ;-)

Step 9(a): Configuring and Testing Kerberos
This is where you hit the roof, since you just locked yourself out of Central Admin. ;-) When the SharePoint configuration wizard finishes, it will report success. However, when it takes you to the Central Administration web site, you'll try to log in 3 times, and it'll fail into a blank browser page. This is because Kerberos is not properly set up for the Central Administration web site.

If you are desperate, one quick workaround that is *very* temporary, and also not a very good idea, is to turn on Kernel Mode Authentication in IIS for the Central Admin site. It'll get you into the site, and earn you a lecture from Spencer Harbar. ;-)

Another thing you can do is use powershell to turn off Negotiate/Kerberos and revert to NTLM. (At the time of this writing, I looked for such a script, and couldn't find one.) By the way, I also like the suggestion that some people have made, to set up Central Administration for the first time in NTLM mode, and then extend the web application into a new site that uses Kerberos. That way, you always have the NTLM version to fall back on if Kerberos fails, and you can use the extended application to configure Central Admin as a load balanced site. As cool as that is, it's totally uneccessary for this demo, so I'm saving it for some other time.

Configure the SPN for Central Admin
A better idea (at least in the long run) is to get Kerberos working, starting by doing the SPN set up.

 


Let's go into IIS and figure out the application pool account.

 


Go to Advanced Settings for the Central Administration web site, then look for Identity property.

So we have our answer there; it's the same account we put into the configuration wizard a few minutes ago for SharePoint database access.

I ran these commands to make an SPN for the CA web site:
setspn -A HTTP/smartypants:53281 DEMO\spdba
setspn -A HTTP/smartypants.demo.colossusconsulting.com:53281 DEMO\spdba
setspn -A HTTP/smartypants DEMO\spdba
setspn -A HTTP/smartypants.demo.colossusconsulting.com DEMO\spdba

The first time I did this, I only used the first two commands, and it didn't work the way I'd intended. What followed was an exercise in frustration that shouldn't surprise anyone who's ever configured Kerberos before – or tried. After installing KerbTray and WireShark on both servers, several reboots, and hours of lost productivity later, I finally got enough information out the system to tell me why I was unable to login to Central Admin. Ultimately, though I tried several things, I couldn't log in until I added the last two lines (underlined above), which correspond to port 80. Pretty weird, I must admit. (I should do some research to figure out why that might be.)

 

Once you finally do get into Central Admin, don't forget to add an AAM for the FQDN. :-)

In a little while, I'll have to configure more of these SPNs for the SharePoint web sites, but I haven't created those application pool acounts yet – all in good time.

Other Stuff I Tried That Helped - Maybe
As an additional step during my troubleshooting process, I changed the following as described on MSDN Blogs, in C:\Windows\System32\inetsrv\config\applicationHost.config under the system.webServices > security section:

<windowsAuthentication enabled="true" useAppPoolCredentials="true">
<providers>
<add value="Negotiate" />
<add value="NTLM" />
</providers>
</windowsAuthentication>

Is it really necessary? I'm not sure, since by itself it didn't fix the problem I was trying to troubleshoot.

Note that in the above case, the command described in the MSDN blog article failed:

C:>c:\windows\system32\inetsrv\Appcmd set config "SharePoint Central Administration v4" /section:windowsauthentication /useAppPoolCredentials:true /commit:MACHINE
ERROR ( message:Configuration error
Filename: \\?\C:\inetpub\wwwroot\wss\VirtualDirectories\12592\web.config
Line Number: 0
Description: The configuration section 'system.webServer/security/authentication/windowsAuthentication' cannot be read because it is missing a section declaration. )

I also added reverse DNS entries for both machines. Maybe it helped, maybe not. But, certainly this is something you should do.

Also, if I haven't mentioned it yet (I'm pretty sure I did), make sure all your servers are set to the same time zone and have their time set correctly. Microsoft-centric Windows installs to Pacific Time, so if the rest of your domain is running in EST that won't help. Time synchronization issues are a common cause of Kerberos failures.

Don't forget to visit AD and trust the DEMO\spdba and DEMO\$SMARTYPANTS accounts for delegation like before. I'll note here that Spence says this isn't needed, and he's probably right. Frankly, I had enough trouble getting Kerberos working to begin with. This is just a demo environment! (Later, I plan to experiment with this and update this article accordingly.)

Another Side Note about Kerberos
Many folks - who are much more experienced than I am on this topic - report that use of CNAMEs causes problems with Kerberos. They might very well be right. However, I haven't had any issues come up yet. If I learn different, I'll let you know.

My rule of thumb is to assume that Kerberos (or maybe it's Internet Explorer) will resolve a CNAME and replace it with the underlying A record's DNS name *before* completing the process that involves looking up the SPN. So, from Kerberos' point of view, it's like the CNAME doesn't even exist. Thus, if you use a CNAME in DNS, don't use it when defining your SPNs; use the A record that it points to instead.

I've never tried this with IE 6; IE 6 sucks, so if this doesn't work that's honestly just one more reason to stop using it forever.

And, if it isn't obvious, I'm certainly not suggesting that you should point 10 SharePoint sites with host headers to a single A record, because then you'd either end up with 10 sites using one application pool - or duplicate SPNs that just won't work. Use A records for your web sites; my CNAME trick is something I do pretty much just for the SQL Server.

Further Reading for the Truly Masochistic
Getting Kerberos to work is a pretty common problem, and troubleshooting Kerberos is so far outside the scope of this article, it isn't even funny. If you need help, here are some resources I used to fix it.

  • Everything Spencer Harbar Has Ever Written on Kerberos (that man is a SharePoint god!)
  • Troubleshooting Kerberos in a SharePoint environment (Part 1)
  • Troubleshooting Kerberos in a SharePoint environment (Part 2)
  • Troubleshooting Kerberos in a SharePoint environment (Part3)
  • Troubleshooting the Kerberos error KRB_AP_ERR_MODIFIED

Step 10: Setting up SharePoint Services
Now, finally we can get to Central Administration to run the [second] configuration wizard.

 

Then, click on Launch the Farm Configuration Wizard (It's the only option shown, so IMHO it wasn't worth the screen shot.)

 


Here I entered the credentials for the "DEMO\spservice" account, and just left all the checkboxes in their default state.

 


<Yawn /> It's been so much time wasted in fixing Kerberos, that now it's dark outside - time to grab a cold beer and soldier on into the night.

 


Success! But I bet that User Profile Synchronization Service probably *still* won't work ;-)

We've had a long day installing a whole bunch of software and dealing with the little issues that came up along the way. But, we've got all the ground work laid out for the good stuff to follow! Tomorrow, I'll go into detail on how to provision the SharePoint web application(s) needed for the demo to work.

Real World Claims Part 1: Prelude to a SharePoint Presentation

This post is the first of a multi-part series, and is a companion to my presentation titled Real World Claims in ADFS and SharePoint 2010. This presentation was given Saturday, April 13th, 12 noon at SharePoint Saturday, The Conference, Washington, DC April 11-13, 2011. Additional links, slides, video recording, and other funny liner notes will be posted here as they become available.

 

I sit here at my computer on the Saturday only a week before the main event. It's been weeks, maybe even months, since I originally submitted my idea for a presentation at SharePoint Saturday. And while there were many things that prevented me from working on this presentation sooner, including my own nature as a chronic procrastinator, everything I've been doing for pretty much the whole year has been leading up to this moment.

The cursor blinks, expectantly. I imagine that it must be disappointed in me. Is it restless?

Where to begin? Do I start by trying to flush out my slides? Do I review the notes I've been carefully collecting all this time? Sometimes staring at a blank page and figuring out what to write first can be the hardest part.

I fix myself a rum'n'Coke, and I head out to the office. It's raining, drearily, producing an early dusk. Our office is a small earth-sheltered room with about 300 square feet, carved from the back half of our 3 car garage and fondly referred to as "The Bat Cave" by friends of the business. The garage was utterly useless for parking cars, and entirely too useful for collecting large piles of useless stuff. Now, it serves as the nerve center for Liquid Mercury Solutions. There's a large U shaped array of desks, comfortable chairs, monitor arrays connected to nothing. The people who're entirely too accustomed to sitting elbow to elbow – and their laptops – are all gone for the weekend.

I sit down at my desk and fire up Pandora – Daft Reactor Society channel. It plays Mad as Hell by The Vandals, and I turn it up to 11. Howard Beale's screams echo into the empty space around me. Next stop: "The Zone".

(Links to other parts of this series will be posted here as they become available.)

Did You Know: Include XSL In SharePoint Search Results

 


Maybe some of you already knew this, but I sure didn't until I tried it. Thanks to Ian Morrish for his post about the XSLT Standard Library for helping me put the pieces together.

So my little journey into customizing SharePoint Search Results started a few days ago. I won't go into a ton of detail because there are plenty of good blogs out there on how to do this, and anything I have to say on the subject will likely go into our internal knowledge base. It suffices to say that the method of using XSL to transform the rendering of search results is about a thousand times better than the crazy web part development that I had to do in 2003 in order to accomplish the same results.

But my headaches began when I decided to try and make use of XSLT 2.0 function capability to leverage some nifty string manipulations like those found here. I needed to use something like "substring-before-last" to trim the file extension off the end of document titles (really, URLs) that were coming from a search protocol handler for content outside of SharePoint.

Try as I might, I could not get it to work, and the error information for XSL validation is hidden from you, so my attempts to determine the cause were only causing me more grief. I could switch the stylesheet to version 2.0, add the necessary namespaces, and even declare functions with xsl:function. But whenever I would try to call the function (even with constants) I would get that monolithic web part error "Unable to display this Web Part. To troubleshoot the problem, open this Web page in a Windows SharePoint Services-compatible HTML editor such as Microsoft Office SharePoint Designer. If the problem persists, contact your Web server administrator."

So, in the end here is what I was able to do. I downloaded the entirety of the XSLTSL zip file and uploaded its contents into a document library on my SharePoint site using Explorer View. (I safely ignored the error I got from one file that wouldn't upload.) This makes all the XSL files web accessible, but I suppose there could be other approaches that would work just as well, like uploading these files into a subfolder under "/_layouts/".

Now, I tried making the xsl:import tag work, but there was no love there. However, xsl:include seems to work just fine. I wonder why the SharePoint folks don't make more use of includes in their own XSL. Though I guess it wasn't strictly necessary, it would've made for cleaner code and easier customization of things like Data Form Web Parts and such.

So, once you have you library chock full of stylesheets, modify the top level tag of your Search Results XSL to look like this:

<xsl:stylesheet version="1.0" xmlns:xsl="http://www.w3.org/1999/XSL/Transform"
xmlns:doc="http://xsltsl.org/xsl/documentation/1.0" xmlns:str="http://xsltsl.org/string" extension-element-prefixes="doc str">
Then, add includes just below the last "xsl:param" tag, like so:

<!-- Import libraries for common use -->
<xsl:include href="/XSL%20Library/xsltsl-1.2.1/string.xsl" />
You can add as many includes as you like. Generally I try to only include the ones I will actually be using. The URL for the include statement should point to the path of the desired XSL document on your site. I was able to get both absolute URLs (with protocol and DNS names) and root relative URLs (leading slash) to work, but I didn't try file system based URLs to see if they would work.

Now, you can use the templates defined in the included file just like they were declared directly in the web part itself.

In my example, I put this tag between the document title and the description.

</xsl:choose>
<!-- code added - TC -->
<div style="font-weight:bold; color: red">
<xsl:call-template name="str:to-upper">
<xsl:with-param name="text" select="title" />
</xsl:call-template>
</div>
<!-- end code added - TC -->
<div class="srch-Description">

The results look like this:

Custom Search Results Screen Capture

If I were to dig through all these function, I bet I could think up some fancy stuff to try. Better yet, why not make an XSL template library specific to SharePoint and share it with the community?

 

What tool should you use to Deploy SharePoint Projects?


At this point, I have tried them all. I hand rolled my solutions; used early versions of WSPBuilder; struggled with the interface for VSeWSS 1.0, 1.1 CTP, and 1.1 final; and even tried a few offbeat solutions.

Until this summer, every method had serious problems that made packaging and deployment problematic. Revisiting my posts, I see that I never mentioned this before, but I owe props to my colleague Ted Calhoon who turned my on to the WSPBuilder Extensions. I switched over, and I never looked back.

http://www.codeplex.com/wspbuilder/Release/ProjectReleases.aspx?ReleaseId=16820

They are a bit counterintuitive, and the commands to flush out the folder structure are well buried, but once you get the hang of them, they are the best around - at least until someone comes up with something better.

In future, I'll do a walkthrough.

Update 5/1/2010: Microsoft has finally delivered a tool of their own with Visual Studio 2010. We've played with it a few times, and may still need to use WSPBuilder extensions for some time to support our legacy projects. They're very similar, and it appears that the SharePoint tema got this part right for the new version.

OMG W[o]TF CAML!? Choice Field Pet Peeve

Actually I had a different pet peeve today, but I'll blog that later. There'll be some code - and ice cream for those of you who get here while there's some left.
 
Okay, SharePoint can awesome, but sometimes it can also be pretty damned stupid. Here is a classic example of that dichotomy.
 
This is the documentation for a CHOICES / CHOICE element collection in CAML, from the MSDN version of the WSS 3.0 SDK.
 
CHOICES Element (List)
Used to define several choices within a field for a drop-down list.

...

CHOICE Element (List)
Used to define a choice within a Choice field.

 
<CHOICES>
<CHOICE
Value = "Text">
</CHOICE>
<CHOICE
Value = "Text">
</CHOICE>
...
</CHOICES>
Attributes
 
Attribute
Description
Value
Optional Text. Can be used to specify an identifier for the choice.
Now, reading that, you might be led to believe that you could create a CHOICES collection like this to have SharePoint use value codes on the back-end of its DropDownList's options tag in HTML.

<CHOICES>
  <CHOICE Value="MD">Maryland</CHOICE>
  <CHOICE Value="DE">Delaware</CHOICE>
  <CHOICE Value="VA">Virginia</CHOICE>
  <CHOICE Value="PA">Pennsylvania</CHOICE>
</CHOICES>

Well, you would be WRONG! Wrong, as in "You get NOTHING! Good day sir!" wrong. In fact, as nearly as I can tell, the Value attrbiute of this element is less than useless. It's not invalid if you use it - sometimes. (Actually, sometimes it causes validation errors and sometimes it doesn't) It [usually] doesn't break anything; it just doesn't do anything, and the documentation leads you to believe that it should, which I think has the potential to cause a huge waste of time.

So, a big WoTF (wag of the finger) to Microsoft, for fragging this one up royally. And, I guess with that comment, I shouldn't expect to be getting that MVP nomination I've always wanted any time soon, huh? Oh well!

Update
I dug into this problem a little deeper. At first, I believed that you could provide a multi-column style value-text pair in the CHOICE element text, similar to what you'd find in a ModStat field type. However, this turned out to be a pipe dream.

  <CHOICE>MD;#Maryland</CHOICE>

As far as I can tell so far, what is actually going on is that the XSD schema for WSS is actually not consistent with the online documentation. See the following from wss.xsd:

  <xs:complexType name="CHOICEDEFINITIONS" mixed="true">
    <xs:sequence>
      <xs:element name="CHOICE" type="xs:string" minOccurs="0" maxOccurs="unbounded" />
    </xs:sequence>
  </xs:complexType>

Well, it's no wonder when I try to use the Value attribute in my custom content types and other features, I get nasty XML validation messages from Visual Studio telling me that the Value attribute is no valid in the CHOICE element, because it's not.

I don't recommend you try this on a production box, but what I did was to patch wss.xsd as follows.

<!-- BEGIN CHANGE BY TCARPE 11/1/2007 -->
  <xs:complexType name="CHOICEDEFINITIONS" mixed="true">
    <xs:sequence>
      <xs:element name="CHOICE" type="CHOICEDEFINITION" minOccurs="0" maxOccurs="unbounded" />
    </xs:sequence>
  </xs:complexType>
  <xs:complexType name="CHOICEDEFINITION">
    <xs:simpleContent>
      <xs:extension base="xs:string">
        <xs:attribute name="Value" type="xs:string" use="optional" />
      </xs:extension>
    </xs:simpleContent>
  </xs:complexType>
<!-- END CHANGE BY TCARPE 11/1/2007 -->

That made Visual Studio and the VSeWSS behave themselves, but it's not clear to me yet if it will actually have any effect on the behavior of the Choice field and its DropDownList. In theory it should, since I see CAML in the RenderTemplate for CHOICE that retreives the Value attribute - if it exists.

From FLDTYPES.xml:

<HTML><![CDATA[<SCRIPT>fld = new ChoiceField(frm,]]></HTML> <ScriptQuote><Property Select="Name"/></ScriptQuote><HTML>,</HTML> <ScriptQuote><Property Select="DisplayName"/></ScriptQuote> <HTML>,</HTML> <ScriptQuote><Column/></ScriptQuote> <HTML>); fld.format = "</HTML>     <Property Select="Format"/>  <HTML>"; </HTML> <Switch> <Expr><Property Select="FillInChoice"/></Expr> <Case Value="TRUE">fld.fFillInChoice = true;</Case> </Switch> <ForEach Select="CHOICES/CHOICE"> <HTML>fld.AddChoice(</HTML> <ScriptQuote><Property Select="."/></ScriptQuote> <HTML>, </HTML> <ScriptQuote><Property Select="Value"/></ScriptQuote> <HTML>);</HTML> </ForEach> <Switch> <Expr><Property Select="Required"/></Expr> <Case Value="TRUE">fld.fRequired = true;</Case> </Switch> <HTML><![CDATA[fld.IMEMode="]]></HTML>      <Switch>       <Expr><Property Select="Type"/></Expr>       <Case Value="Lookup"><HTML><![CDATA[inactive]]></HTML></Case>       <Case Value="DateTime"><HTML><![CDATA[inactive]]></HTML></Case>       <Case Value="GridChoice"><HTML><![CDATA[inactive]]></HTML></Case>       <Case Value="Calculated"><HTML><![CDATA[inactive]]></HTML></Case>       <Case Value="Currency"><HTML><![CDATA[inactive]]></HTML></Case>       <Case Value="Number"><HTML><![CDATA[inactive]]></HTML></Case>       <Case Value="User"><HTML><![CDATA[inactive]]></HTML></Case>       <Case Value="Boolean"><HTML><![CDATA[inactive]]></HTML></Case>       <Default><Property Select="IMEMode" HTMLEncode="TRUE"/></Default>     </Switch>     <HTML><![CDATA[";]]></HTML> <HTML><![CDATA[fld.BuildUI();</SCRIPT>]]></HTML>

Clearly, somebody over there at MS was expecting this attribute to be declared - at least sometimes. I really have to wonder how this fell through the cracks, and if I'll be doing more harm than good with this particular tweak.

So, I just gave it a try to see what happens. Sadly, there is no joy in mudville today. Mighty SharePoint has struck out. Using this:

    <CHOICES>
  <CHOICE Value="C">Client</CHOICE>
  <CHOICE Value="P">Partner</CHOICE>
  <CHOICE Value="S">Staff / Professional</CHOICE>
  <CHOICE Value="V">Vendor</CHOICE>
  <CHOICE Value="O">Other</CHOICE>
    </CHOICES>

SharePoint gives me:

     <option selected="selected" value="Client">Client</option>
     <option value="Partner">Partner</option>
     <option value="Staff / Professional">Staff / Professional</option>
     <option value="Vendor">Vendor</option>
     <option value="Other">Other</option>

I give up! I don't think anything short of creating a custom field type is going to solve this problem.

The remifications of this "bug" are actually far reaching and profound. For example, let's suppose you have a custom field (or site column) with custom properties defined, and you want to assign a value from a CHOICES collection to a property of an enumerable type in your own class derived from SPFIeld. If the Value attribute were working as claimed, you could assign it the string equivalent of each value of the enumeration, then use a simple call to Enum.Parse() to do the conversion. Instead, you are left with the ugly choice of writing custom parsers. (Choose your posion: property attributes in the enum, or a hard coded switch-case statement.) And, you'll have to write one for each choice property you want to create in this way. This increases the overall amount of time spent doing code and maintainance, which is the opposite of why we use SharePoint in the first place.

For now, I guess the only thing I can really take away from this experience, other than frustration, is an affirmation of one of the great truisms of software engineering as coined by the irrerent Fred Brooks, "Documentation lies."

InfoPath Forms "Dropdown of Repeating Self" Bug Stomp


I was fooling around with InfoPath this weekend, trying to create a generic form and data definition for a flexible ad hoc workflow.

One thing I wanted in this form was the ability to specify dynamic routing between workflow steps. So, I created a repeating table for the steps and I also bound a repeating section to the same repeating group in the data. This let me summarize the steps in the table and then let the user specify the properties for each step in the details section below.

In the details section, I have dropdowns for Approve To, Reject To, and Overdue To that are meant to tell the workflow which step it should switch to when these events take place. Each of these dropdowns is meant to be populated by the list of steps.

Here is a shot of the designer:

Screenshot of the Designer View in InfoPath

So, in the data the XML looks like this:

Workflow Data Source Schema

Basically, I have the dropdowns bound to RouteTo in each case, and I want them to let the user pick from ID and TItle for the entire list of WorkflowTask items.

This is where InfoPath barfed, showing a glaring weakness. When you bind the dropdown list's items to the form data, and specify the repeating group as //WorkflowTasks/WorkflowTask it *should* encode that as relative to the parent element, giving you the full list, but instead it encodes it relative to the current element (which turns out to be just '.') giving you only the current WorkflowTask node.

I should also note that if you point the list to a repeating node that is outside the parent oath of the node of your dropdown list, then it will behave itself normally and give you all the nodes under that group.

Though it may be okay in some cases, for a dropdown list this behavior is less than useless. The field browser should give you enough control to let you specify a differnt xpath query at least, which it does not, leaving you stuck with what the wizard lets you pick. I will completely ignore the obvious usefulness of an xpath expression complete with functions, etc. that could've easily allowed for dynamic filtering of the list items.

So, my workaround was to save the IP template as source files, and then go into the source for view1.xsl and find the specific xsl that does the transform for the list. In my case, this looks like this.

I search for "Route To", which is my display name for the dropdown. Below, look for code like:

<xsl:for-each select="."><-- This is the offending xpath
<option>
<xsl:attribute name="value">
<xsl:value-of select="@my:ID"/>
</xsl:attribute>
<xsl:if test="$val=@my:ID">
<xsl:attribute name="selected">selected</xsl:attribute>
</xsl:if>
<xsl:value-of select="@my:DisplayName"/>
</option>
</xsl:for-each>

So, to get the complete list of nodes when you are already inside of it, just swing up to the parent node and select all the desired sub-nodes, like so:

<xsl:for-each select="../my:WorkflowTask">

Once I am done, I open my manifest.xsf and save as back to the original xsn form template. It isn't clear to me how long this hack will hold, but I was able to preview the modified template, and even make some changes to other parts of the form without reverting to the undesirable '.'.

Here's the preview of the form:

Screenshot of InfoPath Form in Preview Mode

It's not the most elegant solution in terms of maintainability, but it was a lot easier than writing custom code behind (and in some cases that isn't necessarily an option anyway), so I am glad to have found it.

Come to think of it, though I haven't tried it yet, you could hack the select with an xpath expression to filter the dropdown. If anyone gives that a shot, shoot me an email or comment and let me know. (Send it to anything at thomascarpe.com until I get my blog comments working.)

Update:
I did eventually see some weird looking XSL in the dropdowns after a couple of successive load-saves. This caused the value that was selected to be displayed as a choice within the list, and was completely undesirable behavior. I simply used the same hack as above to rid myself of the annoying extra option tag. :-)

Custom Fields and Content Types

In a word: BE CAREFUL.
 
Okay, that is actually two words. :-)
 
Just be careful about the naming conventions that you use and how you change them later. If you need to uninstall / reinstall a custom field and control, you will likely have problems with the lists that were built on top of it. You will also have problems with the SharePoint Site Column and Content Type management pages. Needless to say, this is NOT good.
 
Fortunately, it looks like the fields can be manually deleted using the SharePoint web services. I imagine it is also possible to do this from stsadm, but I never much liked command line tools, so I am trying now to create a little Windows Forms application that will do the job for me.
 
I will post my code back here when I get it stable. For now, just be careful about changing or removing custom field names and namespaces, etc.
 
Update:
 
Okay, there is only a very small amount of useful stuff online at this point, so I will try to be as specific as I can - for a Friday evening.
 
My Issue and Its Root Cause
 
Create a custom field control and field type and install them to SharePoint either by hand, with an MSI, or a WSP. In particular, I want to focus on the FLDTYPES_xyz.xml file (or whatever you decide to call yours). This is the file that tells SharePoint how to make use of your custom field control.
 
Now, suppose you decide later that you don't like the name of one of your fields. Maybe you mis-spelled it, or maybe (like me) you just decided that you didn't want there to be any naming ambiguity between your custom fields and those that other developers might provide. So, you rename your field in the definition. You think it will be okay because you are going to uninstall and reinstall your solution anyway.
 
Here is the gotcha. If you have used your custom field in any list, then a reference to it will have been added to the SPWebs.Fields collection, which is a master list of all the field definitions that are used for your entire web site. Additionally, if you have added it to any of your content types, then a reference will exist in the form of a SPFieldLink in the SPWeb.ContentTypes[x].FieldLinks collection.
 
If you remove the FLDTYPES.xml file that these references are relying upon, or change the names in it, then you'll see some pretty interesting behavior out of SharePoint! Hopefully what I am posting here will save somebody the entire workday that this issue cost me.
 
What manifests is that your lists, the content types admin web page, and the site columns admin web page will all be broken by this change. In particular, the site columns page will give an error:
The given key was not present in the dictionary.
Very helpful indeed - as helpful approaches zero. The other pages will have other cryptic messages as well, but I did not capture them here. Maybe some time I will go back and repro this error so I can take some screen shots. Sadly, nothing I did to the CustomErrors settings on my server allowed me to see a stack trace.
 
What you can do to fix this behavior is change the TypeName property back to what it was in your XML file - assuming you can remember what it was.
 
<Field Name="TypeName">MyOldTypeName</Field>
 
If you can't remember the name you used that is causing you the problem, you can use the SharePoint API to find it.
 
Simply open your web site with an SPWeb object, then attempt to iterate through the SPWeb.Fields collection. You will immediately receive an exception to the effect of something like "FieldType named 'x' is not properly installed. Go to the Site Collection page in Site Settings to correct the problem." Of course this is bollocks, because the web page is broken by this problem.
 
   string customColumnName = "My_x0020_Custom_x0020_Column";
   string contentTypeName = "MyContentType";
   using (SPSite site = new SPSite( "http://spdev" )) {
    using (SPWeb web = site.OpenWeb( "/site1" )) {
     // just some useful code for getting the lsit of names
     List<string> contentTypeNames = new List<string>();
     SPContentType ct = web.ContentTypes[contentTypeName];
     foreach (SPFieldLink link in ct.FieldLinks) {
      contentTypeNames.Add( link.Name );
     }
     List<string> fieldNames = new List<string>();
     foreach (SPField field in web.Fields) { // <!-- this is where exception gets thrown!
      fieldNames.Add( field.Title );
     }
     // delete the column from all content types
     ct.FieldLinks.Delete( customColumnName );
     ct.Update( true );
     // delete the site column / field
     web.Fields.Delete( customColumnName );
     web.Update();
    } // using SPWeb
   } // using SPSIte

 
In fact, as it turns out, my first approach was totally wrong, because I was trying to use the web service or API to simply remove the offending site column since I didn't want to keep it anyway. I was able to get rid of the reference to the column in the content type, but SPWeb.Fields.Delete("FieldName") throws the same error I describe above, so you can't actaully delete the column as long as it is still in a broken state. Nice!
 
But, now you will have the name of the field type name that can't be found - and hopefully some ability to stitch togehter a FLDTYPES.xml field to match it, either using your own control or one of the base field controls from SharePoint. Having corrected the issue you should be able to use the pages on Site Settings to delete the offending column and/or remove it from any content types.
 
However, bear in mind that if it is in use in any lsits, then it will be removed and any data housed there will be lost. If you need to keep you data you will either have to find a way to move it into another field or else leave the naming conventions the way they were.
 
What I Learned About SharePoint from This Mess
About the most useful thing I learned today is that it's about 1,000 times easier to make programmatic changes to columns and content types if you use the APU than it is using the web services. Sure, the web services are there and they have their uses I suppose. But, you'll have to learn to write some CAML and get used to parsing your results using XPath queries, because the results don't deserialize (at least not easily).
 
I wrote a nifty applet that can connect to a web site using the Webs.asmx server and let you browse site columns and even select one for deletion. Getting the UI populated was a chore, because the XML that the web services return needed to be massaged to strip the namespaces from it so that SelectNodes would actually work. (It was either that or implemenet a NamespaceManager, which let's just say wasn't going well.) In the end, though I got all the code written to pick a column from a column group and then delete it, it didn't work anyway! This was  because of the issue that caused the exception I describe above, I guess. But, crossing the web service the only error information you will see is "800040005 Operation Failed". Sadly, that's not very useful.
 
So, in the end using the web service to do this type of administrative work is probably not worth it. The only real benefit you'd get from a lot of extra effort is that your admin tool would be able to connect to services across the internet even if they haven't installed anything on the server side. If you have control over the server environemnt it would probably be much better to write your own easy to use web service wrapped around the API functions you want to implement.
 
Another thing that I learned is that the SharePoint web services still largely use CAML passed as either strings or untyped XmlNode objects. These are somewhat difficult to work with, and I will have to do some research into finding out how they can either be serialized/deserialized into useful objects in >NET or otherwise made easier to handle. On the birght side, I wrote a useful serialization tool for SPQuery CAML code some time ago and it looks liekt hat will probably still be relevant and I should go dust it off. :-)
 
Finally, I leanred a lot about the naming conventions for columns and content types as they realte to the API, the web service, and the terminology on the web site itself. This can be pretty confusing, and there's not a lot of good documentation out there. But, it's a little out of scope for today and I'll just have to put together another article for that topic some other time.
 
Now, it's off to New York to see my in laws and hopefully spend a nice weekend with my head buried in my laptop and some comic books.
 
More Information:
Careful When Working with Sealed Site Columns
MSDN: SPFieldLinkCollection.Delete Method
MSDN: Webs.UpdateContentType Method
MOSS 2007: WSS 3.0: How to add/delete/update site columns by using SharePoint WebService
Tags: Architecture, studies, Tips, coding, sharepoint

In a word: BE CAREFUL.
 
Okay, that is actually two words. :-)
 
Just be careful about the naming conventions that you use and how you change them later. If you need to uninstall / reinstall a custom field and control, you will likely have problems with the lists that were built on top of it. You will also have problems with the SharePoint Site Column and Content Type management pages. Needless to say, this is NOT good.
 
Fortunately, it looks like the fields can be manually deleted using the SharePoint web services. I imagine it is also possible to do this from stsadm, but I never much liked command line tools, so I am trying now to create a little Windows Forms application that will do the job for me.
 
I will post my code back here when I get it stable. For now, just be careful about changing or removing custom field names and namespaces, etc.
 
Update:
 
Okay, there is only a very small amount of useful stuff online at this point, so I will try to be as specific as I can - for a Friday evening.
 
My Issue and Its Root Cause
 
Create a custom field control and field type and install them to SharePoint either by hand, with an MSI, or a WSP. In particular, I want to focus on the FLDTYPES_xyz.xml file (or whatever you decide to call yours). This is the file that tells SharePoint how to make use of your custom field control.
 
Now, suppose you decide later that you don't like the name of one of your fields. Maybe you mis-spelled it, or maybe (like me) you just decided that you didn't want there to be any naming ambiguity between your custom fields and those that other developers might provide. So, you rename your field in the definition. You think it will be okay because you are going to uninstall and reinstall your solution anyway.
 
Here is the gotcha. If you have used your custom field in any list, then a reference to it will have been added to the SPWebs.Fields collection, which is a master list of all the field definitions that are used for your entire web site. Additionally, if you have added it to any of your content types, then a reference will exist in the form of a SPFieldLink in the SPWeb.ContentTypes[x].FieldLinks collection.
 
If you remove the FLDTYPES.xml file that these references are relying upon, or change the names in it, then you'll see some pretty interesting behavior out of SharePoint! Hopefully what I am posting here will save somebody the entire workday that this issue cost me.
 
What manifests is that your lists, the content types admin web page, and the site columns admin web page will all be broken by this change. In particular, the site columns page will give an error:
The given key was not present in the dictionary.
Very helpful indeed - as helpful approaches zero. The other pages will have other cryptic messages as well, but I did not capture them here. Maybe some time I will go back and repro this error so I can take some screen shots. Sadly, nothing I did to the CustomErrors settings on my server allowed me to see a stack trace.
 
What you can do to fix this behavior is change the TypeName property back to what it was in your XML file - assuming you can remember what it was.
 
<Field Name="TypeName">MyOldTypeName</Field>
 
If you can't remember the name you used that is causing you the problem, you can use the SharePoint API to find it.
 
Simply open your web site with an SPWeb object, then attempt to iterate through the SPWeb.Fields collection. You will immediately receive an exception to the effect of something like "FieldType named 'x' is not properly installed. Go to the Site Collection page in Site Settings to correct the problem." Of course this is bollocks, because the web page is broken by this problem.
 
   string customColumnName = "My_x0020_Custom_x0020_Column";
   string contentTypeName = "MyContentType";
   using (SPSite site = new SPSite( "http://spdev" )) {
    using (SPWeb web = site.OpenWeb( "/site1" )) {
     // just some useful code for getting the lsit of names
     List<string> contentTypeNames = new List<string>();
     SPContentType ct = web.ContentTypes[contentTypeName];
     foreach (SPFieldLink link in ct.FieldLinks) {
      contentTypeNames.Add( link.Name );
     }
     List<string> fieldNames = new List<string>();
     foreach (SPField field in web.Fields) { // <!-- this is where exception gets thrown!
      fieldNames.Add( field.Title );
     }
     // delete the column from all content types
     ct.FieldLinks.Delete( customColumnName );
     ct.Update( true );
     // delete the site column / field
     web.Fields.Delete( customColumnName );
     web.Update();
    } // using SPWeb
   } // using SPSIte

 
In fact, as it turns out, my first approach was totally wrong, because I was trying to use the web service or API to simply remove the offending site column since I didn't want to keep it anyway. I was able to get rid of the reference to the column in the content type, but SPWeb.Fields.Delete("FieldName") throws the same error I describe above, so you can't actaully delete the column as long as it is still in a broken state. Nice!
 
But, now you will have the name of the field type name that can't be found - and hopefully some ability to stitch togehter a FLDTYPES.xml field to match it, either using your own control or one of the base field controls from SharePoint. Having corrected the issue you should be able to use the pages on Site Settings to delete the offending column and/or remove it from any content types.
 
However, bear in mind that if it is in use in any lsits, then it will be removed and any data housed there will be lost. If you need to keep you data you will either have to find a way to move it into another field or else leave the naming conventions the way they were.
 
What I Learned About SharePoint from This Mess
About the most useful thing I learned today is that it's about 1,000 times easier to make programmatic changes to columns and content types if you use the APU than it is using the web services. Sure, the web services are there and they have their uses I suppose. But, you'll have to learn to write some CAML and get used to parsing your results using XPath queries, because the results don't deserialize (at least not easily).
 
I wrote a nifty applet that can connect to a web site using the Webs.asmx server and let you browse site columns and even select one for deletion. Getting the UI populated was a chore, because the XML that the web services return needed to be massaged to strip the namespaces from it so that SelectNodes would actually work. (It was either that or implemenet a NamespaceManager, which let's just say wasn't going well.) In the end, though I got all the code written to pick a column from a column group and then delete it, it didn't work anyway! This was  because of the issue that caused the exception I describe above, I guess. But, crossing the web service the only error information you will see is "800040005 Operation Failed". Sadly, that's not very useful.
 
So, in the end using the web service to do this type of administrative work is probably not worth it. The only real benefit you'd get from a lot of extra effort is that your admin tool would be able to connect to services across the internet even if they haven't installed anything on the server side. If you have control over the server environemnt it would probably be much better to write your own easy to use web service wrapped around the API functions you want to implement.
 
Another thing that I learned is that the SharePoint web services still largely use CAML passed as either strings or untyped XmlNode objects. These are somewhat difficult to work with, and I will have to do some research into finding out how they can either be serialized/deserialized into useful objects in >NET or otherwise made easier to handle. On the birght side, I wrote a useful serialization tool for SPQuery CAML code some time ago and it looks liekt hat will probably still be relevant and I should go dust it off. :-)
 
Finally, I leanred a lot about the naming conventions for columns and content types as they realte to the API, the web service, and the terminology on the web site itself. This can be pretty confusing, and there's not a lot of good documentation out there. But, it's a little out of scope for today and I'll just have to put together another article for that topic some other time.
 
Now, it's off to New York to see my in laws and hopefully spend a nice weekend with my head buried in my laptop and some comic books.
 
More Information:
Careful When Working with Sealed Site Columns
MSDN: SPFieldLinkCollection.Delete Method
MSDN: Webs.UpdateContentType Method
MOSS 2007: WSS 3.0: How to add/delete/update site columns by using SharePoint WebService
Tags: Architecture, studies, Tips, coding, sharepoint