Error Creating WCF Connection for BCS Content Type in SharePoint Online

***some links Currently Broken***

I started this week's Super SharePoint Detective Adventure while trying to follow Nick
Swan's blog article about creating a BCS External Content Type for CRM 2011. This kind of integration between SharePoint and CRM is something I've been wanting to prove out for our own use for a long time now, and Nick's approach (although it still involves a lot of "glue code") seems like the most reasonable one I've seen to date.

True, his article talks about SharePoint 2010, but he has another one based on 2013 and all the concepts look like they ought to be backward compatible. In fact, everything was going fairly well, until I got to the last step in SharePoint Designer where you actually create the ECT connected to your WCF service. Then, things just wouldn't work - no matter what I tried, SharePoint Designer kept giving me this error.

  • An error occurred while accessing WCF service URL: http://<myStagingGuid>
  • Connection to the WCF service that has connection name BcsTest cannot be established.
    • Unknown Error occurred. Unable to load one or more of the requested types. Retrieve the LoaderExceptions property for more information.

I could see others online are having this problem too, as there are questions about it posted in a few places, but mysteriously so far there was no answer:

  • MS Forums: Problem creating external content type from WCF service
  • Stack Overflow: Consume WCF Service in Office365 from window Azure

So, I did some experimentation. People say this works in on premises SharePoint, so I point
SPD at my on-prem SharePoint 2013 farm, and just as they say everything works fine.

Maybe my types are too complex. After all I am trying to pull a ton of data fields from a CRM Account. I create a much smaller web service with a data structure containing just a string with "Hello world!" and an integer ID of 1234. Still, I cannot create the data connection to this service either.

I spend some time trying to package up my data model into a *.bdcm file, and then do the export from my on-prem farm to SPO. No dice! This just breaks differently, because now when I go back to SP Designer my data connections are still broken, and my options from the SharePoint web UI are too limited to complete the configuration.

After beating my head against the wall all Friday evening - then stressing out all weekend about how I'm supposed to get a good hybrid solution with one part in the cloud and one part on our local network when I can't get the in the cloud part of it to work - I decided to open a ticket with support. At this point, I am not expecting great things to happen.

I guess it's a good thing that my experience did not live up to my expectations. I got the answer that I needed, and I didn't even have to wait a fortnight!

Update: Yay! Microsoft releases official KB article for my issue. KB2879695: Unknown Error occurred" error message when you try to create an External Content Type in SharePoint Online by using SharePoint Designer 2013 came out yeterday; interesting timing, indeed. ;-)

I was also surprised when I learned the cause to this problem, and its solution - or at least the workaround. More on the cause in a moment. I think it may surprise you. Here's the workaround for you, in glorious Technicolor.

  1. Find the spdesigner.exe.config file. In my case this was in C:\Program Files\Microsoft Office\Office15 because I am using the 64 bit version. You're all using 64 bit now right? Right?
  2. Make a backup of this file, because you never know when you'll want to reverse this fix, like maybe on whatever day SharePoint 2016 finally rolls out.
  3. Add the runtime element below, so that your file looks like the following code, then save it.

    And that's it. Just re-execute SharePoint Designer and it will magically work.

So what's going on here? Bascially, we're telling SPD that if it gets a reference to a version
16.* assembly, it should use version instead. Seems that the Office 365 team has incremented the version number for SharePoint online to for some reason. That's why the BCS connection will work fine if you are pointing to your on premises SharePoint farm.

Do they have a time machine into the not-so-distant future? Are they pilot testing the super-secret-squirrel beta version? Was somebody just a very bad typist working with an equally bad set of software testers? As Fox Moulder once said, "The truth is out there." But it turns out that it really isn't so exciting. I'm told that the reason this was done is to help differentiate the build running in Office 365 from the on premises version.

I'll defer judgment for now on whether this was a particularly wise decision on Microsoft's part, but I encourage you to leave your opinions in the comments. Let's just say that heir workaround has had some unintended consequences, and my error was one of those.

And of course, if you think this was a particularly brilliant piece of detective work and want us to help on your next SharePoint project, you can reach me here.

Further reading / related articles:


**Moved over old comments for this blog***

SQL Azure Makes Database Administration Fun!

If the thought of setting up load balancing thrills you; if you just can't wait to optimize your hardware; if you're desperately eager to crawl around your data center making sure that your machines are all connected to the network; basically, if you're thrilled with the physical plant planning aspects of SQL Server… well… this is not the blog post for you. Go look at funny pictures of cats instead.

Okay, now that they're gone…

SQL Azure, a highly available and scalable cloud database service built on SQL Server technologies, separates the logical database from the physical hardware.  Your database becomes a Platonic ideal, floating in the astral plane of ideas. Yes, it physically lives somewhere, but aside from the proximity of your data center (and therefore how much latency you have to deal with), you have no idea where. More importantly, you don't care. You can focus your attention on the data itself, on optimizing it, not on physical server considerations.

Issues to Consider

Sure, there are quasi-physical issues you still have to consider.  For one, your cost will largely be determined by storage and transfer levels, and there's an upper limit to how much data you can have at all.

Also, performance degrades as size increases, to the point where some have found it makes more sense for their applications if they divide a 150 GB database (the maximum size allowed by SQL Azure) into three 50 GB databases.1

A Different Mindset

It’s not just that you don't own the box; the box won't even do things you'd expect it would do if it were really a SQL Server.  When I used to maintain an application that was hosted on shared SQL Server hosting, because it was still SQL Server, I had the commands available to run a backup.

But if I tried it, I wouldn't actually get a backup, because the backup would try to write to the physical hardware that the server was on, and as a mere client of the hosted SQL Service, I had no access to that. 

SQL Azure has solved that problem by just getting rid of the backup command.  You can sync the data with a local SQL Server of your own. You can write queries that extract the data and access them via your own SSMS client.

But you cannot actually do BACKUP (or RESTORE, for that matter.)  You don't have to manage the size of your logs (in fact, you can't.)  Your data lives in a world where the file system behind it is not just inaccessible, it’s invisible.

Is SQL Azure Mission Critical?

This can be a little scary.  It's hard to let go of the idea that your data is a thing – that you can’t look at the physical object that houses it, or control the load balancing2, or that there's little you can do to improve uptime… I'll admit it, this would bother me if I were running mission-critical apps with highly sensitive proprietary data on them via SQL Azure.

I do, however, think that if I were doing that, someone would need to check my blood sugar for me3, because I can't see any reason why any rational person would ever want to run a mission-critical app with highly sensitive data on SQL Azure.  That's not what it's for.4

I mean, I suppose you can.  Microsoft promises 5 9's of uptime and will restore your stuff from their backups if anything goes down.  You can purchase two Azure "servers", and Microsoft will fail them over to each other automatically.  You can run your own backups by syncing to a local copy. And your data's encrypted every time it goes anywhere, so you probably could run Very Important Apps on Azure if you wanted to. 

But odds are, if you actually have Very Important Apps to run, you can afford to buy on-premises SQL Server, or at least virtual SQL Server hosting on your very own VM.  Also, odds are, your Very Important App probably does not want to deal with potential latency issues that you have absolutely no control over whatsoever.

What SQL Azure Is Really For

SQL Azure democratizes web-capable relational databases, so that small businesses and hobbyists can afford them.  Access was never particularly web-capable (unless you're talking about Access Services on SharePoint… but if you have SharePoint, generally speaking, you have SQL Server), and MySQL, like most open source software, is harder to use for most people because it's created to the specifications of the highly technical folks who are creating it in their spare time for fun.

SQL Azure also reduces the footprint of your SQL Server on-premises requirements, so you can cut down the amount of hardware you need to maintain and the amount of time your DBAs need to spend pretending to be network admins or DBAs. 

Additionally, like all cloud services, SQL Azure will probably be up during a problem that affects your data centers, so you can use it for emergency failover or backup services. When you're suffering from bailing your office out after a hurricane, you can at least be confident that your web site was sorta kinda continuing to run and capture data in the background.

So it's a different way to think about your data -- but honestly, I never wanted to be a network admin anyway, so I think I can get used to a world where I might never have to trunc the logs ever again.


1. Actually… is that something normal DBAs ever have to do, or is it just me? You gotta imagine the performance degradation must be impressively awful… because you also cannot join across databases.  So if your data has been split into three db's, you'll need a lot of logic at the app layer to control where you're writing a given record and then remembering where to go back for it when you need it again.

2. This isn't 100% accurate.  If you purchase two SQL Azure "servers", Microsoft will allow them to load balance.  However, SQL Azure still handles the actual load balancing algorithms in the background, and you can't really control them.

3. I am, in fact, hypoglycemic.  So if you ever find me building mission-critical apps with highly sensitive proprietary data with a SQL Azure back end, please give me something to eat.  String cheese is great.  It actually ought to have protein in it; the idea that hypoglycemics need a candy bar to bring up their blood sugar is sort of similar to the idea of giving a guy with a hangover a screwdriver, in that maybe it will solve the immediate problem but in the long run it'll probably make things worse.

4. I'm going to be a big Star Trek geek here: don't give a Vulcan a job as a stand-up comic, don't employ a Klingon as a starship counselor, don't have a Q sitting around reading numbers to you off a screen, and don't use SQL Azure for what it's not good for.  When something is very valuable doing one thing, don't make it do a different thing it's really no good at.