Microsoft Cloud News Round Up

We Told You So: Microsoft Cloud Leads in Gartner Magic Quadrant, Yet Again 

Gartner Magic Quadrant ranks Microsoft as a “Leader” in the BI and Analytics section for 2017. Gartner highlighted multiple key benefits of using Microsoft’s OneDrive for Business for companies who already use productivity and collaboration tools offered by Microsoft (i.e. Office365, SharePoint).  

Our founder and CEO, Thomas Carpe, remarks “Finally business advisors officially recognize what Liquid Mercury Solutions has known for years – that Microsoft’s cloud services are quickly becoming the standard by which all others will be compared. Where Office 365 once provided an advantage to businesses willing to take a chance on the cloud, soon enough it’ll become ubiquitous in business.” 

Microsoft’s optimism and determination seems to promise a growing trend of advancements in this area. Kamal Hathi, Microsoft Power BI general manager, wrote “We’re humbled by this recognition for the innovation we’ve delivered with Microsoft Power BI in the past year … But more importantly, we’re encouraged by the progress we’ve made as a community in executing against the ambitious goal set when Power BI was made generally available only a short time ago in July 2015: Provide BI to more people than ever before, across all roles and disciplines within organizations.” 

Alara Rogers responds, “Microsoft has actually been an industry leader in BI for many years. Excel is the world’s most widely used BI tool by a large margin. Where Microsoft struggles is in communicating to customers about the power and abilities of their platform, especially when we see lots of changes like we have with the shift from traditional tools like PowerPivot and SQL Reporting Services to a cloud platform Power BI. There’s room for improvement, but things are headed in the right direction.” 

Whatever your opinion, it’s safe to expect even bigger and better collaborative innovations in the years to come. 

Read More: 


Office 365 Takes the Lead Over Traditional Office for the First Time 

This week marks the first time that Office 365 users exceed the number of users for traditional versions of Office, making it the clear favorite both among users and at Microsoft. That trend will only continue as Microsoft plans to officially end support for Office 2007 later this fall. 

But what’s made Office 365 such a success? Let’s look at a few benefits. 

  • For starters, no one needs to suffer with the frustration of losing files that weren’t backed up because someone was too lazy to keep the laptop charged, thanks to OneDrive for Business and SharePoint Online. 
  • All the relevant applications people use for themselves can be synced across devices. That means next time your computer has a spontaneous meltdown, you can just switch to another one, easy as pie. 
  • Files can be shared with anyone, both inside the company and with vendors, partners, and customers. No need to clutter folks’ email with bulky and potentially risky attachments. 
  • Multiple people can edit a Word document or Excel spreadsheet simultaneously and it updates in real time. It’s so much easier than passing it around the office for review and revision. 
  • Thanks for applications like Delve, everything you need can be found in one place so say goodbye to having 27 windows open at once. 
  • Best of all, your information is kept safe using the latest encryption, and you can protect your account with more than just a password by using multi-factor authentication such as your phone. 

Office 365 has all the perks you wished Office XP had back in 2002. No wonder the trend of moving everything to the cloud is here to stay, because it works beautifully. 


Microsoft Welcomes Our Robot Overlords 

There has been buzz of Microsoft creating an AI supercomputer and people in the industry have strong opinions on how this will impact the future. Will it be chaos like The Terminator come to life? Or will we all fall in love like Joaquin Phoenix did with Her? 

OpenAI, a non-profit AI research organization, has teamed up with Microsoft to implement accessibility of AI to the masses. Azure is the crucial component that will take this idea and cultivate it into a reality. 

Microsoft already has tools within this platform that can will assist in this development (i.e. Azure Batch, Azure N-Series Virtual Machines, and Azure Machine Learning) but they are in the process of creating further technology to aid in AI research. 

Some of these advancements have already come to light, including the upcoming Hardware Microservices via Azure. Microsoft aims to have FPGA (field-programmable gate arrays) processing power be accessible to developers sometime next year to have a cloud that is 100% configurable. There are major perks to having this type of access including increased speed and functionality. 

What the heck is an FPGA? Thomas Carpe explains “Simply put, FPGA are hardware, like your graphics card for example. Unlike special purpose hardware, they can be programmed and reconfigured as needed using software, potentially including AI. Thus, they’re super-fast, and can facilitate machine learning.” 

This all sounds wonderful, but some think converting and relying on AI technology is going too far. World renowned icons in the tech and science communities have conflicting ideas on what this means for the future of civilization. 

Elon Musk has spoken out against AI, referring to it as “our greatest existential threat” almost three years ago. He’s taken a precautionary role as a member of OpenAI. 

Stephen Hawking made a point to compare the speed of biological evolution versus advancements in AI to show that AI would eventually outgrow us. 

Mark Zuckerberg seems to favor the idea of a world more heavily dependent on AI. He believes it could create vast improvements in every day scenarios like healthcare and transportation. 

Where do you stand on the subject? Will you embrace AI? How would you like to maximize the cloud with the new capabilities of Hardware Microservices via Azure? Let us know what you think in the comments below.

Learn More: 


Stay Ahead of Emerging Cloud Security Threats  

Recently, a massive cybersecurity attack on Office 365 targeted several Fortune 500 companies. How?! 

Skyhigh Networks explained the attackers consistently tried variations in a skillfully discreet manner to get into the accounts of “…high value targets, including more than 100,000 failed Office 365 logins from 67 IP addresses and 12 networks. The attempts in all targeted 48 different organizations.” 

Evidence shows the attackers might have already known some employee names and passwords through phishing and tried different combinations of usernames and passwords based on that. 

Your business can be vulnerable too! Do you reuse the same easy password for everything? Do you interact with spam emails? Do you have a basic username-password authentication system? If you answered yes to any of these questions, you need to up your security game. 

Don’t worry, you’re not alone. But that’s exactly how and why something like this could happen to your business soon. Embrace the modern world and get educated on how you can protect your data. 

There has been a huge shift of bringing sensitive information to the cloud amongst enterprise corporations as well as SMBs in recent years. Almost 60% of all sensitive corporate data in the cloud is based in Office 365. Additionally, it works on a myriad of devices which makes it even more appealing to users. 

The downside to this is that it’s also the hackers’ bull’s eye. It is often said “[w]ith great power comes great responsibility” …to protect your data. 

Slawomir Ligier, senior vice president of engineering at Skyhigh elaborates on this. “While companies traditionally have invested extensively in perimeter security, those without a dedicated cloud security solution will lack visibility and control for a growing category of attacks. Enterprise cloud providers secure their infrastructure, but the ultimate responsibility to control access to sensitive data lies with the customer.” 

Thomas Carpe goes on to say, “Many existing security experts as well as their tools and standards are seriously behind the times when it comes to including the cloud into their security plans. Where our customers have sensitive data, we must consider not just things like their firewalls or patching Windows, but also whether they’re subscribing to the right mix of cloud services to fully protect themselves.” 

Let that sink in for a moment. 

Protect your business! Now, wanna upgrade your security? Contact Liquid Mercury Solutions today to set yourself up with high quality cloud security and data protection to fit your business needs. 

Read More About It: 


Microsoft Renaming Kiosk Plans to Frontline Worker Plans 

For years, we’ve struggled to explain to our clients what a Kiosk plan is, often calling it the “deskless worker” plan instead in favor of Microsoft’s preferred naming. Now Microsoft seems to be catching on to the longstanding communication gap. This week, they’ve announced a naming change to the K1 plan, which will henceforth be known as… wait for it… the F1 plan! 

What’s the F for? Well, all joking aside (and, yes, we’ve had some good-natured fun at Microsoft’s expense), the F is for Frontline Worker – but it could easily mean Field, Fleet, Factory, First-line, or one of those other words that starts with the same letter. 

Whatever way you spell it or decide that it stands for, the F1 plan is still the cheapest way to get your non-IT, non-administrative employees into Office 365. The price is still the same at $4 a month, and while the plan doesn’t include a copy of Office it does have email, Skype, and access to OneDrive and SharePoint – which is fine since a key requirement to the F1 plan is that the user doesn’t have their own PC. The F1 plan is perfect for users who’ll access Office 365 primarily via their smartphone or tablet, and may use a shared computer (kiosk) on occasion. 

So, if just the name is changing, what do Office 365 subscribers need to know? Not a heck of a lot, just keep it in mind when you get your next billing statement. Nothing’s really changed at all, so you’re not getting “F”ed. ;-) 

Error Creating WCF Connection for BCS Content Type in SharePoint Online

***some links Currently Broken***

I started this week's Super SharePoint Detective Adventure while trying to follow Nick
Swan's blog article about creating a BCS External Content Type for CRM 2011. This kind of integration between SharePoint and CRM is something I've been wanting to prove out for our own use for a long time now, and Nick's approach (although it still involves a lot of "glue code") seems like the most reasonable one I've seen to date.

True, his article talks about SharePoint 2010, but he has another one based on 2013 and all the concepts look like they ought to be backward compatible. In fact, everything was going fairly well, until I got to the last step in SharePoint Designer where you actually create the ECT connected to your WCF service. Then, things just wouldn't work - no matter what I tried, SharePoint Designer kept giving me this error.

  • An error occurred while accessing WCF service URL: http://<myStagingGuid>
  • Connection to the WCF service that has connection name BcsTest cannot be established.
    • Unknown Error occurred. Unable to load one or more of the requested types. Retrieve the LoaderExceptions property for more information.

I could see others online are having this problem too, as there are questions about it posted in a few places, but mysteriously so far there was no answer:

  • MS Forums: Problem creating external content type from WCF service
  • Stack Overflow: Consume WCF Service in Office365 from window Azure

So, I did some experimentation. People say this works in on premises SharePoint, so I point
SPD at my on-prem SharePoint 2013 farm, and just as they say everything works fine.

Maybe my types are too complex. After all I am trying to pull a ton of data fields from a CRM Account. I create a much smaller web service with a data structure containing just a string with "Hello world!" and an integer ID of 1234. Still, I cannot create the data connection to this service either.

I spend some time trying to package up my data model into a *.bdcm file, and then do the export from my on-prem farm to SPO. No dice! This just breaks differently, because now when I go back to SP Designer my data connections are still broken, and my options from the SharePoint web UI are too limited to complete the configuration.

After beating my head against the wall all Friday evening - then stressing out all weekend about how I'm supposed to get a good hybrid solution with one part in the cloud and one part on our local network when I can't get the in the cloud part of it to work - I decided to open a ticket with support. At this point, I am not expecting great things to happen.

I guess it's a good thing that my experience did not live up to my expectations. I got the answer that I needed, and I didn't even have to wait a fortnight!

Update: Yay! Microsoft releases official KB article for my issue. KB2879695: Unknown Error occurred" error message when you try to create an External Content Type in SharePoint Online by using SharePoint Designer 2013 came out yeterday; interesting timing, indeed. ;-)

I was also surprised when I learned the cause to this problem, and its solution - or at least the workaround. More on the cause in a moment. I think it may surprise you. Here's the workaround for you, in glorious Technicolor.

  1. Find the spdesigner.exe.config file. In my case this was in C:\Program Files\Microsoft Office\Office15 because I am using the 64 bit version. You're all using 64 bit now right? Right?
  2. Make a backup of this file, because you never know when you'll want to reverse this fix, like maybe on whatever day SharePoint 2016 finally rolls out.
  3. Add the runtime element below, so that your file looks like the following code, then save it.

    And that's it. Just re-execute SharePoint Designer and it will magically work.

So what's going on here? Bascially, we're telling SPD that if it gets a reference to a version
16.* assembly, it should use version instead. Seems that the Office 365 team has incremented the version number for SharePoint online to for some reason. That's why the BCS connection will work fine if you are pointing to your on premises SharePoint farm.

Do they have a time machine into the not-so-distant future? Are they pilot testing the super-secret-squirrel beta version? Was somebody just a very bad typist working with an equally bad set of software testers? As Fox Moulder once said, "The truth is out there." But it turns out that it really isn't so exciting. I'm told that the reason this was done is to help differentiate the build running in Office 365 from the on premises version.

I'll defer judgment for now on whether this was a particularly wise decision on Microsoft's part, but I encourage you to leave your opinions in the comments. Let's just say that heir workaround has had some unintended consequences, and my error was one of those.

And of course, if you think this was a particularly brilliant piece of detective work and want us to help on your next SharePoint project, you can reach me here.

Further reading / related articles:


**Moved over old comments for this blog***

SQL Azure Makes Database Administration Fun!

If the thought of setting up load balancing thrills you; if you just can't wait to optimize your hardware; if you're desperately eager to crawl around your data center making sure that your machines are all connected to the network; basically, if you're thrilled with the physical plant planning aspects of SQL Server… well… this is not the blog post for you. Go look at funny pictures of cats instead.

Okay, now that they're gone…

SQL Azure, a highly available and scalable cloud database service built on SQL Server technologies, separates the logical database from the physical hardware.  Your database becomes a Platonic ideal, floating in the astral plane of ideas. Yes, it physically lives somewhere, but aside from the proximity of your data center (and therefore how much latency you have to deal with), you have no idea where. More importantly, you don't care. You can focus your attention on the data itself, on optimizing it, not on physical server considerations.

Issues to Consider

Sure, there are quasi-physical issues you still have to consider.  For one, your cost will largely be determined by storage and transfer levels, and there's an upper limit to how much data you can have at all.

Also, performance degrades as size increases, to the point where some have found it makes more sense for their applications if they divide a 150 GB database (the maximum size allowed by SQL Azure) into three 50 GB databases.1

A Different Mindset

It’s not just that you don't own the box; the box won't even do things you'd expect it would do if it were really a SQL Server.  When I used to maintain an application that was hosted on shared SQL Server hosting, because it was still SQL Server, I had the commands available to run a backup.

But if I tried it, I wouldn't actually get a backup, because the backup would try to write to the physical hardware that the server was on, and as a mere client of the hosted SQL Service, I had no access to that. 

SQL Azure has solved that problem by just getting rid of the backup command.  You can sync the data with a local SQL Server of your own. You can write queries that extract the data and access them via your own SSMS client.

But you cannot actually do BACKUP (or RESTORE, for that matter.)  You don't have to manage the size of your logs (in fact, you can't.)  Your data lives in a world where the file system behind it is not just inaccessible, it’s invisible.

Is SQL Azure Mission Critical?

This can be a little scary.  It's hard to let go of the idea that your data is a thing – that you can’t look at the physical object that houses it, or control the load balancing2, or that there's little you can do to improve uptime… I'll admit it, this would bother me if I were running mission-critical apps with highly sensitive proprietary data on them via SQL Azure.

I do, however, think that if I were doing that, someone would need to check my blood sugar for me3, because I can't see any reason why any rational person would ever want to run a mission-critical app with highly sensitive data on SQL Azure.  That's not what it's for.4

I mean, I suppose you can.  Microsoft promises 5 9's of uptime and will restore your stuff from their backups if anything goes down.  You can purchase two Azure "servers", and Microsoft will fail them over to each other automatically.  You can run your own backups by syncing to a local copy. And your data's encrypted every time it goes anywhere, so you probably could run Very Important Apps on Azure if you wanted to. 

But odds are, if you actually have Very Important Apps to run, you can afford to buy on-premises SQL Server, or at least virtual SQL Server hosting on your very own VM.  Also, odds are, your Very Important App probably does not want to deal with potential latency issues that you have absolutely no control over whatsoever.

What SQL Azure Is Really For

SQL Azure democratizes web-capable relational databases, so that small businesses and hobbyists can afford them.  Access was never particularly web-capable (unless you're talking about Access Services on SharePoint… but if you have SharePoint, generally speaking, you have SQL Server), and MySQL, like most open source software, is harder to use for most people because it's created to the specifications of the highly technical folks who are creating it in their spare time for fun.

SQL Azure also reduces the footprint of your SQL Server on-premises requirements, so you can cut down the amount of hardware you need to maintain and the amount of time your DBAs need to spend pretending to be network admins or DBAs. 

Additionally, like all cloud services, SQL Azure will probably be up during a problem that affects your data centers, so you can use it for emergency failover or backup services. When you're suffering from bailing your office out after a hurricane, you can at least be confident that your web site was sorta kinda continuing to run and capture data in the background.

So it's a different way to think about your data -- but honestly, I never wanted to be a network admin anyway, so I think I can get used to a world where I might never have to trunc the logs ever again.


1. Actually… is that something normal DBAs ever have to do, or is it just me? You gotta imagine the performance degradation must be impressively awful… because you also cannot join across databases.  So if your data has been split into three db's, you'll need a lot of logic at the app layer to control where you're writing a given record and then remembering where to go back for it when you need it again.

2. This isn't 100% accurate.  If you purchase two SQL Azure "servers", Microsoft will allow them to load balance.  However, SQL Azure still handles the actual load balancing algorithms in the background, and you can't really control them.

3. I am, in fact, hypoglycemic.  So if you ever find me building mission-critical apps with highly sensitive proprietary data with a SQL Azure back end, please give me something to eat.  String cheese is great.  It actually ought to have protein in it; the idea that hypoglycemics need a candy bar to bring up their blood sugar is sort of similar to the idea of giving a guy with a hangover a screwdriver, in that maybe it will solve the immediate problem but in the long run it'll probably make things worse.

4. I'm going to be a big Star Trek geek here: don't give a Vulcan a job as a stand-up comic, don't employ a Klingon as a starship counselor, don't have a Q sitting around reading numbers to you off a screen, and don't use SQL Azure for what it's not good for.  When something is very valuable doing one thing, don't make it do a different thing it's really no good at.