By Jean-Paul Bergeaux
Chief Technology Officer, SwishData
To summarize my last blog post, it’s clear public cloud computing is going to be absolutely revolutionary for small businesses and individuals, cutting costs and adding functionality for them rivaling the most developed and mature IT departments. That market segment is huge and will be profitable for cloud computing providers when you consider that some 95 to 97 percent of companies fall in this category. However, the allure of savings for large organizations that can already use economies of scale to deploy modern virtualized data centers (e.g., private clouds) is questionable at best.
With that said, we’re now a couple of years into the cloud-first mandate that pushed agencies to look at cloud solutions. The assumption was that this mandate was pushing agencies toward public clouds or software as a service (SaaS) so the government could grab part of the golden pot of promised savings. However, most agencies are not serving users in the hundreds, but thousands or tens of thousands, sometimes even many more. They fit into the group that is going to have a tough time finding savings in SaaS offerings.
In addition to not offering the promised savings, most SaaS services are still immature and introduce several problems, two of which are also mandates that put agencies in a bind if they try to follow the assumed “public” in the cloud mandate. The most notorious problem cloud services have is a lack of security compliance. Google is the main offender of this one with GSA, NOAA and now the Department of Interior unable to comply with security mandates without pulling out of expensive service contracts.
Another looming problem is information assurance. Google is an offender here too, but nearly all SaaS solutions currently have this issue in some form or fashion. This will be most telling when a Freedom of Information Act (FOIA) request ends up in court because some agencies either can’t comply at all, or can’t confirm that they have fully provided all information pertaining to the request. Most (including Google) cannot provide logs and tracking information, therefore opening these agencies up to accusations of hiding information and deleting records. Again, the only solution is to pull out of the expensive contracts some agencies have signed.
So why was the cloud push initiated? Who knows for sure. It wouldn’t be the first time that government officials didn’t do their homework before making a sweeping decision. It also wouldn’t be the first time that a government official pushed a mandate, law or project for some unknown reason.
Either way, a public cloud push is bad policy. The answer for agencies seems to be private clouds that can provide compliance to the cloud mandate, grabbing the modernization and virtualization savings, while still offering compliance with other government mandates.So how do agencies begin that private cloud process and show leadership the light? In two weeks we'll come back to the cloud topic. Next week, we'll talk about Hadoop. See you then!
While you wait, check out these cloud-focused blog posts at the Data Performance Blog.
Posted by Jean-Paul Bergeaux on Jun 12, 2012 at 12:18 PM2 comments
By Jean-Paul Bergeaux
Chief Technology Officer, SwishData
Ever heard the story of the fisherman who was so focused on catching a big fish that he missed all the medium and small fish swimming around him? Many public cloud companies have hired high-paid sales people to go find enterprise customers. Get the big fish. It makes a splash in the news and makes the sales guy lots of money. But there’s a problem with this strategy. What about the easy-to-catch fish?
What is a public cloud really? It’s a modern virtualized data center paid for “as a service.” It’s really nothing technologically different. Amazon, Google and others offer the same thing that an internal modern virtualized data center could offer, but just as OPEX (operating costs) instead of CAPEX (capital costs). The theory is that the economies of scale of those public cloud companies will drive down costs that can be passed on to customers. I take issue with this theory for large enterprise customers.
There may or may not be some savings to be had for large enterprise customers in the public cloud, but if there is, it won’t be much. Here’s my version of the cost curve at a per-user basis:
That big dot in my opinion is about 2,500 users. So the cost per user is high when the user count is low, but is driven down as the user count grows and the economies of scale kick in. Granted, this is my theory. But in my experience in architecting solutions, at about 2,500 users, you have the ability to really use virtualization in a modern data center and anything above that is more about designing management controls, which don’t bear nearly the cost savings that virtualization achieves.
In fact, most data centers could see an uptick in cost as their user counts get higher and higher. Anyone will tell you that too big can become just as bad as too small. Why do we think for a minute that public cloud companies are going to avoid this while trying to manage mega-data centers?
It’s crazy to focus on the big fish anyway. There are many smaller companies and organizations out there that will see tremendous cost savings and become hardcore believers in cloud computing as the adoption rate climbs. I looked up via the census the percentage of companies that have fewer than 2,500 employees and the number was a staggering 97 percent of all companies. That’s a cash cow for public cloud companies, and we haven’t even talked about individuals who don’t have to be employed to want to use public cloud services.
In fact, I think that it’s not only a mistake to not focus on smaller companies and government agencies, I think it is a mistake to even allow the larger IT organizations to become customers. It’s more likely that they will become disgruntled customers over time as they realize they aren’t saving as much as advertised and have lost control of their own IT, adding security and information assurance problems to boot. Why risk the public black eye like Google got when NOAA Cloud Computing Program Manager Stephan Leeb slammed them for not working with the agency to resolve ongoing security and information assurance issues on NOAA’s Gmail contract?
Want to hear more from SwishData ? Visit my Data Performance Blog, and follow me on Facebook and Twitter.
Posted by Jean-Paul Bergeaux on Jun 04, 2012 at 12:18 PM2 comments