When Cloud Computing Doesn’t Make Sense

Clouds, clouds, clouds. Everyone talks about Google-style cloud computing — software as services off in the Internet “cloud” — as the future.

Enterprise Computing

But while cloud computing is a marketing triumph, new research from McKinsey & Company asserts that trying to adopt the cloud model would be a money-losing mistake for most large corporations. The research is being presented at a symposium on Wednesday afternoon, sponsored by the Uptime Institute, a research and advisory organization that focuses on improving the efficiency of data centers.

The McKinsey study, “Clearing the Air on Cloud Computing,” concludes that outsourcing a typical corporate data center to a cloud service would more than double the cost. Its study uses Amazon.com’s Web service offering as the price of outsourced cloud computing, since its service is the best-known and it publishes its costs. On that basis, according to McKinsey, the total cost of the data center functions would be $366 a month per unit of computing output, compared with $150 a month for the conventional data center.

“The industry has assumed the financial benefits of cloud computing and, in our view, that’s a faulty assumption,” said Will Forrest, a principal at McKinsey, who led the study.

Owning the hardware, McKinsey states, is actually cost-effective for most corporations when the depreciation write-offs for tax purposes are included. And the labor savings from moving to the cloud model has been greatly exaggerated, Mr. Forrest says. The care and feeding of a company’s software, regardless of where it’s hosted, and providing help to users both remain labor-intensive endeavors.

Clouds, Mr. Forrest notes, can make a lot of sense for small and medium-sized companies, typically with revenue of $500 million or less.

Instead of chasing cloudy visions, McKinsey suggests, corporate technology managers should focus mainly on adopting one building-block technology of the cloud model, virtualization. Such virtualization allows server computers to juggle more software tasks, and thus increase utilization, reducing capital and energy costs.

The average server utilization in a data center, according to McKinsey, is 10 percent. That can be fairly easily increased to 18 percent, the consulting firm says, by adopting virtualization software (EMC’s VMware is the leading vendor). With more aggressive adoption programs, servers in corporate data centers can reach up to 35 percent utilization, McKinsey said.

“We should focus on things we know work now, and virtualization works,” said Kenneth Brill, executive director of the Uptime Institute.

Comments are no longer being accepted.

How one-sided can you get? It’s a study by McKinsey, a firm whose bread is buttered by huge companies with expensive, inefficient data centers to justify … being presented at a conference sponsored by an organization who gets its fees from data center operators, including Managed Services Providers — the other main competitor to cloud computing. So, uh, sure they’d say this. They sound worried.

Chris Dunne – MISight.Net April 15, 2009 · 11:05 am

Is the use of virtualisation just a way to create your own computing ‘Cloud’? There seems to be lots of talk of creating ‘Private Compute Clouds’; it would be interesting to see how the cost of a traditional data centre compares to a ‘Private Compute Cloud’ model.

I do agree that the pricing for current Cloud services is restrictive, but for smaller organisations that do not have the scale to make large investments in high-availability computing in-house – a good Cloud Computing Service is still the perfect answer.

Microsoft’s studies have earlier pointed to a similar concern over price of cooling data centers. This study doesn’t reveal anything new but does put numbers on a snapshot in time of the cloud computing bubble.

ADAPTATION

The Google folks,

I believe’ll,

Change their tune

To “Do Evil.”

McKinsey is completely biased and desperate to hang onto their highly profitable billable hour model. There’s very little money to be made by them with cloud computing, ergo the “research”.

The game is up guys, your pricey MBAs are the equivalent of pablum.

This article only compares the hardware costs of on-premise management versus in the cloud, but a lot of the benefits of cloud computing comes from the other areas, underneath the covers. I’m talking about redundant systems, disaster recovery, backup and failover, systems management and tuning, patching, security assesments, etc. etc. When you add all of those together, the benefits of the cloud far outweigh having those same systems on-premise. My 2 cents.

Surprising. Companies doing 500 million may want to think about their own infrastructure. . . is the study a joke?

To #5. McKinsey does not operate on the billable hour model. It charges clients a flat monthly fee for services depending on the size of the team.

Were the applications that were “outsourced” to the Cloud appropriately enhanced to take advantage of the economics and scale of the new cloud model? Such techniques as muti-tenancy and pay-as-you-go typically require quite a bit of redesign if not already baked into the application from the start. Not taking advantage of these essential Cloud capabilities renders this an apples to oranges comparison, and would significantly affect the results of the study.

Mike Puterbaugh – www.stratavia.com April 15, 2009 · 3:16 pm

Whether or not the research presented in the article is perceived to biased or not, the idea of cloud/elastic/dymanic computing is here to stay, with all of the major IT systems management vendors tripping over themselves to bring ‘cloud-ready’ solutions to market.
Beyond the hype, the goal of many IT departments is to become truly business-driven, to respond to the needs of the business, as opposed to creating ‘one size fits all’ types of IT services. Many of our customers have already deployed (or are in the process of planning for) “internal clouds”, whereas IT is a shared service provider, allowing their ‘customers’ to use self service portals to requisition pre-approved servers, application stacks, etc. These standards are then provisioned and maintained , largely via automation, to allow IT to become as streamlined as possible.

All of those years of buying more hardware and software than they ever needed is finally going to pay off for the large,complex organizations that require that level of computing.

Cloud computing might be likened to leasing a car. Sometimes, leasing is cost effective. More often, for most people, it is not. There are potential issues with security in cloud computing. Also, you are dependent on the continued viability of X company to be able to remain in business in a tough economy while also providing access to your data whenever you want it. In cloud computing, you are renting resources from someone else and it is almost always more expensive over the long-term to rent a resource from someone else rather than purchase it yourself. You also need a big, fat internet pipe to move data back and forth effectively between you and the cloud.

Of course, the many cloud computing vendor proponents and their VC’s (which seem to be represented in some of the previous comments here), don’t want you to think about these issues as it could ruin their end game

As to virtualization, this is something that has existed in the IBM mainframe world (Logical Partitions) for at least the last 30 years. It doesn’t matter how many virtual images you have or create, you still have only 100% of the root server hardware to work with. Virtual images are used for testing or for workload segregation, but creating one or more images does not give you more physical resources than you had originally. It just allows you to divide what you have. Furthermore, each image you create introduces extra overhead to support it, which comes out of the 100% physical resources that you had originally.

Part of the problem wiht the study is that it’s based on the Amazon model of Cloud… so called Infrastructure-as-a-Service. The analsysis may be correct for Infrastructure as a service – some apps will be more cost-effective in the data center and others will be more cost-effective with an IaaS provider.

But Platform-as-a-Service cloud offerings (e.g., Microsoft’s Windows Azure service) probably provides cost savings to a much higher proportion of apps — you don’t have to buy licenses and manage the OS, the database, the server software, the load balancing, etc. — you just manage your application code. That’s a big difference. For technical folks, it’s the difference between a machine instance and an application instance.

Despite whose numbers you do or don’t trust on Cloud Computing, it will certainly mean less work for high quality IT Pros in the Enterprise as they become an expensive and perceived to be unneeded accessory to these outsourced services. It will also mean yet a further reduction in the tiny amount of desktop support that many firms offer now for these shiny new wonder-apps. Bottom line may look good for IT budget, but overall productivity will go down even further as no one on site really has expertise or is able to provide contextual training on the apps. Think support at your firm is bad now, just wait.
Why was it we moved away from terminals and mainframes to desktop computers and local servers in the first place.

I think the McKinsey study is on target for large firms that have the money to invest in creating their own private clouds. But, we should remember that paradigm shifting innovation rarely is born within the Global 2000 – they got to where they are by succeeding under the old models. I believe that there will be a wave of newer firms that will fully take advantage of the new computing delivery models (cloud hosted desktops, on demand scalable computing resource) that will be able to achieve a level of profitability unknown to firms that are spending upwards of 10% of their revenue on providing computing for their employees. The model here is electricity – in the early days of electricity there were no standards and firms generated their own. Eventually, standards were adopted and the economies of scale were realized to such an extent that only a few firms in the world generate their own electricity for their own consumption. I think that this is the direction we are heading with computing. All of this is enabled by finally having a decent computing infrastructure – aka the internet

Balaji Sowmyanarayan April 15, 2009 · 8:23 pm

To cloud or not to cloud? The Hubson’s choice of either promoting mediocrity within the organization or supporting mediocrity of the cloud supplier.
If an organization goes by the report, it is re-inventing the Cloud wheel using the virtualization arcs and spokes.
Big companies Embracing cloud will only create innovation stalling oligopolies. Highly capital intensive infrastructure controlled by hand full of operators is sure path towards less innovation.

If the biggies don’t adopt cloud, that is the biggest service they can do to sustain innovation in the space.

To #1 and #5

McKinsey doesn’t make its dollars by selling IT solutions. As a former partner at an IT services firm, I can tell you that McKinsey often recommends solutions that minimize IT spending — sometimes leaving others to clean up their mess later on because their solutions are unrealistic about what it takes to get IT done. They don’t push spending on information technology and I doubt that they have an agenda.

In this case, McKinsey is simply giving good and logical advice to their primary customers — large corporations that already have significant scale. Note that the study recognizes that cloud computing may make sense for middle and small sized companies. One solution doesn’t fit everyone.

With efficient utilization of virtualization, cloud computing will become a non issue. Cloud computing is just another passing phase. We had Saas, Virtualization, Web 2.0, SOA, etc. This is just another techno-centric acronym for the industry to rake in $$$.
At the end of the day, cloud computing is not for anyone, that’s the message behind this article. It’s highly over rated and more importantly, the risks can over weigh the benefits.

Alani Kuye

This study might not be the most thorough, but it s a very good start at providing tools to help decision makers appreciate what Clouds are.

as an after thought, Cloud is more like taking a hotel room: you use the facility but it never belongs to you, you cannot personalize it the way you want, it’s expensive on the long run, it’s very convenient for short term usage, it’s always clean and tidy, but it’s never yours and therefore, if you only live in a hotel room, you cannot express your self (your business identity) nor you can develop your skills (your core business). As in every technology, there is a specific usage for each one, and the Cloud has its own, just like a hotel room makes sense in some situations (temporary usage)…but that’s my own view

At Joyent, our cloud achieves CPU utilization rates of about 50% and RAM utilization rates of above 70%.

Server virtualization alone just does not get you there. You need to manage the entire application life-cycle and thus virtualize the whole data-center. Joyent’s software does that.

McKinsey misses several points

1 – Agility and flexibility matter more than cost savings.

People hire McKinsey to help them be innovative.

Cloud Computing increases the pace of innovation within organizations because it reduces the cost of experimenting. New projects no longer have to clear a massive ROI hurdle that is driven by the cost of racking and stacking expensive server hardware.

If the marketing guy wants to try out a new type of blog or an interactive site for a contest, he isn’t going to have to fight an IT department that is worried about bringing down the email system or the G/L.

2 – Hybrid public / private clouds will dominate.

What happens when IT departments can buy the software that cloud companies like Joyent use to run their clouds? We’ve only just started to sell these solutions, but the early results are very impressive.

Flexibility, agility, massive utilization jumps and secure bursting onto public clouds when needed.

Rod Boothby

Cloud computing will have lower costs as economies of scale kick in, as they did for nearly all technologies. McKinsey partners must be having a hard time to meet their annual bonuses if they have not factored this basic assumption in their cost projections. Cloud computing just converts this to a mass infrastructure from the present scenario where you pay annual licenses for software that you use for less than 60 % in a day, and hardware that you find obsolete in 3-4 years, which is off course gives accountants a reason to help you with depreciation and tax benefits. Rent a computer in the sky is simpler – and you would not need any consultant to help advise what configuration you need.

Mckinssey has deep touches with the outsourcing industry in India from their seminal paper in 1999, to their first concept Knowledge center that helped start it, to their alumni across the outsourcing sector which satisfy a mutual symbiotic relationship particularly in business research. Cloud computing actually help with virtual teams – no need for server farms, IT bureaucracies and Indian outsourcing can actually reduce a lot of costs along with American direct users. The intermediaries and consultants would be affected the most.

Indeed I am speaking on the Cloud Slam 09, precisely on how cloud computing can help lower the digital divide by giving high power computing to anyone having a thin shell laptop with a browser. Developing countries need access to HPC to better plan their resources and growth in an environmentally optimized manner.

//www.decisionstats.com

McKinsey’s report is definitely a biased perspective. They, like the many overpriced firms are trying to justify their existence.

Likewise I think “The Big Switch” is more on point in that may companies of all sizes have IT departments that are larger than needed. The cloud model reduces that amount of overhead in people needed.

Come on folks get real.

Cloud computing is the new marketing buzzword everyone will get tired of hearing in a year.

It’s all about control and the lack thereof once you hand off your IP to another company. Not very likely…

You bunch of sheep…

Cloud computing is just another term for “shared resources”, you’ve been able to do this for years on unix systems.

Hi,

I will apologize for what I’m saying but this is a very perfunctory view to cloud computing. I’m sure that there were many who used to say things like this when the radio, television or even when the computers were invented.

Cloud computing is an approach of computer technologies that makes possible for the IT industry to make another step ahead. You’ll see that very soon it will not be possible to use standard classic servers to host any kind of business class applications.

I would bet that NYTimes for example is not hosted on a single server and uses any kind of clustered hosting technology.

— Dimitar
//www.fuscan.com

Amazon.com is just a wholesale provider of resources in a monopoly position right now. Their prices will have to adjust when competitors are operational

The price per year assumes someone will completely eliminate hardware acquisition and will replace with virtual resources from AWS (Amazon Web Services).

This is NOT the cloud business case. The business case is to supplement the during peak demand over a short period of time the physical resources from the locally operated private cloud.

In this case, we should consider the annual cost of the data center ($150 per unit per month of compute power) as being the exact cost of a private cloud. What we SHOULD compare, – assuming that we need double compute power for 1 month a year – the cost of buying incrementally hardware and using it only during peaks – versus renting from Amazon for the times

Scenario A: we need more resources for a shorter time than 1 year, say a peak of 1 month duration.

Buying: $150 *12 = $1,800 per year per unit of compute power, of which the utilization is 1/12 = 8%
Renting from Amazon.com: $366 for 1 month only, 100% untilization

Savings 1,800 – 366 = 1,434 per year, or 80% cost savings.

Scenario B: Assuming the peak is 2 month over 1 year and doing the same calculations, The savings are1,068 per year per unit of computing, or 59% cost savings

Based on McKinsey cost data, it makes sense to rent peak demand AWS resources for 1,800/366 = maximum 5 months in one year. More than that, we are better off to buy our own private cloud.