By Richard Waters in San Francisco
Published: March 25 2009 20:16 | Last updated: March 25 2009 20:16
A revolution that is sweeping through corporate data centres is the untold secret of modern business. The information factories of the digital age are going through a transition that is every bit as significant as the advent of the moving assembly line was to manufacturing nearly a century ago.
Companies that master the new techniques of information processing, or make the right bets about when to hand over control of that operation to someone else with greater skills, stand to reap significant benefits in the form of lower costs and greater usability of their data. But this all comes with significant risks: not only of falling behind as information processing enters a new industrial-scale era but, conversely, of losing control of one of a business’s key assets – its information.
The shift is happening largely out of view. Guarded by the priesthood of the IT department and locked away from prying eyes for security reasons, data centres operate beyond the average business manager’s realm of consciousness. The places where the inner workings of business are conducted – where invoices are processed, transactions recorded and corporate secrets stored – are often taken for granted.
Every now and then, though, something happens to shine a light on this arcane world. Such a moment came last week, with signs of a realignment among some of the world’s leading technology concerns.
Cisco Systems, the world’s biggest maker of networking equipment, said it would start selling servers, the back-room machines that are the workhorses of corporate computing, setting up a showdown with Hewlett-Packard and IBM. It also emerged that IBM was in the late stages of negotiations to buy Sun Microsystems, bringing together two of the biggest suppliers of servers and other technologies for data centres.
The maturing of the IT industry and a steep slide into recession provided the immediate impetus for these moves. But something else is at work. After a technology era characterised by the rise of the PC, a new centralisation is taking place in computing and the biggest suppliers of technology are being forced to respond. A catchphrase has been coined to describe this new approach: “cloud computing”.
If the past quarter-century was characterised by a decentralisation of computing, with information processing and storage placed on every desk- and laptop, the coming era is set to bring greater consolidation of computing power in “clouds”, or large-scale, distributed computing facilities. Even Microsoft, a company that came to dominate the PC era, is racing to create one of the world’s biggest computing clouds, although it insists this will co-exist with existing forms of personal computing for years to come.
The economies of scale that come from consolidating computing in fewer places, and the availability of fast internet connections that make it easy to tap into this resource, account for the shift. As a result, data centres – whether run by large companies or by internet services groups such as Google – are assuming an increased share of the world’s information processing workload.
There are striking signs of how this reconfiguration is taking shape. According to Rick Rashid, head of research at Microsoft, a handful of internet companies, including his employer as well as Google, Microsoft, Yahoo and Amazon.com, is already buying 20 per cent of the world’s output of servers. The massive new data centres these companies are building harness an amount of computing power that far exceeds anything assembled by private companies before, he adds.
These companies use much of this brute computing power to run their own online services, including internet search and electronic commerce. Increasingly, though, they are also offering their capacity as a substitute for the data processing and storage that takes place on their customers’ machines.
This comes in two forms. Some is made available in the form of services. For instance, rather than store documents and run a word processor on your own PC, you can now just access Google Docs, which mimics many of the functions that Microsoft’s Office software has carried out in the past. In the business world, this approach is known as “software as a service” and involves the provision of corporate applications such as accounting or customer relationship management by companies including Salesforce.com and NetSuite.
Small companies in particular are turning to these services to escape the headache of maintaining their own technology. Patricia Seybold, founder of a small consulting business, is typical of those who no longer see any need to run their own IT: for e-mail, for instance, she relies on Google’s free Gmail service. “It’s much nicer getting something for free than having to pay Microsoft and manage a server,” she adds.
The second new form of outsourced computing involves the provision of raw data processing power and storage capacity: companies buy access to someone else’s data centre to boost their own capacity at times of need, or even to replace it altogether. Amazon.com has emerged as the unlikely early leader in this business. More than half the online bookseller’s computing resources are being consumed by other companies, which run their own applications in its data centres, says Werner Vogels, Amazon’s chief technology officer. Customers include the New York Times and Nasdaq.
The same forces that are leading to the aggregation of computing power in the internet services companies are also prompting many big companies to centralise more of their internal computing resources, taking advantage of economies of scale that are bringing down the unit cost of information processing and storage. For instance, General Electric’s data centres now consume about 45 per cent of the company’s IT budget, up from 25 per cent three years ago, as it builds up a more centralised resource that can be used by all of its divisions, says Greg Simpson, GE’s chief technology officer.
While internet companies are creating “public clouds”, many big companies like GE say they are creating private ones of their own, even as they weigh the benefits of shifting some of their computing to the outside services suppliers.
There is a danger, however, in overstating the near-term impact of new developments of this nature. “The ‘cloud’ is definitely hyped right now,” says Dante Malagrino at Cisco. He describes the term as a “fancy way of saying ‘a network of resources’ ”. It is also the case that in IT, there is seldom anything really new under the sun. The centralisation trend is a return to life before the fragmentation brought about by PCs and the proliferation of department-level servers in business to handle tasks such as e-mail. Forerunners of the cloud have gone by names like “utility computing” and “on-demand computing”.
Sweeping visions of this kind are easy to lay out yet take years to unfold. Companies do not abandon their earlier investments in technology but layer new technologies on top of old, creating a patchwork of information architectures. Most data centres resemble archaeological digs. Nor will change happen overnight. Companies are deeply cautious about how they handle their information assets and do not experiment with new IT systems lightly. Concerns about security and reliability will act as a drag for years.
“Cloud computing” has become useful shorthand for a trend that has shown signs of accelerating as software-as-a-service and the outsourcing of raw computing power are adopted more widely. Yet there are reasons why some computing will continue to take place locally that go beyond a desire by users to keep more direct control of their own data. For a start, technological advances mean many companies will find they can come close to matching the scale economies of giant internet service companies, says Russ Daniels, chief technology officer of cloud computing at Hewlett-Packard. “There are fewer barriers to entry than there were two years ago,” he says.
Much will depend on how companies use the new resources – and how they meet the rising expectations of customers who have been taught to expect instant gratification by modern internet services. “All the things Google and Amazon do so well, many of our clients need to do themselves,” says Irving Wladawsky-Berger, a former IBM executive who led that company’s early internet efforts. “In order to be able to scale to deliver large-volume services, you need an industrialised approach.”
The idea of having to compete with Google, probably the most advanced engineering force on the planet, is likely to fill the average IT manager with dread. Yet the world of centrally delivered services has room for many players, says Mr Wladawsky-Berger. “Google is at the high end of scalability – not many companies will do that – but the world is full of other services.”
Being able to tap into massive data storage and processing power at low cost – and, equally important, having access to it at the drop of a hat rather than waiting for an IT department to procure and provide it – could also add a new level of intelligence to how companies deal with their customers. By analysing everything they know about a customer and relating this to what they know about other customers’ behaviour, companies should be able to make smarter assessments in real time, says Mr Daniels at Hewlett-Packard. “You can make better recommendations to customers because you have better insights,” he says.
For managers, that same plentiful supply of IT could make it economical to try out new business ideas that would have fallen by the wayside before. “How many ideas do business managers have where they say, ‘If only we could try this out’?” asks Pat Kerpan, chief technology officer of Cohesive Flexible Technologies, a cloud company that assembles automated software.
Ultimately, the effects of this greater centralisation of computing power on business and society are hard to predict, says Microsoft’s Mr Rashid. Every shift such as this to a new computing architecture is accompanied by predictions that derive merely from past experience, he says: for instance, the idea that users will use word processing services “in the cloud” rather than software on their own PCs. Yet the most far-reaching effects of new technology are normally ones that were not anticipated.
According to Mr Rashid, that might include the ability to amass much larger volumes of data on scientific topics and subject them to deep analysis, potentially yielding advances in science that were impossible before.
As with many aspects of cloud computing, such visions are still more dream than reality. With economic as well as technological forces driving the new centralisation, though, they could take shape not only quickly but in surprising ways.
Behind the rise of cloud computing lies a process revolution that has been taking place on the information production line. It has been compared to the impact on the automotive industry of Toyota’s manufacturing system, which brought a step change in quality and a reduction in costs.
“The typical IT organisation is used to a highly customised way of doing things,” says Russ Daniels, chief technology officer of cloud computing at Hewlett-Packard. Adopting more standardised systems “will turn IT into more of a manufacturing process than an art”, he adds.
In turn, that could lead companies to decide they no longer want to invest in staying on the cutting edge of the information processing business. The danger, says Mr Daniels, is that this could leave companies without the skills needed to understand and direct their IT.
The catalyst for this new industrialisation of IT has been a simple but powerful idea: unchaining computing tasks from the physical machines on which they take place. Until recently, companies bought a new server each time they added a new application. The need to commission, house, maintain and power all those servers has become one of IT’s biggest costs.
The antidote to this has been a technology known as “virtualisation”, which makes it possible to run more than one application on each machine. In essence, each application is tricked by the virtualisation software into thinking it is running on its own dedicated server: many new “virtual machines” can exist on a smaller number of actual servers.
Breaking the link between a computing task and the hardware of individual computers does not only lead to more efficient use of capacity: virtual machines can be shifted between servers as they are running, or even between data centres, with no interruption.
This virtualisation trend has taken hold quickly over the past two to three years. A year ago, says Greg Simpson, chief technology officer at General Electric, his company bought a new server in 85 per cent of the instances when it had a new computing task to handle: by the end of last year, that proportion had fallen to 50 per cent, with “virtual machines” making up the difference.
It is a short step from there to shifting some computing tasks from a company’s own computers to those of an external service provider.
Copyright The Financial Times Limited 2009