Nicholas Carr, of IT Doesn’t Matter fame, talks about his new book, The Big Switch: Rewiring The World, From Edison To Google.
Q: What is this big switch you see coming? A: I think we’re at the early stages of a fundamental shift in the nature of computing, which is going from something that people and businesses had to supply locally, through their own machines and their own installed software, to much more of a utility model where a lot of the computer functions we depend on are supplied from big, central stations, big central utilities over the Internet.
It’s analogous to what happened to mechanical power 100 years ago, when the electric utilities changed the nature of that resource and how businesses and people used it and received it.
Q: How did you come upon the electricity analogy for computing? A: It was pretty clear to me that we were in this kind of shift. I’m interested in the history of technology and had been reading a lot about the technologies of the Industrial Revolution. It struck me that the kind of radical shift that businesses, in particular, had to go through when they decided to close down their waterwheels or steam engines at their factories and trust an outside supplier to provide this essential resource reflected the kind of upheaval that people feel in computing when they begin to rely on and trust outside suppliers to supply another essential resource.
Q: You refer to both electricity and information technologies as general-purpose technologies. Has computing gained that status? A: Yes. General-purpose technology is a term that economists use to describe any technology that can be used for many, many different purposes. They’re very rare, and they’re very important in economic and business history for the simple reason that you can use them so broadly. I think it’s widely acknowledged now that the two most important general-purpose technologies in history really are electricity and computing.
Q: Processing is done in so many different ways and for so many different purposes. Is it really as general as you make out? A: The analogy between electricity and information technology works at an economic level, pretty well, I think. When you start looking at a technological level, you see that there are, of course, major differences, and I’m not arguing that IT is like electricity in some fundamental technological way.
The main difference is that IT is extremely modular in a way that electricity wasn’t. With the electric utility, they produced the power, transmitted the power, and then everything on your side of the electric socket was your responsibility. With IT, all of the functions can be considered as individual modules. Raw processing can be done either locally or over the net; data storage, same thing; and all the applications - unlike electricity - can also be supplied either locally or over the grid.
But I do think that, if you break computing down to its essence - which is data processing, data storage, data transmission - that it is very much a general-purpose technology. Similar to the electric grid, you can build all sorts of applications or appliances on top of it to do all sorts of things.
Q: Which will be more significant in the near future: scientific and engineering breakthroughs or economic forces? A: Ultimately, it’s the economics that really determine what people and companies do. It’s easy to lose sight of that, because it’s exciting to see technological breakthroughs and progress. But businesses are completely economic beasts. It’s going to be the economics of IT, and the central or local supply of IT, that determines how companies think about information technology in the future, and how this new utility industry matures and grows and the ultimate structure that it takes.
Q: What do you recommend to CIOs and CTOs? A: The first thing I recommend is staying aware of these new capabilities that are coming online and not viewing them as threatening. Even though they promise to transform or even fundamentally change the nature of IT, they’re not something to be feared, because companies, CIOs and IT professionals now have many more choices, and will continue to have many more choices, as online services get better and the suppliers build their own scale and increase their own sophistication and reliability.
So really this is a trend to be welcomed because suddenly, if you’re a company, you have more options in how you get the IT capabilities you need to operate your business.
Q: Do you think corporate IT departments will shrink? A: I think over the long term they will, and by long term I think this is a shift, like we saw with electricity, [over] a decade or two, particularly for larger companies.
A lot of the jobs that are inside IT departments today, in fact the majority, are related to maintaining the internal assets - the machinery and the software that runs locally. Over time, those kinds of jobs will move from inside companies to the supplier side.
The IT department may shift more toward pure manager of information or connector of software services to business processes.
Q: Do you think the same thing might be true for some of the vendors? A: Yes, definitely. There are a couple of trends here. One is the supply of IT - whether it’s raw computing, data storage or applications centrally - which will tend to expand the workforce on the supply side. On the other hand, we’re seeing a fast move to more automated IT services through virtualisation and other types of trends, which will tend to push down the labour requirements. So we have two opposing but very tightly related trends.
Q: Explain the World Wide Computer and its programmability. A: I argue that the World Wide Web is turning into a World Wide Computer, which means that all the pieces of a computer that we used to maintain locally - the data processing chip, the data storage and the applications - can now be assembled from components that lie out on the Internet, and may be supplied by many different companies. In essence, that means that the Internet, like any other computer, becomes programmable.
If you’re an individual at home, you can go to Facebook and program the information flows, and you can manipulate what is in essence a software program to your own needs. You can program the Internet. I think companies now also have the capability to assemble the IT requirements for their business from all the components that lie out on the Internet or in their own datacentres that are hooked up to the Internet.
One of the big challenges for companies is to figure out how to program this great new shared machine in a way that fulfils their needs efficiently and flexibly and reliably.
Q: If companies are starting to use the Internet for data processing, is security a huge problem? A: I don’t think it’s a huge problem. The onus is on the suppliers to prove their reliability and security and earn the trust of the buyers, but my own feeling is that ultimately the utility model will offer greater security than we have today, because today our IT system is incredibly fragmented. Some companies and some individuals are very attuned to security and are very good at it, and others aren’t.
A lot of failures of security aren’t because of some central failure, they’re because of individual failures in taking appropriate care. As we move toward more of a utility model, and more and more data is supplied from big utilities whose entire existence depends on maintaining a high degree of security, I think we’ll ultimately see more secure data.
Q: Electric utilities have tended to be highly regulated. Do you see the same thing happening with computing? A: My opinion about that has changed quite a bit, even since I began writing the book. Originally I thought the modularity of computing implied that we could have a very diverse set of suppliers whose services would be joined together through a lot of industry standards. So my initial imagination of the utility industry was of a lot of different companies doing different specialised things and competing with each other in a way that you don’t see with electricity, which tends naturally to become a local monopoly. There’s no reason that computing needs to be a local monopoly, since you can supply these things in many different ways from many different places.
More recently, though, I think we’ve seen a lot of pressures to centralise and build utility datacentres of really massive scale, which requires a lot of money and a lot of expertise. That implies that we’ll see a great deal of centralisation in the industry. If that does come true, if we have monopolies or oligopolies begin to form, I think inevitably we’ll see more governmental regulation the way we see with other utilities.
Q: You claim that there was a democratising effect from the electric grid. Do you think the same thing will be true from the computing grid? A: Yes. Once you start computing as shared services, you can gain great economies of scale and you can push down the price of computing even as you expand the availability.
The great advantage of this model is probably for smaller companies, which have been at a disadvantage to bigger ones because they haven’t been able to build big datacentres or put into place big ERP systems. As soon as you move to the utility system, you suddenly level the playing field and allow smaller companies to tap into the same kind of sophisticated computing operations that have been available to larger companies.
Find your next job with techworld jobs