Following Wednesday's news story announcing that Moore's Law is dead, here's the rest of the interview from which that story was drawn.

We caught up with Gordon Moore, chairman emeritus of Intel, though now retired from active duty. Speaking late at night (his time) from Hawaii, he told an audience of European journalists, including Techworld, what he thinks of new technology and, in particular, the law that emerged from an article he wrote in Electronics Magazine back in April 1965, exactly 40 years ago.

Moore opened with some preliminary remarks:
This is the 40th anniversary of the publication of the article that was the basis for Moore's Law. It was a piece I wrote in response to a request to speculate as to what was gong to happen in next 10 years, from 1965-75. It was very early days of integrated circuits (ICs) which were principally used by military without cost concerns. ICs were expensive.

The principal theme in the article was that ICs were the route to inexpensive electronics because they were going to become much more complex. At the time, the most complex IC had 30 components in it, but in the lab we had one with 60 components . So I looked back and saw that we'd doubled the number each year, so I took this and extrapolated it for next 10 years saying that it would go to 10,000 on a chip -- it got the idea across that b integration the cost of transistors would come down. I didn't plan to be all that precise but it turned out that way, and one of my colleagues dubbed it Moore's Law.

In 1975, I looked back and saw that progress could be broken down:

1. Components were getting smaller
2. Chips were getting smaller
3. We were squeezing the wasted space from chips

Since waste had gone by 1975, I said that the number of components would double every two years. And that's the way it's been for the next 30 years, more or less.

By making more complex circuits not only does performance improve and reliability and cost also drops. What that means is that, if you're a generation behind, you suffer in performance and cost. So this has become a self-fulfilling prophesy. I'm amazed that we've been able to stay on this kind of exponential growth for so long but the industry has down a marvellous job to go this far.

Q: How long will Moore's Law last?
A: It can't continue forever -- the nature of exponentials is that you push them out and eventually disaster approaches. But in terms of size you can see that we're approaching the size of atoms which is a fundamental barrier, but it will be two or generations of chip before we get that far.

Q: The press thinks Moore's Law that computer power doubles every 18 months -- who's to blame?
A: a lot of people attach Moore's Law to anything that increases exponentially. The impression probably came from [Intel's] David House who saw that complexity doubled every two years but each chip was faster so he deserved credit for the 18 months.

Q: Is Moore's Law a self fulfilling prophesy? Would things have been different without it?
A: It's hard to say -- I think it has become a very useful guide. At the start it didn't have much impact, but the first place I saw an impact was when the Japanese entered the memory business. It seemed then that the industry generally was moving in a random direction but once the Japanese got into memory they had a plan and were successful in taking a leading position in that area.

In that respect, it would have been different if we hadn't noticed that trend. I was lucky as I was just in a position to see things further out than most people, working for Fairchild who were at the forefront of the technology industry.

Q: Is Moore's Law a menace or an opportunity?
A: The public benefits hugely from more capable electronics. Some participants find it daunting that equipment becomes obsolete before it wears out. But overall its good.

Q: What potential does nano-technology has for replacing transistors?
A: I'm a sceptic. ICs were a result of cumulative investment of over $100bn so to replace that, springing full-blown as it were, is unlikely.

Electronics is a mature industry. We're already operating at well below 100nm which seen as the standard for nano-technology so we're there already. Building things up from the bottom, atom by atom, comes from a different direction. It's not replacing ICs - the technology is being applied in different fields such as gene chips to do bio-analysis very quickly, micro-machines in airbags and avionics, micro-fluidics - chemical labs on a chip.

Electronics though is a fundamental technology that's not likely to be replaced directly. There's a difference between making a small machine and connecting them by the billion. Nanotech will have an impact but it's not about replacing electronics in the foreseeable future.

Q: What about the political impact of computing -- did you foresee mass market computing?
A: Yes, in the original article I did but had no idea what it would look like when I looked at original article. I thought it would be about a screen that allowed a woman in the kitchen to store recipes.

Q: Has the military driven advances in computing over the years?
A: Well, the military was important when costs were v high. It gave them capabilities they couldn't get in any other way, through the 1960s. Since then it hasn't had much impact as the commercial business timeframe is so much faster than the pace at which military systems change - they use obsolete electronics in modern military systems.

Q: What about the processor wars between IBM's G5 and Intel -- is that constructive?
A: Competition is good because it makes things happen more quickly -- I'd rather the competition weren't there of course but that's not the real world. We wouldn't be there if there weren't competition -- it keeps you on your toes.

Q: Isn't it software that's the problem rather than the hardware, especially from the consumer point of view?
A: It's hard to debug software until the hardware is functioning properly -- emulation performance is slow. So software tends to lag. The user interface should be simpler but I don't know what's it’s going to look like -- people make improvements in the UI but complexity and background tasks grow. It's a challenge -- but even so, we have complex systems that work pretty well.

Q: Will the fact that the head of Intel doesn't have a PhD change the company's culture?
When Andy Grove became an Intel consultant, I said he'd got over his PhD! Paul Otellini is steeped in semiconductor industry after 3 years -- don't think it's an issue, even though it took 15 years for me to discover I was pronouncing his name wrong.

Q: Will AI ever work?
A: Humans and computers don't do the same things well. Artificial intelligence has fallen behind expectations so we need to study how the brain works and use that knowledge. What intrigues me is good language recognition -- seeing words in context -- which will change how you relate to your computer but it's a long way off.

Q: Can wafer yields get to 100 per cent as you hoped in your original paper?
A: They've improved hugely, we made one at Intel years ago, it was a huge problem early on but it's tractable now.

Q: Where do the best new developments come from -- companies or universities, and where in the world?
A: From all over -- mainly companies. But companies have got out of strongly directed research -- that's being done in universities who are a very important part of the system to keep the really important new ideas coming. It's mainly coming from US and Europe though lots are coming out of Japan -- but I'm not that close to it.

Q: What about power -- IBM wants to keep power down, What's Intel's position?
A: Intel is very concerned too, the Centrino chips have a lot of power management, and features to keep power down. Desktops need it too. Something needs to be done such as getting performance without increasing clock speed and power consumption to live within practical power envelope while increasing performance.

Q: In the mobile age, isn't the Law obsolete because of power reasons, such as battery power in mobile phones?
A: You can keep power down without increasing the number of transistors. You can always find useful things to do with those transistors so I don't expect the complexity curve to slow down.

Q: Is there any new law for next 40 years?
A: I'll rest on my laurels on this one! I'm not close enough now to make new predictions -- several things have been called Moore's Second Law but I can't take credit for any of them.

Q: How excited are you about next 40 years?
A: I wish I could be around to see it. The world has a lot of problems but the technology will be mind-boggling - compare mid-60s to today and you have a sense of the rate of technology growth and I don't see anything that will slow it down.