IT complexity is getting worse, and no one has a view of the big, picture, according to technology experts at an IBM event.

Panellists from across the IT industry at IBM's Navigating Complexity conference in California painted a dire picture of IT systems taking on more and more complexity.

Harrick Vin, a vice president at outsourcing giant Tata Consultancy Services, noted that IT departments must deal with many problems, including security compliance, root cause analysis and overlapping functions.

"Unfortunately, dealing with these classes of problems is becoming harder and harder over time," said Vin, who is also a computer sciences professor at a US university. He cited a top-tier bank with more than 30,000 servers and 200,000 desktops.

The situation is compounded by the fact that different people deal with different parts of the overall problem in isolation, he said. "Essentially, what happens is we only have a silo-based understanding of what is going on.”

Complexity has arisen from evolution, he added. Operating systems, applications and workload types and volumes kept changing. "The requirements that users impose onto these systems also continue to change," Vin said.

He added that systems must constantly adapt to changes. "The state of the art really is reactive firefighting," Vin said.

Peter Neumann of SRI International's computer science laboratory said old mistakes kept being repeated even if issues like buffer overflow have now been fixed.

"The problem is that we keep going through the same problems over and over and over again," Neumann said. The Multics platform had fixed the buffer overflow problem in 1965, but people ignored it, he said. Meanwhile, helpful developer tools were not being used much.

Single points of failure had presented serious problems, such as with the collapse of ARPAnet in 1980, Neumann added.

He said what was really needed was “some sort of approach to complexity that talks about sound requirements," along with good software practice.