Earlier this summer, I attended GigaOm's Structure 2011 conference in San Francisco. It was a well-attended two-day event that focused on cloud computing.  Most of the big players were all there -- from VMWare's CEO Paul Maritz, Amazon's CTO Werner Vogel to Accenture's very own Gavin Michael.

One of the sessions was a panel of representatives from five VC firms discussing whether cloud is just another tech bubble in the making. Towards the end, a member of the audience asked the panel what they saw as the next wave of technologies beyond cloud computing and big data. Not surprisingly, different panelists gave different answers but a few common themes emerged - one of which was analytics. 

Some of the panelists construed analytics as being the way it is applied to the management of IT infrastructure, such as predicting changes to the network or data requests before they actually happen.  Others believed that applications powered by analytics will be the biggest users of cloud computing and that the infrastructure to support large scale analytics will become very cheap and the use of analytics will be pervasive.

I found these answers intriguing because for the past several years, I have come to believe that analytics will drive the next wave of change that will transform how we develop and architect systems.  As computing infrastructure and the necessary software components for analytics are getting cheaper and more readily available, it is ripe to be applied in the tools, frameworks and processes we use to develop software.

Let me give two simple examples. Remember the days when you had to run a profiler to capture execution timing, sift through the logs to identify your application’s performance bottleneck? Remember how time consuming the entire process was? With the combination of test automation tools to drive the application’s front end, and using machine-learning clustering algorithms to automatically sift through the execution timing logs, you can create an environment that can continuously execute the application, profile it and sift through the execution profiles to pinpoint the location of the bottleneck - all with a few mouse clicks.

Another example: today’s integrated development environments (IDE) such as Eclipse, Visual Studio, and Rational Jazz already have built-in capabilities to monitor the activities within the environment. These capabilities can be used to instrument and analyse software development environments to objectively monitor, collect, aggregate and predict how the software development project is progressing and whether it will hit the expected target in terms of quality, resources and timing. What is currently missing, is the analytical component that takes the data collected and forecasts the direction of the project based on a certain benchmark.

Coupled with the increasing use of software development in the cloud, I expect analytics-powered software development tools, frameworks and environment should quickly emerge in the next several years. What do you think? Am I too optimistic?

By Edy Liongosari, Global Director of Research, Accenture Technology Labs