The next generation of Apache Hadoop, the software implementation that allows batch processing of petabytes of data, is expected out this year, says a Yahoo executive.

Todd Papaioannou, vice president of cloud architecture at Yahoo, said that current iterations of Hadoop lack the ability to effectively manage resources across thousands of servers in a cluster. So developers are working on improving utilisation, scheduling and management of resources.

For example, the new architecture will include a global ResourceManager that will tracks server availability and scheduling invariants while a per-application ApplicationMaster runs inside the cluster and tracks the program semantics for a given job, Yahoo developer Arun Murthy wrote.

Papaioannou said Yahoo contributed about 70% of the code for the current iteration of Hadoop and the Hadoop Distributed File System (HDFS). Earlier this year, Yahoo dropped its own distribution of Hadoop and began working more closely with the Apache Hadoop community, because it allows the open source community to help with development efforts, Papaioannou said.

Along with Apache, Hadoop uses an iteration of MapReduce, a programming technique that originated at Google, for building parallel programs. Running with Hadoop, MapReduce enables it to perform parallel batch processing.

"The next generation of HDFS will be more resilient, available and reliable," Papaioannou said. "We expect to put it all together in a release some time soon. That's an exercise of collaboration with rest of the development community."

Yahoo also just launched a new project called H Catalog, which is a table metadata management schema for Hadoop.

"That will help drive different use cases," he said. "It just went into Apache version last week."