Last week, application-performance monitoring service provider New Relic launched an offering that allows customers to mine its operational data for business intelligence.

The new beta offering, called Insights, has been a big hit, according to the company. The service has been fielding queries that, on average, consult 60 billion metrics a minute, with an average response time of 58 milliseconds.

In-memory database technology has been key in this new line of business, said New Relic CEO Lew Cirne. In-memory databases are relational databases that run entirely within the working memory, or RAM, of a server, or set of servers.

Cirne expects that the service, once live, could be used in all sorts of ways, such as for customer service, security and targeted marketing. As soon as a user has a question about some aspect of operation, the service can return detailed metrics on that topic, drawing from data that has been captured just seconds before.

New Relic built the database from scratch and assembled a large cluster that can muscle through terabytes of data to quickly arrive at an answer. In-memory technology allows the service to provide answers within milliseconds, even when finding the answers involve combing through large sets of machine-generated data.

Once a boutique item for well-funded fast-trading financial firms, in-memory database systems are starting to become more widely used, thanks to the falling costs of server memory and the demands on the part of customers who've come to expect speedy Internet services, such as Amazon's.

"Customers can transform their business by taking advantage of this technology," said Eron Kelly, Microsoft general Manager of SQL server marketing.

Microsoft has released to manufacturers SQL Server 2014, which has in-memory capabilities built in. Big-data software company Pivotal also released the first full commercial version of its in-memory database for Hadoop, Gemfire HD.

Microsoft and Pivotal's new offerings join an increasing number of databases with in-memory capabilities, including IBM Blu for DB2, SAP Hana, VoltDB's eponymous database, and Oracle TimesTen, among others.

Add to this list a growing number of caching tools that allow organizations to keep much of their relational database content into memory, such as Redis and Memcache. This approach is favored by Facebook, for instance, which uses MySQL to store user data, but relies on Memcache to get material quickly to users.

Traditionally, an enterprise database would be stored on disk, because it would be far too large to fit in memory. Also, storing data on nonvolatile disk helps ensure that the material is captured for posterity, even if the power to the storage is cut off. With volatile RAM, if power is interrupted, all of its contents are lost.

These assumptions are being challenged, however. Online transactional databases, in particular, are being moved to main memory.

"If your data does not fit in main memory now, wait a year or two and it probably will," said Michael Stonebreaker, a pioneer in database development who is chief technology officer at VoltDB, the company overseeing his latest database of the same name. "An increasing fraction of the general database market will be main-memory deployable over time."

Stonebreaker's opinion is echoed by others in the business.

"If you have a bit of a performance pain, and you have the budget to pay for a market leading, general purpose, relational database management system anyway, then it makes sense to manage the data in RAM," said Monash Research head analyst Curt Monash.

Stonebreaker admits the approach wouldn't work for all databases. With today's memory prices, keeping a 100GB or even a 1TB database in main memory is not prohibitively expensive, depending on the level of responsiveness required and amount of money that an organization is willing to spend.

"It is still too early to call it commodity, but in-memory is becoming commodity in a way," said Nicola Morini Bianzino, managing director for the SAP Platform Solutions Group at Accenture.

Bianzino said he has seen a shift in the questions he gets from clients and potential clients about in-memory over the past six months. "The questions are shifting from 'What is it?' to 'How do I do it?'" Bianzino said.

"The message has gone to the market and has been assimilated by clients," Bianzino said. "This doesn't mean they will move everything tomorrow, but they are taking it for granted that they will have to move in that direction."

With SQL Server 2014, Microsoft's approach to in-memory is to bundle it into its standard relational database platform."We're not talking about an expensive add-on, or buying new hardware or a high end appliance," Kelly said.

The SQL Server in memory component can be used for online transaction processing, business intelligence and data warehousing.