Data Protection Management (DFPM) has limits. What are they? Where are they moving to? We have talked with Bocada and WysDM about this. Now we have the benefit of APTARE CEO Rick Clark's views on the questions we have asked the other suppliers.
What is your view of the data protection reporting/management needs of your customers now and going forward?
In the past there were no or very little regulations or accountability around protecting the digital assets of a corporation. Backup and Recovery was viewed as the “plumbing” and little attention was paid to the true effectiveness of systems implemented to protect the crown jewels of the enterprise. Now corporate governance, regulations, and data audits are some of the key market drivers that force IT organizations to be accountable and to quantify their protection of the digital assets of their corporation.
Over the past thirteen years, APTARE has met with hundreds of IT professionals and storage management specialists and discussed their key problems around Data Protection Management. The continuous theme from these meetings has been the ever increasing need for enterprise software products that address the four components of what we call the Data Protection Management Lifecycle, namely: Exposure Analysis, Capacity Planning and Forecasting, Performance Optimization, and Compliance. Most importantly with the centralization of storage, they need a centralized portal where every storage team member from the administrator to the CIO can login to see what’s happening in their environment.
We often refer to it as the “bubble in the carpet” because of the fluid and dynamic changes that are always happening within the IT environment. Our customers use APTARE StorageConsole to easily quantify their exposure analysis and stop the bleeding. They then launch our dashboards to provide predictive analysis and forecasting so they can better plan capacity for growth in the midst of continuous changes within their storage environment. The next stage in the lifecycle is to identify bottlenecks and drive to optimal throughput for every component of the data protection environment. The last phase of the lifecycle is around compliance. To minimize the pain and related expense associated with internal and external storage audits from both regulatory bodies like Sarbanes-Oxley (SOX) and internal SLAs. This is a significant pain with real and large costs associated, particularly for large public companies.
Should a DPM reporter cover all data protection operations? E.g.
- Replication (async and sync)
- Fixed content reference storage (like EMC Centera, HP RISS)
- backup to virtual tape libraries
- backup - all backup software products
- backup to tape autoloaders
- backup to tape libraries
- off-site electronic vaulting
- archive to optical media and devices - e.g. UDO
- file archiving
- e-mail archiving
- database archiving
- encrypting data
- fixed content data to a content-addressed store or similar.
Ultimately Data Protection Management products should strive to cover all management and reporting on the meta data digital assets (associated with core business applications) are moved from disk-to-disk-to-tape or any combination thereof. At the simplest level, all of the technologies and storage segments mentioned boil down to data replication, archiving, security, and content-aware storage. And just as data moves from one domain to another, DPM products should track the meta data through exposure analysis, capacity planning, performance optimization, and compliance.
Our focus has always been on backup & recovery systems where the APIs and underlying vendor products are very mature. Our goal is to provide rich functionality for every product we support. APTARE StorageConsole provides the greatest depth and breadth of feature functionality by focusing on our customer’s pain points, and seeing it from the eyes of the administrator all the way up to the CIO. It’s very important that we create products that people can use out-of-the-box with no training or professional services required.
As standards emerge around replication, VTL’s etc, similar to SNIA’s SMI-S, APTARE will embrace and continue our philosophy of standards based, customer deployable portal solutions for complex storage infrastructures. If you try to do too much, you’ll end up covering many different storage domains, but with no substance or expertise in any one. The problem with the lack of standards and trying to cover all of the technologies you mentioned is that you can never do a thorough job in any one area. You’re trying to boil the ocean.
Does your software product do this?
APTARE StorageConsole provides the greatest depth of functionality for the top three enterprise backup and recovery products, namely: VERITAS NetBackup, EMC Legato NetWorker, and IBM Tivoli Storage Manager. Our software provides visibility for virtual tape libraries, backup software products from Symantec, EMC, and IBM, backup to tape autoloaders, backup to tape libraries, off-site electronic vaulting, archive to optical media and devices.
APTARE has taken a very different approach from many of the other DPM vendors. Leveraging our background in deploying massively scalable database transaction systems and mission critical web-portals, APTARE has applied fundamental principles of web portals to storage management. A great example of this is our context sensitive navigation in APTARE StorageConsole.
It’s a web-based interface that provides a global view of your environment that is agnostic to the underlying backup products. As users drill-down through the web-based dashboards, the report parameters displayed are unique to the underlying backup software data that is being presented. This style of user interface presents data and nomenclature consistent with what the users are familiar with. The net result is drillable dashboards with enormous amounts of actionable data at your fingertips with no additional training or new terminology to learn.
What is the product development strategy for your DPM product?
APTARE will continue to expand our coverage of enterprise backup and recovery products with support for CommVault Galaxy and HP Data Protector early in 2007. In addition, APTARE will continue to enhance and deliver dashboards and notification around predictive monitoring and early warning of imminent events that, if no action is taken, would otherwise threaten the storage utility’s ability to recover data.
Will it evolve to cover more of these categories?
APTARE StorageConsole is a platform that layers on top of backup and recovery software including VERITAS NetBackup, Tivoli Storage Manager, or EMC Legato NetWorker. There is a great deal of convergence occurring in the market around replication, backup, and disk-to-disk based technologies. Core technologies in the StorageConsole platform, such as the web-based portal, predictive and root-cause analysis, and our patented methods for agent less capture and transmission of storage meta data via HTTP(s) will be applied to new products to be launched in 2007 around replication and primary storage reporting. As standards start to emerge in the VTL and replication space, APTARE will embrace and release product offerings focused on key customer pain points unique to replication and managing VTLs.
Does it have a roadmap to evolve to cover all of them?
APTARE has roadmap plans to cover most of the technologies listed above. Our focus is on managing the meta data as it moves across data domains, from primary storage to secondary, backup, and replication.
Are there specific problems in these various data protection operations that hinder your DPM product reporting on them?
The biggest challenge is the lack of standards and the lack of adoption of the standards that do exist. When we port to a new backup product, our engineering discovery process will always look for well defined APIs, then CLIs, and always fall back on log scrapping. The challenge is that these interfaces change with every new release. Most of the backup vendors support CLIs but this requires string parsing that is always prone to error, especially when dealing with different international locales.
The next greatest challenge is not over-using the interface. For example, many products support polling of the CLIs to extract information, but too much polling can impact the performance and stability of the underlying backup system. It’s a very fine line between gathering the detail that the customer needs, and not over taxing the API/CLI.
Does DPM have to evolve into uDPM (universal DPM) so as to cover all the electronic data protection activities of an enterprise?
That’s a simple question to a complex problem. I think everyone would love that one magic pill that would solve all of their problems across their storage infrastructure. Storage architectures are already very complex with companies solving different problems at different levels using multiple technologies across multiple vendors. Trying to create a broad-based solution that solves everyone’s problem is an impossible task.
What is possible is creating a solution that addresses the problems within specific areas of the storage environment and doing it well. The reason for APTARE’s success is that we have been very deliberate about focusing on one aspect of the storage environment (backup and recovery) and delivering rich functionality to solve our customer’s pains.
Does it need an enterprise-wide data protection policy?
I would view this by looking at your fundamental principles of business and defining your application data into tiers. Then you measure your data protection policies against each tier. What starts to get complex with this approach is that tier classification of data is fluid and changes over the lifespan of your data. So the key is illumination and visualization of your data protection policies using software that simplifies and provides the “big picture”. APTARE StorageConsole has always focused on visualizing the storage environment with intuitive and simple to use dashboards for all levels of the storage organization.
Does it need a standard interface (APPI) to all data protection hardware and software products rather than product-specific interfaces?
A single unified and well published API adopted and embraced by all data protection hardware and software vendors is Nirvana, but realistically, will not materialize in the short to medium term. Take a look at what happened in the SRM space where storage hardware vendors via SNIA set out to define a unified standard API called SMI-S. Many storage analysts will tell you that SMI-S is dead. After four years of introduction, it is still not embraced by many of the major SAN hardware vendors. And those vendors that do embrace the standard are holding close to their chests (via their proprietary APIs) the core functionality required to manage and report on the device. I would say we can learn a great deal from the adoption of SMI-S in the DPM space. The standards will eventually develop and be embraced and adopted by the major vendors, but this process will be extremely slow.
If it does should this interface be the SNIA's SMI-S? Why? If not then what other interface could be used?
As a member of SNIA we think it’s absolutely the right place to help drive a grass root initiative standard. SMI-S grew out of the requirement for hardware vendors to interoperate in largely a SAN environment. The initial design was focused on hardware. Software products and storage related applications were not a key focus. If SMI-S could adapt to an application, software, and hardware focus, then it has potential. DPM solutions should embrace standards in data collection and at the same time open their interfaces to integrate into the enterprise management systems framework of the IT data center.
What’s interesting in the DPM space is many vendors are focusing on data collection and data analytics. The problem is not what you collect but how you present it. The key is not to lose sight of your customers and to give them the solutions that help them intuitively manage increasingly complex storage environments with less manual intervention.