Ameritrade completed its acquisition of TD Waterhouse in January to become TD Ameritrade Holding Corp. Just prior to the completion of that acquisition, Ameritrade finished rolling out technology that encrypts all data as it moves from servers to tape backup devices. The encryption effort was a reaction to the company’s loss of a data tape with the names of 200,000 clients in April 2005. Jerry Bartlett, CIO at TD Ameritrade, spoke with Computerworld recently about data security and storage management. Extracts from the interview follow:

LM: When did you complete the tape encryption technology?
JB: We completed it in the November and December time frame for the legacy Ameritrade facilities. And we’re completing it for the combined TD Ameritrade this month.

LM: Was it very difficult?
JB: The difficulty was around deciding what we were going to do and how we were going to do it - not around the implementation itself. In fact, the technology difficulty was really around coordination of the network teams and storage teams. Once we realized that we needed to execute like it’s any other infrastructure project, we assigned a project manager with a plan coordinating our infrastructure teams. It was all about execution, and we’re good at execution.

LM: How many encryption appliances from Decru did you deploy?
JB: About a dozen.

LM: Do you have any concerns about un-encrypting data for restoration as new tape rev cycles come out?
JB: Not really. We’re comfortable with the backward-compatibility commitments. We would be concerned if the encryption algorithm were changed from the current AES 256-bit algorithm.

LM: How long did it take to deploy?
JB: It took us, to do the legacy Ameritrade, less than 6 months. Based on that experience, it took us less than three months to do the TD Waterhouse side.

LM: How much data do you actually encrypt?
JB: In the neighborhood of 30TB per week, including full and incremental backups.

LM: How have the regulators reacted to the decision to encrypt your data?
JB: The feedback we’ve received from [them] is that they’re thrilled about it. So we’re thrilled about that.

LM: What other types of challenges are you facing?
JB: In the storage world in particular, it’s this whole idea of a formal and automated approach to the whole information life cycle management. We have very-well-understood retention rules, but it’s too manual. As we acquire companies and the obligations of those firms become our obligations - client data, client e-mails - that’s probably one of the biggest hurdles we have to address. We’re just starting to put together a strategy to address it.

I think we have a good approach to rationalizing storage around our applications, which is important. It’s a big spend. But now it’s really around the overall data management [and] retention because of the industry we’re in. I’d like to reduce the amount of manual effort associated with that.

LM: What do you mean by manual effort?
JB: I’m talking in terms of lacking an automated way to determine that this set of data has these retention properties and when you reach five years, for example, [tape archiving] just happens. I’m thinking about making our staff more efficient around the decision-making around when and what moves, not so much about "You’ve got to go move this to that." We can have operations staff do that. But I’m sure we have data sitting there in near- or real-time accessibility that doesn’t need to be. It could be moved off to tape.

LM: So it’s more about classifying data?
JB: Yes, in a more automated way - whether it’s ticklers that say, "This is the kind of stuff that’s coming up in the next few months to be addressed" - we just haven’t explored it yet.

LM: Are you thinking about an in-house solution or something off the shelf?
JB: We’re reluctant to do something in-house. Our typical strategy across technology is in-house; we build what’s a core competency that’s a differentiator. This, in my mind, is not a differentiator. There have got to be folks that create these types of products and that’s their core competency. My strong bias is that now that we’ve recognized the need, as we have the cycles and bandwidth to address it, we begin looking at potential partners.

LM: The SNIA is working on a standard as part of SMI-S, which would allow migration of data across tiers of storage. How important is that to you?
JB: My team does work with SNIA to some extent. My fundamental view is we are, and ought to be, vendor-agnostic. My team’s a big believer in standards. In this case, standard interfaces and the ability for a heterogeneous group of vendors to be able to be utilized across the whole data life cycle, I think, is the right direction.

LM: Aren’t you mainly an EMC shop? Do you try to standardize on one vendor?
JB: We do. But again, in the end, we’re vendor-agnostic. We’re looking for the best combination of price, quality and availability. Right now, we’re an EMC shop, so as we do mergers and acquisitions, we stick with EMC. It doesn’t mean we won’t continue to look at vendors whose offerings become potentially higher in quality, availability and resilliency at competitive cost points. A fundamental tenet is [that] we’re vendor-agnostic.