My notebook computer has a measly 80GB hard drive and I have a 320GB external drive for backups. Call me Oliver Twist but I want more! De-duping could give it to me.

My external drive fundamentally stores backed up data - thousands of press releases, white papers, images, PDF files and so forth that I'll need to refer to at some time. I just keep adding to it and soon my 320GB will be full.

Suppose, just suppose, that I had a de-dupe application that trawled through these files in the background and de-duped them into a separate partition on the drive. Let's suppose I got a 10:1 de-dupe ratio and I set up a 100GB partition on that external drive. It could hold 1000GB of de-duped data.

That means I have a 320GB external drive, 100GB of which holds a virtual 1TB of data. Reading those files into RAM would take longer than before because of the file reconstruction needed but I'm using USB 2.0. This isn't instantaneous anyway so another second or two isn't much to bear, not when my 320GB hard drive is actually functioning as a 1.22TB drive for the same price plus de-dupe software costs.

Let's suppose I have a 2.5-inch 100GB external drive for my laptop and the data on that is de-duped. Its capacity could effectively become 1TB at a 10:1 de-dupe ratio, 2TB at a 20:1 ratio.

De-dupe software is just software. This would be just a packaging application.

Dear FalconStor and Data Domain,
To really, really establish de-duping as a widely accepted technology, please, for Christmas 2007, give me my very own de-dupe application to turn my GB-class external drive into a terabyte-class one. Please, pretty please....

CC: Avamar, Diligent, Sepaton ...