Hopefully anyone reading this article has done at least one backup of their PC’s contents. Backups are a crucial part of any system or network management strategy, and the first time you ever have to restore from your backup tape you thank yourself for being so diligent all those weeks, months or years ago. Even servers with loads of resilience (multiple power supplies, RAID5 disks) can suffer data losses, and so backing up data is essential.
Important = hard to backup
By its nature, the most important data in an organisation is often required to be online 24 hours a day. This can make backing up such data interesting – if you just try to make copies of files that are being updated all the time (such as your Exchange mailbox collection or the database that back-ends the website) you’ll simply receive ‘File is already in use’ messages and the backup will fail. The obvious solution is to pick a quiet time of day when the database or mail server can be stopped for a few minutes – but with the 24x7 nature of Web sites, this is often unacceptable to the business.
What is a hot backup?
So you need to be able to do ‘hot backups’ – data dumps that work without interrupting the service. When you consider that complex software packages such as email hubs and database servers are using, this sounds rather like black magic. The way hot backup software usually works, though, is to take a ‘snapshot’ of the database either at regular intervals or at the time that the backup is initiated, which produces a collection of files that can be restored to the server en masse, such that it can carry on as if nothing had ever gone amiss
Hot backups for 24x7 applications such as database servers are usually handled by add-on ‘agent’ modules for the various commercial backup tools. The basic backup package allows you to handle normal files, and then you bolt in the extra bits as they’re needed.
In real life…
To show that this hot backup lark really does work, we installed a Windows 2000 Server machine with SQL Server 2000 and Veritas Backup. To ensure we had a real non-stop service, we wrote a short program in VB.NET that continually inserted, updated and deleted 100,000 table rows in the SQL Server database. Making Backup Exec understand our SQL Server 2000 installation was as simple as telling the wizard to add the SQL Server ‘agent’ – after a reboot and a quick tweak of the open file and SQL Server options we were ready to rock.
We chose to back up just the SQL Server databases on our server (i.e. not the main directory structure). We started our test program, set up a Veritas backup job, and hit ‘Go’. The backup process itself took about five minutes, after which we killed off the database test program and blew away the tables the test program had used. A quick query in the SQL Query Analyzer confirmed that the table was no longer there. We then ran the restoration process in Backup Exec, selecting just our test database to restore, and a few moments later it reported ‘success’. SQL Query Analyzer confirmed that 10,000 records were indeed back – with no need to shut down any of the other databases that were running on the system. Note here that although there were 10,000 rows restored and not 100,000, this isn’t surprising because our test program adds to and deletes from the table all the time, so the 10,000 we got back are what happened to be in the database when the backup started.
Keeping your data secure can be very straightforward, even if you’re not allowed to stop things from running in order to back up their data. Most commercial packages that have ‘agent’ modules specifically written for systems like Exchange or SQL Server are able to perform ‘hot’ backups, and once you’ve followed the (usually very basic) setup process, it runs just like the normal cold backup tools most people are used to.