Out of the box, the default installation of Unix, in its many variants, has configuration settings that can make the management of your servers and network easier, but also open up security holes that attackers are all too likely to try to exploit.

Here we run through some tweaks to help protect your environment.

Account Protection
Don’t just go for the basic premise of ensuring all your users have passwords. Take it a step further and make sure you have a password policy that all users sign, to formalise the process. Use something like anlpasswd to proactively check that passwords aren't too easy to crack before they are accepted by the system, and change them regularly.

On your servers, you may have several people who need to log in as root. Utilities such as sudo (see Sudo here) that can add greater levels of control. It’s better to set a policy that admin staff always log in as themselves first and then su to root if they need to, for greater accountability. And make sure that root's login files and cron jobs don't call up other processes or files not also owned by root.

Disable guest logins that often come preconfigured as default. And if a supplier has installed your system for you, just make sure they haven't left any accounts of their own set up. Some don't even have passwords, and there are well published advisories on these that hackers can read as easily as you.

And obviously, monitor frequently for failed login attempts, dormant accounts, and any accounts with null passwords.

File system safety
Many Unix systems have a umask value of 022 as default. That is probably fine as the mask for programs that create files, since it will allow only the program to read and write the files, but everyone else to read them.

It is safer, however, to change the umask value for users to 027 or 077 to prevent everyone by default being able to read all newly created files. Otherwise users will have to use chmod to specifically set permissions, which they will either forget or not know how to do. Setting umask in /etc/defaults/login is probably the easiest place to do this, since it will affect everyone who logs in.

Ensure that file permissions are set correctly - ie as rigidly as possible -specifically for the likes of /etc/utmp and /var/adm/wtmp and syslog.pid to make sure they are read-only for non-owners. If files don’t need to be read by most users, disable all read access to them. Pay particular attention to the '/' (root) and '/etc' directories, and all system and network configuration files, and recheck file and directory protections before and after installing software or running verification utilities, as these procedures can change file and directory protections.

Watch where you save logs to. They can give you invaluable information in the event that someone does break in, but only if you can be sure that the attacker didn’t go in and edit them too, either to just delete the whole time period, or add in erroneous information just to add confusion. Append-only media, and write protection are called for here.

Network services tuning
Disable fingerd and ‘r’ commands (rlogin, rsh etc) where possible. If you have to have them, filter the TCP ports at your firewall to prevent outside users trying to run them, and configure your hosts.allow and hosts.deny files to permit their usage only from specific hosts within your network. If you can, use SSH rather than telnet or rsh.

If you can, disable cron for normal users. If you can live without UUCP, disable it: if you must have it, create different logins for each site that needs to access you, and restrict the commands that can be issued. Again make sure that there are no manufacturer or installer-related entries.

Tftp is dangerous but very useful. If you cannot remove it, then ensure the files to be served by tftp are stored in a separate partition, and limit the tftp daemon to the directory where this partition is mounted.

Similarly for anonymous FTP: at its default setting it has too much flexibility to be secure. You need to use a configurable version of ftp so that you can prevent all delete, overwrite, rename, and chmod options for guests and anonymous users.

The FTP root directory and its subdirectories should not be owned by the ftp account or be in the same group. If they are and are not write protected, an intruder will be able to add files (such as a .rhosts file) or modify other files. Ideally you wouldn’t give write access to that directory, but since many FTP sites have to be used as drop-off points, you should at least limit the amount of data that can be downloaded (to prevent over-utilisation of disk size) and restrict what the users can see in that area.

There are many /etc files that you should check are owned by root and have either 644 or 600 set as file permissions. There are various documents that can advise on these.

Monitor!
This has covered some of the ways you can harden your Unix system to prevent certainly casual attackers from gaining too much knowledge about your environment and doing harm. It is essential that you constantly monitor your file systems to catch any changes. Host intrusion detection software can greatly assist here, such as Tiger or Tripwire, or the commercial offerings from Cisco, Enterasys or Symantec, to name but a few, but it is essential that you have configured your system as tightly as possible, and are aware of the implications of user and program rights and interaction to make sure you don’t make a potential hacker’s life too easy.

In Has your Unix server been hacked? we’ll show how you can tell whether outsiders have managed to get through your defences.