There's a tremendous buzz today about cloud computing, but before outsourcing your critical business systems to the cloud let's review some security concerns.
The most critical business applications deal with corporate HR, finance, credit card, and other sensitive data. If any of this information is compromised lawsuits may ensue and your corporate brand is tarnished. This is a nightmare that could lead to customers avoiding purchasing your products or services. How can cloud computing effectively protect sensitive data?
There are three areas that need to be addressed to effectively push your applications into the cloud:
- Create a second layer of firewall protection (defence in depth);
- Analysing application documentation to determine new firewall rule changes; and
- Collection of system and application metadata that enables a smooth transition.
Let's start with defence in depth.
First, put sensitive data in a second tier of firewall segments behind the main corporate firewalls. This second-tier firewall and corresponding network shields sensitive applications and their data from being easily accessed if the Web-facing firewalls are breached. For example, let's look at grocery stores. It would be wise to deploy at least four firewall/network segments: one for HR data, one for financial data, one for credit card PCI (Payment Card Industry) data, and one for services that the other segments share. The segment containing services that are shared could contain common support services such as network and systems management, encryption and PKI functions, access control services, and security event management functions.
Another architectural implementation that protects corporations from internal data theft is the creation of a Tunneling Access Protocol. The Tunnel Access Protocol is an access control function that forces all administrators to log information before they perform administration on segment systems. Hence, all administrative access is tracked, discouraging internal theft of information
The second area that needs addressing is the analysis needed to determine successful migration of the application to behind the cloud's second-tier firewalls. I recommend starting with the application design document first. It gives you a big-picture understanding of which business need the application performs, what middleware is used, what databases are used, and what protocols it uses. It also often contains the logical architecture.
It is important to focus on all the systems the application interacts with. Your security team will have a variety of information collected about the application: what data is sensitive, how and what tools are used to encrypt the data, and penetration testing results if it is a web-facing application. Also, I recommend creating a protocol diagram showing all servers and their IP addresses, the protocols being used, and the protocol (TCP or UDP) ports being used. This network view specifically shows which servers need to talk to each other and what protocols (ports) they will use to do it. It is not necessary to include switches, routers and other network infrastructure components because the protocols/ports just ride over them. If the protocol diagram is thorough, it should be a simple step to create the firewall rules. Firewall rules are made up of source and destination IP (Internet Protocol) addresses, protocol used, and ports that ride on top of those protocols.
Lastly, I recommend a thorough collection of system and application metadata. The need to port your application well requires this work. Plus, if you have a disaster, business interruption or want to pull your application from the cloud you need this data. System information exists per firewall/network segment. All applications share the same system data such as the same firewall, routers, switches, encryption algorithm (if used for all applications in a segment), and storage subsystem. System metadata includes vendor, model, software release and version, and other system-wide configuration data. Application data is similar but it addresses load balancers, encryption method, middleware, database, server hardware and operating system, and services, protocols, and ports that ride on top of those systems. Application metadata includes vendor, model, software release and version, and other application configuration data.
The next debate is where this metadata should be contained. I recommend containing this information in a hierarchy in a LDAP repository. I would create two tiers in the directory: one called Segment System for each of the four segments in the example above, and lastly one called Application for all applications within a given segment. This ordering enables a systematic collection of all metadata so that sensitive cloud applications can quickly be deployed. And, most importantly, it enables a quick deployment of the application and/or segment into a cloud.
In summary, migrating critical cloud applications involves putting data behind a second tier of firewalls. Common services exist in one of the segments that can be shared by all segmented applications. Applications should be in separate segments based upon the type of data that is being protected such as credit card data, finance data and HR data, and services that are shared. A variety of documentation should be created and/or reviewed to make sure that the porting of applications behind the second-tier 'deep theatre' defence firewalls goes well. This collected metadata is from a hierarchy of two layers: common system per segment and different applications within each segment. I recommend the metadata be saved in a directory where it can be easily retrieved.