Thoughts on British ICT, energy & environment, cloud computing and security from Memset's MD
As the increasing use of cloud computing and other technologies is changing the world of data management, keeping your data private and secure is an ongoing concern for everyone. As a cloud computing Infrastructure as a Service (IaaS) provider, I’m sharing an insider’s perspective on what you should be doing to keep your data safe.
As you move data to the cloud there are many different challenges. Applications have to be designed differently. Security gets pushed further and further away from perimeter-based approaches. Security threats change when data moves to the cloud, with threats from the network or from the provider’s personnel being more pertinent than concerns over physical attack.
However, it need not be a big concern, you just need to apply the same common sense you would to sourcing any other service. Ask questions about your prospective cloud supplier; Are they financially sound? Do they have good security procedures in place? Is the infrastructure your data will be on shared with lots of other users, or will it be in it be segregated by virtualisation or even physically separate dedicated environments?
Up until the existence of cloud computing the norm was to trust the IT department internally. Now that the IT department is outsourced people are asking the right questions about IT security. The focus must be on the security processes and procedures rather than the physical perimeter around the data storage devices. In many ways using the cloud can be much safer than hosting data on your own systems in your own building since a putative attacker no longer knows where to look. Even if, somehow, an individual were able to breach the heavy physical security of our data centres, they would be faced with thousands of identical-looking machines and no way of identifying their target.
The most likely source of data theft is always from within an organisation (the people), therefore for data management when it is not on your own systems, it comes down to trust. Just as if it were hosted on a computer in your office, then you need to trust everyone who has access to that machine, so if outsourcing to the cloud you need to trust the organisation that has access to the underlying infrastructure. Look for companies that have appropriate certifications like ISO27001 (as a minimum), and ask them about how they regulate and monitor their systems administrators’ access to servers holding client data.
The other increasingly common source of attacks on cloud-based services is via the network itself. This can be greatly mitigated with good firewall systems, and if your services only need be accessed from a small number of office locations then the firewall should restrict access to only those IP addresses. That can prevent the helpful feature of universal access, however, so it may not be practical, but even then firewalling is important. Talk to the provider and they should be able to advise you.
For public-facing services there is also the danger of Distributed Denial of Service attack (dDoS), where servers are flooded with millions of bogus requests from hacked computers (a “bot-net”). Most providers should have a system for automatically detecting and blocking the source of such attacks, so ask them, but in cases where the attack is massively distributed the only defence is to have more bandwidth than the attackers, which means you need to be using an operator with large scale.
Confidentiality is a major question to ask your cloud hosting provider. Having the right tools in place to ensure that confidentiality is also being maintained is critical. So, some questions would be:
When entrusting a cloud provider to look after your data it is essential to ensure that there is adequate resilience in their storage systems. At a minimum they should be using RAID (Redundant Array of Independent Disks) systems, but most cloud storage providers will store multiple copies of your data across many independent machines. Memset’s cloud storage solution (currently in beta testing) stores all data in triplicate, for example.
Most providers will offer additional backup services, and these should certainly be considered when operating cloud based applications so that in the event of a serious hardware failure you can roll back to an earlier state. Also ask the provider what their normal restore times are.
Finally, as we have seen with the recent failure of Amazon’s Simple Storage Service, which included irrecoverable loss of some customer data, sometimes it is not enough to trust one provider. To help overcome this problem we will soon be rolling out a service to backup client’s cloud storage accounts with other providers’ onto our storage cloud.
Although pushing data into the cloud is proving increasingly attractive for many organisations, there’s a growing realisation that geographic considerations remain important.
While the overriding concept of cloud involves the decoupling of data and applications from the underlying hardware on which they reside, knowing where that hardware is located can be vitally important.
For reasons of security, legal jurisdiction and privacy, many organisations are obliged to be aware where sensitive data is stored. For British companies, data may need to be stored within UK borders for data protection purposes. For the majority of UK public sector IT requirements the data absolutely must remain within national boundaries.
Any data which is housed, stored or processed by a company, which is a U.S. based company or is wholly owned by a U.S. parent company, is vulnerable to interception and inspection by U.S. authorities.
Microsoft has recently admitted that any EU-stored data, held in their EU-data centres, is subject to the US Patriot Act as Microsoft is a US headquartered company.
If you don’t want your data subject to the PATRIOT Act, then you have to use a non-US based company, in addition to a non-US data centre, for storing your data.
One risk with Software as a Service (SaaS) is that all your eggs are effectively in one basket, and if something goes wrong with that one provider you could face serious challenges. Memset’s approach is to disintegrate the stack enabling you to be able to move your software from one place to another. A typical example of this is using third party open source solutions to deliver hosted software services on their infrastructure. That way if the software provider fails you can still get to the data, and if the hosting company fails (assuming you have good backups) the software company can help you transfer to a new host.
Many SaaS providers are essentially running one application for thousands (or many more) client organisations, with their data commingling on the same infrastructure and in the same databases separated only by the software itself. This presents a potential security risk, since if there is a flaw in the provider’s code it could be exploited to allow access to other customers’ data. For some services this may not be a problem, but for critical company or personal data it may be advisable to obtain additional segregation.
Memset’s stack disintegration approach solves this problem also. By using open source solutions (eg. Zimbra for Web email or Trac for integrated project management and Wiki), each hosted on virtual servers or dedicated servers for just that one client, there are additional layers of segregation between the software instances, thus providing greater security. While many SaaS solution’s code bases are not heavily tested, network and virtual machine segregation are very robust.
Once you’re clear on who has your data, where that data is held, what they are doing with it and how they are protecting it, you also need to establish what procedures are in place to allow you to migrate your data out. Key characteristics to look for include:
For SaaS providers, look for an API or tools to download your data in a meaningful context. This could be as simple as a widget to download a CSV file (like with Google Contacts), or it might be a fully-fledged XML API. Failing that, and if taking the stack disintegration approach, ensure that the database in which the information is stored is transparent and well-documented. It is frequently not in a SaaS provider’s interest to make data portability easy though, so this can be a difficult item.
As with any service provider contract, you should negotiate clear SLAs for your cloud provider. These should include, but not be limited to, clear metrics around performance (both networking and computing), provisioning, change management, patching and vulnerability remediation.
To ensure your data is safe in the cloud at all times, make sure you think about the following:
In summary, the cloud is, and will continue to be, a critical part of many companies’ IT strategy so must it therefore be considered in their security policies. This role is likely to grow as a raft of new services are developed and commercialised and users’ level of familiarity and comfort with this approach to service delivery develops and grows. But it is also likely that the most effective network security strategies will be a hybrid model that takes the best that the cloud has to offer and combines it with the skills and focus of experts working on the ground.