There is no question that businesses can benefit from moving data to the cloud. The cloud is elastic and efficient. It can improve user productivity and unburden IT staff, saving time and money. It can accommodate anything from simple file sharing to mission-critical data backup. The question is, just how secure is your cloud? And how do you know?
There are major differences among cloud providers in their approach to security and their use of security technologies, processes, and personnel. These differences can have a major impact on the availability, integrity, accessibility, privacy, and compliance of your data — and can directly impact your business.
This slideshow provides a list of questions, developed by Syncplicity, that you should ask any prospective cloud provider, whether that is your internal IT department or a third-party cloud service provider.
Click through for 10 questions you should ask any potential cloud provider to ensure your data is secure, as identified by Syncplicity.
The overall approach is crucial. If the vendor is of the opinion that password protection for a file or laptop is sufficient to prevent unauthorized access to content, or that data encryption is needed only for data that is in transit and not at rest, you may want to consider other cloud providers. Encryption of all data, in transit, at rest, and in mobile devices, should be the basis of any holistic security solution.
Failure to encrypt all content can have serious consequences, most notably in the area of regulatory compliance. The data-breach laws mentioned previously are only the tip of the iceberg. In the U.S. alone, legislation such as the Gramm-Leach Bliley Act (GLBA), the Payment Card Industry Data Security Standard (PCI DSS), the Financial Industry Regulatory Authority (FINRA), the Health Insurance Portability and Accountability Act (HIPAA), and the Personal Information Protection and Electronic Documents Act (PIPEDA) could all be violated by a loss of unencrypted data. On the other hand, proper encryption not only defends against such violations but also creates new business opportunities and competitive advantages, such as the ability to transact securely any time, from anywhere, and the ability to serve new customer segments or geographies.
Regarding the actual encryption of the data, make sure all data is transferred and stored using the highest levels of encryption: 256-bit Advanced Encryption Standard (AES) SSL for transit, and 256-bit AES for data at rest (introduced by the National Institute of Standards and Technology or NIST). AES is the only publicly accessible and open encryption technology approved by the National Security Agency (NSA) for Top Secret information. There is simply no excuse for using any lower-grade encryption technology.
In many cases, the difficulty with encryption lies not in the encryption process itself but in the management of the encryption keys. Make sure the prospective vendor provides both physical and logical separation between the encryption keys and the encrypted data. Separate data centers would be optimal, so that there can be no single point of failure or compromise. You will also want to ensure that the vendor has segmented access to their systems so that in general employees only have access to one data center or the other, further protecting access to your data. In addition, ensure that the encrypted file data and the proper file version encryption key are brought together only on an as-needed basis, and in a way that can be audited.
It is important to ensure that an encrypted file cannot be decrypted by anyone. The absolute highest level of security is to own and manage the keys to ensure actual control. However, this is generally so burdensome, particularly in cases where users share and collaborate with one another, that there is a significant risk that users will use simpler methods such as emailing files via their private email accounts, defeating the purpose of the system. A compromise is to have the vendor manage the keys on your behalf. In this case, the vendor should be able to explain how they ensure that the keys are properly managed and, optionally, provide you with the ability to control a key escrow so you can own the keys.
The gold standard is a dual-responsibility model where two authorized employees must combine their authority before access can be granted, such as in the case of a two-data-center security architecture.
Certifications are issued today for virtually every aspect of information handling — from the data center itself to information protection practices.
Ideally, the vendor’s data centers will have successfully completed a SOC 1 audit under SSAE-16 guidelines (formerly SAS70 Type II), as well as testing from independent auditors. An SSAE-16 audit verifies that the cloud provider’s data centers have met rigorous requirements around physical security, physical access, and internal controls. It also allows cloud providers to disclose their control activities and processes to their customers and their customers’ auditors in a uniform reporting format.
In addition, ask prospective cloud providers whether they are FISMA-certified (indicating a high level of commitment to data security), and whether they are certified for compliance with PCI DSS, ISO 27001, HIPAA, and FIPS 140-2.
Finally, while you may want your provider to ensure they can reliably store your data forever, you will also want to ensure that they properly handle the cases where data must be reliably destroyed. Compliance with Department of Defense 5220.22-M or NIST 800-88 ensures your provider properly handles media sanitation, such as in cases where a server holding customer information is retired with the information on it permanently and irrecoverably destroyed to prevent third parties from accessing the information.
Users expect and require that data remain available and uncorrupted absolutely without fail. For years, data center managers have pursued “five-nines” availability (meaning 99.999%) as the Holy Grail for service-level availability. When it comes to data durability, however, there had better be a lot more than five nines (look for 10 or 11). Think of it this way: The standard RAID mirroring (data stored on two hard drives), provides about “four nines” of durability, meaning you have a one in 10,000 chance of data loss. Given that the average user has 10,000 files in storage, this means they will lose a single file every year. At 11 nines, this same user will lose a single file every 10 million years.
With this in mind, expect your cloud vendor to store all files at least in triplicate at each of several geographically dispersed data centers, and expect those copies to be synchronized automatically and instantaneously. These measures ensure that even if a data center goes down, for any reason, or connectivity to a data center is lost, operation will still continue normally.
You should expect to retain end-to-end, lifecycle control over where, when and how your data flows and how it is physically stored. When data is created, there should be a customer-controlled system for capturing the content (files, documents, or messages), policies for uploading the content, and centralized control over which users and which devices can access or make changes to the content. During the midlife of the content, controls are needed to capture the edits and changes made by various authorized users. And at the end of the lifecycle, controls are needed to ensure that the content is properly archived or wiped (destroyed).
Make sure your cloud provider can easily enforce the data retention policies you set, so that shared files and folders can be automatically and permanently deleted from user devices when required. Also, look for the ability to remotely wipe any user’s account — including all of the computers and mobile devices they use — in the event that a device is lost or stolen.
You should also receive a detailed plan that defines the course of action in the event that data is in the wrong places, due to misconfiguration, maliciousness or error. Make sure that your prospective vendor has the capability to provide the level of control you expect.
In the cloud model, data is transmitted between and among connected data centers and a diverse array of clients: mobile phones, desktops, laptops, tablets, etc. While the cloud service provider has no control over the security mechanisms put in place by the vendors of these devices, the cloud provider can ensure that no client ever opens a hole in your firewall with any externally accessible port, communicates with any non-authenticated source, or stores cached credential information in an unencrypted format. This will close three of the most common attack vectors.
In addition, it is possible to protect user data on mobile devices by using AES-256 encryption for data during transmission and while stored on the mobile device, and to provide mobile apps that use app-specific PINs in addition to any phone password.
Often the weakest points of any system are user accounts with passwords that are easily guessed or an account that is accidentally left active when it should have been disabled. In fact, according to a recent article in Information Week, “the combination of poor passwords and automated attacks means that in just 110 attempts, a hacker will typically gain access to one new account on every second or a mere 17 minutes to break into 1000 accounts.”
A common way to reduce the risk is to ensure any system that you adopt can leverage your users’ existing accounts that may be in Active Directory or even Google Apps rather than create yet another username and password for users and IT to manage. This integration should work with pre-existing password policies and advanced configurations such as two-factor authentication. It is also important that when access to an account, folder, or file is disabled or removed, the action takes effect immediately rather than taking a day or more.
In the past, IT departments segregated the files and data of various constituencies by putting them on separate physical servers. There were multiple problems with this approach, including “server sprawl,” underutilized resources, administrative complexity, excessive cost, and downtime or even data loss due to single points of failure.
Virtualization technology makes it possible to encapsulate multiple types of data, applications, and content within the same physical server and to distribute copies of those assets quickly and easily among multiple servers. The content itself is electronically isolated or “partitioned” from all other content on the servers. The result is a more secure and more flexible access model that lowers operating costs and simplifies desktop administration and management.
So, when evaluating prospective vendors, be sure to get a detailed description of their use of virtualization and, if you’re not conversant in the technology yourself, consider having it appraised by an expert.
Your cloud provider should be able to provide an audit trail with full change tracking for changes occurring in an account, with previous versions retained, so that you know who is making changes and what those changes are.
The elasticity of the cloud is one of its key advantages — but make sure your prospective cloud provider can accommodate the volume of growth you anticipate, as well as unexpected spikes in demand for service, with the level of performance your users demand. Also, be sure you’re not a guinea pig for an untested cloud architecture. Explain your needs to a prospective cloud provider and find out whether other customers with your same profile are currently deployed on the system. A cloud provider with real customers should be able to explain best practices for your needs and have references from other businesses. And don’t assume that only a large, established cloud vendor will be able to meet your requirements; there are many small, up-and-coming providers that can deliver a higher level of service than the big players, with greater scalability, at a comparable price point.