|By Elad Yoran||
|May 19, 2013 05:00 PM EDT||
Cloud service providers store data all over the globe, and are constantly moving that data from one datacenter to the next for reasons as wide-ranging as cost considerations and redundancy requirements. Does this mean that the requirements outlined in varying data residency laws and privacy regulations are directly at odds with how cloud computing works?
The question is an especially delicate one when the cloud service provider stores and processes data in a jurisdiction that is perceived to have far less stringent privacy and data protection requirements - or may allow government agencies far broader data subpoena powers. Since the cloud computing model relies on distributed infrastructure to generate cost and flexibility benefits for customers, building a datacenter in each data residency jurisdiction quickly becomes cost-prohibitive. And, applying a set of constraints to the movement of data introduces an additional layer of complexity that further erodes the value proposition of cloud computing for customers.
Just as cloud computing represents a novel way of delivering IT computing and functionality, a new model for maintaining ownership and direct control of data in the cloud is increasingly required. However, this new model requires that the encryption mechanism is maintained externally and independently of the cloud service provider's environment, and that data is encrypted before it is sent to the cloud.
The Issues Surrounding Information Security and Data Protection Laws
Over the past 18 months, concerns about the feasibility of enforcing data residency laws and regulations in the cloud have increasingly come to the forefront. Multiple countries including India, Switzerland, Germany, Australia, South Africa and Canada have enacted laws restricting corporations from storing data outside their physical country borders. Additionally, EU Safe Harbor Principles mandate that companies operating within the European Union are forbidden from sending personally identifiable information (PII) outside the European Economic area, unless it is guaranteed that the data will receive equivalent levels of protection.
This is partly as a result of broader understanding of cloud computing architecture and processes, but also because of the ambiguity of safeguards for the privacy of cloud data. For example, national security concerns have driven the definition of US legislation such as The Foreign Intelligence Surveillance Amendments (FISA) Act and the USA PATRIOT Act, to extend the ability of the federal government and law enforcement agencies to subpoena communications and emails stored in the cloud. The concern is now as much whether data is leaving the jurisdiction as it is what the privacy laws hold where the data lands. Inconsistent approaches to privacy further complicate the picture.
The current response to this challenge is either not to move to the cloud, or require cloud service providers to store data within each jurisdiction. For cloud service providers, this presents a business challenge in delivering a level of flexibility, cost and effective service while altering their delivery and management models to satisfy data residency and privacy requirements. To address the mandates set forth by these laws, a cloud provider would ostensibly have to build datacenters in each jurisdiction, resulting in significant cost and overhead that would reduce the overall gain of cloud storage.
Cloud Encryption and Cloud Data Residency Regulations
The interaction between the evolution of information security and the definition of data protection mandates by legislative bodies or industry groups is a dynamic one. At the heart of the concern is how organizations can continue to maintain ownership and control of data to protect personal information, even when the information resides with a third-party service that relies on a distributed infrastructure in order to deliver resiliency, availability and flexibility to customers.
By way of illustration, compliance requirements and data breach laws have been regularly updated as new information security alternatives have been developed. In the US, more than 40 states currently have breach notification laws mandating that if a company is aware of lost or stolen personally identifiable information, they are required to directly notify the consumer. When these laws were initially enacted (starting with the State of California in 2002), they generally stated that regardless of the circumstances, the company was required to notify the consumer. However, the laws have been gradually amended, and more than 25 states have now enacted an exemption for encrypted personal data. In other words, in instances where lost or stolen data is encrypted, the company is no longer required under law to notify the consumer.
The underlying argument for differentiating between unencrypted data and encrypted data in the context of breach notification is that in the instance where data is encrypted, the attacker has gained access to useless "gibberish" if they do not hold the encryption keys.
However, cloud computing is an evolving paradigm where both the obligations of the data owner and acceptable forms of data protection are still in the process of initial definition. As the technology gains popularity and becomes a well-established method of data storage and processing, the laws pertaining to cloud computing will also continue to evolve in the same way that data breach laws have.
For example, regulations are also now moving towards excluding encrypted data from data residency legislation. Encryption is recognized in the State of Nevada as a means of securing data outside of geographic boundaries: "A data collector doing business in this State shall not: (a) Transfer any personal information through an electronic, non-voice transmission other than a facsimile to a person outside of the secure system of the data collector unless the data collector uses encryption to ensure the security of electronic transmission; or (b) Move any data storage device containing personal information beyond the logical or physical controls of the data collector or its data storage contractor unless the data collector uses encryption to ensure the security of the information."
While data residency regulations can be narrowly defined, in many jurisdictions laws can be interpreted as not applying to data that has been encrypted before being sent to the cloud. Dr. Thilo Weichert, head of the Independent Center for Privacy Protection for the German state of Schleswig-Holstein, argues in his Cloud Computing & Data Privacy paper that if data is anonymized or sufficiently aliased to the extent that the identity of individuals is indecipherable, then data residency law does not apply. Encryption takes anonymizing and aliasing a step further, where the data is completely indecipherable. Similarly, under the European Union's Data Protection Directive (EU DPD), as long as the data is encrypted, where it resides should not present a legal obstacle.
Likewise, under Canadian privacy law, both federal bodies and commercial organizations domiciled within Canadian borders are responsible for the privacy and protection of personal information in their custody. This requirement applies regardless of where the data resides. While significant concerns have been articulated with regards to the probability of disclosure to law enforcement agencies for data that resides within US datacenters, the requirements pertain directly to the safeguards in place to maintain control.
Ann Cavoukian, Information and Privacy Commissioner for the Province of Ontario, noted in her formal response to a question related to the compliance with the Freedom of Information and Protection of Privacy Act concerning the privacy and security of personal information collected by the Ministry of Natural Resources being stored in the US that: "to the extent that the data owner retains the encryption keys, the location of the encrypted data is a secondary issue."
In other words, if the encrypted data leaves the jurisdiction, but the keys remain under the data owner's direct control, the level of protection can be sufficient in terms of data residency requirements.
However, this model also implies that the data encryption scheme is maintained externally and independently of the cloud service provider's environment, and that data is encrypted before it is sent to the cloud.
Persistent Encryption and Data Residency
The most effective method to address the jurisdictional and residency requirements of data processed by third-party services is via control of encryption keys and the application of persistent encryption. By applying persistent encryption, data that is encrypted at the boundary of the network remains encrypted even when processed and stored within a cloud service provider environment. As a result, persistent encryption ensures that data is never decrypted when in a third-party's environment and the ability to access useable data remains solely with the organization that holds the encryption key.
Therefore, businesses can comply with jurisdictional and residency requirements by virtue of keeping the encryption keys within the jurisdiction regardless of the actual physical location of the data. Laws relating to data residency are now undergoing a historic transition from the old paradigm where it mattered where the data was physically located to the new paradigm where it only matters where the encryption keys are located.
With the application of persistent encryption, control of the keys in combination with encryption across the data lifecycle - in transit, at rest and in use - provide the foundation to satisfy requirements for control and adequate safeguards for the privacy of personal information. Although the encrypted data may leave the physical borders of a specific country, the data is always fully encrypted while outside of the defined jurisdiction. As the keys are retained within a business's legal jurisdiction, the data cannot be accessed or read until it returns to the physical borders in which the organization resides.
Global Pharmaceutical Company Case Study: Cloud Data Ownership and Control Concerns
The following example depicts a privately held multinational pharmaceutical company that engages in research, development, production, and marketing of prescription and over-the-counter medicines and healthcare products. The company has thousands of employees across the globe, as well as multiple subsidiaries and entities.
The company's IT procurement and deployment approach follows a decentralized model in which each entity subsidiary hosts its own servers and datacenters. There are three functional organizational pillars maintained within its technology and IT services division: Technology Planning; Enterprise Architecture and Data Services; and Production Services. The divisions are staffed by IT engineers with managed services providing support for thousands of clients across a multitude of sites. Existing infrastructure includes hardware, software, services, and virtualization from multiple top vendors including Microsoft, VCE, Dell, Oracle, EMC and VMware.
The pharmaceutical company had adopted several cloud-based services for applications that do not process or store critical or regulated business information, such as Web conferencing, spam filtering, compliance training and tracking, and travel and expense management, but was seeking to expand its cloud computing usage to business critical applications by moving low value servers to cloud providers, as well as moving commodity applications such as email to the cloud.
Concerns about the loss of control and ownership of corporate data, however, stood in the way of realizing the increased efficiencies and operational benefits possible through broader adoption of cloud-based services. These concerns were related to:
- Compliance with international data residency requirements that preclude data leaving a jurisdiction in the clear
- Compliance with regulations governing the security, privacy and confidentiality of healthcare data
- Safeguards to limit exposure of its intellectual property when it is stored and processed in the cloud
- Lack of visibility into service provider responses to information subpoenas that can result in a breach of confidentiality or loss of data
Addressing Residency and Unauthorized Disclosure
While the cloud service provider could attest to the security of the environment based on a framework like the Cloud Security Alliance's Cloud Control Matrix, the global pharmaceutical company required an independent mechanism to protect its intellectual property while resident in the cloud. A common challenge to cloud migration within the pharmaceutical/healthcare industry is confidentiality and sensitivity to a service provider's compliance with government subpoenas as pharmaceutical and healthcare companies maintain sensitive information related to research, clinical study results, and personal medical history. Therefore, it is critical that sensitive information remain under the company's control, without any forfeiture of attorney-client privilege.
In a typical scenario, if a company stores sensitive data in the cloud, and the cloud service provider is faced with a subpoena or other request from the government, they must comply and disclose the company's data to the federal government body. The provider may notify the company after the fact, or in cases of blind subpoenas, not at all.
The pharmaceutical company decided to use persistent encryption technology to specifically address the migration of their email infrastructure to the cloud. Deployed as an on-premise gateway, this enabled the company to successfully address the jurisdictional and residency requirements of email data hosted in the cloud, as the company maintains control of the encryption keys - and business data is encrypted when it passes through the gateway's proxy at the boundary of the network and remains encrypted even when processed by and stored within a cloud service provider environment.
The persistent encryption technology ensures that data is never decrypted when in a third-party's environment and the ability to access useable data remains solely with the organization that holds the encryption key. Therefore, the company is able to comply with jurisdictional and residency requirements by virtue of keeping the encryption keys within the jurisdiction regardless of the actual physical location of the data, as well as ensuring complete ownership and control of that data if faced with a subpoena.
|veronica321 05/22/13 11:13:00 AM EDT|
Great article and rightly said, security is still the biggest concern when moving to the cloud , I came across this interesting piece on cloud computing and cloud security in particular that readers might find interesting 'Cloud risks Striking a balance between savings and security' it talks a great deal about securing the cloud and data residency laws that might interest readers
In his session at @ThingsExpo, Tony Shan, Chief Architect at CTS, will explore the synergy of Big Data and IoT. First he will take a closer look at the Internet of Things and Big Data individually, in terms of what, which, why, where, when, who, how and how much. Then he will explore the relationship between IoT and Big Data. Specifically, he will drill down to how the 4Vs aspects intersect with IoT: Volume, Variety, Velocity and Value. In turn, Tony will analyze how the key components of IoT ...
Oct. 6, 2015 04:00 AM EDT Reads: 243
The broad selection of hardware, the rapid evolution of operating systems and the time-to-market for mobile apps has been so rapid that new challenges for developers and engineers arise every day. Security, testing, hosting, and other metrics have to be considered through the process. In his session at Big Data Expo, Walter Maguire, Chief Field Technologist, HP Big Data Group, at Hewlett-Packard, will discuss the challenges faced by developers and a composite Big Data applications builder, foc...
Oct. 6, 2015 04:00 AM EDT Reads: 405
Scott Guthrie's keynote presentation "Journey to the intelligent cloud" is a must view video. This is from AzureCon 2015, September 29, 2015 I have reproduced some screen shots in case you are unable to view this long video for one reason or another. One of the highlights is 3 datacenters coming on line in India.
Oct. 6, 2015 03:45 AM EDT Reads: 146
Recently announced Azure Data Lake addresses the big data 3V challenges; volume, velocity and variety. It is one more storage feature in addition to blobs and SQL Azure database. Azure Data Lake (should have been Azure Data Ocean IMHO) is really omnipotent. Just look at the key capabilities of Azure Data Lake:
Oct. 6, 2015 03:30 AM EDT Reads: 146
The cloud has reached mainstream IT. Those 18.7 million data centers out there (server closets to corporate data centers to colocation deployments) are moving to the cloud. In his session at 17th Cloud Expo, Achim Weiss, CEO & co-founder of ProfitBricks, will share how two companies – one in the U.S. and one in Germany – are achieving their goals with cloud infrastructure. More than a case study, he will share the details of how they prioritized their cloud computing infrastructure deployments ...
Oct. 6, 2015 03:00 AM EDT Reads: 668
Decisions about budgets and resources are often made without IT even having a seat at the table. As technologist we understand the value of DevOps - but do your business counterparts? If they don't, your DevOps initiatives could lose funding before they start. In her session at DevOps Summit, Jeanne Morain, Strategist / Author at iSpeak Cloud, LLC, will provide insights on how to bridge the gap between business and technology leaders. Attendees will learn prescriptive guidance on balancing wor...
Oct. 6, 2015 02:00 AM EDT Reads: 564
The modern software development landscape consists of best practices and tools that allow teams to deliver software in a near-continuous manner. By adopting a culture of automation, measurement and sharing, the time to ship code has been greatly reduced, allowing for shorter release cycles and quicker feedback from customers and users. Still, with all of these tools and methods, how can teams stay on top of what is taking place across their infrastructure and codebase? Hopping between services a...
Oct. 6, 2015 02:00 AM EDT Reads: 346
Interested in leveraging automation technologies and a cloud architecture to make developers more productive? Learn how PaaS can benefit your organization to help you streamline your application development, allow you to use existing infrastructure and improve operational efficiencies. Begin charting your path to PaaS with OpenShift Enterprise.
Oct. 6, 2015 02:00 AM EDT Reads: 420
SYS-CON Events announced today that Secure Infrastructure & Services will exhibit at SYS-CON's 17th International Cloud Expo®, which will take place on November 3–5, 2015, at the Santa Clara Convention Center in Santa Clara, CA. Secure Infrastructure & Services (SIAS) is a managed services provider of cloud computing solutions for the IBM Power Systems market. The company helps mid-market firms built on IBM hardware platforms to deploy new levels of reliable and cost-effective computing and hig...
Oct. 5, 2015 11:45 PM EDT Reads: 765
When it comes to IoT in the enterprise, namely the commercial building and hospitality markets, a benefit not getting the attention it deserves is energy efficiency, and IoT’s direct impact on a cleaner, greener environment when installed in smart buildings. Until now clean technology was offered piecemeal and led with point solutions that require significant systems integration to orchestrate and deploy. There didn't exist a 'top down' approach that can manage and monitor the way a Smart Buildi...
Oct. 5, 2015 11:45 PM EDT Reads: 199
SYS-CON Events announced today that IBM Cloud Data Services has been named “Bronze Sponsor” of SYS-CON's 17th Cloud Expo, which will take place on November 3–5, 2015, at the Santa Clara Convention Center in Santa Clara, CA. IBM Cloud Data Services offers a portfolio of integrated, best-of-breed cloud data services for developers focused on mobile computing and analytics use cases.
Oct. 5, 2015 11:00 PM EDT Reads: 608
Today, we are in the middle of a paradigm shift as we move from managing applications on VMs and containers to embracing everything that the cloud and XaaS (Everything as a Service) has to offer. In his session at 17th Cloud Expo, Kevin Hoffman, Advisory Solutions Architect at Pivotal Cloud Foundry, will provide an overview of 12-factor apps and migrating enterprise apps to the cloud. Kevin Hoffman is an Advisory Solutions Architect for Pivotal Cloud Foundry, and has spent the past 20 years b...
Oct. 5, 2015 10:45 PM EDT Reads: 690
While testing is often ignored when it comes to DevOps - it could be the most important aspect of achieving true DevOps success. Without rethinking automated testing from the ground-up, the entire DevOps productivity gain cannot be realized. Large tech companies build their own rapid test automation that runs in minutes across functional, performance, security and other tests. In his session at DevOps Summit, Kevin Surace, CEO of Appvance, will discuss how we learn from these real-world succe...
Oct. 5, 2015 10:00 PM EDT Reads: 419
DevOps has often been described in terms of CAMS: Culture, Automation, Measuring, Sharing. While we’ve seen a lot of focus on the “A” and even on the “M”, there are very few examples of why the “C" is equally important in the DevOps equation. In her session at @DevOps Summit, Lori MacVittie, of F5 Networks, will explore HTTP/1 and HTTP/2 along with Microservices to illustrate why a collaborative culture between Dev, Ops, and the Network is critical to ensuring success.
Oct. 5, 2015 09:45 PM EDT Reads: 691
SYS-CON Events announced today that Logz.io has been named a "Bronze Sponsor" of SYS-CON's @DevOpsSummit Silicon Valley, which will take place November 3-5, 2015, at the Santa Clara Convention Center in Santa Clara, CA. Logz.io provides open-source software ELK turned into a log analytics platform that is simple, infinitely- scalable, highly available, and secure.
Oct. 5, 2015 09:30 PM EDT Reads: 900