|By Elad Yoran||
|May 19, 2013 05:00 PM EDT||
Cloud service providers store data all over the globe, and are constantly moving that data from one datacenter to the next for reasons as wide-ranging as cost considerations and redundancy requirements. Does this mean that the requirements outlined in varying data residency laws and privacy regulations are directly at odds with how cloud computing works?
The question is an especially delicate one when the cloud service provider stores and processes data in a jurisdiction that is perceived to have far less stringent privacy and data protection requirements - or may allow government agencies far broader data subpoena powers. Since the cloud computing model relies on distributed infrastructure to generate cost and flexibility benefits for customers, building a datacenter in each data residency jurisdiction quickly becomes cost-prohibitive. And, applying a set of constraints to the movement of data introduces an additional layer of complexity that further erodes the value proposition of cloud computing for customers.
Just as cloud computing represents a novel way of delivering IT computing and functionality, a new model for maintaining ownership and direct control of data in the cloud is increasingly required. However, this new model requires that the encryption mechanism is maintained externally and independently of the cloud service provider's environment, and that data is encrypted before it is sent to the cloud.
The Issues Surrounding Information Security and Data Protection Laws
Over the past 18 months, concerns about the feasibility of enforcing data residency laws and regulations in the cloud have increasingly come to the forefront. Multiple countries including India, Switzerland, Germany, Australia, South Africa and Canada have enacted laws restricting corporations from storing data outside their physical country borders. Additionally, EU Safe Harbor Principles mandate that companies operating within the European Union are forbidden from sending personally identifiable information (PII) outside the European Economic area, unless it is guaranteed that the data will receive equivalent levels of protection.
This is partly as a result of broader understanding of cloud computing architecture and processes, but also because of the ambiguity of safeguards for the privacy of cloud data. For example, national security concerns have driven the definition of US legislation such as The Foreign Intelligence Surveillance Amendments (FISA) Act and the USA PATRIOT Act, to extend the ability of the federal government and law enforcement agencies to subpoena communications and emails stored in the cloud. The concern is now as much whether data is leaving the jurisdiction as it is what the privacy laws hold where the data lands. Inconsistent approaches to privacy further complicate the picture.
The current response to this challenge is either not to move to the cloud, or require cloud service providers to store data within each jurisdiction. For cloud service providers, this presents a business challenge in delivering a level of flexibility, cost and effective service while altering their delivery and management models to satisfy data residency and privacy requirements. To address the mandates set forth by these laws, a cloud provider would ostensibly have to build datacenters in each jurisdiction, resulting in significant cost and overhead that would reduce the overall gain of cloud storage.
Cloud Encryption and Cloud Data Residency Regulations
The interaction between the evolution of information security and the definition of data protection mandates by legislative bodies or industry groups is a dynamic one. At the heart of the concern is how organizations can continue to maintain ownership and control of data to protect personal information, even when the information resides with a third-party service that relies on a distributed infrastructure in order to deliver resiliency, availability and flexibility to customers.
By way of illustration, compliance requirements and data breach laws have been regularly updated as new information security alternatives have been developed. In the US, more than 40 states currently have breach notification laws mandating that if a company is aware of lost or stolen personally identifiable information, they are required to directly notify the consumer. When these laws were initially enacted (starting with the State of California in 2002), they generally stated that regardless of the circumstances, the company was required to notify the consumer. However, the laws have been gradually amended, and more than 25 states have now enacted an exemption for encrypted personal data. In other words, in instances where lost or stolen data is encrypted, the company is no longer required under law to notify the consumer.
The underlying argument for differentiating between unencrypted data and encrypted data in the context of breach notification is that in the instance where data is encrypted, the attacker has gained access to useless "gibberish" if they do not hold the encryption keys.
However, cloud computing is an evolving paradigm where both the obligations of the data owner and acceptable forms of data protection are still in the process of initial definition. As the technology gains popularity and becomes a well-established method of data storage and processing, the laws pertaining to cloud computing will also continue to evolve in the same way that data breach laws have.
For example, regulations are also now moving towards excluding encrypted data from data residency legislation. Encryption is recognized in the State of Nevada as a means of securing data outside of geographic boundaries: "A data collector doing business in this State shall not: (a) Transfer any personal information through an electronic, non-voice transmission other than a facsimile to a person outside of the secure system of the data collector unless the data collector uses encryption to ensure the security of electronic transmission; or (b) Move any data storage device containing personal information beyond the logical or physical controls of the data collector or its data storage contractor unless the data collector uses encryption to ensure the security of the information."
While data residency regulations can be narrowly defined, in many jurisdictions laws can be interpreted as not applying to data that has been encrypted before being sent to the cloud. Dr. Thilo Weichert, head of the Independent Center for Privacy Protection for the German state of Schleswig-Holstein, argues in his Cloud Computing & Data Privacy paper that if data is anonymized or sufficiently aliased to the extent that the identity of individuals is indecipherable, then data residency law does not apply. Encryption takes anonymizing and aliasing a step further, where the data is completely indecipherable. Similarly, under the European Union's Data Protection Directive (EU DPD), as long as the data is encrypted, where it resides should not present a legal obstacle.
Likewise, under Canadian privacy law, both federal bodies and commercial organizations domiciled within Canadian borders are responsible for the privacy and protection of personal information in their custody. This requirement applies regardless of where the data resides. While significant concerns have been articulated with regards to the probability of disclosure to law enforcement agencies for data that resides within US datacenters, the requirements pertain directly to the safeguards in place to maintain control.
Ann Cavoukian, Information and Privacy Commissioner for the Province of Ontario, noted in her formal response to a question related to the compliance with the Freedom of Information and Protection of Privacy Act concerning the privacy and security of personal information collected by the Ministry of Natural Resources being stored in the US that: "to the extent that the data owner retains the encryption keys, the location of the encrypted data is a secondary issue."
In other words, if the encrypted data leaves the jurisdiction, but the keys remain under the data owner's direct control, the level of protection can be sufficient in terms of data residency requirements.
However, this model also implies that the data encryption scheme is maintained externally and independently of the cloud service provider's environment, and that data is encrypted before it is sent to the cloud.
Persistent Encryption and Data Residency
The most effective method to address the jurisdictional and residency requirements of data processed by third-party services is via control of encryption keys and the application of persistent encryption. By applying persistent encryption, data that is encrypted at the boundary of the network remains encrypted even when processed and stored within a cloud service provider environment. As a result, persistent encryption ensures that data is never decrypted when in a third-party's environment and the ability to access useable data remains solely with the organization that holds the encryption key.
Therefore, businesses can comply with jurisdictional and residency requirements by virtue of keeping the encryption keys within the jurisdiction regardless of the actual physical location of the data. Laws relating to data residency are now undergoing a historic transition from the old paradigm where it mattered where the data was physically located to the new paradigm where it only matters where the encryption keys are located.
With the application of persistent encryption, control of the keys in combination with encryption across the data lifecycle - in transit, at rest and in use - provide the foundation to satisfy requirements for control and adequate safeguards for the privacy of personal information. Although the encrypted data may leave the physical borders of a specific country, the data is always fully encrypted while outside of the defined jurisdiction. As the keys are retained within a business's legal jurisdiction, the data cannot be accessed or read until it returns to the physical borders in which the organization resides.
Global Pharmaceutical Company Case Study: Cloud Data Ownership and Control Concerns
The following example depicts a privately held multinational pharmaceutical company that engages in research, development, production, and marketing of prescription and over-the-counter medicines and healthcare products. The company has thousands of employees across the globe, as well as multiple subsidiaries and entities.
The company's IT procurement and deployment approach follows a decentralized model in which each entity subsidiary hosts its own servers and datacenters. There are three functional organizational pillars maintained within its technology and IT services division: Technology Planning; Enterprise Architecture and Data Services; and Production Services. The divisions are staffed by IT engineers with managed services providing support for thousands of clients across a multitude of sites. Existing infrastructure includes hardware, software, services, and virtualization from multiple top vendors including Microsoft, VCE, Dell, Oracle, EMC and VMware.
The pharmaceutical company had adopted several cloud-based services for applications that do not process or store critical or regulated business information, such as Web conferencing, spam filtering, compliance training and tracking, and travel and expense management, but was seeking to expand its cloud computing usage to business critical applications by moving low value servers to cloud providers, as well as moving commodity applications such as email to the cloud.
Concerns about the loss of control and ownership of corporate data, however, stood in the way of realizing the increased efficiencies and operational benefits possible through broader adoption of cloud-based services. These concerns were related to:
- Compliance with international data residency requirements that preclude data leaving a jurisdiction in the clear
- Compliance with regulations governing the security, privacy and confidentiality of healthcare data
- Safeguards to limit exposure of its intellectual property when it is stored and processed in the cloud
- Lack of visibility into service provider responses to information subpoenas that can result in a breach of confidentiality or loss of data
Addressing Residency and Unauthorized Disclosure
While the cloud service provider could attest to the security of the environment based on a framework like the Cloud Security Alliance's Cloud Control Matrix, the global pharmaceutical company required an independent mechanism to protect its intellectual property while resident in the cloud. A common challenge to cloud migration within the pharmaceutical/healthcare industry is confidentiality and sensitivity to a service provider's compliance with government subpoenas as pharmaceutical and healthcare companies maintain sensitive information related to research, clinical study results, and personal medical history. Therefore, it is critical that sensitive information remain under the company's control, without any forfeiture of attorney-client privilege.
In a typical scenario, if a company stores sensitive data in the cloud, and the cloud service provider is faced with a subpoena or other request from the government, they must comply and disclose the company's data to the federal government body. The provider may notify the company after the fact, or in cases of blind subpoenas, not at all.
The pharmaceutical company decided to use persistent encryption technology to specifically address the migration of their email infrastructure to the cloud. Deployed as an on-premise gateway, this enabled the company to successfully address the jurisdictional and residency requirements of email data hosted in the cloud, as the company maintains control of the encryption keys - and business data is encrypted when it passes through the gateway's proxy at the boundary of the network and remains encrypted even when processed by and stored within a cloud service provider environment.
The persistent encryption technology ensures that data is never decrypted when in a third-party's environment and the ability to access useable data remains solely with the organization that holds the encryption key. Therefore, the company is able to comply with jurisdictional and residency requirements by virtue of keeping the encryption keys within the jurisdiction regardless of the actual physical location of the data, as well as ensuring complete ownership and control of that data if faced with a subpoena.
|veronica321 05/22/13 11:13:00 AM EDT|
Great article and rightly said, security is still the biggest concern when moving to the cloud , I came across this interesting piece on cloud computing and cloud security in particular that readers might find interesting 'Cloud risks Striking a balance between savings and security' it talks a great deal about securing the cloud and data residency laws that might interest readers
Organizations planning enterprise data center consolidation and modernization projects are faced with a challenging, costly reality. Requirements to deploy modern, cloud-native applications simultaneously with traditional client/server applications are almost impossible to achieve with hardware-centric enterprise infrastructure. Compute and network infrastructure are fast moving down a software-defined path, but storage has been a laggard. Until now.
Jul. 25, 2016 10:15 PM EDT Reads: 1,662
Continuous testing helps bridge the gap between developing quickly and maintaining high quality products. But to implement continuous testing, CTOs must take a strategic approach to building a testing infrastructure and toolset that empowers their team to move fast. Download our guide to laying the groundwork for a scalable continuous testing strategy.
Jul. 25, 2016 10:15 PM EDT Reads: 1,917
Let’s face it, embracing new storage technologies, capabilities and upgrading to new hardware often adds complexity and increases costs. In his session at 18th Cloud Expo, Seth Oxenhorn, Vice President of Business Development & Alliances at FalconStor, discussed how a truly heterogeneous software-defined storage approach can add value to legacy platforms and heterogeneous environments. The result reduces complexity, significantly lowers cost, and provides IT organizations with improved efficienc...
Jul. 25, 2016 10:00 PM EDT Reads: 1,939
Internet of @ThingsExpo, taking place November 1-3, 2016, at the Santa Clara Convention Center in Santa Clara, CA, is co-located with the 19th International Cloud Expo and will feature technical sessions from a rock star conference faculty and the leading industry players in the world and ThingsExpo Silicon Valley Call for Papers is now open.
Jul. 25, 2016 10:00 PM EDT Reads: 2,498
The IoT is changing the way enterprises conduct business. In his session at @ThingsExpo, Eric Hoffman, Vice President at EastBanc Technologies, discussed how businesses can gain an edge over competitors by empowering consumers to take control through IoT. He cited examples such as a Washington, D.C.-based sports club that leveraged IoT and the cloud to develop a comprehensive booking system. He also highlighted how IoT can revitalize and restore outdated business models, making them profitable ...
Jul. 25, 2016 08:30 PM EDT Reads: 1,931
StackIQ has announced the release of Stacki 3.2. Stacki is an easy-to-use Linux server provisioning tool. Stacki 3.2 delivers new capabilities that simplify the automation and integration of site-specific requirements. StackIQ is the commercial entity behind this open source bare metal provisioning tool. Since the release of Stacki in June of 2015, the Stacki core team has been focused on making the Community Edition meet the needs of members of the community, adding features and value, while ...
Jul. 25, 2016 08:15 PM EDT Reads: 230
The cloud competition for database hosts is fierce. How do you evaluate a cloud provider for your database platform? In his session at 18th Cloud Expo, Chris Presley, a Solutions Architect at Pythian, gave users a checklist of considerations when choosing a provider. Chris Presley is a Solutions Architect at Pythian. He loves order – making him a premier Microsoft SQL Server expert. Not only has he programmed and administered SQL Server, but he has also shared his expertise and passion with b...
Jul. 25, 2016 08:00 PM EDT Reads: 1,915
With 15% of enterprises adopting a hybrid IT strategy, you need to set a plan to integrate hybrid cloud throughout your infrastructure. In his session at 18th Cloud Expo, Steven Dreher, Director of Solutions Architecture at Green House Data, discussed how to plan for shifting resource requirements, overcome challenges, and implement hybrid IT alongside your existing data center assets. Highlights included anticipating workload, cost and resource calculations, integrating services on both sides...
Jul. 25, 2016 08:00 PM EDT Reads: 1,965
Big Data engines are powering a lot of service businesses right now. Data is collected from users from wearable technologies, web behaviors, purchase behavior as well as several arbitrary data points we’d never think of. The demand for faster and bigger engines to crunch and serve up the data to services is growing exponentially. You see a LOT of correlation between “Cloud” and “Big Data” but on Big Data and “Hybrid,” where hybrid hosting is the sanest approach to the Big Data Infrastructure pro...
Jul. 25, 2016 07:30 PM EDT Reads: 1,881
"We are a well-established player in the application life cycle management market and we also have a very strong version control product," stated Flint Brenton, CEO of CollabNet,, in this SYS-CON.tv interview at 18th Cloud Expo, held June 7-9, 2016, at the Javits Center in New York City, NY.
Jul. 25, 2016 07:15 PM EDT Reads: 1,794
We all know the latest numbers: Gartner, Inc. forecasts that 6.4 billion connected things will be in use worldwide in 2016, up 30 percent from last year, and will reach 20.8 billion by 2020. We're rapidly approaching a data production of 40 zettabytes a day – more than we can every physically store, and exabytes and yottabytes are just around the corner. For many that’s a good sign, as data has been proven to equal money – IF it’s ingested, integrated, and analyzed fast enough. Without real-ti...
Jul. 25, 2016 07:15 PM EDT Reads: 1,008
In his session at 18th Cloud Expo, Sagi Brody, Chief Technology Officer at Webair Internet Development Inc., and Logan Best, Infrastructure & Network Engineer at Webair, focused on real world deployments of DDoS mitigation strategies in every layer of the network. He gave an overview of methods to prevent these attacks and best practices on how to provide protection in complex cloud platforms. He also outlined what we have found in our experience managing and running thousands of Linux and Unix ...
Jul. 25, 2016 07:15 PM EDT Reads: 1,746
In his session at @DevOpsSummit at 19th Cloud Expo, Yoseph Reuveni, Director of Software Engineering at Jet.com, will discuss Jet.com's journey into containerizing Microsoft-based technologies like C# and F# into Docker. He will talk about lessons learned and challenges faced, the Mono framework tryout and how they deployed everything into Azure cloud. Yoseph Reuveni is a technology leader with unique experience developing and running high throughput (over 1M tps) distributed systems with extre...
Jul. 25, 2016 07:15 PM EDT Reads: 2,087
"Operations is sort of the maturation of cloud utilization and the move to the cloud," explained Steve Anderson, Product Manager for BMC’s Cloud Lifecycle Management, in this SYS-CON.tv interview at 18th Cloud Expo, held June 7-9, 2016, at the Javits Center in New York City, NY.
Jul. 25, 2016 07:00 PM EDT Reads: 1,881
I wanted to gather all of my Internet of Things (IOT) blogs into a single blog (that I could later use with my University of San Francisco (USF) Big Data “MBA” course). However as I started to pull these blogs together, I realized that my IOT discussion lacked a vision; it lacked an end point towards which an organization could drive their IOT envisioning, proof of value, app dev, data engineering and data science efforts. And I think that the IOT end point is really quite simple…
Jul. 25, 2016 06:30 PM EDT Reads: 979