Welcome!

Related Topics: @CloudExpo, Java IoT, Microservices Expo, Linux Containers, Cloud Security, @BigDataExpo

@CloudExpo: Article

Toward a More Confident Cloud Security Strategy

Confidence in cloud encryption depends on understanding on where it needs protection

The cloud has hit the mainstream. Businesses in the United States currently spend more than $13 billion on cloud computing and managed hosting services, and Gartner projects that by 2015, end-user spending on cloud services could be more than $180 billion worldwide. It is estimated that 50 percent of organizations will require employees to use their own devices by 2017, which will depend on shared cloud storage. All of this requires encryption.

Organizational deployment of encryption has increased significantly in recent years. Its use spans everything from encrypting data in databases and file systems, in storage networks, on back-up tapes, and while being transferred over a public and internal networks. Although this might seem that we are moving in the right direction when it comes to enterprise data protection, there's a real risk of creating fragmentation and inconsistency - referred to as encryption sprawl - as different organizations deploy diverse technologies in different places to secure different types of data. Adding fuel to the fire, the cloud poses its own unique threats and challenges. With an undeniable value proposition, it seems clear that the cloud is inevitable and that protecting data within it will be a top priority.

The 2014 Encryption in the Cloud report reveals that more than 50 percent of businesses surveyed have sent confidential or sensitive data to the cloud. Only 11 percent of respondents say that their organization has no plans to use the cloud for sensitive operations, down from 19 percent just two years ago. It is heartening to see that use of encryption to protect that sensitive data in the cloud is also increasing, but it's disturbing that over half of the respondents who store sensitive data in the cloud report that their data is "cleartext" and therefore readable by anyone who can access it.

Cloud Confidence Through Key Management
Cloud usage may be ubiquitous, but opinions on securing data in it are no unanimous. Viewpoints abound when it comes to deciding where and how to apply encryption in the cloud. The report shows an almost equal split between those who encrypt data before it is sent to the cloud and those who choose to apply encryption directly within the cloud. Regardless of approach, key management remains a pain point, as businesses tread the line between trust and control between their own organization and the cloud provider.

In fact, key management is foundational to an effective encryption strategy. Although many regard encryption itself as being black and white - data is either encrypted or not - the reality is that there is such a thing as good or bad encryption. Much of the variance comes down to implementation and key management - a point that became crystal clear with the recent "Heartbleed" vulnerability in OpenSSL. With this in mind, we were pleased to see that 34 percent of respondents report that their own organization is in control of encryption keys when data is encrypted in the cloud. Only 18 percent of respondents report that the cloud provider has full control over keys.

Letting the cloud provider hold the reins is a dicey proposition. If the provider holds the encryption keys, how do you know they're safe? If someone shows up with a lawsuit or subpoena, will the cloud provider release these keys without your knowledge? From a criminal's perspective, stealing keys is far more interesting than stealing data. Stealing data is the modern equivalent of stealing money, yet stealing keys is like stealing the machine that makes the money - an attack that keeps on giving, or to be more accurate, an attack that keeps on taking!

As demand for cloud services continues to rise, security threats to data stored in the cloud will rise as well. Confidence in cloud encryption depends on understanding on where it needs protection, what the consequences are of it being compromised and what level of protection is required. Best practices dictate a cloud encryption strategy to protect critical data while maintaining control of keys.

More Stories By Richard Moulds

Richard Moulds is VP of product strategy at Thales e-Security. Previously he was nCipher's vice president of marketing. He has a bachelor's degree in electrical engineering from Birmingham University and an MBA from Warwick University in the UK.

Comments (0)

Share your thoughts on this story.

Add your comment
You must be signed in to add a comment. Sign-in | Register

In accordance with our Comment Policy, we encourage comments that are on topic, relevant and to-the-point. We will remove comments that include profanity, personal attacks, racial slurs, threats of violence, or other inappropriate material that violates our Terms and Conditions, and will block users who make repeated violations. We ask all readers to expect diversity of opinion and to treat one another with dignity and respect.


Latest Stories
One of the hottest areas in cloud right now is DRaaS and related offerings. In his session at 16th Cloud Expo, Dale Levesque, Disaster Recovery Product Manager with Windstream's Cloud and Data Center Marketing team, will discuss the benefits of the cloud model, which far outweigh the traditional approach, and how enterprises need to ensure that their needs are properly being met.
Internet of @ThingsExpo, taking place June 6-8, 2017 at the Javits Center in New York City, New York, is co-located with the 20th International Cloud Expo and will feature technical sessions from a rock star conference faculty and the leading industry players in the world. @ThingsExpo New York Call for Papers is now open.
WebRTC has had a real tough three or four years, and so have those working with it. Only a few short years ago, the development world were excited about WebRTC and proclaiming how awesome it was. You might have played with the technology a couple of years ago, only to find the extra infrastructure requirements were painful to implement and poorly documented. This probably left a bitter taste in your mouth, especially when things went wrong.
Up until last year, enterprises that were looking into cloud services usually undertook a long-term pilot with one of the large cloud providers, running test and dev workloads in the cloud. With cloud’s transition to mainstream adoption in 2015, and with enterprises migrating more and more workloads into the cloud and in between public and private environments, the single-provider approach must be revisited. In his session at 18th Cloud Expo, Yoav Mor, multi-cloud solution evangelist at Cloudy...
When you focus on a journey from up-close, you look at your own technical and cultural history and how you changed it for the benefit of the customer. This was our starting point: too many integration issues, 13 SWP days and very long cycles. It was evident that in this fast-paced industry we could no longer afford this reality. We needed something that would take us beyond reducing the development lifecycles, CI and Agile methodologies. We made a fundamental difference, even changed our culture...
The proper isolation of resources is essential for multi-tenant environments. The traditional approach to isolate resources is, however, rather heavyweight. In his session at 18th Cloud Expo, Igor Drobiazko, co-founder of elastic.io, drew upon his own experience with operating a Docker container-based infrastructure on a large scale and present a lightweight solution for resource isolation using microservices. He also discussed the implementation of microservices in data and application integrat...
All organizations that did not originate this moment have a pre-existing culture as well as legacy technology and processes that can be more or less amenable to DevOps implementation. That organizational culture is influenced by the personalities and management styles of Executive Management, the wider culture in which the organization is situated, and the personalities of key team members at all levels of the organization. This culture and entrenched interests usually throw a wrench in the work...
Containers have changed the mind of IT in DevOps. They enable developers to work with dev, test, stage and production environments identically. Containers provide the right abstraction for microservices and many cloud platforms have integrated them into deployment pipelines. DevOps and containers together help companies achieve their business goals faster and more effectively. In his session at DevOps Summit, Ruslan Synytsky, CEO and Co-founder of Jelastic, reviewed the current landscape of Dev...
In his General Session at DevOps Summit, Asaf Yigal, Co-Founder & VP of Product at Logz.io, will explore the value of Kibana 4 for log analysis and will give a real live, hands-on tutorial on how to set up Kibana 4 and get the most out of Apache log files. He will examine three use cases: IT operations, business intelligence, and security and compliance. This is a hands-on session that will require participants to bring their own laptops, and we will provide the rest.
IoT is at the core or many Digital Transformation initiatives with the goal of re-inventing a company's business model. We all agree that collecting relevant IoT data will result in massive amounts of data needing to be stored. However, with the rapid development of IoT devices and ongoing business model transformation, we are not able to predict the volume and growth of IoT data. And with the lack of IoT history, traditional methods of IT and infrastructure planning based on the past do not app...
"LinearHub provides smart video conferencing, which is the Roundee service, and we archive all the video conferences and we also provide the transcript," stated Sunghyuk Kim, CEO of LinearHub, in this SYS-CON.tv interview at @ThingsExpo, held November 1-3, 2016, at the Santa Clara Convention Center in Santa Clara, CA.
"We're bringing out a new application monitoring system to the DevOps space. It manages large enterprise applications that are distributed throughout a node in many enterprises and we manage them as one collective," explained Kevin Barnes, President of eCube Systems, in this SYS-CON.tv interview at DevOps at 18th Cloud Expo, held June 7-9, 2016, at the Javits Center in New York City, NY.
With major technology companies and startups seriously embracing IoT strategies, now is the perfect time to attend @ThingsExpo 2016 in New York. Learn what is going on, contribute to the discussions, and ensure that your enterprise is as "IoT-Ready" as it can be! Internet of @ThingsExpo, taking place June 6-8, 2017, at the Javits Center in New York City, New York, is co-located with 20th Cloud Expo and will feature technical sessions from a rock star conference faculty and the leading industry p...
@DevOpsSummit at Cloud taking place June 6-8, 2017, at Javits Center, New York City, is co-located with the 20th International Cloud Expo and will feature technical sessions from a rock star conference faculty and the leading industry players in the world. The widespread success of cloud computing is driving the DevOps revolution in enterprise IT. Now as never before, development teams must communicate and collaborate in a dynamic, 24/7/365 environment. There is no time to wait for long developm...
Updating DevOps to the latest production data slows down your development cycle. Probably it is due to slow, inefficient conventional storage and associated copy data management practices. In his session at @DevOpsSummit at 20th Cloud Expo, Dhiraj Sehgal, in Product and Solution at Tintri, will talk about DevOps and cloud-focused storage to update hundreds of child VMs (different flavors) with updates from a master VM in minutes, saving hours or even days in each development cycle. He will also...