Welcome!

Related Topics: @BigDataExpo, Java IoT, Linux Containers, Containers Expo Blog, @CloudExpo, Cloud Security

@BigDataExpo: Article

Protecting Data in the Cloud

Today’s cloud-driven, always-connected world enables organizations to be very agile but is also putting data integrity at risk

The cloud plays an integral role in enabling the agility required to take advantage of new business models and to do so in a very convenient and cost-effective way. However, this also means that more personal information and business data will exist in the cloud and be passed back and forth. Maintaining data integrity is paramount.

Today's approach to security in the cloud may not be sufficient; it doesn't focus on putting controls close to data, which is now more fluid, and it doesn't discriminate one set of data from another. All data is not created equal and should not be treated in the same manner; a one-size fits all model doesn't work.

In this always-connected world, protection measures in the cloud need to focus on what really matters - the type of data, how it is used, and where it goes.

Data Classification
In order to adequately protect data in the cloud, organizations need to start considering how to classify data. One approach is to use a three-tier data protection model to cater to data of different sensitivities and relevance across industries. This model would include:

Tier 1, Regulated: Data subject to regulation, or data that carries with it proprietary, ethical, or privacy considerations such as personally identifiable information (PII). Unauthorized disclosure of regulated data may have serious adverse effects on an organization's reputation, resources, services, or individuals and requires the most stringent level of control.

Tier 2, Commercial: Industry-related, ecommerce or transactional and intellectual property data whose unauthorized disclosure may have moderately adverse effects on an organization's reputation, resources, services, or individuals. Commercial data requires a moderate level of security.

Tier 3, Collaborative: Collaborative and DevOps-type data that typically is publicly accessible, requires minimal security controls and poses little or no risk to the consuming organization's reputation, resources, or services.

Using this model, security teams can strategically partner with business users to understand requirements and determine the right approach for their organization. Small to mid-sized organizations, enterprises, and service providers can apply this model to begin classifying their data based on contextual attributes such as how the data will be accessed, stored, and transmitted. Once the data is classified, they can then apply appropriate data protection measures focused on protecting work streams and transactions that continue to evolve to enable business agility. Given that most of today's data breaches are a result of user-access issues, security considerations such as Identity and Access Management, Authorization, and Authentication are critical.

The Data Integrity Challenge
Understanding and classifying data is just a first step, albeit an important one. Organizations also need to determine how to ensure data integrity when the perimeter is amorphous and control of the endpoints and the data is diminished mobility and cloud services.

Business departments are increasingly encouraged to find efficient and innovative ways to generate new business. This requires identifying new applications and ways to support the business anywhere and anytime. Business users often make the decision to use the cloud before involving IT since they can get up and running in a fraction of the time and cost it would take to provision in house.

With this unprecedented change in operations and infrastructure comes an unprecedented need for ensuring data integrity - ultimately working through the life cycle of data that can, at any point, be within the confines of a company, out to a network of partners and suppliers, or floating in a cloud. The challenge in this fractured landscape is that the perimeter is amorphous, but legacy security solutions are not; designed for a time when there was a more well-defined perimeter. The result is that attackers now use various techniques to bypass traditional perimeter-based defenses and compromise data - be it through tampering, stealing, or leaking data. Point-in-time defenses are no longer sufficient.

To effectively protect data wherever it may be, defenses must go beyond simply blocking and detection to including capabilities such as data correlation, continuous data analysis, and retrospective action when data has been found to have been corrupted, tampered with, or exfiltrated.

A New Approach to Applying Controls
In order to protect the classes of data described earlier - regulated, commercial, and collaborative - security teams need a mix of policy, process, and technology controls. These controls should be applied based on user and location context and according to a security model that is open, integrated, continuous, and pervasive:

  • Open to provide access to global intelligence and context to detect and remediate breaches and to support new standards for data protection.
  • Integrated solutions that enable policy to be automated and minimize manual processes can close gaps in security and support centralized management and control according to data classifications.
  • Both point-in-time solutions as well as continuous capabilities are needed to identify new threats to data.
  • Pervasive security delivers protection across the full attack continuum - before, during, and after an attack.

Let's take a closer look at the advantages of applying controls to protect data based on this model.

Openness provides:

  • The opportunity to participate in an open community of users and standards bodies to ensure consistent data classification and standards of policy and process.
  • Easy integration with other layers of security defenses to continue to uphold data protection best practices as IT environments and business requirements change.
  • The ability to access to global intelligence with the right context to identify new threats and take immediate action.

Integrated enables:

  • Technology controls that map to data tiers and also track data through different usage contexts and locations to support the fundamental first step of data classification.
  • Identity and access controls, authorization, and authentication that work in unison to map data protection to data classifications.
  • Encryption controls applied based on deemed data sensitivity to further strengthen protection, including strong encryption key standards (minimum AES256) and encryption keys retained by data owners.
  • Security solutions and technologies that seamlessly work together to protect data across its entire lifecycle.
  • Centralized policy management, monitoring, and distributed policy enforcement to ensure compliance with regulatory and corporate policies.

Continuous supports:

  • Technologies and services to constantly aggregate and correlate data from across the connected environment with historical patterns and global attack intelligence to maintain real-time contextual information, track data movement, and detect data exfiltration.
  • The ability to leverage insights into emerging new threats, take action (automatically or manually) to stop these threats, and use that intelligence to protect against future data breaches.

Pervasive translates into:

  • Defenses (including technologies and best practices) that address the full attack continuum - before, during, and after an attack. Before an attack, total, actionable visibility is required to see who is accessing what data from where and how, and to correlate that information against emerging threat vectors. During an attack, continuous visibility and control to analyze and take action in real time to protect data is necessary. After an attack, the key is to mitigate the damage, remediate, quickly recover, and prevent similar, future data breaches, data tampering, or data corruption activities.
  • The ability to address all attack vectors - including network, endpoints, virtual, the cloud, email and Web - to mitigate risk associated with various communications channels that could be used by an attacker to compromise data.

Today's cloud-driven, always-connected world is enabling organizations to be very agile but it is also putting data integrity at risk. IT teams need to quickly adapt to this new way of doing business despite having less control of the endpoints and the data. Traditional data protection models fail due to their inability to discriminate one set of data from another. By putting in place protection measures based on the type of data, how it is used, and where it goes, and backed by a security model that is open, integrated, continuous, and pervasive, organizations can take advantage of new business opportunities the cloud affords without sacrificing data integrity.

More Stories By Raja Patel

Raja Patel is a Senior Director, Cloud Security Product Management, at Cisco, where he is responsible for the portfolio strategy and development of security solutions for Cisco's Security Business. His responsibilities include building solutions and managing operations associated with Cloud, Threat Intelligence, Web and Email Security. Raja has been at Cisco for 13 years and during this tenure he has product managed a broad portfolio of products within Cisco’s Enterprise Networking Business Group, developed and accelerated new consumption & business models such as Enterprise Licensing, and lead strategic initiatives to develop more agile business practices across Cisco.

Mr. Patel holds a BS in Aerospace Engineering with a Minor in Mathematics from Embry Riddle Aeronautical University, and an MBA in Global Business Management.

Comments (0)

Share your thoughts on this story.

Add your comment
You must be signed in to add a comment. Sign-in | Register

In accordance with our Comment Policy, we encourage comments that are on topic, relevant and to-the-point. We will remove comments that include profanity, personal attacks, racial slurs, threats of violence, or other inappropriate material that violates our Terms and Conditions, and will block users who make repeated violations. We ask all readers to expect diversity of opinion and to treat one another with dignity and respect.


Latest Stories
SYS-CON Events announced today that StarNet Communications will exhibit at the 19th International Cloud Expo, which will take place on November 1–3, 2016, at the Santa Clara Convention Center in Santa Clara, CA. StarNet Communications’ FastX is the industry first cloud-based remote X Windows emulator. Using standard Web browsers (FireFox, Chrome, Safari, etc.) users from around the world gain highly secure access to applications and data hosted on Linux-based servers in a central data center. ...
SYS-CON Events announced today that Hitrons Solutions will exhibit at the 19th International Cloud Expo, which will take place on November 1–3, 2016, at the Santa Clara Convention Center in Santa Clara, CA. Hitrons Solutions Inc. is distributor in the North American market for unique products and services of small and medium-size businesses, including cloud services and solutions, SEO marketing platforms, and mobile applications.
As the world moves toward more DevOps and Microservices, application deployment to the cloud ought to become a lot simpler. The Microservices architecture, which is the basis of many new age distributed systems such as OpenStack, NetFlix and so on, is at the heart of Cloud Foundry - a complete developer-oriented Platform as a Service (PaaS) that is IaaS agnostic and supports vCloud, OpenStack and AWS. Serverless computing is revolutionizing computing. In his session at 19th Cloud Expo, Raghav...
Pulzze Systems was happy to participate in such a premier event and thankful to be receiving the winning investment and global network support from G-Startup Worldwide. It is an exciting time for Pulzze to showcase the effectiveness of innovative technologies and enable them to make the world smarter and better. The reputable contest is held to identify promising startups around the globe that are assured to change the world through their innovative products and disruptive technologies. There w...
DevOps at Cloud Expo, taking place Nov 1-3, 2016, at the Santa Clara Convention Center in Santa Clara, CA, is co-located with 19th Cloud Expo and will feature technical sessions from a rock star conference faculty and the leading industry players in the world. The widespread success of cloud computing is driving the DevOps revolution in enterprise IT. Now as never before, development teams must communicate and collaborate in a dynamic, 24/7/365 environment. There is no time to wait for long dev...
Today we can collect lots and lots of performance data. We build beautiful dashboards and even have fancy query languages to access and transform the data. Still performance data is a secret language only a couple of people understand. The more business becomes digital the more stakeholders are interested in this data including how it relates to business. Some of these people have never used a monitoring tool before. They have a question on their mind like “How is my application doing” but no id...
Personalization has long been the holy grail of marketing. Simply stated, communicate the most relevant offer to the right person and you will increase sales. To achieve this, you must understand the individual. Consequently, digital marketers developed many ways to gather and leverage customer information to deliver targeted experiences. In his session at @ThingsExpo, Lou Casal, Founder and Principal Consultant at Practicala, discussed how the Internet of Things (IoT) has accelerated our abil...
With so much going on in this space you could be forgiven for thinking you were always working with yesterday’s technologies. So much change, so quickly. What do you do if you have to build a solution from the ground up that is expected to live in the field for at least 5-10 years? This is the challenge we faced when we looked to refresh our existing 10-year-old custom hardware stack to measure the fullness of trash cans and compactors.
Extreme Computing is the ability to leverage highly performant infrastructure and software to accelerate Big Data, machine learning, HPC, and Enterprise applications. High IOPS Storage, low-latency networks, in-memory databases, GPUs and other parallel accelerators are being used to achieve faster results and help businesses make better decisions. In his session at 18th Cloud Expo, Michael O'Neill, Strategic Business Development at NVIDIA, focused on some of the unique ways extreme computing is...
The emerging Internet of Everything creates tremendous new opportunities for customer engagement and business model innovation. However, enterprises must overcome a number of critical challenges to bring these new solutions to market. In his session at @ThingsExpo, Michael Martin, CTO/CIO at nfrastructure, outlined these key challenges and recommended approaches for overcoming them to achieve speed and agility in the design, development and implementation of Internet of Everything solutions wi...
Cloud computing is being adopted in one form or another by 94% of enterprises today. Tens of billions of new devices are being connected to The Internet of Things. And Big Data is driving this bus. An exponential increase is expected in the amount of information being processed, managed, analyzed, and acted upon by enterprise IT. This amazing is not part of some distant future - it is happening today. One report shows a 650% increase in enterprise data by 2020. Other estimates are even higher....
With over 720 million Internet users and 40–50% CAGR, the Chinese Cloud Computing market has been booming. When talking about cloud computing, what are the Chinese users of cloud thinking about? What is the most powerful force that can push them to make the buying decision? How to tap into them? In his session at 18th Cloud Expo, Yu Hao, CEO and co-founder of SpeedyCloud, answered these questions and discussed the results of SpeedyCloud’s survey.
SYS-CON Events announced today that Isomorphic Software will exhibit at DevOps Summit at 19th International Cloud Expo, which will take place on November 1–3, 2016, at the Santa Clara Convention Center in Santa Clara, CA. Isomorphic Software provides the SmartClient HTML5/AJAX platform, the most advanced technology for building rich, cutting-edge enterprise web applications for desktop and mobile. SmartClient combines the productivity and performance of traditional desktop software with the simp...
Actian Corporation has announced the latest version of the Actian Vector in Hadoop (VectorH) database, generally available at the end of July. VectorH is based on the same query engine that powers Actian Vector, which recently doubled the TPC-H benchmark record for non-clustered systems at the 3000GB scale factor (see tpc.org/3323). The ability to easily ingest information from different data sources and rapidly develop queries to make better business decisions is becoming increasingly importan...
SYS-CON Events announced today that 910Telecom will exhibit at the 19th International Cloud Expo, which will take place on November 1–3, 2016, at the Santa Clara Convention Center in Santa Clara, CA. Housed in the classic Denver Gas & Electric Building, 910 15th St., 910Telecom is a carrier-neutral telecom hotel located in the heart of Denver. Adjacent to CenturyLink, AT&T, and Denver Main, 910Telecom offers connectivity to all major carriers, Internet service providers, Internet backbones and ...