Welcome!

Related Topics: @DXWorldExpo, Java IoT, Linux Containers, Containers Expo Blog, @CloudExpo, Cloud Security

@DXWorldExpo: Article

Protecting Data in the Cloud

Today’s cloud-driven, always-connected world enables organizations to be very agile but is also putting data integrity at risk

The cloud plays an integral role in enabling the agility required to take advantage of new business models and to do so in a very convenient and cost-effective way. However, this also means that more personal information and business data will exist in the cloud and be passed back and forth. Maintaining data integrity is paramount.

Today's approach to security in the cloud may not be sufficient; it doesn't focus on putting controls close to data, which is now more fluid, and it doesn't discriminate one set of data from another. All data is not created equal and should not be treated in the same manner; a one-size fits all model doesn't work.

In this always-connected world, protection measures in the cloud need to focus on what really matters - the type of data, how it is used, and where it goes.

Data Classification
In order to adequately protect data in the cloud, organizations need to start considering how to classify data. One approach is to use a three-tier data protection model to cater to data of different sensitivities and relevance across industries. This model would include:

Tier 1, Regulated: Data subject to regulation, or data that carries with it proprietary, ethical, or privacy considerations such as personally identifiable information (PII). Unauthorized disclosure of regulated data may have serious adverse effects on an organization's reputation, resources, services, or individuals and requires the most stringent level of control.

Tier 2, Commercial: Industry-related, ecommerce or transactional and intellectual property data whose unauthorized disclosure may have moderately adverse effects on an organization's reputation, resources, services, or individuals. Commercial data requires a moderate level of security.

Tier 3, Collaborative: Collaborative and DevOps-type data that typically is publicly accessible, requires minimal security controls and poses little or no risk to the consuming organization's reputation, resources, or services.

Using this model, security teams can strategically partner with business users to understand requirements and determine the right approach for their organization. Small to mid-sized organizations, enterprises, and service providers can apply this model to begin classifying their data based on contextual attributes such as how the data will be accessed, stored, and transmitted. Once the data is classified, they can then apply appropriate data protection measures focused on protecting work streams and transactions that continue to evolve to enable business agility. Given that most of today's data breaches are a result of user-access issues, security considerations such as Identity and Access Management, Authorization, and Authentication are critical.

The Data Integrity Challenge
Understanding and classifying data is just a first step, albeit an important one. Organizations also need to determine how to ensure data integrity when the perimeter is amorphous and control of the endpoints and the data is diminished mobility and cloud services.

Business departments are increasingly encouraged to find efficient and innovative ways to generate new business. This requires identifying new applications and ways to support the business anywhere and anytime. Business users often make the decision to use the cloud before involving IT since they can get up and running in a fraction of the time and cost it would take to provision in house.

With this unprecedented change in operations and infrastructure comes an unprecedented need for ensuring data integrity - ultimately working through the life cycle of data that can, at any point, be within the confines of a company, out to a network of partners and suppliers, or floating in a cloud. The challenge in this fractured landscape is that the perimeter is amorphous, but legacy security solutions are not; designed for a time when there was a more well-defined perimeter. The result is that attackers now use various techniques to bypass traditional perimeter-based defenses and compromise data - be it through tampering, stealing, or leaking data. Point-in-time defenses are no longer sufficient.

To effectively protect data wherever it may be, defenses must go beyond simply blocking and detection to including capabilities such as data correlation, continuous data analysis, and retrospective action when data has been found to have been corrupted, tampered with, or exfiltrated.

A New Approach to Applying Controls
In order to protect the classes of data described earlier - regulated, commercial, and collaborative - security teams need a mix of policy, process, and technology controls. These controls should be applied based on user and location context and according to a security model that is open, integrated, continuous, and pervasive:

  • Open to provide access to global intelligence and context to detect and remediate breaches and to support new standards for data protection.
  • Integrated solutions that enable policy to be automated and minimize manual processes can close gaps in security and support centralized management and control according to data classifications.
  • Both point-in-time solutions as well as continuous capabilities are needed to identify new threats to data.
  • Pervasive security delivers protection across the full attack continuum - before, during, and after an attack.

Let's take a closer look at the advantages of applying controls to protect data based on this model.

Openness provides:

  • The opportunity to participate in an open community of users and standards bodies to ensure consistent data classification and standards of policy and process.
  • Easy integration with other layers of security defenses to continue to uphold data protection best practices as IT environments and business requirements change.
  • The ability to access to global intelligence with the right context to identify new threats and take immediate action.

Integrated enables:

  • Technology controls that map to data tiers and also track data through different usage contexts and locations to support the fundamental first step of data classification.
  • Identity and access controls, authorization, and authentication that work in unison to map data protection to data classifications.
  • Encryption controls applied based on deemed data sensitivity to further strengthen protection, including strong encryption key standards (minimum AES256) and encryption keys retained by data owners.
  • Security solutions and technologies that seamlessly work together to protect data across its entire lifecycle.
  • Centralized policy management, monitoring, and distributed policy enforcement to ensure compliance with regulatory and corporate policies.

Continuous supports:

  • Technologies and services to constantly aggregate and correlate data from across the connected environment with historical patterns and global attack intelligence to maintain real-time contextual information, track data movement, and detect data exfiltration.
  • The ability to leverage insights into emerging new threats, take action (automatically or manually) to stop these threats, and use that intelligence to protect against future data breaches.

Pervasive translates into:

  • Defenses (including technologies and best practices) that address the full attack continuum - before, during, and after an attack. Before an attack, total, actionable visibility is required to see who is accessing what data from where and how, and to correlate that information against emerging threat vectors. During an attack, continuous visibility and control to analyze and take action in real time to protect data is necessary. After an attack, the key is to mitigate the damage, remediate, quickly recover, and prevent similar, future data breaches, data tampering, or data corruption activities.
  • The ability to address all attack vectors - including network, endpoints, virtual, the cloud, email and Web - to mitigate risk associated with various communications channels that could be used by an attacker to compromise data.

Today's cloud-driven, always-connected world is enabling organizations to be very agile but it is also putting data integrity at risk. IT teams need to quickly adapt to this new way of doing business despite having less control of the endpoints and the data. Traditional data protection models fail due to their inability to discriminate one set of data from another. By putting in place protection measures based on the type of data, how it is used, and where it goes, and backed by a security model that is open, integrated, continuous, and pervasive, organizations can take advantage of new business opportunities the cloud affords without sacrificing data integrity.

More Stories By Raja Patel

Raja Patel is a Senior Director, Cloud Security Product Management, at Cisco, where he is responsible for the portfolio strategy and development of security solutions for Cisco's Security Business. His responsibilities include building solutions and managing operations associated with Cloud, Threat Intelligence, Web and Email Security. Raja has been at Cisco for 13 years and during this tenure he has product managed a broad portfolio of products within Cisco’s Enterprise Networking Business Group, developed and accelerated new consumption & business models such as Enterprise Licensing, and lead strategic initiatives to develop more agile business practices across Cisco.

Mr. Patel holds a BS in Aerospace Engineering with a Minor in Mathematics from Embry Riddle Aeronautical University, and an MBA in Global Business Management.

Comments (0)

Share your thoughts on this story.

Add your comment
You must be signed in to add a comment. Sign-in | Register

In accordance with our Comment Policy, we encourage comments that are on topic, relevant and to-the-point. We will remove comments that include profanity, personal attacks, racial slurs, threats of violence, or other inappropriate material that violates our Terms and Conditions, and will block users who make repeated violations. We ask all readers to expect diversity of opinion and to treat one another with dignity and respect.


Latest Stories
In his keynote at 19th Cloud Expo, Sheng Liang, co-founder and CEO of Rancher Labs, discussed the technological advances and new business opportunities created by the rapid adoption of containers. With the success of Amazon Web Services (AWS) and various open source technologies used to build private clouds, cloud computing has become an essential component of IT strategy. However, users continue to face challenges in implementing clouds, as older technologies evolve and newer ones like Docker c...
Personalization has long been the holy grail of marketing. Simply stated, communicate the most relevant offer to the right person and you will increase sales. To achieve this, you must understand the individual. Consequently, digital marketers developed many ways to gather and leverage customer information to deliver targeted experiences. In his session at @ThingsExpo, Lou Casal, Founder and Principal Consultant at Practicala, discussed how the Internet of Things (IoT) has accelerated our abilit...
As organizations shift towards IT-as-a-service models, the need for managing and protecting data residing across physical, virtual, and now cloud environments grows with it. Commvault can ensure protection, access and E-Discovery of your data – whether in a private cloud, a Service Provider delivered public cloud, or a hybrid cloud environment – across the heterogeneous enterprise. In his general session at 18th Cloud Expo, Randy De Meno, Chief Technologist - Windows Products and Microsoft Part...
Data is the fuel that drives the machine learning algorithmic engines and ultimately provides the business value. In his session at Cloud Expo, Ed Featherston, a director and senior enterprise architect at Collaborative Consulting, discussed the key considerations around quality, volume, timeliness, and pedigree that must be dealt with in order to properly fuel that engine.
In IT, we sometimes coin terms for things before we know exactly what they are and how they’ll be used. The resulting terms may capture a common set of aspirations and goals – as “cloud” did broadly for on-demand, self-service, and flexible computing. But such a term can also lump together diverse and even competing practices, technologies, and priorities to the point where important distinctions are glossed over and lost.
When shopping for a new data processing platform for IoT solutions, many development teams want to be able to test-drive options before making a choice. Yet when evaluating an IoT solution, it’s simply not feasible to do so at scale with physical devices. Building a sensor simulator is the next best choice; however, generating a realistic simulation at very high TPS with ease of configurability is a formidable challenge. When dealing with multiple application or transport protocols, you would be...
Detecting internal user threats in the Big Data eco-system is challenging and cumbersome. Many organizations monitor internal usage of the Big Data eco-system using a set of alerts. This is not a scalable process given the increase in the number of alerts with the accelerating growth in data volume and user base. Organizations are increasingly leveraging machine learning to monitor only those data elements that are sensitive and critical, autonomously establish monitoring policies, and to detect...
Two weeks ago (November 3-5), I attended the Cloud Expo Silicon Valley as a speaker, where I presented on the security and privacy due diligence requirements for cloud solutions. Cloud security is a topical issue for every CIO, CISO, and technology buyer. Decision-makers are always looking for insights on how to mitigate the security risks of implementing and using cloud solutions. Based on the presentation topics covered at the conference, as well as the general discussions heard between sessio...
Dion Hinchcliffe is an internationally recognized digital expert, bestselling book author, frequent keynote speaker, analyst, futurist, and transformation expert based in Washington, DC. He is currently Chief Strategy Officer at the industry-leading digital strategy and online community solutions firm, 7Summits.
In his session at 20th Cloud Expo, Scott Davis, CTO of Embotics, discussed how automation can provide the dynamic management required to cost-effectively deliver microservices and container solutions at scale. He also discussed how flexible automation is the key to effectively bridging and seamlessly coordinating both IT and developer needs for component orchestration across disparate clouds – an increasingly important requirement at today’s multi-cloud enterprise.
In his session at @DevOpsSummit at 20th Cloud Expo, Kelly Looney, director of DevOps consulting for Skytap, showed how an incremental approach to introducing containers into complex, distributed applications results in modernization with less risk and more reward. He also shared the story of how Skytap used Docker to get out of the business of managing infrastructure, and into the business of delivering innovation and business value. Attendees learned how up-front planning allows for a clean sep...
In his session at @ThingsExpo, Arvind Radhakrishnen discussed how IoT offers new business models in banking and financial services organizations with the capability to revolutionize products, payments, channels, business processes and asset management built on strong architectural foundation. The following topics were covered: How IoT stands to impact various business parameters including customer experience, cost and risk management within BFS organizations.
In his session at 20th Cloud Expo, Brad Winett, Senior Technologist for DDN Storage, will present several current, end-user environments that are using object storage at scale for cloud deployments including private cloud and cloud providers. Details on the top considerations of features and functions for selecting object storage will be included. Brad will also touch on recent developments in tiering technologies that deliver single solution and an end-user view of data across files and objects...
Given the popularity of the containers, further investment in the telco/cable industry is needed to transition existing VM-based solutions to containerized cloud native deployments. The networking architecture of the solution isolates the network traffic into different network planes (e.g., management, control, and media). This naturally makes support for multiple interfaces in container orchestration engines an indispensable requirement.
Businesses and business units of all sizes can benefit from cloud computing, but many don't want the cost, performance and security concerns of public cloud nor the complexity of building their own private clouds. Today, some cloud vendors are using artificial intelligence (AI) to simplify cloud deployment and management. In his session at 20th Cloud Expo, Ajay Gulati, Co-founder and CEO of ZeroStack, discussed how AI can simplify cloud operations. He covered the following topics: why cloud mana...