Welcome!

Blog Feed Post

Before the Breach: Cloud Breach Response Best Practices

Cloud Breach Response

What are the technical steps you need to take to safeguard or respond to a breach?

One of the most difficult and damaging events that can occur for any business’ infrastructure is a breach. However, breaches occur when proper planning hasn’t gone into an infrastructure contingency plan.

There are several areas of consideration that need to be fully planned for before any IT strategy and data objectives can be balanced in the face of a potential breach: whether technical, HR, or compliance, have a response plan for each area is necessary before any problem ever arises.

In this post we will explore the technical considerations that go into breach planning. Check back for our posts on HR responses and compliance/legal responses next week.

So where do you start in planning your cloud breach response?

Know where the data lives

Understanding what your data is, where it resides in your systems, and how the data flows is of the utmost importance when beginning your technical planning for breach protection. It’s surprising to considering that given the importance of data, many organizations don’t have a complete handle on their data flows through various different levels of the application, especially in situations where it might be exposed. These are things that you have to know if you’re trying to respond to a breach and are trying to drill down into where could things have been exposed that shouldn’t have been. You don’t want to be sorting that out after.

Logging for success

It’s important to make sure in advance you have all the right types of reporting and management and logging protocols in place, not just your regular server logs. Your application logs need to be centralized and put some place that will be easily accessible and a source where it’s easy to correlate what happened against the server logs, firewall logs and everything else at the same time. Depending on where things are stored and at what level information becomes accessible to be breached, businesses have to think very carefully of what they have to log in the application. They should log specific user access. But you don’t want to log anything that gets displayed to users because you don’t want ePHI to be in the logs (which does happen). One of the things you have to check for in PCI compliant hosting, for instance, is anything that actually could be a credit card number.

Separation of church and state

You definitely want to know what the separate roles are for people who have access to the systems and based on that whether there’s the likelihood that someone has the wrong type of permission and got to things that they shouldn’t have. Then there’s always the question of, once you have drilled down and figured out where the vector a breach may have happened and what might have gotten compromised, how you actually get to the source of the root cause analysis. This will help frame the steps that need to be taken depending on whether it was a person or something in the software.

Where do MSPs fit?

MSPs can’t handle all breach planning and response protocols because they are primarily on the OS and infrastructure side. So they have to be able to collaborate with clients, by providing insight into the layers of the stack that they have access to or requesting more information from within the application to compare to the logs they have with those they don’t. The reasons to share? You might realize there’s an ongoing problem, but you might not be able to tell when it started without these sharing protocols in place.

Bottom line: Setting up protocols and communication channels between your organization and your MSP ahead of time is super important. However, the technical response is only part of the planning. Check back for our next segment which explores the legal/compliance and HR planning necessary to avoid or deal with a breach.

What are your thoughts on breach planning? Let us know on Twitter @CloudGathering.

By Jake Gardner

Read the original blog entry...

More Stories By Gathering Clouds

Cloud computing news, information, and insights. Powered by Logicworks.

Latest Stories
The taxi industry never saw Uber coming. Startups are a threat to incumbents like never before, and a major enabler for startups is that they are instantly “cloud ready.” If innovation moves at the pace of IT, then your company is in trouble. Why? Because your data center will not keep up with frenetic pace AWS, Microsoft and Google are rolling out new capabilities. In his session at 20th Cloud Expo, Don Browning, VP of Cloud Architecture at Turner, posited that disruption is inevitable for comp...
No hype cycles or predictions of zillions of things here. IoT is big. You get it. You know your business and have great ideas for a business transformation strategy. What comes next? Time to make it happen. In his session at @ThingsExpo, Jay Mason, Associate Partner at M&S Consulting, presented a step-by-step plan to develop your technology implementation strategy. He discussed the evaluation of communication standards and IoT messaging protocols, data analytics considerations, edge-to-cloud tec...
"When we talk about cloud without compromise what we're talking about is that when people think about 'I need the flexibility of the cloud' - it's the ability to create applications and run them in a cloud environment that's far more flexible,” explained Matthew Finnie, CTO of Interoute, in this SYS-CON.tv interview at 20th Cloud Expo, held June 6-8, 2017, at the Javits Center in New York City, NY.
IoT solutions exploit operational data generated by Internet-connected smart “things” for the purpose of gaining operational insight and producing “better outcomes” (for example, create new business models, eliminate unscheduled maintenance, etc.). The explosive proliferation of IoT solutions will result in an exponential growth in the volume of IoT data, precipitating significant Information Governance issues: who owns the IoT data, what are the rights/duties of IoT solutions adopters towards t...
Wooed by the promise of faster innovation, lower TCO, and greater agility, businesses of every shape and size have embraced the cloud at every layer of the IT stack – from apps to file sharing to infrastructure. The typical organization currently uses more than a dozen sanctioned cloud apps and will shift more than half of all workloads to the cloud by 2018. Such cloud investments have delivered measurable benefits. But they’ve also resulted in some unintended side-effects: complexity and risk. ...
It is ironic, but perhaps not unexpected, that many organizations who want the benefits of using an Agile approach to deliver software use a waterfall approach to adopting Agile practices: they form plans, they set milestones, and they measure progress by how many teams they have engaged. Old habits die hard, but like most waterfall software projects, most waterfall-style Agile adoption efforts fail to produce the results desired. The problem is that to get the results they want, they have to ch...
With the introduction of IoT and Smart Living in every aspect of our lives, one question has become relevant: What are the security implications? To answer this, first we have to look and explore the security models of the technologies that IoT is founded upon. In his session at @ThingsExpo, Nevi Kaja, a Research Engineer at Ford Motor Company, discussed some of the security challenges of the IoT infrastructure and related how these aspects impact Smart Living. The material was delivered interac...
In 2014, Amazon announced a new form of compute called Lambda. We didn't know it at the time, but this represented a fundamental shift in what we expect from cloud computing. Now, all of the major cloud computing vendors want to take part in this disruptive technology. In his session at 20th Cloud Expo, Doug Vanderweide, an instructor at Linux Academy, discussed why major players like AWS, Microsoft Azure, IBM Bluemix, and Google Cloud Platform are all trying to sidestep VMs and containers wit...
The Internet giants are fully embracing AI. All the services they offer to their customers are aimed at drawing a map of the world with the data they get. The AIs from these companies are used to build disruptive approaches that cannot be used by established enterprises, which are threatened by these disruptions. However, most leaders underestimate the effect this will have on their businesses. In his session at 21st Cloud Expo, Rene Buest, Director Market Research & Technology Evangelism at Ara...
"We are a monitoring company. We work with Salesforce, BBC, and quite a few other big logos. We basically provide monitoring for them, structure for their cloud services and we fit into the DevOps world" explained David Gildeh, Co-founder and CEO of Outlyer, in this SYS-CON.tv interview at DevOps Summit at 20th Cloud Expo, held June 6-8, 2017, at the Javits Center in New York City, NY.
When growing capacity and power in the data center, the architectural trade-offs between server scale-up vs. scale-out continue to be debated. Both approaches are valid: scale-out adds multiple, smaller servers running in a distributed computing model, while scale-up adds fewer, more powerful servers that are capable of running larger workloads. It’s worth noting that there are additional, unique advantages that scale-up architectures offer. One big advantage is large memory and compute capacity...
SYS-CON Events announced today that IBM has been named “Diamond Sponsor” of SYS-CON's 21st Cloud Expo, which will take place on October 31 through November 2nd 2017 at the Santa Clara Convention Center in Santa Clara, California.
A look across the tech landscape at the disruptive technologies that are increasing in prominence and speculate as to which will be most impactful for communications – namely, AI and Cloud Computing. In his session at 20th Cloud Expo, Curtis Peterson, VP of Operations at RingCentral, highlighted the current challenges of these transformative technologies and shared strategies for preparing your organization for these changes. This “view from the top” outlined the latest trends and developments i...
"Loom is applying artificial intelligence and machine learning into the entire log analysis process, from start to finish and at the end you will get a human touch,” explained Sabo Taylor Diab, Vice President, Marketing at Loom Systems, in this SYS-CON.tv interview at 20th Cloud Expo, held June 6-8, 2017, at the Javits Center in New York City, NY.
In his session at @ThingsExpo, Eric Lachapelle, CEO of the Professional Evaluation and Certification Board (PECB), provided an overview of various initiatives to certify the security of connected devices and future trends in ensuring public trust of IoT. Eric Lachapelle is the Chief Executive Officer of the Professional Evaluation and Certification Board (PECB), an international certification body. His role is to help companies and individuals to achieve professional, accredited and worldwide re...