Welcome!

Blog Feed Post

Digital Forensic Challenges within Cloud Computing

Will cloud forensics be effective in managing boundaries of responsibility and access?

Proponents of the cloud ecosystem touts its "vastness, flexibility and scalability as advantages for the implementation of cloud services. However, from a digital point of view this can be a veritable forensic challenge as we view the cloud in terms of its scope and diversity.

According to Dr. Stephen Wolthusen[1] "Digital forensics (also referred to at times as computer forensics) encompasses approaches and techniques for gathering and analyzing traces of human and computer-generated activity in such a way that it is suitable in a court of law."

A key challenge to a digital investigator called to pursue an investigation with cloud resources as a subset will be to establish and map computational and storage structures that will fall within the realm of the investigation. Bear in mind that for any system (cloud or otherwise) security incidents will cross boundaries of responsibility and access.[2]

Within the digital forensic process where there is a no one-size-fits-all solution for a digital forensic examination, all forensic evidence must follow the Forensic process of :

Collection - Examination - Analysis - Reporting. Also no matter its environment, forensic evidence must:

•Be relevant to the issue at hand
•Be authentic
•Not be unfairly prejudicial not be hearsay or if hearsay, able to meet the requirements for an exception
•Be the original or duplicate of the evidence or able to meet an exception to that rule.
Within the cloud computing ecosystem I believe there may be a dilemma in terms of time stamps. A question for cloud vendors would be, with a distributed and "vast" infrastructure how will they ensure synchronized clocks across all their systems? Synchronized clocks across a distributed global system may not be a possibility, and if this supposition holds true, then what other solution will a cloud vendor provide in such an instance?

Another challenge can be with that of reciprocity. Digital forensics within the cloud computing environment can have legal implications within an international jurisdiction which will require corporation from established relationships with legal entities in foreign countries and/or the establishment of new ones if possible.

As with any live forensic examination another challenge will be the establishing of snapshots of the system in operation. But in this case one can question if this is good enough for such a "vast" and possibly globally distributed ecosystem.

Take the instance of malware injected into the kernel space of a system; it is possible that it may be programmed to modify data or functionality...or both, in a variety of ways upon detection of a probe, or simply set to shut-down,obfuscate evidence, or delete pertinent data residues within a set time frame. Can a forensic examiner be notified of this change, or more pertinent can a cloud service provider implement protocols, tools or processes to ensure that such an event can be mitigated in real time? Most likely not, at least for now.

However a solution of sorts to this dilemma can be gleaned from thesis suggested in a paper by Wolthusen [1] which states: Data may be present or available in a given configuration for a limited time or be staged through different levels of storage hierarchies; it is hence important to place bounds on events in question so as to be able to capture events of interest completely.

In terms of the "vast" distributed environment that can comprise a cloud ecosystem under investigation; we have to be aware of the fact that within such an ecosystem, any forensic investigation can cause: parallel or unrelated services to be interrupted to completed halted, infringe on third party rights and cross jurisdictional boundaries and in the case of duplication require infeasible storage volumes. [1]

Aspects of Control within Cloud Computing Service Models:

SaaS: Here the cloud user, dependent on their contracted services with the cloud vendor will only control certain configuration parameters, whilst the cloud vendor maintain control over application\s and underlying infrastructure.

PaaS: Here the cloud vendor controls the cloud infrastructure and runtime environments when the cloud user controls the application.

IaaS: Although a cloud user will have control over their servers with the installed Operating Systems and applications with this cloud offering the cloud vendor will still controls the virtualization infrastructure and at least parts of the network infrastructure.

These aspects will affect how a digital forensic examination is conducted as, every cloud computing environment will have variations. Therefore the degrees of methods/tools protocols etc. implemented in identifying relevant events that support the detection and analysis of attacks have to be crafted accordingly.

Four Forensic Challenges within the Cloud Ecosystem

Grobauer and Schreck [2] identified the following forensic challenges within the cloud computing environment:

1.Separation of customer's data sources during evidence collection
2.Adapting forensic analysis methods to the cloud
3.Improving live analysis techniques
4.Improving log generation & analysis techniques
Another major challenge is a need to establish a complete understanding of processes, their dependencies and distribution across different systems within the cloud ecosystem. [1]

Wolthusen[1] also states that, "if semantic dependencies must be captured, this must not only capture the immediate data required to reconstruct a view or document or to recreate and reconstruct a process, but also sufficient information to ascertain the semantics of the event at the point in time of the event."

However would not the establishment of such a process potentially impact customers not involved in an investigation that are sharing the cloud-space that is part of a cloud forensic examination?

Despite the semantics and challenges of the Cloud Computing Environment it is my opinion that:

Cloud Computing users must open dialogue with their vendor regarding processes and protocols for successfully handling/managing incidents. These need to be clearly established within the requirements portion, when drafting their service level agreement (SLA).

References

1.Overcast: Forensic Discovery in Cloud Environments -Stephen D. Wolthusen
2.Towards Incident Handling in the Cloud:Challenges and Approaches -Bernd Grobauer,Thomas Schreck

Read the original blog entry...

More Stories By Jon Shende

Jon RG Shende is an executive with over 18 years of industry experience. He commenced his career, in the medical arena, then moved into the Oil and Gas environment where he was introduced to SCADA and network technologies,also becoming certified in Industrial Pump and Valve repairs. Jon gained global experience over his career working within several verticals to include pharma, medical sales and marketing services as well as within the technology services environment, eventually becoming the youngest VP of an international enterprise. He is a graduate of the University of Oxford, holds a Masters certificate in Business Administration, as well as an MSc in IT Security, specializing in Computer Crime and Forensics with a thesis on security in the Cloud. Jon, well versed with the technology startup and mid sized venture ecosystems, has contributed at the C and Senior Director level for former clients. As an IT Security Executive, Jon has experience with Virtualization,Strategy, Governance,Risk Management, Continuity and Compliance. He was an early adopter of web-services, web-based tools and successfully beta tested a remote assistance and support software for a major telecom. Within the realm of sales, marketing and business development, Jon earned commendations for turnaround strategies within the services and pharma industry. For one pharma contract he was responsibe for bringing low performing districts up to number 1 rankings for consecutive quarters; as well as outperforming quotas from 125% up to 314%. Part of this was achieved by working closely with sales and marketing teams to ensure message and product placement were on point. Professionally he is a Fellow of the BCS Chartered Institute for IT, an HITRUST Certified CSF Practitioner and holds the CITP and CRISC certifications.Jon Shende currently works as a Senior Director for a CSP. A recognised thought Leader, Jon has been invited to speak for the SANs Institute, has spoken at Cloud Expo in New York as well as sat on a panel at Cloud Expo Santa Clara, and has been an Ernst and Young CPE conference speaker. His personal blog is located at http://jonshende.blogspot.com/view/magazine "We are what we repeatedly do. Excellence, therefore, is not an act, but a habit."

Latest Stories
While some developers care passionately about how data centers and clouds are architected, for most, it is only the end result that matters. To the majority of companies, technology exists to solve a business problem, and only delivers value when it is solving that problem. 2017 brings the mainstream adoption of containers for production workloads. In his session at 21st Cloud Expo, Ben McCormack, VP of Operations at Evernote, discussed how data centers of the future will be managed, how the p...
"NetApp is known as a data management leader but we do a lot more than just data management on-prem with the data centers of our customers. We're also big in the hybrid cloud," explained Wes Talbert, Principal Architect at NetApp, in this SYS-CON.tv interview at 21st Cloud Expo, held Oct 31 – Nov 2, 2017, at the Santa Clara Convention Center in Santa Clara, CA.
"Space Monkey by Vivent Smart Home is a product that is a distributed cloud-based edge storage network. Vivent Smart Home, our parent company, is a smart home provider that places a lot of hard drives across homes in North America," explained JT Olds, Director of Engineering, and Brandon Crowfeather, Product Manager, at Vivint Smart Home, in this SYS-CON.tv interview at @ThingsExpo, held Oct 31 – Nov 2, 2017, at the Santa Clara Convention Center in Santa Clara, CA.
The question before companies today is not whether to become intelligent, it’s a question of how and how fast. The key is to adopt and deploy an intelligent application strategy while simultaneously preparing to scale that intelligence. In her session at 21st Cloud Expo, Sangeeta Chakraborty, Chief Customer Officer at Ayasdi, provided a tactical framework to become a truly intelligent enterprise, including how to identify the right applications for AI, how to build a Center of Excellence to oper...
Widespread fragmentation is stalling the growth of the IIoT and making it difficult for partners to work together. The number of software platforms, apps, hardware and connectivity standards is creating paralysis among businesses that are afraid of being locked into a solution. EdgeX Foundry is unifying the community around a common IoT edge framework and an ecosystem of interoperable components.
"IBM is really all in on blockchain. We take a look at sort of the history of blockchain ledger technologies. It started out with bitcoin, Ethereum, and IBM evaluated these particular blockchain technologies and found they were anonymous and permissionless and that many companies were looking for permissioned blockchain," stated René Bostic, Technical VP of the IBM Cloud Unit in North America, in this SYS-CON.tv interview at 21st Cloud Expo, held Oct 31 – Nov 2, 2017, at the Santa Clara Conventi...
"Akvelon is a software development company and we also provide consultancy services to folks who are looking to scale or accelerate their engineering roadmaps," explained Jeremiah Mothersell, Marketing Manager at Akvelon, in this SYS-CON.tv interview at 21st Cloud Expo, held Oct 31 – Nov 2, 2017, at the Santa Clara Convention Center in Santa Clara, CA.
Coca-Cola’s Google powered digital signage system lays the groundwork for a more valuable connection between Coke and its customers. Digital signs pair software with high-resolution displays so that a message can be changed instantly based on what the operator wants to communicate or sell. In their Day 3 Keynote at 21st Cloud Expo, Greg Chambers, Global Group Director, Digital Innovation, Coca-Cola, and Vidya Nagarajan, a Senior Product Manager at Google, discussed how from store operations and ...
"Infoblox does DNS, DHCP and IP address management for not only enterprise networks but cloud networks as well. Customers are looking for a single platform that can extend not only in their private enterprise environment but private cloud, public cloud, tracking all the IP space and everything that is going on in that environment," explained Steve Salo, Principal Systems Engineer at Infoblox, in this SYS-CON.tv interview at 21st Cloud Expo, held Oct 31 – Nov 2, 2017, at the Santa Clara Conventio...
Large industrial manufacturing organizations are adopting the agile principles of cloud software companies. The industrial manufacturing development process has not scaled over time. Now that design CAD teams are geographically distributed, centralizing their work is key. With large multi-gigabyte projects, outdated tools have stifled industrial team agility, time-to-market milestones, and impacted P&L stakeholders.
"ZeroStack is a startup in Silicon Valley. We're solving a very interesting problem around bringing public cloud convenience with private cloud control for enterprises and mid-size companies," explained Kamesh Pemmaraju, VP of Product Management at ZeroStack, in this SYS-CON.tv interview at 21st Cloud Expo, held Oct 31 – Nov 2, 2017, at the Santa Clara Convention Center in Santa Clara, CA.
"We're focused on how to get some of the attributes that you would expect from an Amazon, Azure, Google, and doing that on-prem. We believe today that you can actually get those types of things done with certain architectures available in the market today," explained Steve Conner, VP of Sales at Cloudistics, in this SYS-CON.tv interview at 21st Cloud Expo, held Oct 31 – Nov 2, 2017, at the Santa Clara Convention Center in Santa Clara, CA.
"Codigm is based on the cloud and we are here to explore marketing opportunities in America. Our mission is to make an ecosystem of the SW environment that anyone can understand, learn, teach, and develop the SW on the cloud," explained Sung Tae Ryu, CEO of Codigm, in this SYS-CON.tv interview at 21st Cloud Expo, held Oct 31 – Nov 2, 2017, at the Santa Clara Convention Center in Santa Clara, CA.
Data scientists must access high-performance computing resources across a wide-area network. To achieve cloud-based HPC visualization, researchers must transfer datasets and visualization results efficiently. HPC clusters now compute GPU-accelerated visualization in the cloud cluster. To efficiently display results remotely, a high-performance, low-latency protocol transfers the display from the cluster to a remote desktop. Further, tools to easily mount remote datasets and efficiently transfer...
"CA has been doing a lot of things in the area of DevOps. Now we have a complete set of tool sets in order to enable customers to go all the way from planning to development to testing down to release into the operations," explained Aruna Ravichandran, Vice President of Global Marketing and Strategy at CA Technologies, in this SYS-CON.tv interview at DevOps Summit at 21st Cloud Expo, held Oct 31 – Nov 2, 2017, at the Santa Clara Convention Center in Santa Clara, CA.