Welcome!

Blog Feed Post

Defense through Offense, and how APT fits there

I’m guessing that having “APT” in anything that goes outside for public consumption these days is mandatory, but this post actually has a good reason to do so. If you look back just one post in the past, we were discussing the new initiative to define “Penetration Testing”. The post, and the proposed standard itself really take a good look at what organizations need, and how to address such needs from a practical point of view, rather than from a compliance or a “check-box ticking” perspective.

For me this is one of the things that the security industry has done a great disservice to. It is exactly why companies are announcing that for every time they get breached, it was an advanced attack. An attack so sophisticated, that managed to stay persistent in their network and exfiltrate lots of sensitive information, that no reasonable control could have prevented or detected it. The all dreaded “APT”.

However, if you take a look at how organizations prepare themselves for such attacks you may find yourself staring at a blank page. Since regulatory compliance dictates a very basic “box checking” methodology for a very narrow and specific aspect of information security, and the product vendors on the other hand provide solutions that are “compliance oriented”, organizations are left with a very weak defense mechanisms. This is without even mentioning the biggest security gap in most organizations – the employees.

The lack of self-testing, of a real-world simulation of what an attack would look like, and how the organization would cope with, hinders most organizations from putting reasonable defenses in place. The lack of proper training, awareness campaigns, and exercises that stress out the human factor as well are leading us to a situation where even simple attacks that utilize off-the-shelf (and even FREE) attack tools, manage to go through an organizations control mechanisms with aggravating ease.

I’m looking back at what the penetration testing execution standard defines for its basic testing methodology, and I can clearly see how every element of the recent “APT” attacks would have been simulated, and probably in a more rigorous scenario. Such a test would have clearly left the tested organization with a roadmap that would bring it to a much higher security standard. And that’s the power of testing – of understanding the adversary’s techniques and strategies, and running exercises that reflect them in order to identify security gaps and close them as efficiently as possible. And yes – that also (and perhaps mainly) applies to human related processes and policies rather than just to technology.

So to sum things up – you may be compliant, but do not think for a moment that this compliance has anything to do with the security of your information. Until regulatory compliance does not mandate proper security testing in order to protect the data in question, such compliance is only going to hinder your “security vision”. Get proper testing, set up an internal team that would be responsible for understanding the threat communities you are dealing with (or hire an external one <hint hint>), and make sure you set yourself a goal to have an unbiased understanding of what your gaps are and how well you can face a standard attack (yes – the same standard attack that you are going to call an “APT” if it would hit you unprepared).

Related posts:

  1. Information Security Intelligence Report for 2010 and Predictions for 2011
  2. Practical vs. Regulatory – the votes are in!
  3. Defining Penetration Testing

Read the original blog entry...

More Stories By Iftach Ian Amit

With more than 10 years of experience in the information security industry, Ian (Iftach) Amit brings a mixture of software development, OS, network and Web security expertise as Managing Partner of the top-tier security consulting and research firm Security & Innovation. Prior to Security & Innovation, Ian was the Director of Security Research at Aladdin and Finjan, leading their security research while positioning them as leaders in the Web security market. Amit has also held leadership roles as founder and CTO of a security startup in the IDS/IPS arena, developing new techniques for attack interception, and a director at Datavantage, responsible for software development and information security, as well as designing and building a financial datacenter. Prior to Datavantage, he managed the Internet application and UNIX worldwide. Amit holds a Bachelor's degree in Computer Science and Business Administration from the Interdisciplinary Center at Herzlya.

Latest Stories
DX World EXPO, LLC, a Lighthouse Point, Florida-based startup trade show producer and the creator of "DXWorldEXPO® - Digital Transformation Conference & Expo" has announced its executive management team. The team is headed by Levent Selamoglu, who has been named CEO. "Now is the time for a truly global DX event, to bring together the leading minds from the technology world in a conversation about Digital Transformation," he said in making the announcement.
"Space Monkey by Vivent Smart Home is a product that is a distributed cloud-based edge storage network. Vivent Smart Home, our parent company, is a smart home provider that places a lot of hard drives across homes in North America," explained JT Olds, Director of Engineering, and Brandon Crowfeather, Product Manager, at Vivint Smart Home, in this SYS-CON.tv interview at @ThingsExpo, held Oct 31 – Nov 2, 2017, at the Santa Clara Convention Center in Santa Clara, CA.
SYS-CON Events announced today that Conference Guru has been named “Media Sponsor” of the 22nd International Cloud Expo, which will take place on June 5-7, 2018, at the Javits Center in New York, NY. A valuable conference experience generates new contacts, sales leads, potential strategic partners and potential investors; helps gather competitive intelligence and even provides inspiration for new products and services. Conference Guru works with conference organizers to pass great deals to gre...
DevOps is under attack because developers don’t want to mess with infrastructure. They will happily own their code into production, but want to use platforms instead of raw automation. That’s changing the landscape that we understand as DevOps with both architecture concepts (CloudNative) and process redefinition (SRE). Rob Hirschfeld’s recent work in Kubernetes operations has led to the conclusion that containers and related platforms have changed the way we should be thinking about DevOps and...
The Internet of Things will challenge the status quo of how IT and development organizations operate. Or will it? Certainly the fog layer of IoT requires special insights about data ontology, security and transactional integrity. But the developmental challenges are the same: People, Process and Platform. In his session at @ThingsExpo, Craig Sproule, CEO of Metavine, demonstrated how to move beyond today's coding paradigm and shared the must-have mindsets for removing complexity from the develop...
In his Opening Keynote at 21st Cloud Expo, John Considine, General Manager of IBM Cloud Infrastructure, led attendees through the exciting evolution of the cloud. He looked at this major disruption from the perspective of technology, business models, and what this means for enterprises of all sizes. John Considine is General Manager of Cloud Infrastructure Services at IBM. In that role he is responsible for leading IBM’s public cloud infrastructure including strategy, development, and offering m...
The next XaaS is CICDaaS. Why? Because CICD saves developers a huge amount of time. CD is an especially great option for projects that require multiple and frequent contributions to be integrated. But… securing CICD best practices is an emerging, essential, yet little understood practice for DevOps teams and their Cloud Service Providers. The only way to get CICD to work in a highly secure environment takes collaboration, patience and persistence. Building CICD in the cloud requires rigorous ar...
Companies are harnessing data in ways we once associated with science fiction. Analysts have access to a plethora of visualization and reporting tools, but considering the vast amount of data businesses collect and limitations of CPUs, end users are forced to design their structures and systems with limitations. Until now. As the cloud toolkit to analyze data has evolved, GPUs have stepped in to massively parallel SQL, visualization and machine learning.
"Evatronix provides design services to companies that need to integrate the IoT technology in their products but they don't necessarily have the expertise, knowledge and design team to do so," explained Adam Morawiec, VP of Business Development at Evatronix, in this SYS-CON.tv interview at @ThingsExpo, held Oct 31 – Nov 2, 2017, at the Santa Clara Convention Center in Santa Clara, CA.
To get the most out of their data, successful companies are not focusing on queries and data lakes, they are actively integrating analytics into their operations with a data-first application development approach. Real-time adjustments to improve revenues, reduce costs, or mitigate risk rely on applications that minimize latency on a variety of data sources. In his session at @BigDataExpo, Jack Norris, Senior Vice President, Data and Applications at MapR Technologies, reviewed best practices to ...
Widespread fragmentation is stalling the growth of the IIoT and making it difficult for partners to work together. The number of software platforms, apps, hardware and connectivity standards is creating paralysis among businesses that are afraid of being locked into a solution. EdgeX Foundry is unifying the community around a common IoT edge framework and an ecosystem of interoperable components.
"ZeroStack is a startup in Silicon Valley. We're solving a very interesting problem around bringing public cloud convenience with private cloud control for enterprises and mid-size companies," explained Kamesh Pemmaraju, VP of Product Management at ZeroStack, in this SYS-CON.tv interview at 21st Cloud Expo, held Oct 31 – Nov 2, 2017, at the Santa Clara Convention Center in Santa Clara, CA.
Large industrial manufacturing organizations are adopting the agile principles of cloud software companies. The industrial manufacturing development process has not scaled over time. Now that design CAD teams are geographically distributed, centralizing their work is key. With large multi-gigabyte projects, outdated tools have stifled industrial team agility, time-to-market milestones, and impacted P&L stakeholders.
"Akvelon is a software development company and we also provide consultancy services to folks who are looking to scale or accelerate their engineering roadmaps," explained Jeremiah Mothersell, Marketing Manager at Akvelon, in this SYS-CON.tv interview at 21st Cloud Expo, held Oct 31 – Nov 2, 2017, at the Santa Clara Convention Center in Santa Clara, CA.
Enterprises are adopting Kubernetes to accelerate the development and the delivery of cloud-native applications. However, sharing a Kubernetes cluster between members of the same team can be challenging. And, sharing clusters across multiple teams is even harder. Kubernetes offers several constructs to help implement segmentation and isolation. However, these primitives can be complex to understand and apply. As a result, it’s becoming common for enterprises to end up with several clusters. Thi...