Welcome!

Related Topics: @DevOpsSummit, Java IoT, Linux Containers, Containers Expo Blog, Cloud Security, SDN Journal

@DevOpsSummit: Blog Feed Post

The Intersection of Security and SDN

If you cross log analysis with infrastructure integration you get some interesting security capabilities

A lot of security-minded folks immediately pack up their bags and go home when you start talking about automating anything in the security infrastructure. Automating changes to data center firewalls, for example, seem to elicit a reaction akin not unlike that to a suggestion to putting an unpatched Windows machine directly on the public Internet.

At RSA yesterday I happened to see a variety of booths with a focus on .. .logs. That isn't surprising as log analysis is used across the data center and across domains for a variety of reasons. It's one of the ways databases are replicated, it's part of compiling access audit reports and it's absolutely one of the ways in which intrusions attempts can be detected.

And that's cool. Log analysis for intrusion detection is a good thing. But what if it could be better?

What if we started considering operationalizing the process of acting on events raised by log analysis?

One of the promises of SDN is agility through programmability. The idea is that because the data path is "programmable" it can be modified at any time by the control plane using an API. In this way, SDN-enabled architectures can respond in real time to conditions on the network impacting applications. Usually this focuses on performance but there's no reason it couldn't 'be applied to security, as well.

If you're using a log analysis tool capable of performing said analysis in near-time, and the analysis results in suspicious activity, there's no reason it couldn't inform a controller of some kind on the network, which in turn could easily decide to enable infrastructure capabilities across the network. Perhaps to start capturing the flow, or injecting a more advanced inspection service (malware detection perhaps) into the service chain for the application.

In the service provider world, it's well understood that the requirement in traditional architectures to force flows through all services is inefficient. It increases the cost of the service and requires scaling every single service along with subscriber growth. Service providers are turning to service chaining and traffic steering as a means to more efficiently use only those services that are applicable, rather than the entire chain.

While enterprise organizations for the most part aren't going to adopt service provider architectures, they can learn from then the value inherent in more dynamic network and service topologies. Does every request and response need to go through every security service? Or are some only truly needed for deep inspection?

It's about intelligence and integration. Real time analysis on what is traditionally data at rest (logs) can net actionable data if infrastructure is API-enabled. It's taking the notion of scalability domains to a more dynamic level by not only ensuring scale of services individually to reduce costs but further to improve performance and efficiency by only consuming resources when necessary, instead of all the time. The key is being able to determine when it's necessary and when it isn't.

More reading on infrastructure architecture patterns supporting scalability domains

In a service provider world that's based on subscriber and traffic type. In the enterprise it's more behavioral analysis, it's what someone is trying to do and with what application or data.

But in the end, both environments need to be dynamic with policy enforcement and service invocation based on the unique combination of devices, networks and applications and enabled by the increasing prevalence of API-enabled infrastructure.

SDN is going to propel not just operational networks as a cost savings vehicle, but as part of the technology that ultimately unlocks the software-defined data center. And that includes security.

Read the original blog entry...

More Stories By Lori MacVittie

Lori MacVittie is responsible for education and evangelism of application services available across F5’s entire product suite. Her role includes authorship of technical materials and participation in a number of community-based forums and industry standards organizations, among other efforts. MacVittie has extensive programming experience as an application architect, as well as network and systems development and administration expertise. Prior to joining F5, MacVittie was an award-winning Senior Technology Editor at Network Computing Magazine, where she conducted product research and evaluation focused on integration with application and network architectures, and authored articles on a variety of topics aimed at IT professionals. Her most recent area of focus included SOA-related products and architectures. She holds a B.S. in Information and Computing Science from the University of Wisconsin at Green Bay, and an M.S. in Computer Science from Nova Southeastern University.

Latest Stories
Extracting business value from Internet of Things (IoT) data doesn’t happen overnight. There are several requirements that must be satisfied, including IoT device enablement, data analysis, real-time detection of complex events and automated orchestration of actions. Unfortunately, too many companies fall short in achieving their business goals by implementing incomplete solutions or not focusing on tangible use cases. In his general session at @ThingsExpo, Dave McCarthy, Director of Products...
Security, data privacy, reliability, and regulatory compliance are critical factors when evaluating whether to move business applications from in-house, client-hosted environments to a cloud platform. Quality assurance plays a vital role in ensuring that the appropriate level of risk assessment, verification, and validation takes place to ensure business continuity during the migration to a new cloud platform.
WebRTC is bringing significant change to the communications landscape that will bridge the worlds of web and telephony, making the Internet the new standard for communications. Cloud9 took the road less traveled and used WebRTC to create a downloadable enterprise-grade communications platform that is changing the communication dynamic in the financial sector. In his session at @ThingsExpo, Leo Papadopoulos, CTO of Cloud9, discussed the importance of WebRTC and how it enables companies to focus...
Security, data privacy, reliability and regulatory compliance are critical factors when evaluating whether to move business applications from in-house client hosted environments to a cloud platform. In her session at 18th Cloud Expo, Vandana Viswanathan, Associate Director at Cognizant, In this session, will provide an orientation to the five stages required to implement a cloud hosted solution validation strategy.
UpGuard has become a member of the Center for Internet Security (CIS), and will continue to help businesses expand visibility into their cyber risk by providing hardening benchmarks to all customers. By incorporating these benchmarks, UpGuard's CSTAR solution builds on its lead in providing the most complete assessment of both internal and external cyber risk. CIS benchmarks are a widely accepted set of hardening guidelines that have been publicly available for years. Numerous solutions exist t...
Up until last year, enterprises that were looking into cloud services usually undertook a long-term pilot with one of the large cloud providers, running test and dev workloads in the cloud. With cloud’s transition to mainstream adoption in 2015, and with enterprises migrating more and more workloads into the cloud and in between public and private environments, the single-provider approach must be revisited. In his session at 18th Cloud Expo, Yoav Mor, multi-cloud solution evangelist at Cloudy...
A critical component of any IoT project is what to do with all the data being generated. This data needs to be captured, processed, structured, and stored in a way to facilitate different kinds of queries. Traditional data warehouse and analytical systems are mature technologies that can be used to handle certain kinds of queries, but they are not always well suited to many problems, particularly when there is a need for real-time insights.
Verizon Communications Inc. (NYSE, Nasdaq: VZ) and Yahoo! Inc. (Nasdaq: YHOO) have entered into a definitive agreement under which Verizon will acquire Yahoo's operating business for approximately $4.83 billion in cash, subject to customary closing adjustments. Yahoo informs, connects and entertains a global audience of more than 1 billion monthly active users** -- including 600 million monthly active mobile users*** through its search, communications and digital content products. Yahoo also co...
"Software-defined storage is a big problem in this industry because so many people have different definitions as they see fit to use it," stated Peter McCallum, VP of Datacenter Solutions at FalconStor Software, in this SYS-CON.tv interview at 18th Cloud Expo, held June 7-9, 2016, at the Javits Center in New York City, NY.
Extreme Computing is the ability to leverage highly performant infrastructure and software to accelerate Big Data, machine learning, HPC, and Enterprise applications. High IOPS Storage, low-latency networks, in-memory databases, GPUs and other parallel accelerators are being used to achieve faster results and help businesses make better decisions. In his session at 18th Cloud Expo, Michael O'Neill, Strategic Business Development at NVIDIA, focused on some of the unique ways extreme computing is...
"We're bringing out a new application monitoring system to the DevOps space. It manages large enterprise applications that are distributed throughout a node in many enterprises and we manage them as one collective," explained Kevin Barnes, President of eCube Systems, in this SYS-CON.tv interview at DevOps at 18th Cloud Expo, held June 7-9, 2016, at the Javits Center in New York City, NY.
Amazon has gradually rolled out parts of its IoT offerings in the last year, but these are just the tip of the iceberg. In addition to optimizing their back-end AWS offerings, Amazon is laying the ground work to be a major force in IoT – especially in the connected home and office. Amazon is extending its reach by building on its dominant Cloud IoT platform, its Dash Button strategy, recently announced Replenishment Services, the Echo/Alexa voice recognition control platform, the 6-7 strategic...
The best-practices for building IoT applications with Go Code that attendees can use to build their own IoT applications. In his session at @ThingsExpo, Indraneel Mitra, Senior Solutions Architect & Technology Evangelist at Cognizant, provided valuable information and resources for both novice and experienced developers on how to get started with IoT and Golang in a day. He also provided information on how to use Intel Arduino Kit, Go Robotics API and AWS IoT stack to build an application tha...
IoT generates lots of temporal data. But how do you unlock its value? You need to discover patterns that are repeatable in vast quantities of data, understand their meaning, and implement scalable monitoring across multiple data streams in order to monetize the discoveries and insights. Motif discovery and deep learning platforms are emerging to visualize sensor data, to search for patterns and to build application that can monitor real time streams efficiently. In his session at @ThingsExpo, ...
Ovum, a leading technology analyst firm, has published an in-depth report, Ovum Decision Matrix: Selecting a DevOps Release Management Solution, 2016–17. The report focuses on the automation aspects of DevOps, Release Management and compares solutions from the leading vendors.