Welcome!

Blog Feed Post

SIEM NetFlow Support: Don’t Sell Yourself Short

This is a conversation I find myself having more and more lately so I thought it would make sense to discuss in detail just exactly how security information management systems (SIEMs) and NetFlow are related and why SIEMs are a poor choice for NetFlow collection.

Customer: Adam, we have a SIEM from Vendor X. They say they already support NetFlow. Why would I need a Scrutinizer NetFlow Analyzer?

Adam: The SIEM community has confused things a bit for the customer. In response to the recent surge in NetFlow popularity, many SIEMs have added bare-minimum “check box” NetFlow support just to stay competitive. Comparing SIEM NetFlow support to an advanced flow collector such as  Scrutinizer is like comparing MS Paint to Photoshop. Sure MS Paint works, but it doesn’t do much.

Customer: Okay then, explain to me the difference between my SIEM’s NetFlow support and Scrutinizer’s NetFlow support.

Adam: Sure. First of all, let’s make one thing clear: Scrutinizer is not a SIEM and doesn’t directly compete with a SIEM. It’s a NetFlow and IPFIX Analysis technology that compliments a SIEM. Scrutinizer exports syslog events just like any other IDS, firewall, or network device. It’s better to think of Scrutinizer as a flow-based threat detection technology that feeds alerts into a SIEM, rather than something that would compete with one. Just as a traditional IDS uses packets to fuel its analysis engine, Scrutinizer uses flows to detect network-based threats and traffic bottlenecks.

Let’s talk through the various challenges a SIEM faces when collecting NetFlow:

Capturing and storing NetFlow isn’t enough.

You’ve got to analyze the incoming flows, not just grab them off the wire and dump them to disk. Flows are really only useful when you apply some intelligence to them, otherwise they’re just simple audit trails. Most SIEMs shove NetFlow records into a database or flat file, check the “NetFlow Support?” box, and call it a day. Scrutinizer not only stores the flows to disk but also generates dozens of charts, top talker reports, trending views, security alerts, and customizable thresholds.

Flow Analysis is a full time job.

SIEM vendors have their hands full developing connectors for a large variety of inputs such as syslog, SNMP, event logs, and proprietary log formats. To get the most out of NetFlow, the vendor must really understand the dozens of NetFlow fields and capabilities that are available. While flows are much more powerful than syslog, they’re also a great deal more involved to implement correctly. Ask your SIEM how many developers they have working on their NetFlow capabilities. Plixer’s entire engineering staff is focused exclusively on NetFlow and IPFIX development. It’s our full time job.

NetFlow is rapidly evolving.

In just the last two years we’ve seen such NetFlow innovations as MediaNet, NBAR, PaloAlto’s application-aware flows, and Cisco’s ASA NAT tables all make their way into NetFlow exports. The NetFlow community, especially Plixer, is in constant communication with the various infrastructure vendors helping test and develop new fields that can be used by the collector to report on an increasingly deep set of network statistics. Most SIEM vendors barely even support NetFlow v5, much less the latest and greatest from Cisco or the IPFIX community.

Performance is a major issue.

For most vendors NetFlow is an after thought, a bolt-on recently and almost always inadequately implemented. Most SIEM vendors don’t realize that one of the core values NetFlow provides is its ability to provide visibility into many different places within the network simultaneously. When you turn on NetFlow from the edge of your network all the way down to the access layer you get an amazingly deep view of the what’s going on but you’ll also generate a *lot* of flows. Ask your SIEM vendor how many flows per second their collector can handle while also doing its normal “SIEM thing”. A single Scrutinizer instance can process up to 100,000 flows per second. The unfortunate part is that the SIEM probably doesn’t even support MFSN detection so they can’t even tell if they are missing flows. They’re blind to their own shortcomings.

Flows are not events, they are flows.

Most SIEM vendors force NetFlow fields into their own event-oriented data structures. This makes sense for them given the fact that SIEMs are designed around the notion of a point-in-time event. NetFlow records are on-going and can last for anywhere from 1 second to many hours. This makes reporting on NetFlow rather awkward for most SIEMs. They are trying to force a sustained flow into a point-in-time event and it simply doesn’t work well.

SIEMs are oriented entirely toward security.

SIEMs rarely provide basic NetFlow reporting features such as Top Talkers, Interface Utilization, Capacity Planning, QoS Reporting, MediaNet reporting, or topological representations of the NetFlow exporters. This absence is a result of the SIEM’s core buyer: the security analyst. One of NetFlow’s core strengths is that it provides visibility into the network for both security and network purposes. Most security people that see the reports that can be built from NetFlow often wonder how they got along without them before.

NetFlow is often a “check box” feature.

Most SIEM vendors only recently added NetFlow and did so because the customer said something like “if you don’t have NetFlow support we’ll have to go with someone who does” so they added the bare minimum to be able to check the “NetFlow Supported?” box on the RFP. Check Box features are unfortunately common place in the vendor community. Watch out for them. Don’t sell your team short on the value NetFlow and IPFIX can provide.

One closing point to make: it might still make sense to send some of your NetFlow, especially from the Internet gateways or critical resources, to the SIEM. Sending the same flows to multiple collectors is common practice and easy to accomplish using the Scrutinizer UDP Replicator.

To learn more about Scrutinizer, visit the product page here, download a 30 day trial, or contact Plixer for a live demonstration and discussion.

If you’re new to NetFlow and are interested in hands-on training be sure to check out our Advanced NetFlow Training courses coming to a city near you.

Read the original blog entry...

More Stories By Michael Patterson

Michael Patterson, is the founder & CEO of Plixer and the product manager for Scrutinizer NetFlow and sFlow Analyzer. Prior to starting Somix and Plixer, Mike worked in a technical support role at Cabletron Systems, acquired his Novell CNE and then moved to the training department for a few years. While in training he finished his Masters in Computer Information Systems from Southern New Hampshire University and then left technical training to pursue a new skill set in Professional Services. In 1998 he left the 'Tron' to start Somix and Plixer.

Latest Stories
Cloud promises the agility required by today’s digital businesses. As organizations adopt cloud based infrastructures and services, their IT resources become increasingly dynamic and hybrid in nature. Managing these require modern IT operations and tools. In his session at 20th Cloud Expo, Raj Sundaram, Senior Principal Product Manager at CA Technologies, will discuss how to modernize your IT operations in order to proactively manage your hybrid cloud and IT environments. He will be sharing bes...
After more than five years of DevOps, definitions are evolving, boundaries are expanding, ‘unicorns’ are no longer rare, enterprises are on board, and pundits are moving on. Can we now look at an evolution of DevOps? Should we? Is the foundation of DevOps ‘done’, or is there still too much left to do? What is mature, and what is still missing? What does the next 5 years of DevOps look like? In this Power Panel at DevOps Summit, moderated by DevOps Summit Conference Chair Andi Mann, panelists loo...
Cloud applications are seeing a deluge of requests to support the exploding advanced analytics market. “Open analytics” is the emerging strategy to deliver that data through an open data access layer, in the cloud, to be directly consumed by external analytics tools and popular programming languages. An increasing number of data engineers and data scientists use a variety of platforms and advanced analytics languages such as SAS, R, Python and Java, as well as frameworks such as Hadoop and Spark...
Automation is enabling enterprises to design, deploy, and manage more complex, hybrid cloud environments. Yet the people who manage these environments must be trained in and understanding these environments better than ever before. A new era of analytics and cognitive computing is adding intelligence, but also more complexity, to these cloud environments. How smart is your cloud? How smart should it be? In this power panel at 20th Cloud Expo, moderated by Conference Chair Roger Strukhoff, paneli...
"Loom is applying artificial intelligence and machine learning into the entire log analysis process, from start to finish and at the end you will get a human touch,” explained Sabo Taylor Diab, Vice President, Marketing at Loom Systems, in this SYS-CON.tv interview at 20th Cloud Expo, held June 6-8, 2017, at the Javits Center in New York City, NY.
The current age of digital transformation means that IT organizations must adapt their toolset to cover all digital experiences, beyond just the end users’. Today’s businesses can no longer focus solely on the digital interactions they manage with employees or customers; they must now contend with non-traditional factors. Whether it's the power of brand to make or break a company, the need to monitor across all locations 24/7, or the ability to proactively resolve issues, companies must adapt to...
A look across the tech landscape at the disruptive technologies that are increasing in prominence and speculate as to which will be most impactful for communications – namely, AI and Cloud Computing. In his session at 20th Cloud Expo, Curtis Peterson, VP of Operations at RingCentral, highlighted the current challenges of these transformative technologies and shared strategies for preparing your organization for these changes. This “view from the top” outlined the latest trends and developments i...
With major technology companies and startups seriously embracing Cloud strategies, now is the perfect time to attend 21st Cloud Expo October 31 - November 2, 2017, at the Santa Clara Convention Center, CA, and June 12-14, 2018, at the Javits Center in New York City, NY, and learn what is going on, contribute to the discussions, and ensure that your enterprise is on the right path to Digital Transformation.
@DevOpsSummit at Cloud Expo taking place Oct 31 - Nov 2, 2017, at the Santa Clara Convention Center, Santa Clara, CA, is co-located with the 21st International Cloud Expo and will feature technical sessions from a rock star conference faculty and the leading industry players in the world. The widespread success of cloud computing is driving the DevOps revolution in enterprise IT. Now as never before, development teams must communicate and collaborate in a dynamic, 24/7/365 environment. There is ...
"We are a monitoring company. We work with Salesforce, BBC, and quite a few other big logos. We basically provide monitoring for them, structure for their cloud services and we fit into the DevOps world" explained David Gildeh, Co-founder and CEO of Outlyer, in this SYS-CON.tv interview at DevOps Summit at 20th Cloud Expo, held June 6-8, 2017, at the Javits Center in New York City, NY.
Join us at Cloud Expo June 6-8 to find out how to securely connect your cloud app to any cloud or on-premises data source – without complex firewall changes. More users are demanding access to on-premises data from their cloud applications. It’s no longer a “nice-to-have” but an important differentiator that drives competitive advantages. It’s the new “must have” in the hybrid era. Users want capabilities that give them a unified view of the data to get closer to customers and grow business. The...
For organizations that have amassed large sums of software complexity, taking a microservices approach is the first step toward DevOps and continuous improvement / development. Integrating system-level analysis with microservices makes it easier to change and add functionality to applications at any time without the increase of risk. Before you start big transformation projects or a cloud migration, make sure these changes won’t take down your entire organization.
Artificial intelligence, machine learning, neural networks. We’re in the midst of a wave of excitement around AI such as hasn’t been seen for a few decades. But those previous periods of inflated expectations led to troughs of disappointment. Will this time be different? Most likely. Applications of AI such as predictive analytics are already decreasing costs and improving reliability of industrial machinery. Furthermore, the funding and research going into AI now comes from a wide range of com...
"When we talk about cloud without compromise what we're talking about is that when people think about 'I need the flexibility of the cloud' - it's the ability to create applications and run them in a cloud environment that's far more flexible,” explained Matthew Finnie, CTO of Interoute, in this SYS-CON.tv interview at 20th Cloud Expo, held June 6-8, 2017, at the Javits Center in New York City, NY.
Internet of @ThingsExpo, taking place October 31 - November 2, 2017, at the Santa Clara Convention Center in Santa Clara, CA, is co-located with 21st Cloud Expo and will feature technical sessions from a rock star conference faculty and the leading industry players in the world. The Internet of Things (IoT) is the most profound change in personal and enterprise IT since the creation of the Worldwide Web more than 20 years ago. All major researchers estimate there will be tens of billions devic...