Welcome!

Blog Feed Post

How to configure Palo Alto Networks NetFlow

Palo Alto Networks NetFlow support is now available and with the latest version of our NetFlow monitoring solution you can get NAT and also application reporting for this firewall.

Today I’ll be providing step by step instructions on how to configure NetFlow for this device, and also show an example of the extended NetFlow reporting available.

How to configure Palo Alto Networks NetFlow

There are 2 basic steps for configuring the Palo Alto Networks firewall to export NetFlow:

1.  Define a NetFlow server profile – this specifies the frequency of the export along with the NetFlow servers that will receive the exported data.

2.  Assign the profile to a firewall interface - all traffic flowing over this interface is exported to the specified server(s).

 

Step 1

To define a NetFlow server profile, navigate to Device-> Server Profiles-> NetFlow in the GUI. Here you will see the following settings:

Name: Enter a name for the NetFlow settings.

Template Refresh Rate: Specify the number of minutes or number of packets after which the NetFlow template is refreshed (we recommend 1 minute; packets range 1-600, default 20).

Active Timeout: Specify the frequency at which data records are exported for each session (we recommend 1 minute).

Export PAN-OS Specific Field Types: Export PAN-OS specific fields such as App-ID and User-ID in NetFlow records.

Server Name: Specify a name to identify the server.

Server: Specify the host name or IP address of the server.

Port: Specify the port number for server access (default 9996).

 

Palo Alto NetFlow Servers

 

Step 2

Once the NetFlow profile is configured, the next step is to assign the profile to a firewall interface. For this, navigate to Network-> Interfaces-> Ethernet. Click the link for the interface on the Ethernet tab -

 

Palo Alto NetFlow interface configuration

Then specify the NetFlow Profile -

Palo Alto interface NetFlow profile configuration

 

With our advanced NetFlow reporting solution, you can get advanced Palo Alto Networks NetFlow reporting such as applications reports – giving you visibility of named applications, rather than reporting the traffic as http(80 TCP); NAT (Network Address Translation) reports; and User reports.

Palo Alto Networks Application reports

 

In addition to the advanced NetFlow reporting, the standard NetFlow reports such as conversations, TopN reporting, and also threat detection capabilities are available from Palo Alto Networks NetFlow exports.

 

For more information on configuring NetFlow on this firewall, see the Palo Alto NetFlow Configuration Guide.

And if you need further assistance with configuring the NetFlow on this firewall, or with accessing the advanced NetFlow reports, please do not hesitate to contact us at 207-324-8805.

 


Joanne Ghidoni
Sr. Solutions Engineer

For a free 30 day trial of Scrutinizer, Download Now

Sign up for Advanced NetFlow Training coming to a city near you!

Read the original blog entry...

More Stories By Michael Patterson

Michael Patterson, is the founder & CEO of Plixer and the product manager for Scrutinizer NetFlow and sFlow Analyzer. Prior to starting Somix and Plixer, Mike worked in a technical support role at Cabletron Systems, acquired his Novell CNE and then moved to the training department for a few years. While in training he finished his Masters in Computer Information Systems from Southern New Hampshire University and then left technical training to pursue a new skill set in Professional Services. In 1998 he left the 'Tron' to start Somix and Plixer.

Latest Stories
Automation is enabling enterprises to design, deploy, and manage more complex, hybrid cloud environments. Yet the people who manage these environments must be trained in and understanding these environments better than ever before. A new era of analytics and cognitive computing is adding intelligence, but also more complexity, to these cloud environments. How smart is your cloud? How smart should it be? In this power panel at 20th Cloud Expo, moderated by Conference Chair Roger Strukhoff, paneli...
In his session at @ThingsExpo, Eric Lachapelle, CEO of the Professional Evaluation and Certification Board (PECB), provided an overview of various initiatives to certify the security of connected devices and future trends in ensuring public trust of IoT. Eric Lachapelle is the Chief Executive Officer of the Professional Evaluation and Certification Board (PECB), an international certification body. His role is to help companies and individuals to achieve professional, accredited and worldwide re...
Today we can collect lots and lots of performance data. We build beautiful dashboards and even have fancy query languages to access and transform the data. Still performance data is a secret language only a couple of people understand. The more business becomes digital the more stakeholders are interested in this data including how it relates to business. Some of these people have never used a monitoring tool before. They have a question on their mind like “How is my application doing” but no id...
IoT solutions exploit operational data generated by Internet-connected smart “things” for the purpose of gaining operational insight and producing “better outcomes” (for example, create new business models, eliminate unscheduled maintenance, etc.). The explosive proliferation of IoT solutions will result in an exponential growth in the volume of IoT data, precipitating significant Information Governance issues: who owns the IoT data, what are the rights/duties of IoT solutions adopters towards t...
With the introduction of IoT and Smart Living in every aspect of our lives, one question has become relevant: What are the security implications? To answer this, first we have to look and explore the security models of the technologies that IoT is founded upon. In his session at @ThingsExpo, Nevi Kaja, a Research Engineer at Ford Motor Company, discussed some of the security challenges of the IoT infrastructure and related how these aspects impact Smart Living. The material was delivered interac...
The current age of digital transformation means that IT organizations must adapt their toolset to cover all digital experiences, beyond just the end users’. Today’s businesses can no longer focus solely on the digital interactions they manage with employees or customers; they must now contend with non-traditional factors. Whether it's the power of brand to make or break a company, the need to monitor across all locations 24/7, or the ability to proactively resolve issues, companies must adapt to...
Wooed by the promise of faster innovation, lower TCO, and greater agility, businesses of every shape and size have embraced the cloud at every layer of the IT stack – from apps to file sharing to infrastructure. The typical organization currently uses more than a dozen sanctioned cloud apps and will shift more than half of all workloads to the cloud by 2018. Such cloud investments have delivered measurable benefits. But they’ve also resulted in some unintended side-effects: complexity and risk. ...
It is ironic, but perhaps not unexpected, that many organizations who want the benefits of using an Agile approach to deliver software use a waterfall approach to adopting Agile practices: they form plans, they set milestones, and they measure progress by how many teams they have engaged. Old habits die hard, but like most waterfall software projects, most waterfall-style Agile adoption efforts fail to produce the results desired. The problem is that to get the results they want, they have to ch...
With major technology companies and startups seriously embracing Cloud strategies, now is the perfect time to attend 21st Cloud Expo October 31 - November 2, 2017, at the Santa Clara Convention Center, CA, and June 12-14, 2018, at the Javits Center in New York City, NY, and learn what is going on, contribute to the discussions, and ensure that your enterprise is on the right path to Digital Transformation.
In 2014, Amazon announced a new form of compute called Lambda. We didn't know it at the time, but this represented a fundamental shift in what we expect from cloud computing. Now, all of the major cloud computing vendors want to take part in this disruptive technology. In his session at 20th Cloud Expo, Doug Vanderweide, an instructor at Linux Academy, discussed why major players like AWS, Microsoft Azure, IBM Bluemix, and Google Cloud Platform are all trying to sidestep VMs and containers wit...
The taxi industry never saw Uber coming. Startups are a threat to incumbents like never before, and a major enabler for startups is that they are instantly “cloud ready.” If innovation moves at the pace of IT, then your company is in trouble. Why? Because your data center will not keep up with frenetic pace AWS, Microsoft and Google are rolling out new capabilities. In his session at 20th Cloud Expo, Don Browning, VP of Cloud Architecture at Turner, posited that disruption is inevitable for comp...
While DevOps most critically and famously fosters collaboration, communication, and integration through cultural change, culture is more of an output than an input. In order to actively drive cultural evolution, organizations must make substantial organizational and process changes, and adopt new technologies, to encourage a DevOps culture. Moderated by Andi Mann, panelists discussed how to balance these three pillars of DevOps, where to focus attention (and resources), where organizations might...
No hype cycles or predictions of zillions of things here. IoT is big. You get it. You know your business and have great ideas for a business transformation strategy. What comes next? Time to make it happen. In his session at @ThingsExpo, Jay Mason, Associate Partner at M&S Consulting, presented a step-by-step plan to develop your technology implementation strategy. He discussed the evaluation of communication standards and IoT messaging protocols, data analytics considerations, edge-to-cloud tec...
When growing capacity and power in the data center, the architectural trade-offs between server scale-up vs. scale-out continue to be debated. Both approaches are valid: scale-out adds multiple, smaller servers running in a distributed computing model, while scale-up adds fewer, more powerful servers that are capable of running larger workloads. It’s worth noting that there are additional, unique advantages that scale-up architectures offer. One big advantage is large memory and compute capacity...
New competitors, disruptive technologies, and growing expectations are pushing every business to both adopt and deliver new digital services. This ‘Digital Transformation’ demands rapid delivery and continuous iteration of new competitive services via multiple channels, which in turn demands new service delivery techniques – including DevOps. In this power panel at @DevOpsSummit 20th Cloud Expo, moderated by DevOps Conference Co-Chair Andi Mann, panelists examined how DevOps helps to meet the de...