NetFlow, a network protocol developed for Cisco routers by Cisco Systems, is widely used to collect metadata about the IP traffic flowing across network devices such as routers, switches and hosts. It monitors and provides insight into the performance of your applications and network.
Traffic flow data informs a company’s IT professionals as to how much traffic there is, where it’s coming from and going to, and the paths being used. These network flow statistics are recorded and used to monitor usage over time, watch for problems and plan upgrades.
NetFlow is recognized as an Internet Engineering Task Force (IETF) standard, although “netflow” is also a general term to refer to other network flow monitoring protocols. Cisco Systems NetFlow version 9 is template-based and lets you choose which statistics to enable. In contrast, previous versions like the popular Cisco Systems NetFlow v5 required you to use a fixed set of fields. Overall, NetFlow version 9 is more flexible than previous versions.
IP Flow Information Export (IPFIX) is another netflow protocol that uses a flexible, template-based approach. While NetFlow v9 is used widely, IPFIX has become the industry standard. In fact, IPFIX is based on NetFlow v9. The two protocols are so similar that IPFIX is sometimes called NetFlow v10, although it’s not a NetFlow product.
Another popular IP network flow monitoring protocol and data record standard is sFlow, or samples NetFlow. Introduced by InMon Corp, it doesn’t track every packet crossing your network as NetFlow does; instead, it captures a random sample of network traffic. This random sampling means there’s less traffic flow data to process and analyze, but it minimizes the impact on performance. Since NetFlow tracks everything, it can cause network slowdowns.
Other netflow monitoring protocols include J-Flow from Juniper Networks, NetStream from 3Com/Huawei, Cflow from Alcatel-Lucent, and Rflow from Ericsson. A generic term used to refer to these tools is xFlow.
NetFlow solutions generally have three main components:
1. NetFlow exporter. A NetFlow-enabled device, usually a router or firewall, operates as a flow exporter and collects flow information. It aggregates data packets into flows and periodically exports NetFlow records via User Datagram Protocol (UDP) to one or more NetFlow collectors.
The exporter identifies a flow as a unidirectional packet stream with at least one of these elements in common: input interface port, the source IP address and destination address, source port, destination port number, Layer-3 protocol field or type of service. A flow is ready for NetFlow export when it’s inactive for a set period of time. It's also ready when a TCP flag, such as FIN or RST, shows that the flow has ended.
2. NetFlow collector. A Netflow collector can be hardware- or software-based, although software-based tools are more commonly used. NetFlow collectors receive the aggregated flow record data from flow exporter tools, and then preprocess and store it.
3. NetFlow analyzer. A NetFlow analyzer is a tool that processes and analyzes NetFlow records received and stored by a flow collector. It turns data into reports and alerts that provide insight on bandwidth usage, bandwidth hogs, traffic patterns, application usage and other performance metrics that may identify security threats and performance problems. This traffic flow analysis lets you create a picture of your network traffic and volume.
NetFlow data provides deep visibility into your network, which helps you optimize performance for better user experiences.
Understand traffic flow to maximize throughput. There is great value in seeing IP traffic patterns within your entire network, such as tracking traffic coming into the corporate network and identifying your top users. Security and network ops teams can use this flow information to monitor network applications use, detect bottlenecks and reduce the risk of downtime. NetFlow data is also helpful for usage-based billing. This capability allows charging network costs to specific business units or end users.
Plan for growth with accuracy. NetFlow data helps you track your network traffic to ensure adequate bandwidth capacity and best planning for network growth. This information makes planning upgrades easier and more efficient in terms of the number of ports, routing devices and other needs.
Heighten cybersecurity protection. Visualizing changes in network behavior helps your SecOps team identify anomalies that can indicate a cybersecurity breach. This data can also be helpful after a security incident to replay history and better understand what happened and how to avoid that scenario in the future.
Improve your user experience. Collecting and analyzing traffic flow data is a great help in diagnosing and troubleshooting slowdowns, network traffic spikes, bandwidth overuse and other network problems.
Gain insights within minutes. Many network devices already have either NetFlow or IPFIX support installed, making it simple to activate and point the resulting data to a NetFlow collector. If you don’t already have it, NetFlow is relatively inexpensive to install. You generally don’t need extra hardware. After downloading NetFlow, you can easily configure a few network nodes to add IP flow analysis to your network in just a few minutes, without downtime.
NetFlow uses a lot of bandwidth, which can impact the performance of the devices it monitors. Because of this limitation, some teams choose to sample IP packets instead, using sFlow, for instance, which uses much less bandwidth. However, the downfall is that sampling can keep IT teams from spotting critical network security or performance problems.
Also, NetFlow can forward results only to a limited number of people or monitoring tools, often fewer than needed to adequately manage and troubleshoot a network. This limitation means that too few people see too little information about how a network is performing to stay on top of problems and threats.
Another limitation is that while NetFlow identifies a device that sends or receives traffic, it doesn’t look at login information, so it cannot provide user identities.
Before NetFlow was available, IT professionals used Simple Network Management Protocol (SNMP) to analyze and monitor network traffic. SNMP is still widely used for network monitoring.
Unlike NetFlow, SNMP monitors memory, CPU and storage usage, and device temperature. SNMP collects information for standard network monitoring and capacity planning. SNMP differs from NetFlow in that it is used for real-time network management. SNMP does not provide detailed information about bandwidth usage, though, such as what a network is used for and who is using it.
NetFlow uses push technology, so you see information as soon as it’s available, whereas SNMP generally uses pull technology at set intervals.
Because NetFlow provides more information than SNMP, it’s better for deep network traffic analysis and debugging. NetFlow is more appropriate for complex, high-traffic networks that use IP traffic and for detecting anomalies. It provides more detailed information about applications and network traffic sources and is more scalable for performance analysis and network traffic management.
While both SNMP and NetFlow can be useful, it’s important to consider the differences to select the best option for your network.
Learn how an open data lakehouse approach can provide trustworthy data and faster analytics and AI projects execution.
Explore the data leader's guide to building a data-driven organization and driving business advantage.
Discover why AI-powered data intelligence and data integration are critical to drive structured and unstructured data preparedness and accelerate AI outcomes.
Gain unique insights into the evolving landscape of ABI solutions, highlighting key findings, assumptions and recommendations for data and analytics leaders.
Simplify data access and automate data governance. Discover the power of integrating a data lakehouse strategy into your data architecture, including cost-optimizing your workloads and scaling AI and analytics, with all your data, anywhere.
Explore how IBM Research is regularly integrated into new features for IBM Cloud Pak for Data.
Design a data strategy that eliminates data silos, reduces complexity and improves data quality for exceptional customer and employee experiences.
Watsonx.data enables you to scale analytics and AI with all your data, wherever it resides, through an open, hybrid and governed data store.
Unlock the value of enterprise data with IBM Consulting, building an insight-driven organization that delivers business advantage.