While most enterprises are making the jump from 1G to 10G, some early adopters are leaping ahead to 40G and even 100G. These rapid advancements in networking, along with new technologies like virtual networks and software defined networks (SDN), are driving an equally large change in network monitoring technologies.
The ability to seamlessly monitor, analyze, and troubleshoot, especially in real time, at multi-gigabit speeds is becoming a significant challenge. However, there are some ways to make this process more manageable and ensure that you are not missing the data you need for detailed network analysis.
Prioritize Monitoring Needs
Let’s say your 10G link (full duplex) is 50% utilized. That’s 10Gbps of traffic that requires analysis, or about 77GBytes of data that must be analyzed per minute. Besides supercomputing applications, like weather predictions, there are very few applications that require this level of data processing. And of course super-computing platforms are not within the budget for most network analysis solutions, so all this analysis must be packed into more traditional and affordable computing platforms.