No Jitter is part of the Informa Tech Division of Informa PLC

This site is operated by a business or businesses owned by Informa PLC and all copyright resides with them. Informa PLC's registered office is 5 Howick Place, London SW1P 1WG. Registered in England and Wales. Number 8860726.

Network Monitoring as Big Data

Network monitoring has been around in some form ever since networks were established, largely focused on fixing failures and improving performance. Monitoring the network continues to be essential today, and the data collected from these efforts -- especially historical data -- fall into the category of big data.

portable

New Technologies Change the Scope
Most networks today are static in structure but still need to support dynamically changing traffic. When traffic changes drastically enough, the network is redesigned, routing tables modified, and bandwidth reassigned/increased. Performance issues are resolved by these alterations, with the assumption that traffic will not drastically change again in the near term. If traffic changes are substantial, the static network does not have the ability to restructure itself to respond to the new traffic levels.

The advent of virtualization across the data center and its servers, software-defined networks (SDN), and network function virtualization (NFV) means that much of the resource configurations will be under software control, charged with dynamically producing the responses to any changes or problems.

These are dynamic networks supporting dynamic traffic. All software responses will be automated, and human network operators will be informed of the changes after the fact. But the enterprise needs to be able to analyze the automated actions to ensure that they deliver what was expected and to modify automated behavior to satisfy the enterprise's goals.

Network Data = Big Data
As Sandy Borthick, an analyst with Frost & Sullivan's Stratecast unit recently told us in the No Jitter post, Big Data: Tools and a Test, big data refers to data sets that are too large and complex or are growing and changing too quickly for traditional databases and applications. Either general-purpose statistical packages or special-purpose software applications provide the ability to analyze big data for a variety of purposes within an organization.

Network and data center monitoring data definitely qualifies as big data, and the volume of data produced from monitoring efforts continues to increase. The monitored data also changes rapidly -- in milliseconds and even microseconds -- and can change faster than humans can analyze and respond.

There is poor visibility of the data center network connecting virtualized servers, but there is a need to correlate the traffic within the data center with traffic from the WANs and LANs. One affects the other, but most data center and WAN/LAN monitoring is separate and disconnected from the WAN/LAN monitoring.

Big Data as a Tool
In Big Data: A Tool, Not an Answer, I described five factors of big data that produce challenges for anyone who wants to process it and generate useful, timely results:

  • Quantity--The amount of data produced by a wide range of network resources keeps increasing, with data coming in structured and unstructured forms depending on the vendor of the resource.
  • Delivery Speed--The rate of data delivery stimulates rapid processing of data. Traffic and resource changes within a virtualized automated network can happen in milliseconds. If left unanalyzed, the data value may decrease to where it becomes only historical data, which means the data cannot be used to deal with real-time situations or for making predictions.
  • Variability--Data creation does not follow a nice smooth pattern. Large data production can occur due to unexpected events (traffic bursts, resource failures, and security attacks) as well as periodic events.
  • Many Formats--It would be nice if all data was in a common format, but this is rarely the case. Vendors add proprietary data extensions to make their products more attractive by providing more collectable information. The variety of formats and data extensions is already significant, making analysis that much more complicated.
  • Many Sources--Attempting to connect, link, match, and transform data is quite a task. If correlations cannot be made, data relationships will be fragmented and the end result will be analyses that do not offer useful, actionable results.

What Do We Need?
Software-defined networking is a networking technique that separates the resource that makes decisions about where traffic is sent from the resources that forward traffic to the selected destination. We do not have one, but several SDN approaches on the market, so we need some agreement on the data generated by different vendors and standardization of format and data content.

One group working to solve this multivendor issue is the Network Functions Virtualization Working Group of the European Telecommunications Standards Institute (ETSI). It states that NFV "aims to address... problems by leveraging standard IT virtualization technology to consolidate many network equipment types onto industry standard high volume servers, switches and storage, which could be located in data centers, network nodes and in the end user premises."

If the ETSI is able to standardize NFV and its monitoring data, then we may not have the same multivendor approach of SDN. If not, then we need standards here as well.

A third entry into the confusion is the service provider community. Nearly all enterprises use some providers in their networks. SPs may be reluctant and therefore limit the data available for big data analysis. They may also add their own formats and content information.

What we need is a holistic approach to big data network monitoring analysis. Today's fragmented condition means that the enterprise will have to focus on one resource vendor to reduce the variety of data elements with varied formats and content. This will still not resolve the monitoring data difference from the providers, though.

Finally, many big data tools are focused on human behavior and consumer data. Vendors of big data tools will have to expand their products to be capable of handling the real-time fast response requirements of the enterprise network staff.

Conclusion
As I wrote in Big Data: A Tool, Not an Answer, big data is nothing unless it's properly digested, correlated, matched, and transformed across systems. Networking staff in charge of big data face a problem in that most of the collecting systems are neither connected nor designed to be so. Ensuring that the right data is collected and the right questions asked is going to take a lot of training, experience, new system connections, and feedback.

The end results of analyzing the monitoring data will be useless and possibly dangerous to the health of the network and therefore detrimental to the enterprise -- unless steps are taken to standardize data.

  • Delivery Speed--The rate of data delivery stimulates rapid processing of data. Traffic and resource changes within a virtualized automated network can happen in milliseconds. If left unanalyzed, the data value may decrease to where it becomes only historical data, which means the data cannot be used to deal with real-time situations or for making predictions.
  • Variability--Data creation does not follow a nice smooth pattern. Large data production can occur due to unexpected events (traffic bursts, resource failures, and security attacks) as well as periodic events.
  • Many Formats--It would be nice if all data was in a common format, but this is rarely the case. Vendors add proprietary data extensions to make their products more attractive by providing more collectable information. The variety of formats and data extensions is already significant, making analysis that much more complicated.
  • Many Sources--Attempting to connect, link, match, and transform data is quite a task. If correlations cannot be made, data relationships will be fragmented and the end result will be analyses that do not offer useful, actionable results.

    What Do We Need?
    Software-defined networking is a networking technique that separates the resource that makes decisions about where traffic is sent from the resources that forward traffic to the selected destination. We do not have one, but several SDN approaches on the market, so we need some agreement on the data generated by different vendors and standardization of format and data content.

    One group working to solve this multivendor issue is the Network Functions Virtualization Working Group of the European Telecommunications Standards Institute (ETSI). It states that NFV "aims to address... problems by leveraging standard IT virtualization technology to consolidate many network equipment types onto industry standard high volume servers, switches and storage, which could be located in data centers, network nodes and in the end user premises."

    If the ETSI is able to standardize NFV and its monitoring data, then we may not have the same multivendor approach of SDN. If not, then we need standards here as well.

    A third entry into the confusion is the service provider community. Nearly all enterprises use some providers in their networks. SPs may be reluctant and therefore limit the data available for big data analysis. They may also add their own formats and content information.

    What we need is a holistic approach to big data network monitoring analysis. Today's fragmented condition means that the enterprise will have to focus on one resource vendor to reduce the variety of data elements with varied formats and content. This will still not resolve the monitoring data difference from the providers, though.

    Finally, many big data tools are focused on human behavior and consumer data. Vendors of big data tools will have to expand their products to be capable of handling the real-time fast response requirements of the enterprise network staff.

    Conclusion
    As I wrote in Big Data: A Tool, Not an Answer, big data is nothing unless it's properly digested, correlated, matched, and transformed across systems. Networking staff in charge of big data face a problem in that most of the collecting systems are neither connected nor designed to be so. Ensuring that the right data is collected and the right questions asked is going to take a lot of training, experience, new system connections, and feedback.

    The end results of analyzing the monitoring data will be useless and possibly dangerous to the health of the network and therefore detrimental to the enterprise -- unless steps are taken to standardize data.

  • Variability--Data creation does not follow a nice smooth pattern. Large data production can occur due to unexpected events (traffic bursts, resource failures, and security attacks) as well as periodic events.
  • Many Formats--It would be nice if all data was in a common format, but this is rarely the case. Vendors add proprietary data extensions to make their products more attractive by providing more collectable information. The variety of formats and data extensions is already significant, making analysis that much more complicated.
  • Many Sources--Attempting to connect, link, match, and transform data is quite a task. If correlations cannot be made, data relationships will be fragmented and the end result will be analyses that do not offer useful, actionable results.

    What Do We Need?
    Software-defined networking is a networking technique that separates the resource that makes decisions about where traffic is sent from the resources that forward traffic to the selected destination. We do not have one, but several SDN approaches on the market, so we need some agreement on the data generated by different vendors and standardization of format and data content.

    One group working to solve this multivendor issue is the Network Functions Virtualization Working Group of the European Telecommunications Standards Institute (ETSI). It states that NFV "aims to address... problems by leveraging standard IT virtualization technology to consolidate many network equipment types onto industry standard high volume servers, switches and storage, which could be located in data centers, network nodes and in the end user premises."

    If the ETSI is able to standardize NFV and its monitoring data, then we may not have the same multivendor approach of SDN. If not, then we need standards here as well.

    A third entry into the confusion is the service provider community. Nearly all enterprises use some providers in their networks. SPs may be reluctant and therefore limit the data available for big data analysis. They may also add their own formats and content information.

    What we need is a holistic approach to big data network monitoring analysis. Today's fragmented condition means that the enterprise will have to focus on one resource vendor to reduce the variety of data elements with varied formats and content. This will still not resolve the monitoring data difference from the providers, though.

    Finally, many big data tools are focused on human behavior and consumer data. Vendors of big data tools will have to expand their products to be capable of handling the real-time fast response requirements of the enterprise network staff.

    Conclusion
    As I wrote in Big Data: A Tool, Not an Answer, big data is nothing unless it's properly digested, correlated, matched, and transformed across systems. Networking staff in charge of big data face a problem in that most of the collecting systems are neither connected nor designed to be so. Ensuring that the right data is collected and the right questions asked is going to take a lot of training, experience, new system connections, and feedback.

    The end results of analyzing the monitoring data will be useless and possibly dangerous to the health of the network and therefore detrimental to the enterprise -- unless steps are taken to standardize data.

  • Many Formats--It would be nice if all data was in a common format, but this is rarely the case. Vendors add proprietary data extensions to make their products more attractive by providing more collectable information. The variety of formats and data extensions is already significant, making analysis that much more complicated.
  • Many Sources--Attempting to connect, link, match, and transform data is quite a task. If correlations cannot be made, data relationships will be fragmented and the end result will be analyses that do not offer useful, actionable results.

    What Do We Need?
    Software-defined networking is a networking technique that separates the resource that makes decisions about where traffic is sent from the resources that forward traffic to the selected destination. We do not have one, but several SDN approaches on the market, so we need some agreement on the data generated by different vendors and standardization of format and data content.

    One group working to solve this multivendor issue is the Network Functions Virtualization Working Group of the European Telecommunications Standards Institute (ETSI). It states that NFV "aims to address... problems by leveraging standard IT virtualization technology to consolidate many network equipment types onto industry standard high volume servers, switches and storage, which could be located in data centers, network nodes and in the end user premises."

    If the ETSI is able to standardize NFV and its monitoring data, then we may not have the same multivendor approach of SDN. If not, then we need standards here as well.

    A third entry into the confusion is the service provider community. Nearly all enterprises use some providers in their networks. SPs may be reluctant and therefore limit the data available for big data analysis. They may also add their own formats and content information.

    What we need is a holistic approach to big data network monitoring analysis. Today's fragmented condition means that the enterprise will have to focus on one resource vendor to reduce the variety of data elements with varied formats and content. This will still not resolve the monitoring data difference from the providers, though.

    Finally, many big data tools are focused on human behavior and consumer data. Vendors of big data tools will have to expand their products to be capable of handling the real-time fast response requirements of the enterprise network staff.

    Conclusion
    As I wrote in Big Data: A Tool, Not an Answer, big data is nothing unless it's properly digested, correlated, matched, and transformed across systems. Networking staff in charge of big data face a problem in that most of the collecting systems are neither connected nor designed to be so. Ensuring that the right data is collected and the right questions asked is going to take a lot of training, experience, new system connections, and feedback.

    The end results of analyzing the monitoring data will be useless and possibly dangerous to the health of the network and therefore detrimental to the enterprise -- unless steps are taken to standardize data.

  • Many Sources--Attempting to connect, link, match, and transform data is quite a task. If correlations cannot be made, data relationships will be fragmented and the end result will be analyses that do not offer useful, actionable results.

    What Do We Need?
    Software-defined networking is a networking technique that separates the resource that makes decisions about where traffic is sent from the resources that forward traffic to the selected destination. We do not have one, but several SDN approaches on the market, so we need some agreement on the data generated by different vendors and standardization of format and data content.

    One group working to solve this multivendor issue is the Network Functions Virtualization Working Group of the European Telecommunications Standards Institute (ETSI). It states that NFV "aims to address... problems by leveraging standard IT virtualization technology to consolidate many network equipment types onto industry standard high volume servers, switches and storage, which could be located in data centers, network nodes and in the end user premises."

    If the ETSI is able to standardize NFV and its monitoring data, then we may not have the same multivendor approach of SDN. If not, then we need standards here as well.

    A third entry into the confusion is the service provider community. Nearly all enterprises use some providers in their networks. SPs may be reluctant and therefore limit the data available for big data analysis. They may also add their own formats and content information.

    What we need is a holistic approach to big data network monitoring analysis. Today's fragmented condition means that the enterprise will have to focus on one resource vendor to reduce the variety of data elements with varied formats and content. This will still not resolve the monitoring data difference from the providers, though.

    Finally, many big data tools are focused on human behavior and consumer data. Vendors of big data tools will have to expand their products to be capable of handling the real-time fast response requirements of the enterprise network staff.

    Conclusion
    As I wrote in Big Data: A Tool, Not an Answer, big data is nothing unless it's properly digested, correlated, matched, and transformed across systems. Networking staff in charge of big data face a problem in that most of the collecting systems are neither connected nor designed to be so. Ensuring that the right data is collected and the right questions asked is going to take a lot of training, experience, new system connections, and feedback.

    The end results of analyzing the monitoring data will be useless and possibly dangerous to the health of the network and therefore detrimental to the enterprise -- unless steps are taken to standardize data.

    What Do We Need?
    Software-defined networking is a networking technique that separates the resource that makes decisions about where traffic is sent from the resources that forward traffic to the selected destination. We do not have one, but several SDN approaches on the market, so we need some agreement on the data generated by different vendors and standardization of format and data content.

    One group working to solve this multivendor issue is the Network Functions Virtualization Working Group of the European Telecommunications Standards Institute (ETSI). It states that NFV "aims to address... problems by leveraging standard IT virtualization technology to consolidate many network equipment types onto industry standard high volume servers, switches and storage, which could be located in data centers, network nodes and in the end user premises."

    If the ETSI is able to standardize NFV and its monitoring data, then we may not have the same multivendor approach of SDN. If not, then we need standards here as well.

    A third entry into the confusion is the service provider community. Nearly all enterprises use some providers in their networks. SPs may be reluctant and therefore limit the data available for big data analysis. They may also add their own formats and content information.

    What we need is a holistic approach to big data network monitoring analysis. Today's fragmented condition means that the enterprise will have to focus on one resource vendor to reduce the variety of data elements with varied formats and content. This will still not resolve the monitoring data difference from the providers, though.

    Finally, many big data tools are focused on human behavior and consumer data. Vendors of big data tools will have to expand their products to be capable of handling the real-time fast response requirements of the enterprise network staff.

    Conclusion
    As I wrote in Big Data: A Tool, Not an Answer, big data is nothing unless it's properly digested, correlated, matched, and transformed across systems. Networking staff in charge of big data face a problem in that most of the collecting systems are neither connected nor designed to be so. Ensuring that the right data is collected and the right questions asked is going to take a lot of training, experience, new system connections, and feedback.

    The end results of analyzing the monitoring data will be useless and possibly dangerous to the health of the network and therefore detrimental to the enterprise -- unless steps are taken to standardize data.