With more data than ever traversing corporate networks, it’s only a matter of time before network administers start to encounter Big Data problems of their very own. Either as part of an effort to optimize performance of certain applications or simply trying to adhere to compliance mandates, the ability to capture massive amounts of data in transit as it travels across the network represents a major challenge.
Looking to address that specific issue, WildPackets has reengineered WatchPoint 3.0, a network monitoring tool, in a way that allows network administrators to capture all network flow and packet analysis data with one-minute granularity for up to one year.
Based on the Vertica columnar database platform for storing Big Data from Hewlett-Packard, WildPackets 3.0 comes in the form of an application configured with up to 8TB of disk space. According to Jay Botelho, director of product management at WildPackets, that dedicated appliance can then be used to, for example, capture all the OmniFlow, Netflow and sFlow data travelling across as many as 20 different network segments.
Integrated with existing troubleshooting and analytics tools from WildPackets, network administrators can then use tools to analyze all that data down to the packet level in order to determine how to prioritize the flow of applications moving massive amounts of Big Data across the network.
There’s a natural tendency to think of Big Data mainly as a storage issue. In reality, most Big Data applications also have velocity characteristics that can easily overwhelm available bandwidth. Network administrators are going to have to come to terms with that nature of Big Data traffic on their networks, which starts, of course, with figuring out where waves of that traffic are coming from and then ultimately heading in order to minimize impact on every other application that has to share the same network.