Exabytes and Zettabytes -- in Other Words, a Whole Lot -- of Data

Share it on Twitter  
Share it on Facebook  
Share it on Linked in  

I've posted several times on the issue of Comcast and other cable operators' network monitoring initiatives. The ISPs say the activities are legitimate efforts to manage bandwidth, while critics see an effort to exert control for competitive reasons.


In any case, the tools will have plenty of content to look after. Just how busy the Internet will be is coming into focus. This week, Cisco released a report that shows the alarming rate of growth in Internet traffic. The Cisco Visual Networking Index, which is linked to from this InformationWeek report, says that for the years 2007 to 2012 Internet usage has and will grow at a combined annual rate of 46 percent.


Indeed, we are getting into a number nobody has ever heard of. The new metrics, the vendor says, are exabytes (one of which is the equivalent of 250 million DVDs, or 1 billion Gigabytes) and zettabytes (1,000 exabytes, equal to 250 billion DVDs or 1 trillion GB). Jim Cicconi, vice president of legislative affairs for AT&T, put it another way: In just three years, he was quoted by CNET as saying, 20 typical households will create more content than the entire Internet now does. The Information Week story didn't suggest how many exabytes or zettabytes will be used by 2012, but just the nature of the numbers is a bit chilling. The story said that 90 percent of the traffic will be video-on-demand, peer-to-peer (P2) and Internet video by 2012.


This Government Computer News piece provides an example of the kind of growth that is expected and the steps necessary to meet the demand. The bottom line is that the steady improvement in technology is leading to rapid proliferation in the amount of data created and subsequently networked. The story says that traffic generated by the Energy Sciences Network (ESnet), which is affiliated with The Energy Department, has increased by a factor of 10 every 47 months since 1990. If anything, the rate of increase will climb when the switched is flipped on the Large Hadron Collider particle accelerator this summer in Switzerland. The story says a 10 Gigabit per second (Gbps) link is necessary now -- and will be provided by Internet2 -- and that a connection running at 10 times that speed will be needed by 2010.


Ike Elliott takes a crack at determining precisely where the weak points are in this fearsome network tapestry. His post starts by defining the three sectors of the Internet: Tier one backbones are the huge networks that connect major cities, "middle mile" infrastructure connects these to ISP (and, though he doesn't say it, enterprises), and the last mile connects ISPs to end users. The post says that the heaviest pressure is in the middle mile. Elliott offers six data points suggesting how quickly traffic is increasing while contending that the current infrastructure is just covering needs. Finally, Elliott outlines what the immediate future holds as vendors and service providers struggle to keep up with demand.


There is a storm brewing, and quite a storm it will be. The key to surviving and even thriving will be to find technologies to cut bandwidth requirements for a given amount of data while simultaneously expanding network infrastructure.