AT&T said today that it will launch networks capable of providing 1 gigabit per second (Gbps) connectivity in Columbia, SC; Jackson, MS; Knoxville, TN; Milwaukee and Shreveport. By the end of February, it will provide all-fiber networks in 51 metro areas across the country.
AT&T claims that its all-fiber networks now pass almost 4 million homes and businesses. By mid-2019, the carrier aims to offer such services to 12.5 million subscribers in 67 metro areas. That, according to Light Reading, will exceed its promise to the government when it acquired DirecTV in July, 2015.
Such an ambitious rollout will lead to innovation. The goal of any telecommunications provider is to use wireless technology or wired technology that is already in the ground or on poles. One approach, which has been a dream of network engineers and the financial people to whom they report, is to use electrical lines for distribution. The rationale is simple: These networks are already just about everywhere.
The challenge is that telecommunications signals are extremely sensitive and fragile and electricity is robust. It is like having a hockey player and figure skater share the same rink: Things will be fine most of the time, but once in a while, the skater will face a significant (and in that case, painful) obstacle.
AT&T is apparently making great progress on this front. Ars Technica reported in late January that the AirGig project, which was announced last September, may be field tested later this year. The carrier is in “advanced discussions” with power companies about using its lines for gigabit services. Two trials may be mounted this fall and one of these may be outside of the United States.
The difference between AirGig and other projects using high-capacity power lines is that the signals are not actually carried on the same line:
Antennas that are placed on utility poles send wireless signals to each other; AT&T says the power lines “serve as a guide for the signals,” ensuring they reach their destination. AT&T says the wireless signals could be used to deliver multi-gigabit Internet speeds for either smartphone data or home Internet service.
Two key elements to upgrading such a huge network to gigabit status will be software-defined networks and network functions virtualization (SDN and NFV). It is a complex effort to transition the network to be dominated by software instead of hardware. For AT&T, a vital element of the effort is the Enhanced Control, Orchestration, Management & Policy (ECOMP) platform, which can be thought of as the network’s operating system.
In essence, ECOMP is an attempt to make it possible to systematically develop, onboard, manage, retire and otherwise manage services and functions in an SDN and network functions virtualization (NFV) environment.
AT&T wants its ECOMP approach to become the industry standard. A key step toward this goal has been taken: AT&T says ECOMP, which other carriers are testing, has been “open sourced” and is now a Linux Foundation Collaborative Project, according to CIO.
Carl Weinschenk covers telecom for IT Business Edge. He writes about wireless technology, disaster recovery/business continuity, cellular services, the Internet of Things, machine-to-machine communications and other emerging technologies and platforms. He also covers net neutrality and related regulatory issues. Weinschenk has written about the phone companies, cable operators and related companies for decades and is senior editor of Broadband Technology Report. He can be reached at cweinsch@optonline.net and via twitter at @DailyMusicBrk.