More

    Make Way for the Edge Data Center

    Slide Show

    5 Essential Elements in Building an Agile Data Center

    Cloud computing is leading to a massive centralization of IT resources. If current trends progress, the vast majority of data infrastructure will be housed in giant regional cloud facilities, with only highly converged systems remaining in corporate settings around the world.

    This will undoubtedly be more efficient and less costly, but it presents a problem: Centralized resources are not great for time-sensitive applications, since the data center is now some miles away from the user. This is why future data architectures will rely on massive centralization and legions of automated mini data centers on the edge.

    These facilities will be crucial for both the content-streaming services that populate the web and emerging Big Data/IoT workloads that need to gather data and produce analytical results in a moment’s notice in order to capitalize on fast-moving market opportunities. As IHS Markit analyst Lucas Beran noted in a recent series of blogs on Data Center Journal, the typical edge facility will process loads drawing between 10 and 100 kW and will provide services like data aggregation and content-caching to reduce latency and network congestion across wide-area infrastructure. At the same time, organizations can use these facilities to provide targeted, regional services to give a more local feel to national and even international product offerings.

    Naturally, edge devices will have to be small and largely self-sufficient, which means they will have to be built around hyperconverged infrastructure (HCI) and outfitted with a fair degree of automation, says IT analyst Zeus Kerravala. By adopting the same HCI architectures that populate Google and Facebook, the edge data center gains high levels of agility, scale and fault tolerance. And by layering services atop a virtual architecture, edge resources can be integrated into a distributed resource pool, which makes it easier to manage data environments remotely and to scale physical footprints when necessary.

    A glimpse of this future can be seen in the newest EdgeConneX facility near Portland, Oregon, says Light Reading’s Carol Wilson. The company has partnered with service providers Electric Lightwave and Megaport, along with tech vendors Ciena Corp., Opus Interactive and ScienceLogic, to create the Portland Edge Data Center Cloud Ecosystem. The idea is to provide direct, private cloud connections that are currently available only at top Internet clearing sites. In this way, it can deliver direct access to AWS and Azure clouds using high-speed fiber and flexible SDN architectures, allowing the enterprise to extend its data infrastructure without having to develop in-house expertise in telecom-layer networking.

    The rise of edge data centers affects more than just the enterprise, according to Data Center Knowledge’s Yevgeniy Sverdlik – it will change the very geography of the Internet. By decentralizing the traditional hubs that have serviced the modern Internet for more than 20 years, these facilities will have a major impact on data traffic patterns across the globe and will bring much-needed bandwidth to second-tier markets that are becoming just as data-dependent as their major-market brethren. Not only will this make more data available to more people, it will also lower the cost of content delivery. According to a recent study by EdgeConneX, top content providers like Netflix and Google can cut their transport costs in half by caching data locally.

    Edge data centers represent the culmination of multiple trends that are taking place in the enterprise center; namely, miniaturization, consolidation, commoditization and the deployment of software-defined data architectures. As data environments become less dependent on fixed hardware, it is somewhat ironic that location can still influence the speed and agility that users require for their digital interactions.

    It will still be possible to access files from halfway around the world, but with greater capacity and more processing on the edge, you won’t have to unless it is absolutely necessary.

    Arthur Cole writes about infrastructure for IT Business Edge. Cole has been covering the high-tech media and computing industries for more than 20 years, having served as editor of TV Technology, Video Technology News, Internet News and Multimedia Weekly. His contributions have appeared in Communications Today and Enterprise Networking Planet and as web content for numerous high-tech clients like TwinStrata and Carpathia. Follow Art on Twitter @acole602.

    Arthur Cole
    Arthur Cole
    With more than 20 years of experience in technology journalism, Arthur has written on the rise of everything from the first digital video editing platforms to virtualization, advanced cloud architectures and the Internet of Things. He is a regular contributor to IT Business Edge and Enterprise Networking Planet and provides blog posts and other web content to numerous company web sites in the high-tech and data communications industries.

    Get the Free Newsletter!

    Subscribe to Daily Tech Insider for top news, trends, and analysis.

    Latest Articles