Ready or not, the Internet of Things (IoT) is bearing down on the enterprise at a rapid clip, which is forcing the industry to shift its IT capital budgets from the central data center to the edge.
But what form will this new edge take? And how is the enterprise supposed to manage such broadly distributed infrastructure when it can barely keep tabs on what is going on at home?https://o1.qnsr.com/log/p.gif?;n=203;c=204663295;s=11915;x=7936;f=201904081034270;u=j;z=TIMESTAMP;a=20410779;e=i
According to Peter Levine, of capital venture firm Andreesen Horowitz, it is important for the enterprise to understand that the edge represents not only a new layer of infrastructure, but a new way of computing. In a recent video post, he notes that the real-time, sensor-driven data on the edge is vastly different from traditional enterprise workloads in both form and function. Whether it is managing the multiple gigabytes from a connected car or the feedback from a robotic assembly line, edge computing will have to be smarter, more self-sufficient and far more reliable than anything developed in the past. It also means that technology giants will have to give up on dreams of controlling or dominating the IoT and instead focus on broad cooperation across multiple layers of the data infrastructure stack.
Judging by the steady stream of IoT platforms hitting the channel, however, it’s difficult to tell if the vendor community has come to this realization just yet. In recent weeks, we’ve seen partnerships emerge from Cisco and SAS, HPE and Rittal, plus various solutions from start-ups like Benu Networks and long-time providers like Dell-EMC. At the moment, it doesn’t appear that providers are attempting to lock organizations into proprietary solutions, but neither is it clear how deeply a given set of solutions needs to integrate in order to provide an appropriate level of IoT reliability.
Along with serverless computing and machine learning, the new edge will redefine cloud computing as we know it, says IT architect Janakiram MSV. Posting on Forbes, MSV notes that the need to drive high-speed analytics as close to the data endpoint as possible will require a new generation of microservices built on serverless platforms (or Functions as a Service, FaaS) like Amazon Lambda, Azure Functions, Google Cloud Functions and OpenWhisk. Meanwhile, machine learning will provide the adaptability the IoT services will need to respond to changing data and enable continuous improvements in accuracy and troubleshooting, allowing the overall ecosystem to get better as it engages with real-world user patterns.
Yet another key edge solution is likely to be the micro data center (MDC), says Dell-EMC’s Jyeh Gan. Although the concept has been around for a while, the IoT is leading to a wealth of diverse MDC solutions intended to meet a range of environmental conditions, while new open management platforms like Redfish are allowing for broad control of heterogeneous deployments. At the same time, increased modularity is enabling rapid installation and remote monitoring capabilities.
In many ways, the IoT edge will incorporate some of the most challenging data infrastructure and architecture to date. Not only will it have to be immensely scalable, but incredibly rapid in its response times and capable of self-analyzing and self-correcting itself as the size and complexity of its data loads expand.
When the data environment is literally all around us, data infrastructure needs to become more prevalent and more responsive to our evolving needs.
Arthur Cole writes about infrastructure for IT Business Edge. Cole has been covering the high-tech media and computing industries for more than 20 years, having served as editor of TV Technology, Video Technology News, Internet News and Multimedia Weekly. His contributions have appeared in Communications Today and Enterprise Networking Planet and as web content for numerous high-tech clients like TwinStrata and Carpathia. Follow Art on Twitter @acole602.