Internet of Things (IoT) management platforms are hitting the channel at a rapid clip this year, all promising to ramp up the enterprise’s data collection and device connectivity challenges without requiring overly burdensome upfront investments in infrastructure.
But organizations looking to manage their IoT footprints should be aware that most of these solutions are aimed at the enterprise’s relatively limited set of IoT devices and data flows. Even for the largest of companies, this is an extremely finite pool of IoT resources, which poses the risk of locking out potentially critical data sets that could vastly alter analytics results and the versions of truth they provide to decision-makers.https://o1.qnsr.com/log/p.gif?;n=203;c=204663295;s=11915;x=7936;f=201904081034270;u=j;z=TIMESTAMP;a=20410779;e=i
Of course, no enterprise should have the right to control the endpoints of another, but without a mechanism for universal data collection and sharing, the IoT runs the risk of evolving around the same data silo architectures that plague the data center.
Google recently announced its Cloud IoT Core service that allows businesses to connect their own IoT sensors and devices to the company’s cloud platform where analytics services like Dataflow and BigQuery can collect, process and visualize data in real time. As eWeek’s Jaikumar Vijayan points out, this offers broad capability to enforce security and connectivity management across a fleet of devices, and it does provide a means for publicly available data to be integrated into IoT workflows, but the crux of the platform is intended to provide insight into what is happening within the enterprise’s own IoT ecosystem, not trends and events taking place in the broader economy.
By the same token, Dell EMC recently announced a suite of new products designed to help the enterprise establish IoT infrastructure quickly and easily. These include the integration of VMware’s Pulse IoT Center management stack into the Dell Edge gateway and partnerships with Bosch and Atos to incorporate rapid deployment frameworks and various integration and management tools into the IoT edge. Perhaps more significantly, however, the company says it is contributing code and microservices to the EdgeX Foundry project, which seeks to build an open IoT framework for edge devices. Still unclear, however, is whether this is intended to incorporate data from sources outside the enterprise’s direct control or to allow the enterprise to deploy multiple vendor solutions within its own IoT infrastructure.
To some IoT experts, a truly universal IoT ecosystem will not emerge until it incorporates a high level of artificial intelligence. Jaxenter’s Rick Delgado argued recently that under today’s deployment models, the IoT suffers from too much vertical segmentation – that is, organizations tend to hoard their pools of information rather than share them. Part of this is due to competitive pressure, but part of it is due to the complexity surrounding data and resource integration across disparate platforms. With AI, much of this work can be automated and the results will become progressively more meaningful as the technology learns what information is and is not to be shared.
Two other crucial elements in this vision are the database service and infrastructure layers. According to a recent survey by Orbis Research, the IoT represents the next big opportunity for a wide range of DB services, including identity management, device management, authentication and plain old accounting. Much of this can be supported by an IoT data registry and an IoT DB transaction services layer, which can be used by network providers, enterprises and government organizations to provide a uniform framework for the open exchange of information. Initially, these services are likely to emerge within multiple IoT silos, but through cooperation we could see the emergence of universal APIs, plus signaling, communications, access and other technologies.
The dream of universal data interactivity has existed since the first mainframes were deployed in government and commercial settings in the 1960s. The internet is about as close as we’ve come to that goal.
But the IoT has a different underlying rationale than previous data constructs. Analytic results are only as good as the data fed into the engine. If the enterprise limits itself to its own data ecosystem, it should get a pretty good idea of how well it is performing and what opportunities lie within its own user base. But the wider universe of information will remain mostly hidden, and this is where the real opportunities, and the real threats, reside.
Arthur Cole writes about infrastructure for IT Business Edge. Cole has been covering the high-tech media and computing industries for more than 20 years, having served as editor of TV Technology, Video Technology News, Internet News and Multimedia Weekly. His contributions have appeared in Communications Today and Enterprise Networking Planet and as web content for numerous high-tech clients like TwinStrata and Carpathia. Follow Art on Twitter @acole602.