The data volumes expected from the Internet of Things (IoT) are certain to be large – too large, in fact, for even an army of trained analysts to turn into useful information in a reasonable amount of time.
This is why every solution aimed at the IoT relies heavily on automation, simply to manage the flow of information between devices and to centralized storage and analytics systems. But even this is not likely to be enough. To fully leverage the IoT, it’s becoming obvious that the enterprise will have to utilize new forms of artificial intelligence and machine learning to basically allow the environment to makes its own use of available data and tell human operators what needs to be done.https://o1.qnsr.com/log/p.gif?;n=203;c=204663295;s=11915;x=7936;f=201904081034270;u=j;z=TIMESTAMP;a=20410779;e=i
Already, this is emerging on leading IoT platforms. Software developer C3 recently updated its IoT platform by pushing artificial intelligence to the edge where it can function on an application level for an improved user experience. The system uses Amazon Web Services for infrastructure and device management and provides AI and machine learning tools to supplement both out-of-the-box and homegrown applications. As well, the platform enables deep learning technology on the analytical side to improve its predictive capabilities, plus tools like object and facial recognition, natural language processing, and text analysis capable of interpreting even hand-written notes.
Combining AI and the IoT is the best way to unleash the enormous potential that both technologies bring to the enterprise, says jaxenter.com’s Rick Delgado. Without a high degree of system autonomy, the IoT is merely a massive data collection service, with no way to interpret or leverage that data for any meaningful purpose. Although the IoT brings many new technologies into play, it still suffers from the same vertical segmentation of any other silo-based infrastructure. Implemented properly, AI can break down the silos of pooled information to further the basic rationale behind the IoT: to find the connections between data sets that would otherwise remain hidden.
A key application for AI in the IoT is the data catalog, according to Informatica’s Amit Walia. Speaking to Silicon Angle at the Big Data SV conference in San Jose recently, Walia noted that it’s not just the amount of data that poses a problem for analysts, but the diversity of data types, streams and sources. With AI providing the metadata that gives meaning to all of this raw, unstructured data, human analysts can provide the higher-value service of actually analyzing information, rather than the rote processing and categorization. It’s the difference between forming IoT infrastructure around a data lake or a data swamp.
But AI has the potential to increase the IoT’s value even further by extending its reach beyond highly trained data analysists and scientists to the average business user, or even the consumer. The best way to do that is by simplifying the user interface, which AI is uniquely suited for through its ability to recognize and emulate human speech. As TechRepublic’s Alison DeNisco points out, IBM recently brought the word error rate of its speech recognition system down to 5.5 percent, which is close to human parity. This will have enormous implications for the enterprise because it opens the possibility of speaking to digital systems as easily as we speak to co-workers. So instead of clicking, typing and texting our way through menus to get what we want, we can simply ask for it and the artificially intelligent system will be able to figure out on its own how to provide it. IBM says it is not “popping the champagne yet” over its speech recognition but is instead focusing on error rates of 5.1 percent or lower and tuning the system toward more complex contextualization.
At the moment, the enterprise is focused on putting IoT infrastructure in place and getting it to function in a reasonably cohesive fashion. But it won’t be long before data volumes start to mount and the pressure will be on to find the value that justifies the expense of all this massive collection and analytic infrastructure.
It’s a safe bet that the first enterprises to cross this line will have done it through artificial intelligence.
Arthur Cole writes about infrastructure for IT Business Edge. Cole has been covering the high-tech media and computing industries for more than 20 years, having served as editor of TV Technology, Video Technology News, Internet News and Multimedia Weekly. His contributions have appeared in Communications Today and Enterprise Networking Planet and as web content for numerous high-tech clients like TwinStrata and Carpathia. Follow Art on Twitter @acole602.