It seems that the elephant in the enterprise heading into the new year is artificial intelligence (AI). Regardless of the specific initiatives currently envisioned – from virtualized, hyperconverged infrastructure to the Internet of Things – AI is expected to play the lead role in managing and optimizing increasingly complicated workloads.
But AI is also an extremely complex, and somewhat unpredictable, technology, so it isn’t entirely clear exactly how it will help, or hurt, enterprise operations in the digital era. Of course, this doesn’t impede the healthy flow of year-end predictions, which this year have gravitated toward AI like ants to a picnic.
Forbes’ Gil Press offered a pretty good roundup of AI prognostications for 2019. Deloitte, for example, predicts that smart speakers led by Amazon Echo and Google Home will become the fastest-growing connected device category in history, topping 164 million units. Meanwhile, McKinsey claims that nearly half of all companies have already embedded at least one AI capability into their processes, with another 30 percent in the pilot stage. Even more telling is data from PwC that has 20 percent of business executives planning to deploy AI throughout their businesses in 2019, with nearly three quarters planning to do this through the cloud.
Nagging AI Doubts
Still, some headwinds remain. McKinsey also reports that more than 40 percent of executives worry about the lack of a complete AI strategy and a distinct shortage of professional talent to develop and manage these tools. PwC highlights a still-stubborn trust deficit when it comes to AI, with many executives still on the hunt for a transparent, explainable and provable AI model, as well as a legal and ethical framework to guide its actions.
Nevertheless, it seems that AI will be upon us in short order, ready or not. Technavio reports that demand for AI-infused silicon is at fever pitch, fueling a 39 percent growth rate until at least 2023. This means everything from consumer devices to advanced scientific platforms will employ AI in one form or another during the next refresh cycle, with data centers emerging as a particularly active growth area.
Somewhat ironically, however, the mere presence of AI in the data center is also fueling the need for AI in the data center. Technavio notes that many facilities are implementing neural networking (a form of AI) to handle increasingly large and sophisticated workloads. These technologies require an inordinate amount of power, cooling and other resources, which in turn can be delivered more efficiently under an AI-driven management system that is constantly on the lookout for over- or under-provisioning, data bottlenecks and other elements that drive up the cost of computing. Using specialized chips, organizations will be able to implement and operate AI simply through the normal course of hardware upgrades and infrastructure expansion.
One area where AI growth is expected to be particularly acute is the IoT edge. Much of this infrastructure is being built from scratch anyway, so there is no reason not to employ the latest intelligence to handle what is certain to be an extremely heavy and chaotic data load. As Datanami notes, companies like Dell are convinced that AI will be crucial on the edge, since this is where the data streams from all those devices first encounter the enterprise data ecosystem. Decisions over what to do with it must be made instantly and at scale, without direct human oversight – a job tailor-made for AI.
The Real AI
As more businesses become familiar with AI, however, we can expect the current anticipation of its capabilities to give way to a more pragmatic realism. Forrester’s Michele Goetz notes that the primary difference between 2019 and 2018 in relation to AI is that we will finally get a sense of what AI cannot do, which will hopefully put to rest all the fanciful notions of massive job losses and computer systems running amok. Instead, we can expect AI to weave into existing processes and systems almost imperceptibly, with the primary task for humans being to understand why AI does what it does and implement the proper mechanisms to control it before entrusting it to critical infrastructure.
Two key aspects of this process, Goetz says, are to identify the inevitable cases of “AI-washing” that are likely to arise, and to recognize that mistakes will be made, probably many times, during this transition. Just because a technology uses an algorithm does not mean it is intelligent, nor are all forms of AI appropriate for any given use case. To protect yourself, start by establishing a roadmap with both short- and long-term goals and don’t fall prey to over-planning or the notion that failure is the end of an experiment rather than an opportunity for learning.
2019 is also likely to be the year the enterprise, and perhaps the world at large, will finally come to realize the many ways in which AI will manifest itself into daily life. While the popular impression is that intelligence will work quietly behind the scenes doing who knows what for what purposes, the reality is that AI will allow humans to engage with computers, and vice versa, in novel new ways. Advanced image recognition and natural language processing, for instance, will allow digital entities to recognize and converse with people, which in turn will usher in a new era of computing in which highly advanced functions can be carried out simply by asking for a certain result – no programming skills required.
At the same time, risk and uncertainty in all manner of operations will be driven to a minimum once predictive analytics and forecasting become commonplace. And perhaps best of all, the idea that software cannot and will not be improved until someone updates it with new code will fade into history. Software will improve itself, constantly learning from its user and its environment, and then tailoring itself to anticipate ever-changing expectations.
These developments are not likely to fully emerge in 2019, of course, but over decades. For most enterprises, 2019 will be another year of experimentation, albeit on more functional levels in non-critical production environments. And perhaps, we will start to see AI take on the roles of both teacher and student in its own development.
Arthur Cole writes about infrastructure for IT Business Edge. Cole has been covering the high-tech media and computing industries for more than 20 years, having served as editor of TV Technology, Video Technology News, Internet News and Multimedia Weekly. His contributions have appeared in Communications Today and Enterprise Networking Planet and as web content for numerous high-tech clients like TwinStrata and Carpathia. Follow Art on Twitter @acole602.