Natural Language Processing (NLP) has made great strides since the advent of Siri, to the point where it is now garnering serious attention as an enterprise productivity tool.
In addition to helping IT provision and manage data infrastructure, NLP is expected to democratize the data ecosystem throughout the knowledge workforce. Instead of crafting the user interface in language the machine understands, it will employ the language of the user, opening up advanced computer processes to non-technical staff.
The market for NLP is expected to top $22.3 billion by 2025, according to research house Tractica. Software alone is expected to jump from a piddling $136 million last year to more than $5.4 billion in 2025, which will in turn drive sales of hardware and professional services throughout the supply chain. Interestingly, this market growth is not driven by new algorithms or other advances to NLP itself. Rather, says analyst Mark Beccue, it is the result of increasingly scalable and affordable computational power and the digitization of virtually all information.
Business activity is already swirling around NLB as vendors large and small position themselves for what could be “the next big thing.” Data visualization firm Tableau recently paid an undisclosed sum for ClearGraph, a start-up that specializes in voice-activated database queries. Tableau is expected to integrate the platform into its own portfolio to simplify the visualization process, perhaps to the point where users will no longer need SQL or other database experience to parse data. Giants like Microsoft are offering similar services in their business intelligence suites and other products, but Tableau says it can provide a vendor- and cloud-neutral solution.
NLB is likely to be the driving force behind the upcoming “Insight Engine” that will replace enterprise search in a few short years, says Kamran Khan, CEO of Search Technologies. Gartner, in fact, has already swapped the two terms in its Magic Quadrant program, highlighting the Insight Engine’s more natural access to information for knowledge workers and other users. NLP is already in high demand given the volumes of unstructured and semi-structured content that most enterprises maintain, and given time, the artificial intelligence and machine learning components that drive language processing will evolve to produce highly customized digital agents capable of providing a wide range of services.
NLP will also influence product design and service delivery, according to a recent report from Strategy Analytics. The technology gives developers additional flexibility when crafting the Human-Machine Interface (HMI) for key platforms and devices, which in turn will provide a competitive advantage against products that rely on traditional one-size-fits-all interfaces. And coupled with facial recognition, gesture control and emotion control, NLP stands a good chance of providing near human-like responsiveness.
In science fiction, machines that act and think like humans are either the key to a new nirvana or the end of mankind forever. In reality, the results will probably be a mixed bag. New technologies have a way of solving old problems but creating new ones of their own.
A talking data ecosystem will be a benefit to the enterprise, but don’t expect it to be an infallible, all-knowing font of wisdom. Like any digital system, it will only be as good as the data it receives, and it will be up to humans to use their intuition, which no algorithm can duplicate, to discern between fact and fiction.
Arthur Cole writes about infrastructure for IT Business Edge. Cole has been covering the high-tech media and computing industries for more than 20 years, having served as editor of TV Technology, Video Technology News, Internet News and Multimedia Weekly. His contributions have appeared in Communications Today and Enterprise Networking Planet and as web content for numerous high-tech clients like TwinStrata and Carpathia. Follow Art on Twitter @acole602.