Natural language processing (NLP) with business intelligence (BI) applications has been used sparingly but that may all soon change as speech recognition interfaces infused with artificial intelligence (AI) capabilities become a lot more advanced.
A recent report published by the market research firm Dresner Advisory Services notes that out of 41 business intelligence capabilities ranked in a survey of end users natural language analytics (NLA) came in 32nd in relative importance.
However, 70% of respondents said NLA capabilities are important, it is just that BI fundamentals such as self-service dashboards still take precedence over comparatively emerging technologies such as NLA. Only 13% of respondents said NLA is not important.
In fact, while the survey finds only 27% of respondents currently make use of NLA capabilities in their BI applications, another 41% plan to make use of NLA. Just under a third (31%) said they have no plans.
Providers of BI applications are betting that speech interfaces will increase usage of NLP engines that many have already embedded within their applications. In some instances, that means embedding support for speech recognition interfaces directly within BI applications.
In other cases, it means making use of application programming interfaces (APIs) to enable end users to use, for example, Apple Siri to query BI data. The Dresner survey finds a third of respondents (33%) are leaning toward third-party technologies, while 25% said they would prefer NLA capabilities that are embedded into their BI application.
Also read: How AI Might Change the BI Experience
Speech Interfaces vs. Data Analysts
Business executives that don’t typically launch complex queries or don’t know how to use SQL are naturally more likely to employ a speech interface than an analyst that works with more complicated data sets. Most data analysts will continue to launch SQL queries via a graphical tool to interactively interrogate large amounts of data that resides both inside and out of a BI application.
NLP engines coupled with speech interfaces will make it easier for other types of end users to query the same data without any intervention on the part of a data analyst being required. That means a lot of the requests analysts today get from end users asking them to configure a dashboard around a specific set of key performance indicators (KPIs) are soon likely to be sharply reduced.
As more data heads into the cloud, end users will soon use speech interfaces to launch queries directly against massive data lakes. Amazon Web Services (AWS), for example, has unfurled an Amazon QuickSight BI service based on a serverless computing framework infused with machine learning algorithms that can be queried via a speech interface dubbed Amazon QuickSight Q.
BI applications are already being integrated with data lakes running on public clouds. As those cloud data warehouses become just another source of data for a BI application to potentially query using a speech recognition engine.Those AI-infused BI applications will also recommend queries that end users might want to make based on what machine learning algorithms have been able to learn about the data the application can access. In addition, BI applications might be configured to automatically run queries anytime a specific event occurs.
NLP Adoption Alongside Machine Learning
As AI capabilities advanced the average end users will become a lot more informed about the relationships between different data sets, notes Tableau CTO Andrew Beers. “A core benefit will be increased data literacy,” he said.
In general, NLP is emerging as one of the primary use cases for AI. A recent survey published by Modzy, a provider of a platform for building AI models, notes that in terms of adoption of AI technologies NLP is tied for second at 81% with computer vision. The top slot, not surprisingly, is held by machine learning. Going forward, however, respondents ranked computer visions, NLP, and deep learning/neural networks as likely to have the biggest impact in the years ahead.
Adoption of NLP, of course, also depends on the level of depth of understanding an NLP engine has for the lexicon employed within a particular vertical industry. A recent survey published by John Snow Labs, a provider of an AI platform for the healthcare sector, suggests a lot of NLP progress is being made in a field that is well known for its usage of arcane terminology. The survey finds more than a third of respondents (36%) either already have access to NLP or expect to by the end of this year. A third of the survey respondents (33%) also noted they employ BI applications or expect to by the end of the year as well. That level of adoption suggests that NLP engines are finally moving beyond only understanding basic terminology.
NLP and other AI advances are not likely to replace the need for a human to analyze data any time soon, but the ability of any end user to generate insights from BI applications is about to be dramatically enhanced. It will be up to each end user to determine when they prefer to employ speech recognition versus continuing to rely on a graphical user interface to launch queries. Chances are, however, most end users depending on the specific task will be using a mix of both for the foreseeable future. Regardless of the interface employed, the insights being surfaced by AI technologies will be a lot more actionable than ever before.