AI Is Here, But How to Deploy It?

    Enterprise executives are rightly intrigued by the operational advantages that artificial intelligence (AI) brings to the modern data environment, but questions remain as to how this technology will be deployed into legacy infrastructure and the ways it can best be applied to existing and future workflows.

    AI will most certainly make its way into the data center via the normal systems refresh cycle. It is getting more and more difficult to find a platform or application suite that does not incorporate some form of intelligence at this point, although the degree to which this can be attributed to simple “AI-washing” is unclear. What does seem likely is that everyday IT management functions will fall under the purview of an increasingly intelligent automation stack within the next few years.

    But this still leaves human operators the difficult task of determining what tasks to surrender to software and what to keep under their control. Without careful coordination, the enterprise could very well wind up with multiple intelligent platforms all tripping over themselves trying to harness the resources needed to execute their programming.

    According to Forbes’ Janakiram MSV, implementing AI on the application layer is a crucial element in this transition, which means it will most likely emerge in database operations first. The least disruptive way to do this is to start consuming AI-driven APIs, which gives apps key capabilities like natural language processing, image pattern recognition and speech-to-text/text-to-speech. Following that, tools like Machine Learning as a Service (MLaaS) provide the means to expose data as an API endpoint, which in turn benefits training and testing capabilities to help workers navigate the new operational paradigm.

    The third step is to deploy AI infrastructure in-house, most likely under an open-source, private cloud architecture. This will likely be heavy on solid-state storage and GPUs that can handle scale-out parallel processing functions.

    Already, a number of vendor platforms have hit the channel promising to streamline the AI deployment processes. Datameer’s SmartAI solution is said to democratize data science throughout the enterprise by providing the operational foundation for deep learning and other capabilities. The system leverages Google’s TensorFlow library to build intelligent data models that can be incorporated into analytics processes via a single mouse click. In this way, data scientists can move the new models from the lab to production environments at a rapid pace while maintaining governance, security and management standards across the data ecosystem.

    Meanwhile, a company called Alteryx is out with a new AI deployment, management and integration platform that the company says allows for widespread use of intricate analytics products, even by non-technical users on the business side of the enterprise. The Alteryx Promote solution is based on technology acquired from data management firm Yhat. Its main function is to combine model deployment with various REST APIs, providing for high degrees of customization and broad scalability. As well, the platform supports multiple analytics languages, including R, Python, PySpark and TensorFlow.

    Beyond mere deployment and integration, AI will also have to work its way into the culture of the enterprise, which is dramatically more challenging. Tiger Tyagarajan, CEO of professional services firm Genpact, says the enterprise needs to employ three crucial tactics in order to successfully launch an AI-driven work environment. First, employees must be prepared for the change. Simply unleashing bots into workflows without prepping users on how they function and what their own roles will be is a recipe for failure. Secondly, be ready for some resistance, and not just from low-level workers whose tasks are being automated. Top executives often have strong vested interests in keeping things as they are, as well. And finally, recognize that AI will produce changes to the core business model, not simply provide a more efficient means of achieving today’s outcomes.

    It should be noted, however, that a more intelligent enterprise is not necessarily a more successful one. Even the most advanced AI platform can only produce results that are as good as the data it is given and according to the parameters in which that data is queried.

    In the end, it takes smart people to make the most of smart technology, no matter how thoroughly it has infiltrated the IT environment.

    Arthur Cole writes about infrastructure for IT Business Edge. Cole has been covering the high-tech media and computing industries for more than 20 years, having served as editor of TV Technology, Video Technology News, Internet News and Multimedia Weekly. His contributions have appeared in Communications Today and Enterprise Networking Planet and as web content for numerous high-tech clients like TwinStrata and Carpathia. Follow Art on Twitter @acole602.

    Arthur Cole
    With more than 20 years of experience in technology journalism, Arthur has written on the rise of everything from the first digital video editing platforms to virtualization, advanced cloud architectures and the Internet of Things. He is a regular contributor to IT Business Edge and Enterprise Networking Planet and provides blog posts and other web content to numerous company web sites in the high-tech and data communications industries.

    Latest Articles