More

    NVIDIA Extends Scope of AI Software Ambitions

    This week, during an online Computex trade show, NVIDIA unveiled a platform designed to make it simpler for enterprise IT teams to build and then deploy AI models in a production environment.

    Developed in collaboration with NetApp, the NVIDIA Base Command Platform can be deployed in an on-premises environment or on a public cloud, with Google Cloud planning to make the platform available on its cloud later this year. Available under an early access program, NVIDIA Base Command Platform is priced starting at $90,000 per month for subscription that provides continuous access to updates made to the platform. The alliance with NetApp is an extension of an existing partnership that now includes data management tools for the NVIDIA Base Command Platform provided by NetApp.

    The NVIDIA Base Command Platform provides a single pane of glass through which organizations can manage the development and deployment of AI models on graphical processor units (GPUs) from NVIDIA that require the cooperation of data scientists, data engineers, application developers, and IT operations teams to successfully deploy. Via that platform NVIDIA is also making it simpler for those teams to employ a range of AI tools it now offers via the NVIDIA GPU Cloud (NGC), including application programming interfaces (APIs), Jupyter notebooks, and other tools.

    Manuvir Das NVIDIA Head of Enterprise Computing
    “The next era of democratization of AI is coming,” notes Manuvir Das, head of NVIDIA’s Enterprise Computing.

    NVIDIA also expanded the number of servers it has certified to run NVIDIA AI Enterprise software to now include 50 platforms from vendors such as Advantech, Altos, ASRock Rack, ASUS, Dell Technologies, GIGABYTE, Hewlett Packard Enterprise, Lenovo, QCT and Supermicro. The company also revealed that ASUS, Dell Technologies, GIGABYTE, QCT and Supermicro are also making available NVIDIA BlueField-2 data processing units (DPUs) as options to offload networking, security and storage tasks from NVIDIA GPUs.

    In general, NVIDIA is trying to advance adoption of AI by making it simpler for all the stakeholders involved to collaborate, says Manuvir Das, head of Enterprise Computing at NVIDIA. “The next era of democratization of AI is coming,” he says.

    Also read: Google Makes Case for Managing AI Models

    Uniting Ops Teams

    Achieving that goal requires bridging the divide that currently exists between data scientists and data engineers on the one side and application developers and IT operations teams on the other. Data scientists typically require months to build an AI model that will be embedded with an application. Many application development teams are now creating applications using DevOps practices at rates that make it challenging to align the deployment of an AI model with an application that is ready to be deployed in an application environment. 

    Once an AI model is deployed it also needs to be monitored to ensure tasks are being automated in a way that doesn’t drift away from the original intent as more data is made accessible to the underlying machine learning algorithms.

    Also read: Rush to AI Exposes Need for More Robust DataOps Processes

    NVIDIA’s Unique Move

    The NVIDIA Base Command Platform is designed to provide a platform through which all the stakeholders involved can coordinate their respective tasks and responsibilities. There is no shortage of rival platforms that promise similar benefits. The difference is the platform from NVIDIA comes from the provider of the GPUs that are most commonly employed to train AI models. 

    In fact, if successful it would mark the first time that any provider of a family of processors would also have been able to deliver a platform on which the applications for that processor are widely built.

    Not surprisingly, there will be  IT organizations and rival vendors keeping closer tabs on NVIDIA’s future ambitious software plans from here on out.

    Read next: AIOps Trends & Benefits for 2021

    Mike Vizard
    Mike Vizard
    Michael Vizard is a seasoned IT journalist, with nearly 30 years of experience writing and editing about enterprise IT issues. He is a contributor to publications including Programmableweb, IT Business Edge, CIOinsight and UBM Tech. He formerly was editorial director for Ziff-Davis Enterprise, where he launched the company’s custom content division, and has also served as editor in chief for CRN and InfoWorld. He also has held editorial positions at PC Week, Computerworld and Digital Review.

    Latest Articles