The popular perception is that anything involving petabytes of data requires a lot of IT people and at least one data scientist to analyze. In reality, however, analytics applications are scaling to the point where analysts can now analyze that data without much help from anyone.
With release of version 3.0 of its namesake predictive analytics application, Alpine Data Labs is making it possible for analysts to use a Web-based interface to directly access Hadoop data without requiring IT to move that data into a data warehouse.
According to Bruno Aziza, chief marketing officer for Alpine Data Labs, the simple fact of the matter is that data scientists don’t scale. Organizations need a predictive analytics application that allows them to work with massive amounts of Big Data without requiring a huge investment in people to set up and configure the environment.
Because of its drag-and-drop user interface, Aziza says the Alpine Data software can be accessed from multiple devices by multiple analysts, which makes collaboration a much simple endeavor within a Hadoop environment.
Ultimately, Aziza says the goal of any organization should be to move the data to the application versus setting up yet another silo of data to manage. As Hadoop becomes a more mainstream part of enterprise IT, organizations should be able set up a single “data lake” that multiple applications can directly invoke without having to set up data marts or entire data warehouses. In fact, beyond configuring the Hadoop cluster itself, it’s even conceivable that IT may one day be out of the data warehouse business altogether. Predictive analytics applications that can work directly against Hadoop data are a first step in that direction.