Looking Ahead to Fast Data Analytics

    Big Data is all the rage in the IT trade press and at leading technology confabs, but some experts are already turning their attention to the next goal for enterprise infrastructure: Fast Data.

    In part, this is a reflection of the times. Traditional enterprises dealing with large data volumes need a way to process and analyze that data to mine nuggets of actionable information. But as the world becomes more interconnected, IT will very quickly have to deal not with a few extremely large data sets, but untold millions of small packets offering tiny slices of information which, when coordinated with related sets, form an accurate picture of complex environments.

    Take the large, web-facing enterprises like Google and Facebook as examples. As they deploy increasingly sophisticated management stacks to keep tabs on their diverse infrastructure, numerous monitoring systems stream continuous updates regarding the health of this or that component. A traditional Big Data analytics platform would be of little use because they are more adept at crunching fully compiled information. What’s needed is a new type of analytics that is more adept at real-time, streaming information. Enter Fast Data.

    Fast Data, after all, is what spurred Paul Maritz to give up his post at VMware in favor of a new EMC venture called Pivotal. The goal of the new company is to devise nothing less than the new operating system for the cloud, much the same as VMware supplanted Windows and other operating systems in the data center. The company is focusing heavily on Fast Data as a means to enable organizations across multiple industries, not just the enterprise, to keep tabs on highly complex, highly dynamic systems and processes.

    One way to do this is through a new generation of in-memory analytics. A company called Aerospike recently devised a new hybrid approach to real-time data processing that combines DRAM and Flash in such a way that it can respond to queries and other commands in mere milliseconds. The system scales up to hundreds of thousands of transactions per second and can accommodate terabytes of stored data and billions of objects. The company also recently hired in-memory specialist Marie-Ann Neimat, who’s helped develop systems for Oracle and TimesTen.

    In-memory technology is also making its way to the service model. SAP, for example, recently introduced a cloud version of its HANA analytics platform, which provides analytics and transactional services aboard SAP’s own physical infrastructure for firms that are either unwilling or unable to optimize their own systems for Fast Data. The company is aligning the service along key vertical markets, enabling tailored offerings for financial applications, travel services and other functions.

    Some experts are already saying that Fast Data is likely to eclipse Big Data as a top enterprise priority in relatively short order. According to XtremeData’s Jay Desai, many companies are already struggling with the influx of small-packet data coming from mobile infrastructure and web-based transactions. As infrastructure devices start to adopt increasing intelligence and machine-to-machine (M2M) communications – what some are starting to call the Internet of Things – Fast Data technologies will play a crucial role in maintaining and optimizing critical data sets within the enterprise.

    This doesn’t mean that the enterprise is poised to move beyond Big Data, however. In fact, it is highly likely that both Big and Fast Data systems will work side-by-side to give the enterprise a clear view of an increasingly diverse but still largely opaque data environment.

    As I’ve pointed out in the past, cloud infrastructure lessens the need for massive storage and processing infrastructure because those resources will always be available somewhere. The crucial piece nowadays is networking, since productivity is largely a factor of the ability to move data from place to place. And if that data can be captured, compiled and analyzed as it comes in, the enterprise will have a much easier time keeping up with a rapidly changing data universe.

    Arthur Cole
    Arthur Cole
    With more than 20 years of experience in technology journalism, Arthur has written on the rise of everything from the first digital video editing platforms to virtualization, advanced cloud architectures and the Internet of Things. He is a regular contributor to IT Business Edge and Enterprise Networking Planet and provides blog posts and other web content to numerous company web sites in the high-tech and data communications industries.

    Latest Articles