More to Big Data Infrastructure than Just Size

    Slide Show

    Best Practices for Choosing a Business Intelligence Dashboard

    The enterprise is rightly concerned about the impact that Big Data and the Internet of Things will have on infrastructure, but the sad fact is that this is only the first of many challenges that lie in the path of simple functionality, let alone full-blown optimization of large, sensor-driven data sets.

    Not only does the enterprise need to build or lease new server, storage and networking capacity for large workloads, it will also have to deal with data handling, integration, and the monumental task of making sure constantly updated and manipulated data is made available across the application and user bases.

    According to Gartner, 75 percent of enterprises are either utilizing or laying the groundwork for advanced analytics. While the rate of growth has leveled off in the past few months, it is clear that Big Data and analytics are moving from the test bed to the enterprise mainstream in record time. Drivers for all this investment range from improving and streamlining existing processes to ramping up sales and marketing techniques and even development of new lines of business. One interesting side note is that interest is so high even though nearly half of IT executives who are working toward data analytics are not sure if the ROI will be positive or negative.

    True to form, the software industry is hitting the market with a plethora of solutions that aim to solve many of the pain points that arise in Big Data infrastructure. A Hitachi subsidiary, Pentaho, is out with a new version of its same-named integration and analytics platform featuring an all-new server that automates the entire data pipeline. The system streamlines critical functions like data-blending and integration, data lineage tracking and storage, discovery and collaboration for functions like in-line modeling, and SNMP-based monitoring and visibility.

    Meanwhile, a company called Birst is building global governance capabilities into its Business Intelligence and analytics portfolio, allowing enterprises to leverage powerful, disparate resource sets for high-volume workloads. The “Networked BI” approach essentially virtualizes the BI ecosystem so it can be distributed across decentralized infrastructure without having to physically replicate data and metadata across multiple clusters. The system relies on Birst’s own multitenant cloud architecture that supports a common analytical fabric for high-speed, enterprise-class scalability, as well as self-service data preparation and transparent governance.

    Data Analytics

    At the same time, stalwarts like HP are tailoring their management platforms for Big Data, and breaking down many of the silo-based architectures that they helped create in the process. As part of its HP Haven system, the company will update its HP Propel and HP Service Anywhere solutions to better accommodate the complexity of hybrid clouds and application catalogs. Part of the plan is to leverage data analytics to help the enterprise overcome the isolation that arises in Big Data warehousing solutions, allowing data to be pushed directly to the applications that can make the best use of it. The Haven platform itself is designed to harness various elements in the Big Data arsenal, such as Hadoop, the Autonomy IDOL system, the Vertica analytics platform and others, to create a cohesive approach to the gathering, analysis and utilization of large data sets.

    Effectively leveraging Big Data – that is, turning bits and bytes into actual knowledge, and then getting that knowledge to those who can use it – will require careful coordination between hardware, software and human resources. Simply focusing on the “big” aspects of Big Data misses the point entirely. In order to fully leverage the capabilities at hand, the enterprise will need to ensure that supporting platforms are big, fast, intelligent and, most of all, flexible enough for the highly dynamic data environments to come.

    Arthur Cole writes about infrastructure for IT Business Edge. Cole has been covering the high-tech media and computing industries for more than 20 years, having served as editor of TV Technology, Video Technology News, Internet News and Multimedia Weekly. His contributions have appeared in Communications Today and Enterprise Networking Planet and as web content for numerous high-tech clients like TwinStrata and Carpathia. Follow Art on Twitter @acole602.

    Arthur Cole
    Arthur Cole
    With more than 20 years of experience in technology journalism, Arthur has written on the rise of everything from the first digital video editing platforms to virtualization, advanced cloud architectures and the Internet of Things. He is a regular contributor to IT Business Edge and Enterprise Networking Planet and provides blog posts and other web content to numerous company web sites in the high-tech and data communications industries.

    Latest Articles