More

    Intel Shapes Enterprise Future for Hadoop

    Slide Show

    Five Pitfalls to Avoid with Hadoop

    At the O’Reilly Strata conference today, Intel took another step toward reinventing the data center from the ground up with the launch of the Intel Data Platform, an extension of the Intel distribution of Hadoop that provides enhanced security, and the release of an analytics toolkit that embeds graph analytics and predictive modeling environments within the Intel Data Platform.

    Intel has embraced Hadoop in particular and Big Data in general as part of a broader effort to increase the utilization of multi-core processors. According to Jason Fedder, general manager of channels, marketing and business operations for Intel’s Datacenter Software Division, the short-term goal is to make it easier for developers to create Big Data applications without having to invoke the skills of a data scientist.

    Longer term, however, the goal is nothing less than making in-memory streaming analytics a fundamental element of every application. The implications of that effort are profound for enterprise IT organizations. Instead of building entire separate transaction processing and data warehousing environments, the data center of the future will be defined by applications that leverage analytics to inform and optimize transaction processing in real time.

    In effect, Fedder says Intel is in the process of reinventing the data center from the silicon up by exposing application programming interfaces to various processor components in a way that will create truly software-defined infrastructure. Once that’s in place, Fedder says, IT organizations will be able to compose application services on top of multicore processors that will be optimized to provide different classes of capabilities.

    Key elements of that strategy include not only Hadoop, but also other open source software technologies such as the Lustre distributed file system for managing storage across a high-performance computing (HPC) environment.

    Clearly, there’s a lot of investment in Hadoop these days. But if Intel has its way, Hadoop is not just going to be an element of Big Data applications, but rather a core foundational element of just about every future enterprise application that will ever be built.

    Mike Vizard
    Mike Vizard
    Michael Vizard is a seasoned IT journalist, with nearly 30 years of experience writing and editing about enterprise IT issues. He is a contributor to publications including Programmableweb, IT Business Edge, CIOinsight and UBM Tech. He formerly was editorial director for Ziff-Davis Enterprise, where he launched the company’s custom content division, and has also served as editor in chief for CRN and InfoWorld. He also has held editorial positions at PC Week, Computerworld and Digital Review.

    Get the Free Newsletter!

    Subscribe to Daily Tech Insider for top news, trends, and analysis.

    Latest Articles