Intel Shapes Enterprise Future for Hadoop

Mike Vizard
Slide Show

Five Pitfalls to Avoid with Hadoop

At the O’Reilly Strata conference today, Intel took another step toward reinventing the data center from the ground up with the launch of the Intel Data Platform, an extension of the Intel distribution of Hadoop that provides enhanced security, and the release of an analytics toolkit that embeds graph analytics and predictive modeling environments within the Intel Data Platform.

Intel has embraced Hadoop in particular and Big Data in general as part of a broader effort to increase the utilization of multi-core processors. According to Jason Fedder, general manager of channels, marketing and business operations for Intel’s Datacenter Software Division, the short-term goal is to make it easier for developers to create Big Data applications without having to invoke the skills of a data scientist.

Longer term, however, the goal is nothing less than making in-memory streaming analytics a fundamental element of every application. The implications of that effort are profound for enterprise IT organizations. Instead of building entire separate transaction processing and data warehousing environments, the data center of the future will be defined by applications that leverage analytics to inform and optimize transaction processing in real time.

In effect, Fedder says Intel is in the process of reinventing the data center from the silicon up by exposing application programming interfaces to various processor components in a way that will create truly software-defined infrastructure. Once that’s in place, Fedder says, IT organizations will be able to compose application services on top of multicore processors that will be optimized to provide different classes of capabilities.

Key elements of that strategy include not only Hadoop, but also other open source software technologies such as the Lustre distributed file system for managing storage across a high-performance computing (HPC) environment.

Clearly, there’s a lot of investment in Hadoop these days. But if Intel has its way, Hadoop is not just going to be an element of Big Data applications, but rather a core foundational element of just about every future enterprise application that will ever be built.



Add Comment      Leave a comment on this blog post

Post a comment

 

 

 

 


(Maximum characters: 1200). You have 1200 characters left.

 

 

Subscribe to our Newsletters

Sign up now and get the best business technology insights direct to your inbox.


 

Resource centers

Business Intelligence

Business performance information for strategic and operational decision-making

SOA

SOA uses interoperable services grouped around business processes to ease data integration

Data Warehousing

Data warehousing helps companies make sense of their operational data