One of the reasons so many Big Data projects are starting to proliferate across the enterprise is that Hadoop made it a lot more affordable to collect massive amounts of data. But collecting data and managing it are not one and the same.https://o1.qnsr.com/log/p.gif?;n=203;c=204663295;s=11915;x=7936;f=201904081034270;u=j;z=TIMESTAMP;a=20410779;e=iThis week, Hortonworks and Pitney Bowes announced an accord under which Hortonworks will help market Big Data management tools for Hadoop environments developed by Pitney Bowes.
Roger Pilc, executive vice president and chief innovation officer for Pitney Bowes, says Pitney Bowes has developed both data quality tools and a geospatial analysis application. In addition, Pitney Bowes makes available data sets it has created as part of its billion-dollar shipping and mailing business.
Pilc says that while most of the interest surrounding Big Data continues to be on the platform, the everyday IT reality of Hadoop environments needs to be focused on the tools used to manage the Hadoop environment.
“It’s the tools that are the key to unlocking the value,” says Pilc.
Pitney Bowes makes its software available both as a cloud service and as a service that customers can invoke. Given the sensitive nature of the data involved, Pilc says, most Pitney Bowes customers have opted to set up their Hadoop instances running in their own data centers.
Going into 2017, it’s clear that Hadoop has now crossed the proverbial chasm in terms of enterprise adoption. The next big IT challenge is going to be finding a way to make sure the organization doesn’t start to drown in all the data being collected.