In recent years, more IT organizations have shifted away from hand-coding integration to using a data integration tool.
In theory, that should eliminate some of the spaghetti code going forward. The problem, though, is that times have changed and the data integration tools of yesterday may no longer be enough, contends integration and data consultant David Linthicum.
In particular, Big Data and cloud computing are creating new challenges that may require organizations to re-think their current data integration strategy, Linthicum writes in a guest blog for Actian, a Big Data solution provider that offers an integration platform for the cloud.
In particular, Linthicum points to Big Data, noSQL, the cloud, and the complexity of data as reasons why CIOs need to evaluate how the IT division currently handles data integration.
“The bottom line is that our data environments are becoming more complex and distributed, and this trend will continue for at least the next 10 years,” Linthicum writes. “Enterprises will continue to see the need and importance of data integration solutions, and thus it continues to be a priority in most IT shops that think proactively.”
Assess your existing data integration solutions’ readiness for this new distributed, data-rich environment by asking the following:
“So, yes, it’s great that you can pull in, for example, social data about what your customers are saying about your products on Twitter and Facebook, but if that data is antisocial (i.e., not integrated) with the data about your customers and products in MDM and the data warehouse, then what use is it?” he asks.
Many data integration tools today are shipping with connectors for Hadoop and other Big Data tools, but you need to make sure your data integration solution will allow you to move data from your Big Data/cloud store of choice.