Re-envisioning Integration

Loraine Lawson
Slide Show

Data Integration Remains a Major IT Headache

Study shows that data integration is still costly and requires a lot of manual coding.

For the most part, integration has always been something outside the systems and applications we use on a day-to-day basis. Want to know how many widgets sold across the region today? You're going to need to use an ETL tool to run the numbers, and that's going to be a batch process done overnight. Want to really analyze a situation? Wait while IT builds a data mart and loads the data from across the enterprise.


Let's face it, traditionally, integration's slogan could've been "Separate and Slow."


Despite all the talk of "real-time" and "near-time" integration, I suspect there are plenty of organizations "living on Tulsa time" when it comes to data. (No offense to Tulsa, which is actually a beautiful and hip city, IMHO - after all, it has the nation's only inland seaport.)


But that's just not going to work in today's high-volume, accessing-the-Internet-by-mobile, social networking, unstructured and semi-structured world, contends Brian Babineau, vice president of research and analyst services for the Enterprise Strategy Group (ESG).


In a recent Information Management article, Babineau argues for re-envisioning integration as "perpetual IT projects as part of continuous improvement goals" rather than a one-off process. And that is the only way integration will be able to support the "creativity of an organization and its commitment to using information as an asset," he adds.


Frankly, these types of articles can be a bit obtuse and meandering, but I think Babineau does a great job of making his argument and providing a realistic evaluation of what this means - which is why I'm recommending it to you now.


The article does an excellent job of explaining the scope of the challenge, which will sound familiar to most techies and all data heads: social networking, more mobile devices, etc. One issue he points out that I had not heard of, or considered, was the proliferation of "less-structured application modules," like freeform fields that allow you to put in links, audio or text.


"In some instances, these fields are stored in a database, but many times they are external files linked to a central database," he writes. "This information provides important context to standard database records."


Babineau outlines seven characteristics for his re-envisioned integration platform:

  1. Support for "evolving requirements," such as less-structured data sources and real-time access to data.
  2. Connect to data regardless of where it is stored - whether that's an on-premise database, enterprise content management system, a SaaS, portal or website.
  3. Integrate multiple sources in real time without copying the data to a separate system; he adds this can "usually be accomplished by indexing the supporting information sources."
  4. Perform data analysis before presenting it to end users. This may read like he's suggesting IT must handle the analysis, but I believe he means the integration layer can perform the analysis before it goes to BI or some other software.
  5. Support security by a role-based view of the data, which he says could be done either by the integration platform itself or by an existing solution.
  6. "Optimize workflows" by, for example, sending end users updated information if something changes or if some critical threshold is reached.
  7. Integration with other integration-based investments, including ETL and BI.

Add Comment      Leave a comment on this blog post

Post a comment





(Maximum characters: 1200). You have 1200 characters left.



Subscribe to our Newsletters

Sign up now and get the best business technology insights direct to your inbox.