The Next Evolution of Integration Tools

    When I first started to cover integration, it took me a long time to figure out the real difference between application and data integration. After all, they’re both really about the same thing: sharing data.

    Actually, I still have trouble explaining it. That’s why I was thrilled to learn the differences between the two appear to be blurring.

    Stewart Bond, a senior research analyst at Info-Tech Research Group, contends the lines are blurring between these two styles of integration.

    Now, I admit, on a scale of one to 10 on a list of what CIOs and business leaders need to know, this ranks pretty darn low. It’s a technologist’s issue, in many ways — except one: costs.

    You see, because there are two different sets of tools for accomplishing the same goal — moving data — companies must pay twice for the honor of moving data.

    As if that weren’t troubling enough, application integration has always been more complicated. Rather than just extracting, transforming and loading the data into a data warehouse, application integration requires a host of functions such as publish/subscribe, canonical message models, routing, brokering and orchestration, he points out.

    That complexity makes application integration more expensive, too, and frequently requires IT to hire consultants and maintain custom code.

    Steward does the best job of explaining the difference between application integration (AI) and data integration (DI):

    “The primary functional difference between AI and DI is the interface layer. AI interfaces at the API level whereas DI interfaces at the database level. The primary non-functional difference is the way that data volume is realized. An easy illustration is considering 1000 records of data being sent between applications. An AI scenario would represent those records as individual messages, sent 1000 times. In a DI scenario, those 1000 records would be sent in one message.”

    That’s why data integration tends to use ETL tools for batch processing and storing data in a data warehouse. Conversely, that bulk record approach does not work so well with ESBs, which are used for application services and treat each instance of data as a message.

    Change Data Capture has introduced real-time messaging to the data integration world. Now Bond is seeing “classic application integration functionalities” in the data integration (DI) tools.

    The cloud seems to play a role here, since he writes that data integration products need to call REST and SOAP APIs and application integration tools need to be able to interact directly with databases (think APIs calling databases from the cloud for, say, smartphones as one example).

    Bond foresees the data integration hub evolving into a data services bus that works much like an enterprise service bus, but for data.

    It sounds promising, though it does make you wonder about whether it will skew integration costs toward data integration (cheaper) or application integration (not so cheap). I would hope it would reduce the costs, since he writes you could reuse data transformations across different tools.

    But then again, this might mean new investments, and it would definitely mean moving away from hand-coding point-to-point integration.

    Loraine Lawson
    Loraine Lawson
    Loraine Lawson is a freelance writer specializing in technology and business issues, including integration, health care IT, cloud and Big Data.

    Latest Articles