Judy Ko of Informatica has a money-saving tip for those of you under a cash crunch out there: Stop thinking project-to-project on data integration and standardize your data-integration approach across the organization.
Ko, the VP of product management and marketing at Infromatica, is an expert on enterprise data integration and integration competency centers, according to her bio. Before Informatica, she worked as the director of Financial Services Solutions Marketing at BEA Systems. Despite the marketing-heavy resume, her bachelor's degree is actually in engineering and management systems from Princeton University and she holds an MBA from Harvard.
In a recent blog post, Ko said companies need to shift their thinking beyond the project at hand and start focusing on building an integration competency center, complete with a standardized approach to data integration. If that just strikes you as overly bureaucratic, Ko cites this Gartner statistic: An integration competency center can save large enterprises an average of 30 percent in integration application and data interface development time and costs, plus an additional 20 percent savings in maintenance costs.
Ko even offers advice on how to shift away from the project mentality. Instead of looking at your immediate needs, she suggests you think in terms of the four realms of data integration and buy a tool that addresses all four:
- Enterprise data integration, which is the behind-the-firewall integration we all know and love.
- B2B data exchange, integration with partners outside the firewall.
- Cloud integration, a newbie.
- Data quality, which obviously touches all three as a sub-discipline, but she felt was important enough to hold its own realm.
Of course, the danger in sourcing vendor blogs is that they may be shaping their discussion to match their company's product. But her realms are general and the piece is good fodder for establishing your own criteria for evaluating a data-integration solution.
In a similar vein, you might want to check out her related post, "Checklist for Data Integration Platform Capabilities," which offers a ton of possible evaluation criteria, including a look at the five data-integration steps any product should address:
- Access the data.
- Discover the data.
- Cleanse the data.
- Integrate the data.
- Deliver the data.
And, of course, the recommendation to move from the project mindset to a more strategic mindset is always smart advice.
In other integration news:
Update on Batch Processing with SOA. In January, I pointed out a ZapThink piece on service-enabling batch processing. The article was long on the pros and strategy, but short on cons and how-to. For those of you who'd like to read more, I recently found a series of technical blog posts on the topic by Akiva Marks, a senior SOA architect and integration specialist based in Israel. In part one, he looks at whether ESBs can provide the environment needed for batch processing. In part two, he looks at potential problems and how the data access model gets in the way. He's promised a third article, but it hasn't posted yet.
Balfour Beatty Construction Integration Case Study. TMCnet.com published a sort of mini-case study on Balfour Beatty Construction's ERP integration. The company used Jitterbit, an open source solution. The piece focuses a bit too much on how wonderful Jitterbit is, and is a bit light on information about the actual implementation, but if you're considering an open source data-integration solution, it might be a good read.