When Gartner issued a report and related press release suggesting companies could save $500,000 a year by consolidating their data-integration tools - aka, data tool rationalization - and forming shared-services teams, it seemed like a useful, albeit simple, bit of advice.
I mean, of course you'd save on licensing and maintenance costs if you eliminate tools, right? Of course an integration best-practices team is a good idea. Why not run with it?
In fact, Gartner even issued a similar cost-cutting press release just a year before. So, who knew there'd be a backlash?
But, as I shared last week, there was a backlash. Seth Grimes called foul on the projected cost savings in an Intelligent Enterprise column and David Linthicum thought Gartner's advice was pretty tired. 'Nuff said, right?
Uh...not exactly. Since then, two more pieces have emerged debating the merits of Gartner's advice - the actual advice, and not just the claimed cost savings. So, I think you're going to read what they have to say, too.
Normally, these things devolve into a tit-for-tat where people are basically repeating the same points, but that hasn't happened yet with this discussion. In fact, one of my favorite IT commentators, Vincent McBurney, a Deloitte manager based in Australia, made some excellent new points about the report in a recent blog post, "ETL Consolidation is no Walk in the Park and no $500,000 Windfall."
I like McBurney because he's a straight talker with a quick wit. You can tell his advice is anchored in the real working-stiff world, rather than the unfathomably ginormous world of global conglomerates with all the IT money in the world.
True to form, McBurney sees the pros and cons of the Gartner report. His favorite recommendation is actually the shared-services team, which is basically a data-integration center of excellence, or, as McBurney puts it, "A group dedicated to fighting for truth, justice and better data integration." Once you've established a shared-services team, it can handle all the data integration you can toss its way ... but it can also be bureaucratic and difficult to keep staffed, since ETL (extract, transform and load) skills and participating in a "center of excellence" looks great on a resume, he notes.
But the real beef is his analysis of rationalizing and consolidating data integration tools. He sees plenty of headaches with trying to follow these Gartner recommendations. He writes it's "good in theory, but tough in practice," arguing that ETL proliferation is just reality.
He lists four reasons "off the top of his head" why rationalizing to one tool won't work. My favorite:
Putting different data integration tools onto the same hardware is like putting alpha personalities into a share house. When a good ETL tool takes a run at very high data volumes it wants to use as much of the hardware, RAM and disk I/O as it can get its hands on and every other piece of software that is running on that hardware can go suck eggs. Put two ETL tools on the same hardware or put two projects on the same hardware and the odds are they have the same overnight batch processing window and fight to the death to get it done in time.
Those of you considering open source data-integration tools will be particularly interested in his assessment of how open source tools actually fare in the wilds of the enterprise.
While McBurney is skeptical about the realities of consolidating on a single ETL tool, Stephen Swoyer over at Enterprise Systems reaches a different conclusion.
Swoyer sets aside the question about cost savings to ask whether data-integration rationalization is a worthwhile endeavor in and of itself. His article cites a range of experts, including Grimes, and determines that it is worthwhile-if you have a data-integration architecture in place to guide the rationalization. The goal should be to make your data operations more effective, rather than cut costs, according to this piece.