Hand-coding data integration hurts families. That’s what I learned yesterday, when my rather disgruntled spouse came home late from a day of data entry.
My toddler skipped his nap, then inflicted his outrage on the household. The grass was about four inches high in the backyard and it was finally dry enough to mow. I was sick with asthma, while the older child was making increasingly whiny demands about dinner. It was not a good day for anyone.
But the problem was payroll, so it couldn’t wait — even if it meant devoting half your development team to entering by hand time sheet data.
PLEASE keep in mind, my husband neither built nor owns this process. He only provides backup support. And the people responsible (well, most of them) have already retired, or I wouldn’t be sharing this.
As near as we can figure out, here’s what happened: About three years ago, someone hand-coded an integration flow — or maybe just a data migration — between the time sheet system and SAP’s CATS. That person then turned that process over to someone in finance, who retired.
And that’s how a payroll clerk came to be managing a business-critical task that involved pages of instructions every week.
The time sheet seems to send data in micro-batches to a text file that’s then imported into SAP. That file is then overwritten. That sounds fine, but it’s not automated and as far as the current staff can tell, there’s no backup or historical data retained.
The data did not make it to SAP, and by the time she reported the problem to IT — two days later — the data was long gone. Need I even say I had serious questions about how, exactly, a payroll clerk could delete days worth of the raw and original time sheet data?
Now. This setup screamed “fail” for many reasons, including the basic lack of a backup system, a time clock system that was three versions behind, the obviously hand-coded integration and, oh yes, a business user with poor technical expertise is following multiple pages of written instructions to run this process.
Let me be clear about one thing: I am no data integration expert, even though I’ve been writing about it for years. This is complex stuff.
So if I know it’s wrong, then geez, you would think someone on the site would’ve figured it out long ago.
I was still steaming about the incident — won’t somebody please think of the children! — when I stumbled onto a conversation about why companies need to change how they approach data integration, and provide more self-serve integration to business users.
Joe McKendrick, who writes for ZDNet, participated in an Informatica-sponsored webcast about the next generation of data integration. Dr. Claudia Imhoff and John Schmitt, Informatica’s VP of global integration services, were also part of the event.
Parson asked what’s changed that suddenly data integration is more urgent than the previous decades.
In short, business has changed and data integration must as well, McKendrick writes.
“Today’s organizations are ‘loosely coupled’ businesses, structured as sets of services either created and maintained in-house or brokered through outside service providers,” he writes. “With this fluid state of existence, applications and data need to be constantly re-assigned, re-allocated, or repurposed to ever-changing processes and workflows.”
To which I say, “Preach it, Brother Joe.”
But then I looked at the event slides, which are quite good, and it lists the 12 building blocks for architecture-based data integration.
Architecture-based DI is a big term for what I’m affectionately calling business-friendly DI. It’s the final step in an evolutionary data integration chain that starts with a project-based approach, shifts to an application-centric approach, then to a process-centric DI approach, and finally arrives at architecture-based DI.
And here’s the first of the 12 building blocks: “Put the business in charge of data.”
Normally, I’d be all nodding and “yes, yes, okay.” But this time, I thought, “Whoa, there buddy.”
That sounds great in theory, but as this recent encounter shows, there’s a lot of stuff that needs to happen in between, like maybe a serious talk about what “in charge” means.
The rest are great, but many strike me as absolutely IT’s job. I particularly liked number four — position data as a service — which seems like a good way to automate the integration (building block #7) while safeguarding it from business user error.
Sure enough, future iterations of that time clock software support sending data as a service, and, of course, SAP is service-enabled.
But once again, I had to pause at #9: Move to self-service. That’s another “sounds great,” but only if you’ve planned for the ID-10T error.
It’s a good list, but I would put “put the business in charge of the data” after most of the other blocks. Again, I’m no expert, but it seems like some sort of historical archive needs to be added or turned on.
Because here’s a hard fact: The business may “own” the data, but you can bet your britches IT will own the problems, whether they’re bad architecture design, a brittle integration or simply PEBKAC.