I've written a couple of times about the federal government's efforts to improve its poor technology track record. Much of the focus appears to be on project management, with initiatives like the Department of Veterans Affairs' Program Management Accountability System (PMAS) vetting process, which it used to identify and either rework or cancel IT projects that were behind schedule and/or over budget. It sounds like a similar approach is being extended to other agencies, based on a Wall Street Journal article that says the Obama administration is considering overhauling 26 troubled federal IT projects that are over budget, haven't met expectations or both.
Among the targeted projects:
Federal CIO Vivek Kundra stresses the importance of splitting such projects into more manageable chunks. He's right, of course, but I can't help but think there's a pretty big problem even before these projects make it to the scoping stage. I think part of the problem is a convoluted and arcane procurement process that favors incumbent IT vendors and contractors that are experienced in jumping through all of the necessary hoops, mostly giants like AT&T, Lockheed Martin, Accenture, IBM and Oracle, all companies involved with problematic projects, according to the article.
I am not saying there is anything wrong with any of those companies. But by making the process so complicated, the government is discouraging many other companies from even trying, which means agencies aren't getting a full picture of the possibilities.
Government procurement appears to be a broken process, one in which agencies hamstring even their usual vendors from suggesting different ways of solving problems or achieving goals. Writing on CIO.com UK, Gary Bettis of Compass Management Consulting opines that government procurement focuses too much on the technical aspects of service delivery and not enough on desired business outcomes. He writes:
... Instead of simply defining the need for desktop clerical tools such as word processing and e-mail, a procurer might insist on a certain PC specification, operating system and versions of the word processing tools rather than letting the vendor come up with the most cost-effective solution to deliver the same outputs.
The result? Bettis says he and his Compass colleagues estimate the public sector pays at least (emphasis mine) 40 percent above the market rate for outsourced IT services. (The figure is based on an analysis of central government contracts in the UK over the last five years. I suspect the figure would be pretty similar here in the United States.)
To be fair, Bettis also mentions government agencies' frequent demands for customized technology, an issue that is also touched upon the Journal article, which says, "technology companies and federal contractors have privately raised concerns about the Obama administration's broader review of technology spending, arguing that delays and cost overruns often occur because federal officials change their requirements for the projects."
If they really want to cut costs and boost efficiences, government agencies will use standardized platforms, Bettis says. Standardization "allows service providers to do what they have done so successfully in the private sector -- provide utility IT services to a range of clients at a competitive price, having achieved economies of scale through use of the same delivery infrastructure."
There are certainly hints that at least some agency officials in the United States realize this. For instance, Kundra has been pushing government agencies to consolidate data centers and consider cloud computing. Problem is, just as with tweaking the procurement process, a move to the cloud will entail some major changes to government culture.