Nobody likes to test applications, but we all know that if it’s not done properly some testing-related issue is eventually going to come back to haunt not just the developer, but the entire business.
Most organizations are still discovering application defects pretty late in the application development lifecycle, according to a new survey of 140 development professionals working in companies will 1,000 employees or more that had 50 or more developers on staff that was conducted by Osterman Research on behalf of Electric Cloud, a provider of software production management tools. Worse yet, many organizations are encountering the same bugs over and over again, which means application development costs are being driven up unnecessarily.
Despite these issues, most development organizations take an inconsistent approach to application testing. Some testing activities are automated, while others are done by hand. And surprisingly, many development organizations report they have trouble convincing senior managers about the value of automated testing tools.
As much as anyone values their job, nobody enjoys tediously looking for the same bugs over and over again when there is a tool that can do the same thing in a matter of minutes. That doesn’t mean that manually testing applications is going to go away any time soon. But it does mean that IT organizations as a whole could be a lot more efficient about application testing.
According to Electric Cloud CEO Mike Maciag, the biggest issue his company sees is an over-reliance on flawed testing scripts that fail to discover the same issues over and over again. As security becomes a bigger application testing issue and the applications become more complex, automating certain application testing functions is no longer an option. This is especially true as IT organizations move to adopt more agile and iterative development methodologies that stress testing applications in parallel with their actual development.
When you get right down to it, the biggest factor holding back the adoption of automated testing in most IT organizations is inertia. What little structure these organizations have in terms of testing is so ingrained in the organization, no one really takes the time to consider whether there is a better way to do a set of tasks that most developers rank with having their taxes done or visiting the dentist.
So the real question becomes, when all is said and done, when will the development team finally get out of its own way when it comes to application testing?