When it comes to managing technology products on an ongoing basis, the biggest issue that most chief technologists wind up struggling with is human error. Whether it’s simply poor performance or a major security breach, chances are the initial problem stems from a configuration issue.
Now it’s debatable as to whether most IT configuration issues are caused by the fact the products are overly complex to configure, or that the people configuring the devices lack the proper training. But the Network Barometer Report for 2010 from the IT services firm Dimension Data, which is based on actual customer data regularly collected during network assessments, shows that small IT organizations with less than 100 users and large organizations with 500 to 2,500 users have the highest number of vulnerability and configuration issues. In contrast, enterprise-class organizations with 2,500 users or more have the least, closely followed by medium-sized organizations with 100 to 500 users.
Rich Schofield, business development manager for the Network Integration unit of Dimension Data, surmises that enterprise-class customers have enough people and technology to throw at the problem. Large-scale customers, in contrast, don’t have as many people or tools, but tend to have more opportunities to run into these issues than medium-sized customers. Smaller customers, in the meantime, tend to have limited access to both personnel and the tools needed to take on the challenge.
Of course, a big part of the fundamental issue that needs to be addressed when it comes to configuration is the tendency for IT organizations across the spectrum to keep IT infrastructure in place too long. Much of this technology installed today is comparatively obsolete and difficult to manage. So while many IT organizations try to save money on upgrades, they are actually incurring higher costs in terms of ongoing management overhead associated with the legacy IT infrastructure that more often than not is incompatible with modern systems management tools. For that reason, Schofield says it’s critical for IT organizations to understand just how much of their IT portfolio is dependent on products and technologies that cost more to operate than they do to replace.
In fact, Schofield estimates that any time an IT organizations has any issue with a product, the source of the problem is a configuration issue “well north of 50 percent of the time.”
At a time when many IT organizations are struggling to master the basic concepts associated with IT Infrastructure Library (ITIL) specifications as they aspire to run IT more like a business, the time has come for IT organizations to take a hard look at what products they are managing today.
In the final analysis, the majority of the products being managed by IT organizations today are simply more trouble than they're really worth given the plethora of modern alternatives that typically support a greater array of self-managing and self-healing capabilities. So the real question IT organizations need to ask themselves is how much of their time is being spent on glorified digital maintenance activities versus truly helping the company achieve its business goals?
Click through to see key findings of a report from Dimension Data.
Large and small organizations seem to have the highest number of vulnerabilities.
The business services sector is now at the top of the heap.
The smaller the business, the more configuration issues there tend to be.
The energy sector seems to be the most challenged.
Authentication issues top the list.
Despite a touhg economy, the number of obsolete systems still in service declined.
Automotive and manufacuring sector barely edge out travel and transportation.
On average about 35 percent of an organization assets are obsolete.