As the lines of COBOL code continued to grow over the years, the number of skilled programmers has steadily declined. It's difficult to quantify the number of active COBOL programmers currently employed. In 2004, the last time Gartner tried to count COBOL programmers, the consultancy estimated that there were approximately two million worldwide and the number was declining at 5 percent annually.
The greatest impact of COBOL programmers leaving the workforce will be felt from 2015 through 2029. This makes sense when you realize that the oldest of the baby boomer generation are those born between 1946 and 1950 and that this generation will approach the traditional retirement age of 65 between the years 2011 and 2015.
As these workers prepare for retirement or are part of a generation that doesn't stay at one company for their entire career, last-minute scrambles to capture a career's worth of programming expertise and how it's been applied to company specific applications are challenging.
However, efforts are under way to address this issue. They include IBM's Academic Initiative and the Academic Connection (ACTION) program from Micro Focus International plc, a software company that helps modernize COBOL applications. Both programs are actively involved in teaching COBOL skills at colleges and universities worldwide. However, very little was done to document the knowledge of these skilled workers in the real world while they were on the job.
The COBOL Skills Debate
It's debatable whether a COBOL skills shortage actually exists. To some degree, the impact of the dearth of skilled workers will depend on how many of a company's applications rely on COBOL.
For companies that will be affected, industry analysts at Gartner published a report in 2010 titled 'Ensuring You Have Mainframe Skills Through 2020.' In the report, Gartner analysts advise companies that depend on mainframes how to prepare for the impending skills shortage. The report says companies should work closely with human resources professionals to put a comprehensive plan in place that will help guide them through the next decade, when the mainframe skills shortage will be more prominent.
Mike Chuba, a vice president at Gartner and author of the report, wrote, 'Many organizations have specialists with years of deep knowledge-a formal plan to capture and pass on that knowledge via cross-training or mentoring of existing personnel is critical.'
According to Kim Kazmaier, a senior IT architect with over 30 years of industry experience, this skills challenge is the result of a combination of factors: 'The demographics have changed. You are unlikely to find many people who remain in one company, industry, or technology focus for long periods of time, unlike previous generations of IT professionals. Also, the sheer volume and complexity of technology make it virtually impossible for any individual to master the information about all the technology that's in use within a large IT organization. It used to be that an IT professional would literally study manuals cover to cover, but those days have been replaced by just-in-time learning.'
'To be fair, the information that once filled a bookcase would now fill entire rooms. We simply don't have the luxury to master that much information individually, so we rely on information mash-ups provided by collaboration, search engines, metadata repositories, and often overstretched subject- matter experts.'
How to Prepare for Any Pending Skills Drought
This issue adds up to fewer skilled IT workers to handle the increasing issues of developing and managing software, which results in the proliferation of glitches.
How can organizations address this issue in a logical and realistic way? The most practical approach-and one that can be applied to nearly any challenge of this nature-is to first fully understand the business issue, and then figure out how your people can help address it through technology.
The first step is to conduct an IT audit by invoking a comprehensive inventory of all the technology in the infrastructure. Along with tracking all the COBOL-specific applications, you should understand which non-COBOL applications intersect with COBOL applications. Given that we are more connected every day, applications are no longer relegated to specific departments or companies. Because COBOL is behind a significant number of business transactions, there is a strong likelihood that the non-COBOL applications being created today will also pass through a mainframe. The inventory process is actually not as arduous as it may initially seem, given the amount of available technology resources that can accelerate this step.
The next step is to get an update on the percentage of COBOL expertise in your company versus other technologies, as well as tenure and retirement dates. Based on an IT audit and staff evaluation, you can get a clear picture of just how much of a risk the COBOL situation is to your organization.
If you determine that your company is at risk due to a lack of COBOL expertise, consider the following recom mendations:
Be realistic about knowledge transfer
Automate as much as possible
Be Realistic About Knowledge Transfer
A logical course of action would be to suggest knowledge transfer, but that won't completely resolve the situation because of two significant issues. The first is that the applications that were put in place decades ago have been consistently tweaked, updated, and enhanced through the years. It would be impossible to review every change made along the way to pick up exactly where a COBOL expert with 30 years of experience left off. This won't be a showstopper, but it can a glitch.
This leads to the second issue-experience. It's simply not possible to do a Vulcan mind meld with the retiring workforce and expect a new team of developers to be as conversant in COBOL as people who have dedicated their careers to it. Although knowledge transfer is important, companies should be realistic in their expectations.
There's no reason that the IT job of the future couldn't or shouldn't be a hybrid of various kinds of technology expertise.
For example, you could offer positions that mix different skill sets such as Flash programming and COBOL and offer additional salary and benefits to the cross-trained developers. This would result in greater expertise across the company and would help you avoid creating knowledge silos.
Automate Where Possible
If your company is facing a skills shortage, consider using technology to automate as many tasks as possible. You can never fully replace intellectual knowledge, but this step can help alleviate the time-consuming and less-strategic functions that still need to happen on a regular basis.
Computer Science Is Cool Again
Whether or not you believe that a pending COBOL skills drought is imminent, you can't deny that there is a demand for IT skills across the board. The U.S. Bureau of Labor Statistics (BLS) estimates that by 2018, the information sector will create more than one million jobs. The BLS includes the following areas in this category: data processing, web and application hosting, streaming services, Internet publishing, broadcasting, and software publishing. Although this represents great opportunities for the next generation, we will face a supply-and demand issue when it comes to building and maintaining the technology that runs our businesses.
This is due to the fact that the number of students studying computer science and related disciplines at the college and university level is just now on the upswing after steadily declining from 2000 to 2007, largely as a result of the dotcom collapse. In March 2009, The New York Times reported on the Computing Research Association Taulbee Survey. It found that as of 2008, enrollment in computer science programs increased for the first time in six years, by 6.2 percent. But the gap will still exist from nearly a decade of students who opted out of studying the fundamentals associated with software design, development, and programming. Adding to this is the evidence that students studying computer science today are more interested in working with 'cooler' front-end application technologies-the more visible and lucrative aspects in the industry. They're not as interested in the seemingly lessexciting opportunities associated with the mainframe. The Taulbee Survey found that part of the resurgence in studying computer science is due to the excitement surrounding social media and mobile technologies.
The next factor that is creating this perfect storm is mergers and consolidation. Although market consolidation is part of the natural ebb and flow of any industry, a shift has occurred in the business model of mergers and consolidation. It is driven by larger deals and the need to integrate more technology.
From an IT perspective, when any two companies merge, the integration process is often much lengthier and more timeconsuming than originally anticipated. This holds true regardless of the merger's size. Aside from the integration of teams and best practices, there is the very real and potentially very costly process of making the two different IT infrastructures work together.
One of the biggest 'hurry up' components of mergers is the immediacy of combining the back-office systems of the new collective entity. At a minimum, similar if not duplicate applications will be strewn throughout the infrastructure of both companies. With all these variations on the same type of application, such as customer accounts, sales databases, and human resources files, there will undoubtedly be inconsistencies in how the files were created. These inconsistencies become very apparent during the integration process. For example, John Q. Customer may be listed as both J.Q. Customer and Mr. John Customer and might have duplicate entries associated with different addresses and/or accounts, yet all of those accounts represent one customer.
Along with trying to streamline the number of applications is the challenge of integrating the various technologies that may or may not adhere to industry standards. Getting all the parts of the orchestra to play the right notes at the right time presents a significant challenge for even the most talented IT professionals.
From a more mainstream point of view, spotty efforts to merge infrastructures can have a very real impact on consumers. For example, according to the Boston, Massachusetts television station and NBC affiliate WHDH, as well as the local Boston CW news affiliate, when tuxedo retailers Mr. Tux and Men's Wearhouse merged in 2007, a computer glitch didn't properly track inventory and customer orders.This resulted in wedding parties and others on their way to formal events without their preordered tuxedos. Since you can't change the date of such events, many customers had to incur additional expenses by going to another vendor to ensure they were properly attired for their big day.
Before you start chuckling at the thought of groomsmen wearing tuxedo T-shirts instead of formal wear, keep in mind that the U.S. wedding industry represents $86 billion annually and that men's formal wear represents $250 million annually.Talk about a captive audience-you can well imagine the word-of-mouth influence of an entire wedding reception.
Hallmarks of Successful Mergers
Every company that's been through a merger can share its own tales of what went right, what went wrong, and what never to do again. Yet I believe that successful mergers have consistent hallmarks:
A cross-functional team: This group is dedicated to the success of the merger. It needs to represent all the different functions of the newly formed organization and should work on the integration full time.
A realistic road map: When millions of dollars are at stake, especially if one or both of the companies are publicly traded, there may be a tendency to set aggressive deadlines to accelerate the integration. Don't sacrifice the quality of the efforts and the customer experience for short-term financial gains. For example, if your senior-level IT staff tells you the deadlines are unrealistic, listen carefully and follow their lead.
Humility: Don't assume that the acquiring organization has the better infrastructure and staff. Part of the responsibility of the integration team is to take a closer look at all the resources that are now available to create the strongest company possible.
Technology overlap: Whether a successful integration takes three months or three years, do not shut off any systems until the merger is complete from an IT perspective. You may need to spend more resources to temporarily keep simultaneous systems running, but this is well worth the investment to avoid any disruptions in service to customers.
Anticipate an extensive integration process. The biggest mistake an acquiring company can make is to assume that the integration will be complete in 90 days. Although it may be complete from a legal and technical standpoint, don't overlook the commitment required from a cultural perspective or you may risk degrading the intellectual value of the acquisition.
The Ubiquity of Technology
The third force that is contributing to the impending IT storm is the sheer volume and ubiquity of technology that exists among both businesses and consumers. It's difficult to understate the scale at which the IT industry has transformed productivity, stimulated economic growth, and forever changed how people work and live.
It's hard to overlook the contributions of the information technology sector to the gross domestic product (GDP). Even with the natural expansion and contractions in the economy, the IT sector continues to be a growth engine. In 2007, U.S. businesses spent $264.2 billion on information and communication technology (ICT) equipment and computer software, representing a 4.4 percent increase over the year 2006.
The next decade shows no signs of stopping. A February 2010 Internet and mobile growth forecast jointly developed by IT networking vendor Cisco and independent analysts said that global Internet traffic growth is expected to reach 56 exabytes (EB) per month. The various forms of video, including TV, video on demand (VOD), Internet video, and peer-to-peer (P2P), will exceed 90 percent of global consumer traffic. This will all happen by the year 2013. The same report forecast that the mobile industry will realize a compounded annual growth rate of 108 percent per year through the year 2014. Cisco's forecast for mobile data growth in terabytes (TB) and exabytes.
I often equate technology with fire: It can warm your hands on a cold night, or it can burn down your house. The innovations of the past few decades illustrate this point. They've resulted in a massive amount of software and devices that, if not properly developed and managed, can bring down a network and shut out those who rely on it.
The irony is that the technology industry has created and perpetuated the ubiquity of its products, which is leading to this potential perfect storm of sorts.
Does this mean that a technology Armageddon is under way? Well, that's a bit dramatic, but it's not too far from what could potentially happen if we continue to allow these technology glitches to fester in our infrastructures.