I've long said that the line between technology used in the office and at home is blurring. It's nice to see that the folks at Wharton Business School agree with me.
Self-aggrandizing asides aside, it's difficult to miss the fact that there no longer is a clear divide between technology used at work and home. This thoughtful piece in the school's magazine lays out the reasons modern communications crosses back and forth over the home and work divide with impunity. The bottom line is that young workers entering the workforce have technology in their DNA, today's devices and networks are more capable of supporting business-oriented tasks and, in general, people flow more easily between work and play than they did in the past.
These kids -- as well as the older folks -- are likely to use their consumer gear to read a report or send a business e-mail and their work devices to watch a television show or surf the Net. The bottom line is that myriad tools and applications -- from IM to mashups to smart phones -- are useful for both. Indeed, the very reason that unified communications is a hot topic is to find a worker wherever she is. This, by definition, includes home, a theater or the dentist's office.
There is a big problem with all this. Corporate history is full of examples of what people do being far ahead of corporate systems' ability to handle it. Folks conducting business on their personal devices running on any old network is a big security and regulatory issue. SOX and HIPAA inspectors have wide latitude to see documentation detailing how a certain piece of information has been handled and stored. It doesn't seem that a company will get high marks if that record spent time on a laptop that also carried the eDonkey file sharing program or if a sensitive IM exchange was done using a service liable to malware.
It seems that there are two parallel universes: The one that is drawn up by security and management folks and the one in which real people live their real lives. The two must overlap or trouble will ensue. Says the writer of the Wharton piece:
Indeed, Gartner predicts that by 2011, 10% of all information technology spending will reside with employees. In other words, employees will pay for and bring their own technology -- laptops, iPhones and the like -- to work as their primary tools.
This is part and parcel of the growth in Web 2.0, which will be seen equally as a corporate and consumer area -- and a big one. Forrester Research says that spending on the technology family will increase 43 percent annually during the next five years, reaching $4.6 billion by 2013.
There is an obvious answer here: Vendors, in the long term, must find a way to empower IT to upgrade consumer technology to provide an adequate level of consumer security. This could be done by firmware and software downloads, circuitry that is latently present in the device, some combination of those two things -- or by some other method.
Employees must agree to this upgraded version of security if they are to use the device for work purposes -- and their must be a management tool to monitor the agreement. Likewise, back end systems must be broad enough to support the wider variety of gadgets and connectivity options.
There is no way to stop employees from using their personal devices and the public networks to which they attach. This only is a bad thing if enterprises refuse to adopt a model that deals with this reality.