One of the concepts being floated around the RSA Conference 2010 this week is the need to develop an intelligent approach to whitelisting. The basic concept behind a lot of IT security is that we blacklist sites that are known to be malicious and we whitelist sites that are known to be safe. The 90 percent of all sites that fall between these two definitions would be somewhere in a gray area between black and white.
Because no one would know what to make of an idea called 'graylisting,' Lumension Security is pushing a concept called intelligent whitelisting. The basic idea is that security software should learn and keep track of not just what sites are safe, but specifically what pages within a given Web site are safe. We all know that the vast majority of public-facing Web sites are infected with malware, but not every page on every site is infected.
According to Ed Brice, senior vice president for global marketing at Lumension, whitelisting has been scorned by end users and security professional alike because it's a dumb, brute force approach to security. Lumension is advocating a new approach that combines frequent updating of whitelists with blacklisting inside the Lumension Endpoint Management and Security Suite.
At the core of this offering is a new rules-based engine that allows security managers to define more elastic policies that define more granularly in terms of what types of changes can be introduced into the overall environment.
Only time can tell if this approach will work, especially given Microsoft's misadventures with whitelisting in Windows Vista. But it's pretty clear that rigid security policies not only get in the way of productivity, they also wind up being circumvented by the end user. Maybe the time has come to try another more flexible approach that can still get the job done without driving everybody crazy in the process.