8 Web App Security Best Practices to Fight Off Bot Intrusions

1 | 2 | 3 | 4 | 5 | 6 | 7 | 8 | 9 | 10 | 11
Next 8 Web App Security Best Practices to Fight Off Bot Intrusions-5 Next

Scrub All Inputs

Any time an application receives data, from any source, you should assume that the data is unclean and needs to be sanitized. Scrub incoming data to eliminate anything that appears to be program logic or an executable, even if execution would occur elsewhere. The cleaning process is complex and requires searching for and removing certain character sequences that could enable vulnerabilities. When an application "scrubs" inputs or goes through this cleaning process, the exposure to attacks like XSS and SQLi comes down considerably. To take it one extra step, by defining a character set (a collection of ASCII characters that a valid input would have), you take that exposure to almost zero.

A strong web application firewall (WAF) input policy specifies exactly what characters your application expects across each of its inputs. If your application is expecting a product ID number consisting of 12 numbers, then a WAF input policy would at the very least remove control characters and punctuation. A strong WAF input policy would constrain the product ID to only accepting 12 characters as input, and those 12 would have to be numerals - anything else should throw an error. You should be scrubbing data any time you accept it from end users or external services.

Web applications have become the mainstay of the business world. Whether it's the backend of a mobile app that connects users to your product or your public-facing website, one thing remains the same. Web apps have become just as important in doing business as brick-and-mortar operations. Yet we sometimes overlook the need to secure our online applications.

To complicate matters, we've seen a huge increase in bots, which now make up 61 percent of all website traffic. Cheap cloud computing resources and open source software have enabled attackers to launch bot attacks faster and at a lower cost than ever before. Hackers use bots to uncover website security vulnerabilities – at scale – then spread their attack origins across hundreds of IPs. Bad bots are now the key culprits behind web scraping, online fraud, reconnaissance attacks, man-in-the-browser attacks, brute force attacks and application denial of service.

Securing web apps from the millions of bad bots that attempt to penetrate them each year can seem like a daunting task. John Stauffacher, a world-renowned expert in web application security, and the author of Web Application Firewalls: A Practical Approach, recently sat down with Rami Essaid, CEO of Distil Networks, to brainstorm actionable ways organizations can defend their web applications from malicious bots. The good news is that you can quickly shore up your defenses by following a few simple rules, as well as implementing controls within your application development lifecycle.


Related Topics : Unisys, Stimulus Package, Security Breaches, Symantec, Electronic Surveillance

More Slideshows

PAM PAM Solutions: Critical to Securing Privileged Access

To protect the company from those insiders who abuse their privileged access and from hackers with stolen credentials, many companies are turning to a privileged access management (PAM) solution. ...  More >>

Fake news How Can We Fix the Fake News Problem?

Is fake news a security issue? Some say yes, as it can be used as a social engineering tool to spread disinformation and conceivably to get unsuspecting users to click on malicious links. ...  More >>

blockchain The World According to Blockchain

Blockchain comes with many costs and is surrounded by confusion. Here, we examine realistic use cases, drawbacks and the potential of blockchain. ...  More >>

Subscribe Daily Edge Newsletters

Sign up now and get the best business technology insights direct to your inbox.