8 Web App Security Best Practices to Fight Off Bot Intrusions

Email     |     Share  
1 | 2 | 3 | 4 | 5 | 6 | 7 | 8 | 9 | 10 | 11
Next Next

Scrub All Inputs

Any time an application receives data, from any source, you should assume that the data is unclean and needs to be sanitized. Scrub incoming data to eliminate anything that appears to be program logic or an executable, even if execution would occur elsewhere. The cleaning process is complex and requires searching for and removing certain character sequences that could enable vulnerabilities. When an application "scrubs" inputs or goes through this cleaning process, the exposure to attacks like XSS and SQLi comes down considerably. To take it one extra step, by defining a character set (a collection of ASCII characters that a valid input would have), you take that exposure to almost zero.

A strong web application firewall (WAF) input policy specifies exactly what characters your application expects across each of its inputs. If your application is expecting a product ID number consisting of 12 numbers, then a WAF input policy would at the very least remove control characters and punctuation. A strong WAF input policy would constrain the product ID to only accepting 12 characters as input, and those 12 would have to be numerals - anything else should throw an error. You should be scrubbing data any time you accept it from end users or external services.

Web applications have become the mainstay of the business world. Whether it's the backend of a mobile app that connects users to your product or your public-facing website, one thing remains the same. Web apps have become just as important in doing business as brick-and-mortar operations. Yet we sometimes overlook the need to secure our online applications.

To complicate matters, we've seen a huge increase in bots, which now make up 61 percent of all website traffic. Cheap cloud computing resources and open source software have enabled attackers to launch bot attacks faster and at a lower cost than ever before. Hackers use bots to uncover website security vulnerabilities – at scale – then spread their attack origins across hundreds of IPs. Bad bots are now the key culprits behind web scraping, online fraud, reconnaissance attacks, man-in-the-browser attacks, brute force attacks and application denial of service.

Securing web apps from the millions of bad bots that attempt to penetrate them each year can seem like a daunting task. John Stauffacher, a world-renowned expert in web application security, and the author of Web Application Firewalls: A Practical Approach, recently sat down with Rami Essaid, CEO of Distil Networks, to brainstorm actionable ways organizations can defend their web applications from malicious bots. The good news is that you can quickly shore up your defenses by following a few simple rules, as well as implementing controls within your application development lifecycle.

 

Related Topics : Unisys, Stimulus Package, Security Breaches, Symantec, Electronic Surveillance

 
More Slideshows

Security117-190x128.jpg 5 Steps to Protect Executives from a Whale-Phishing Attack

Whaling is a type of spearphishing targeting "big fish" in an organization with access to sensitive, highly-valuable information. ...  More >>

Security116-190x128.jpg 5 Common Failures Companies Make Regarding Data Breaches

Five common failures companies make when preparing for, and responding to, a data breach, as well as guidance for companies on how they can tackle these issues. ...  More >>

Security115-290x195 Data-Centric Approach Starves Data-Hungry Cybercriminals

Incorporating security capabilities such as encryption, better control and management and a data security framework will help alleviate the burden breaches place on the organization and people's lives. ...  More >>

Subscribe to our Newsletters

Sign up now and get the best business technology insights direct to your inbox.