Automated security scanning for websites is an automated process that periodically scans a website to check for security flaws and uploads the reports with any found vulnerabilities in HTML format to a website for review by the development team. Even if the site was originally built with security in mind ; in order to make sure that information remains secure as the website evolves from early deployment to maturity, security scanning should be done regularly. An automated web application security scanning scheme that runs the relevant security checks daily and performs a deeper discovery/training scan in the weekend fits most security department requirements.
How to implement automated web application security scanning using Arachni ?
As part of the highly effective Devops process, Syloé works with the development and security teams to determine and implement the security policy, of which Automated Security Scanning is a key component.
Our approach to set up automated web application security scanning for a new client
The basic approach we use to set up Automated Security Scanning for a new client is simple:
1) Identify the checks to run
Include only checks that are relevant to the underlying technology, such as « SQL Injection » on « PostgreSQL ». Running irrelevant checks is a waste of computing resources. Because we are the administrators of the site, there is no need for discovery.
2) Identify what URLs should be scanned
Both for the public site and for the pages that require login. There should be separate scans for the scenarios with login in addition to the scenarios without login.
3) Script the login procedure
We need a user account that is dedicated to arachni, configured with the rights of a normal/standard user. So not an administrative user account.. The login script is a simple ruby script that can be called from the command line, to be integrated in the arachni scan.
4) Finalize the scans in a test environment
Arachni offers many command line parameters in order to customize the scanning procedure, it takes some time to identify exactly what the final scans will look like on the command line. Do not run the scans against the production servers while still developing the scans, this could seriously impact performance and generate unwanted results – arachni might auto-fill some forms a few hundred times, causing false site usage statistics..
5) Script the scans
Using crontab and bash, it’s easy enough to automate the entire process where the arachni reports (HTML format) automatically get transferred to a web-server and the customer can always verify the latest security scan report.
6) Establish monitoring
An automated process needs to be monitored. At Syloé, we use zabbix to generate alerts when there hasn’t been a report for a certain time and in cases where there were issues with the report generation. We also track scan process times to understand the impact on resources with alerts if certain scans take longer than expected.
7) Establish a procedure to deal with new issues
Relevant security issues that are identified in the security scan need to be dealt with; for example one could target 1 week to resolve critical issues and 4-6 weeks to resolve medium-severity security issues. Consider temporarily disabling new functionality until the problem is resolved. If the vulnerability is found in the daily arachni scan, it’s safe to assume that this will also be reported in hacker scans..
Contact us to set up an automated web application security scanning
Arachni is a very powerful tool that can be used to support your website security framework. At Syloé we believe in strong monitoring and using automated security scanning to identify new vulnerabilities. We have experience in hardening against and resolving many types of security issues at network, systems and application layer. We customize solutions for customers Web applications with tools such as Apache ModSecurity. Please contact us for details.