46% of websites had High-security vulnerabilities, acc. to website Vulnerability Report 2019.

Top 5 reasons being:

  • Vulnerable web servers
  • CMS Vulnerabilities
  • XSS
  • Vulnerable JS libraries
  • CSRF attacks

In this guide, we’ll cover 13 tips to perform a website security audit on your website and harden your web application.

If you are looking for professional help for website security audit, click here

Implement site-wide strong-password use policy

86% of all the passwords are terrible, acc. to Have I Been Pwned Report. That being said, poor passwords are one of the major reasons why website and user data is breached every year.

It is advisable for every business owner to implement a strong-password use policy across the web application and business combined. Not only company information, but user data is also a major asset that a company is responsible to keep protected.

Below are a few helpful tips and links while implementing strong-password use policy:

  1. Use strong password encryption scheme (prevents Brute-force attacks)
  2. Use Pwned Passwords API from HIBP
  3. Use Multi-factor authentication/2-factor authentication on login forms.

“Web host Hostinger’s data breach, 14 million users affected. Outdated hashing algorithm SHA-1 used, – TechCrunch

Update web server

According to web vulnerability scanner, 2019 report web-servers scored the top position in the High-risk severity graph. 

website security audit report 019

It’s a good security practice to keep your software updated. But, keep a regular check on your web server version update. 


sudo apt update && sudo apt install package_name

To update your current web server software. Replace package_name with Nginx, apache, caddy, etc.

As a pro tip – Make sure to update all other software like CMS, database, PHP, and plugins to the latest version.

Fix file permissions – (WordPress, Magento and Joomla)

Not all files are created the same. Make sure to double-check the file permission for all the sensitive files on your server.

A simple ls -l will tell you which user group gets the RWX(read, write, execute) permissions on each file.

For sensitive files, you’d want to give read/write permissions to root the only user and for non-root users do not give any privilege. For example

Disable RW permissions for sensitive files and folders on all non-root users

chmod 600 filename

To give read-only permission to all non-root users

chmod 644 filename

Appropriate file permissions will prevent a public data breach of sensitive files. Just make sure your web server and other applications are run as non-root users.

Disable directory listing

In version 5, WordPress disabled default directory listing in the application level on wp-content/uploads/ directory. 

If you are using an earlier version of WordPress or using other CMS’s (even self-developed), we advise you to disable directory listing

disable directory listing

apache: add a line into .htaccess or /etc/apache2/httpd.conf: OPTION -Indexes

nginx: site.conf: autoindex off inside location{} block.

Run a regular health check scan

Regular health checks must be performed since bots are always at work to penetrate your application. You can do this in two ways

Either hire a professional penetration tester to keep an eye on your application, or run a health check right now using our free website security scanner.

free website security scanner

Astra’s Security Scanner run checks for 

  • Content Security Policy
  • Header Security
  • Cookie Security
  • CORS Tests
  • HTTPS Security
  • And 135+ more security tests

Run regular backups

100% security is a myth. You may become victim to a data breach anytime. We advise you to keep regular backups of your database.

For MongoDB – mongodump -o compressed_db_file

For MySQL – mysqldump -o compressed_db_file

You can also put it inside a cronjob that’ll run every 12 hours auto-backup frequency every 12 hours. Use the following link to customize your crontab script

Enable SSL and redirect

Install SSL certificate (LetsEncrypt) and enable auto-redirect to https using the certbot tool by LetsEncrypt

apt-get install software-properties-common python-software-properties
add-apt-repository ppa:certbot/certbot
apt-get update && apt-get install python-certbot-apache

To install LetsEncrypt’s SSL certificate on your domain, run the following command:

certbot --apache -d yourdomain.com

Enable Redirect to Secure HTTPS Access

When you see a prompt for HTTPS redirection, select 2 and hit Enter

Please choose whether or not to redirect HTTP traffic to HTTPS, removing HTTP access.
1: No redirect - Make no further changes to the webserver configuration.
2: Redirect - Make all requests redirect to secure HTTPS access. 
Choose this for new sites, or if you're confident your site works on HTTPS. 
You can undo this change by editing your web server's configuration.
Select the appropriate number [1-2] then [enter] (press 'c' to cancel): 2

Congratulations! your SSL cert is deployed and active
In case you already have SSL implemented on your site, make sure to update the OpenSSL version from 1.1.0 to 1.1.1a.

OpenSSL versions older than 1.1.1a are susceptible to heart-bleed attacks and can put your website security at high risk!

Update OpenSSL

sudo apt update && sudo apt install openssl

Use SSH for remote file management

If you are responsible for moving files back and forth to the server, we recommend doing that securely by using strong encryption and protect your data transit.

Use the SSH file transfer protocol (SFTP). SCP is a great SFTP-based tool for ssh file sharing.

apt update && apt install scp -y

Prevent bot crawling

Search engine bots are at work, 24/7. You may agree with the idea of your sensitive directories being listed in google results or your sitemap. As a countermeasure, you can disallow the bots to stop listing all your content or specific sensitive directories inside search results.

Just open robots.txt in your web application directory and use the following syntax to disallow robots to list the content of specific directories: 

File syntax: 

For a specific directory (/sensitive/url/)

user-agent: GoogleBot

disallow: /sensitive/url/

Disallow all bots to list /sensitive/url/ 

user-agent: *

disallow: /sensitive/url/

Here’s a beginner’s guide by Moz on http://moz.com/learn/seo/robotstxt that’ll give you a broad overview of security uses of robots.txt

Remove unused plugins

Apart from updating existing plugins, we recommend taking your time out and clean u your website for unused plugins. This will make your website more secure whilst speed up the overall loading time.

website security audit

Monitor DDoS

Install a web application firewall like Astra on your WordPress, Drupal, Joomla or any other CMS store.

Astra web application firewall provides security features like

  1. Realtime security scans on WP, Drupal, Joomla, etc 
  2. Application firewall (SQLi, XSS, RCE prevention)
  3. Threat analytics
  4. blacklist monitoring
  5. File upload scanning
  6. Bad bot and spam protection

Hide server version for Nginx/apache

To hide web server version number, server operating system details, installed Apache modules and more, open your Apache web server configuration file using your favorite editor:

sudo vi /etc/apache2/apache2.conf         #Debian/Ubuntu systems
sudo vi /etc/httpd/conf/httpd.conf        #RHEL/CentOS systems

And add/modify/append the lines below:

ServerTokens Prod
ServerSignature Off

Save the file, exit and restart your Apache webserver like so:

sudo service apache2 restart

Enable HttpOnly cookie 

There’s a good reason to have an HttpOnly cookie flag enabled on your requests. It prevents 3rd-party scripts from reading user’s cookie data and prevent modifications.

To help mitigate cross-site scripting (XSS) attacks, HttpOnly cookies are inaccessible to JavaScript’s Document.cookie API; they are only sent to the server. For example, cookies that persist server-side sessions don’t need to be available to JavaScript, and the HttpOnly flag should be set.

Set-Cookie: id=a3fWa; Expires=Wed, 21 Nov 2019 07:28:00 GMT; Secure; HttpOnly

A secure cookie is only sent to the server with an encrypted request over the HTTPS protocol. Even with Secure, sensitive information should never be stored in cookies, as they are inherently insecure and this flag can’t offer real protection. Starting with Chrome 52 and Firefox 52, insecure sites (HTTP:) can’t set cookies with the Secure directive.

Quick Links for Security Checklist


Weak passwords – https://www.getastra.com/blog/knowledge-base/create-safe-and-secure-passwords/

86% of passwords are terrible (Troy Hunt) – https://www.troyhunt.com/86-of-passwords-are-terrible-and-other-statistics/

TechCrunch Report – https://www.troyhunt.com/ive-just-launched-pwned-passwords-version-2/

Hashing Basics – https://www.wired.com/2016/06/hacker-lexicon-password-hashing/

Directory Listing Vuln – “Nine percent of sampled Targets were found to be vulnerable to Directory Listing misconfigurations”

Coding Horror – Enable HttpCookie Only – https://blog.codinghorror.com/protecting-your-cookies-httponly/

Acunetix website vulnerability report 2019 – https://cdn2.hubspot.net/hubfs/4595665/Acunetix_web_application_vulnerability_report_2019.pdf