Once upon a time you could buy a domain name and hook it up to some cheap, shared hosting and that was all you had to do. You could build your website or install a WordPress blog and no further configuration was really required. These days you can still do this but you are leaving yourself open to security, speed, and privacy issues. Surfers becoming more aware of which browsers are safe and which aren’t through information from their browser and anti-virus programs. But not only this, search engines are also starting to penalize websites which do not protect the surfer’s privacy, are slow or are insecure.
There has been a desire to improve search engine rankings by doing SEO work for a long time. Now, in 2018, SEO is different to how it was 15 years ago and it’s importance is joined by security, speed and privacy as the four things everyone should be looking into. I have labeled each header to show which of the four sides the technology is used for.
These are just my thoughts at the moment, much of it will be my opinion. There are people who know much more about everything here so my advice would be to do more research before making any changes to your websites.
These best practices are the current general buzzwords for all websites that I think people may be slow to adopt. These should be added to the specific best practices for whichever kind of website you have concerning permissions, owerships, coding, code injection, etc.
This guide is just a quick look at all the topics. I may have missed some topics out. There is a short discussion and generally link(s) to follow to get more information, or tutorials to follow.
If you use a CDN, like CloudFlare, some of these may already be done without you having to think about it but they are good to know about, especially if you do not use a CDN. Also, if you use managed or shared hosting you may not be able to change some of these, but they may already be done for you by your hosting company.
Here are some best practices for websites in 2018. Some of these used to be nice-to-haves but are fast becoming must-haves, if they are not already.
Google Audit on the Chrome browser has replaced Google Pagespeed and offers a lot more detail than before as to how Google views your website.
Much of what Google Audit looks at is the speed of your website, especially over mobile networks. So, it wants the content that first appears on the browser to appear very quickly, loading content from further down the page afterward. The Audit is mainly the content of the website and how quickly it loads. The harshest test will be to run it in mobile mode with “3G/PC throttling” switched on. Google wants the above the crease content to be displayed quickly even on slow 3G.
Every time I do an Audit I have to be prepared for it to be painful reading. The good thing is that it highlights issues that you might not have seen, or you might have thought that they were fixed.
Enabling SSL encryption and forcing your website to use HTTPS has hit the headlines, mainly because of the changes to browsers which are meaning that HTTP-only websites are starting to look like bad places to visit.
The docs for Apache SSL are here… link
The disadvantage with just using SSL encryption by itself is that the website can often be much slower than it would have been without the encryption due to the extra handshakes that are needed with HTTPS. But, there are ways to further tweak HTTPS that will improve the security and also the speed of HTTPS websites. Most of these changes can be made to the SSL config file in Debian if you add them to another file (e.g. apache2.conf or a vitualhost file) make sure that there are no conflicts.
To check your own site to see how it ranks for security this website gives a good overview and even gives you a grade to show exactly how secure it thinks your site is.
TLS Session Resumption is configured in the SSL config file on Apache web servers. By default, it should be enabled. Check whether this is enabled for your website at SSL Labs.
TLS Session Resumption is the default with Cloudflare Flexible SSL… link.
If you use a CDN HTTP/2 may already be setup, or it may be an option that you can select. If you do not use a CDN you should check that your server is compatible with HTTP/2 like I did in this post.
Enabling HTTP/2 before HTTP/1.1 looks like this…
Protocols h2 http/1.1
HTTP/2 wiki… link
HSTS wiki… link
How to use HSTS… link
On Debian you have to allow headers a2enmod headers
then add this to the virtualhost file or the apache2.conf file…
15552000 seconds is 6 months.
# Use HTTP Strict Transport Security to force client to use secure connections only
Header always set Strict-Transport-Security "max-age=15552000; includeSubDomains;"
Header always set X-Frame-Options DENY
Header always set X-Content-Type-Options: nosniff
Then restart apache and test with SSL Labs.
Enabling Perfect Forward Secrecy (FPS)… link also link.
Some cryptographic protocols are deprecated because they are able to be hacked and are thus insecure. Very old browsers may not use the TLS 1.1 or TLS 1.2, so you have to strike a compromise between security and accessibility. If you think a lot of your viewers may have older browsers you can keep SSL 2.0, SSL 3.0 and TLS 1.0 enabled, however, these are all insecure. Just allowing TLS 1.0+ is better. Just allowing TLS 1.1+ is much more secure. The risk of forcing too high a cryptographic protocol is that there may be people using browsers who do not support your current protocol. It’s a balancing act which comes down to your own decision about what is more important security or accessibility.
If you just wanted to allow TLS 1.1 and TLS 1.2 you would add this to your ssl.conf or apache2.conf (in Debian). Be careful that there are no conflicts between these two files and the individual virtualhost files…
SSLProtocol TLSv1.2 TLSv1.1
You can check which browsers use which cryptographic protocols at this link.
Specifying which domain(s) you want to issue certificates to your website also makes your website more secure. You do this through your domain registrar or CDN, where available.
On CloudFlare using their Flexible SSL you would need the following…
example.com. IN CAA 0 issue “comodoca.com” example.com. IN CAA 0 issue “digicert.com” example.com. IN CAA 0 issue “globalsign.com” example.com. IN CAA 0 issuewild “comodoca.com” example.com. IN CAA 0 issuewild “digicert.com” example.com. IN CAA 0 issuewild “globalsign.com”
Taken from the CloudFlare blog
DNSSEC was designed to protect applications (and caching resolvers serving those applications) from using forged or manipulated DNS data, such as that created by DNS cache poisoning. All answers from DNSSEC protected zones are digitally signed. By checking the digital signature, a DNS resolver is able to check if the information is identical (i.e. unmodified and complete) to the information published by the zone owner and served on an authoritative DNS server. While protecting IP addresses is the immediate concern for many users, DNSSEC can protect any data published in the DNS, including text records (TXT) and mail exchange records (MX), and can be used to bootstrap other security systems that publish references to cryptographic certificates stored in the DNS such as Certificate Records (CERT records, RFC 4398), SSH fingerprints (SSHFP, RFC 4255), IPSec public keys (IPSECKEY, RFC 4025), and TLS Trust Anchors (TLSA, RFC 6698).
CDN. See also clouds.
Javascript creates a cache of the website on the viewer’s machine so that they can still view your website if they lose internet connection.
Having pages that load quickly and not having duplicate content are big parts of SEO. 301 redirects tell search engines and browsers that they should be using a certain URL. For example, you should be redirecting from HTTP to HTTPS, and you can redirect from non-www to www or vice versa.
With a canonical URL you are telling the search engine the exact URL it should be using. This is another method of ensuring that the pages are not going to be listed several times and appear to be duplicate content to search engines.
You add schema to your HTML markup. This is mainly for search engines as it is not visible on the page… link
GDPR was launched in the EU in 2018. Data protection has been around for a long time, but the addition of GDPR means that websites who have European visitors should definitely consider having a privacy policy. This is all to do with collecting data on individuals, and how that data is used. It’s probably safer not to collect any data at all or as little as possible. I know useful stats-based websites that have closed as a direct result of GDPR, which is a shame. On the plus side, it gives Europeans more control over their data, which is probably a good thing.
Do an analysis of your website on SSL Labs and do a Google Audit. Both sets of results will give you a list of things that are good and things that are bad. You can seek to improve the things that are bad, some of which will be listed in this article. It is probably not possible for mere mortals to get 100% perfect, but a lot of these steps are both free and easy to implement so it’s worth trying to get as high a score as possible.
I have focussed on privacy, security, speed and SEO in this guide. There are considerations that have always been around or are not especially new for 2018 such as accessibility and having a mobile-friendly website which should also be looked at if you do not already.
Some of this is primarily aimed at mobile users. Google Audit and service workers, in particular, are very concerned with how the website behaves on mobile connections which may be intermittent. The benefit of working on these, along with having a mobile-friendly website is that you may well get more mobile visitors. Google wants to send mobile visitors to websites they’ll enjoy using, thus it is gradually increasing their importance on it’s mobile rankings.
Quick Links
Legal Stuff