Our Blog

htaccess redirects and rewrites and robots.txt optimisation – Updated 2015

By : Rivmedia |August 26, 2015 |Featured, General, Hosting |0 Comment

Robots.txt optimisation

When we first wrote this post back in 2011 we outlined best practice concerning the robots.txt file, blocking un-relavant parts of file structures to prevent search engines from indexing certain parts of a site, this was considered best practice until late 2014 when Google released the Fetch and Render Tool, from this point blocking parts of your site such as the theme and plugin folders can have negative effects on your site. Google now give guidelines to allow bots to crawl parts of the site which contain CSS, Scripting and anything else which can alter the appearance of a website to a user.

As a result of these we generally leave the robots.txt to bare basics and adjust it per situation, but for reference here is our base robots.txt which blocks some resource hogs, scanners and potentially bad bots.

Redirect non Www to Www. :

Redirect Www to non Www :

3. Protecting your .htaccess file

You really dont want people looking at your .htaccess file so use this to block them.

Redirect Dedicated IP to domain ( duplicate content fix )

Redirect http to https/SSL

Single Page 301 htaccess redirect

Redirect entire directory and child URLs to one page or domain

Enable Gzip Compression

Expires Header caching (Leverage Browser Caching)

Prevent directory listing / Browsing

WordPress Hardening

Below are a couple of htaccess additions which will help harden a WordPress based website.

Protect wp-config.php

Secure / Harden wordpress includes folder

Article Name
HTaccess and Robots.txt optimisation and tips
Top tips for redirection and optimisation of the htaccess and robots.txt files.