Destroying Your SEO from the Inside

  • Technical Issues That Could be Destroying Your SEO

  • Basics that no webmaster, webdesigner or marketer should ever forget about.

  • From Robots to Redirects

From link building to incredible content, on-page optimisation to easy navigation, most of us know the basic techniques that improve SEO. However, sometimes using good SEO practices isn’t enough. If your site has a fundamental technical issue, the search engines may not be able to get a good idea of what it’s trying to explain. In some cases, they may even get prevented from crawling your URL altogether. To ensure this doesn’t happen to you, take a quick look at these common technical issues that could be destroying your SEO.

Fundimental basics of technical seo

Robots

One of the most important technical aspects of SEO, your robots.txt file – also known as Robots Exclusion Protocol – gives the search engines instructions on how to crawl your site. If the data is incomplete, incorrect or missing, some search engines may face problems accurately indexing your site.

Common mistakes include disallowing the entire domain to be crawled by the search engines, failing to provide instructions on where the sitemap.Xml resides, or not updating a sitemap URL location to forgetting to exclude pages that you don’t want indexing. To ensure this doesn’t happen to your site, make sure your website provider or webmaster has taken the time to implement and make sure it’s all in good standing. A basic SEO audit will pick up issues such as this; such a small thing can often lead to significant problems.

Redirects and Canonicalization

If you’ve migrated your site to a new URL structure, you need to make sure users – and search engines – are automatically redirected to your new webpage. Before you launch your new web site, set up a series of 301 redirects ( most commonly done via htaccess ). These will ensure anyone who clicks on an old link is taken to the new website, allowing you to retain the external  links you’ve built up over the years and ensuring your site is easy to find. However, as multiple 301 redirects on top of each other are bad, it’s important you go through your site and make the appropriate changes to your internal linking structure and previously redirected URLs first to avoid creating chains of 301s.

Another thing which is often forgotten/neglected by web designers for small business clients is to redirect either the non-www to the www. version of the website or visa versa….such a simple tweak which can make the world of difference to both SEO, user experience and over all functionality.  Once the redirect is in place you should also take the additional steps to help the likes of Google acknowledge your preferred domain is to tell them what your “preferred domain” actually is via Webmaster tools ( search console ), to do this you will be required to validate all versions of your site which generally is easiest through DNS validation (in my opinion for long term validation)

We wrote about some of these redirects and robots.txt entries a couple of years ago, most of which is still valid today :  htaccess redirects and Robots.txt examples

Audits, Technical and fundimental SEO

Broken links

Broken links are bad for SEO. Even if you haven’t migrated your site, it’s a good idea to check for broken links on a regular basis especially if you link out to external sources on a regular basis who may move, shutdown or simply change the address of the page you were linking to. If you don’t have a web expert you can call on, there are a number of programmes you can download ( such as screaming frog ) that will check for broken links for you. Google Search Console should also flag up any broken links, giving you the chance to fix these errors as quickly as possible.

Duplicate Content

Search engines dislike duplicate content, something that is widely misunderstood but also fairly simple to at least give search engines an indication of which variant  of the page is the one that matters,  using  rel=“canonical” tags and being careful ( and creative ) about the structure of your site should could work in your favour. Though features like dynamic URLs , category, tag and product listings can cause duplicate content to be displayed, in general, most sites should be able to tweak text, images and links to ensure every single page is unique.   This is particularly important for e-commerce sites and content blogs that have the potential to have categories or tag pages of products/post which ultimately are the same page but with a different URLs.

Technical issues are crucial to SEO success. To ensure your site has robots, redirects and other fundamental elements in order, or to find out how we can help boost your SEO, get in touch with a member of our team today.