Skip to main content

Bristol SEO Meetup: Technical SEO Audit For Beginners

At the November Bristol SEO Meetup, our digital marketing executive Faye Watt shared a presentation on technical SEO audits for beginners, and you can check out the slides here.

In this talk, Faye covered:

  • SEO Tools
  • Robots.txt
  • Index
  • Sitemaps
  • HTTP & HTTPS
  • 404 Pages
  • Redirect Errors
  • Page Speed
  • Structured data
  • Mobile-Friendliness

SEO Tools

Some of the tools that are referenced throughout this presentation, and that we use in our day to day activities include:

ScreamingFrog: ScreamingFrog is one of our favourite SEO tools and we use it every day to crawl websites and identify any SEO issues, such as 404 errors, incorrect canonical URLs, and redirect chains, and to also extract key on-page SEO elements, such as title tags, meta descriptions, and structured data.

Ahrefs: We use Ahrefs to mainly get an in-depth look into a website’s backlink profile. We are able to monitor and identify new or lost backlinks, and research competitors’ backlink profiles to get an idea of the quality of their links and to identify opportunities for new links.

SEMrush: SEMrush is our go-to tool for tracking keyword movements, competitor performance, and for running daily, weekly, or monthly site audits that will quickly flag any new SEO issues so we can act immediately to minimise impact.

Redirect Path: Redirect Path is a super useful Chrome plugin that allows us to quickly see the HTTP status code of a URL. This allows us to immediately see if a page is returning a 200, 301, 302, 404 and 500 HTTP status code, as well as client-side redirects, such as JavaScript redirects.

GTMetrix: We use GTMetrix to test a website’s page speed. There are so many tools that you can use to review page speed, such as Google’s PageSpeed Insights, but we like to use GTMetrix as it gives a much more in-depth breakdown of which elements are slowing down your webpage and why.

TinyPNG: TinyPNG is a tool you can use to optimise your images for page speed. By uploading your image files to TinyPNG, you can reduce the file size of your images which helps reduce the page load time.

CSS Minifier: CSS Minifer is also related to page speed and is used to minify CSS files to reduce their file size, but make sure you speak to your development team before attempting to implement this!
 

Robots.txt

So, let’s start at the very beginning, the robots.txt file. Your robots.txt file can usually be found by adding /robots.txt at the end of any root domain, e.g. https://example.com/robots.txt.

The robots.txt file instructs web robots, usually search engines robots such as Googlebot, if they can or cannot crawl specific parts of your website. When a search bot reaches a website, they will first look for the robots.txt file and read this before continuing to crawl the site. By using the disallow directive, you can block search engines from crawling specific pages or sections of your website, such as your staging environment.

Common mistakes:

  • Blocking any pages or sections of your website that you want to be crawled
  • Blocking your entire website from being crawled
  • Using robots.txt to prevent search engines from indexing your content. Your pages can still be indexed if they are linked to from another page

Examples of robots.txt:

Blocking all web crawlers from crawling your entire website:

User-agent: *
Disallow: /

Blocking Googlebot from crawling a specific URL and its sub-folders:

User-agent: Googlebot
Disallow: /example-sub-folder/

Index

Once your website has been crawled, it can then be indexed in the search results. 

It’s important to review how your website appears in the search index, and to identify:

  • How many pages are being indexed
  • Any missing pages
  • Important pages are being indexed
  • Any pages that are incorrectly indexed
  • Any pages incorrectly excluded from the index by the noindex meta tag

To find out which of your website’s pages are being included in the index, you can perform a site:search in Google’s search bar e.g. site:edicomedia.com, and this returns a list of pages that are in Google’s index.

There is also a new tool in Google Search Console, the Index Coverage Report, that allows you to find any URLs that are included in the index and identify any that are excluded and why they are being excluded.

If your webpages are being excluded from the index:

  • Check for a noindex meta tag and remove this 
  • Inspect the URL using Search Console’s URL inspection tool which informs you why your page is excluded
  • Once you have solved any issues that might be preventing your URL from being indexed, you should then request indexing in Search Console
  • If no issues with the age are found, the page will be queued for indexing
     

Sitemap

Your website’s sitemap can usually be found by adding /sitemap.xml or /sitemap at the end of your root domain e.g. example.com/sitemap.xml. Your sitemap helps search engines better understand your website so that they crawl it more intelligently.  

Using a XML to CSV converter, you can export your website's sitemap to a spreadsheet and upload the URLs to ScreamingFrog and:

  • Remove URLs that return a 404 error
  • Remove URLs that redirect
  • Remove non-canoncial URLs
  • Identify any URLs that are missing from the sitemap
     

URL Status Codes

Put together a list of all of your website's URLs and identify any that do not return a 200 status code. We use several tools to collect URLs and check their HTTP status, including ScreamingFrog, Redirect Path, Search Console, and Ahrefs.

Common HTTP Status Codes:

HTTP Status Code 3xx – Redirection

  • HTTP 301 Moved Permanently
  • HTTP 302 Temporary Redirect

HTTP Status Code 4xx – Client Error

  • HTTP 404 Not Found

Internal & External Links

Perform a crawl of your website to identify any internal or external links that result in a 4xx or 3xx status code. If an internally linked URL returns a 3xx status code, you should update the link to the destination URL. If it returns a 4xx error, you should attempt to find a relevant webpage that it can then be redirected to, however if there are no relevant destinations, remove the internal link. Also do this for any external links you have on your website.

Inbound links

Export and crawl your backlink profile to identity any inbound links to URLs that return a 4xx status code:

  1. Export your historical backlink profile from Ahrefs or Google Search Console
  2. Upload list of destination URLs to ScreamingFrog
  3. Identify URLs that return a 404 error
  4. Redirect these URLs to a live URL or contact the site owner to update the link

404 Page Not Found

When a user does land on a URL that returns a 404 error, they should land on a page that is designed in a way that allows the user to easily navigate your website so that they don’t immediately leave. On your 404 page, you should:

  • Link back to your homepage
  • Provide useful links to most commonly visited pages
  • Consider adding a search bar to allow the user to search for what they were originally looking for
  • Provide contact details, such as customer support

Common mistakes

One of the most common mistakes we see with a 404 page is that the URL actually redirects to a /page-not-found URL which returns a 200 status code. A broken URL should always return a 404 error.
 

HTTPS

By now, your website should really be HTTPS. Since 2014, Google has been using HTTPS as a ranking signal and now over 70% of Google’s page on search results are HTTPS.

Google Chrome now also marks all non-HTTPS URLs as non-secure and since October 2018, this will turn red when a user enters sensitive data:

HTTP Not Secure

 

You should ensure that all of your URLs are HTTPS and your HTTP URLs correctly 301 redirect to the HTTPS version to avoid creating duplicate URLs and content.

Redirect Errors

There are several ways to write a URL and while no style is officially preferred by Google from an SEO perspective, you should always be consistent in your choice and ensure that one version correctly 301 redirects to your preferred version.

Examples:

  • Non-www and www
  • Trailing slash and non-trailing slash
  • Upper and lowercase URLs

So, if you don’t use www in your URLs, you should make sure that the www version correctly redirects to the non-www version, or if you add a trailing slash to the end of your URLs, make sure that the URL without a trailing slash redirects correctly to the trailing slash version and vice versa. Otherwise you will be creating two completely different URLs that have the same content.
 

Page Speed

There are multiple tools you can use to review your page speed, including:

  • PageSpeed Insights
  • Pingdom
  • GTMetrix

We prefer to use GTMetrix as it provides a more indepth analysis of your page speed and what can be done to improve it.

Some of the most common fixes we come across when analysing page speed are:

  • Optimise images
  • Minify and combine JavaScript and CSS file
  • Minimise the use of unnecessary images 
  • JavaScript placement
  • Leverage browser caching

Optimise images

The file size of your images can have a significant impact on the page load time of your webpage, and there are several tools that you can use to compress your images and reduce their file size, such as TinyPNG.

Minify JavaScript & CSS Files

You can visually check if your JavaScript or CSS files have been minified, the below is an example of unminified CSS followed by minified CSS. 

Minified CSS

 

If your CSS file looks like the first image, then you need to minify it. You can do this by using a tool such as CSS Minifier, but make sure you speak to your development team before implementing this!

Minimise use of unnecessary images

Don’t make the mistake of using images for small icons such as:

  • Magnifying glass
  • Arrows
  • Mobile menu icon
  • Social media logos

Use a custom font instead to avoid loading additional, often unnecessary images. Font Awesome is a great resource for these and they’re usually super easy to implement.

Structured Data

Structured data is becoming increasingly more common and more options and opportunities are being created all the time. It helps search engines better understand the content on a webpage and can enhance how your content or listing appears in the search results.

 Structured data can be added to several elements of a website, such as:

  • Logo
  • Social profile
  • Breadcrumbs
  • Articles
  • Events
  • Job Postings
  • Reviews
  • Recipes
  • Contact
  • Course
  • Local business
  • Livestream

You should review all the content types on your website and identify any structured data opportunities.

Mobile-Friendly

 In April 2015, the Mobile-Friendly algorithm was released and was designed to give mobile-friendly pages a boost in mobile search results, and this was updated again in 2016. In May 2018, Google rolled out their mobile-first index, which means your website will now be ranked based on the mobile-friendly version of your website, even for search results shown to desktop users. So, it’s super important that your website is fully responsive. Read our blog post on the Mobile-First Index to find out more about what you can do to make sure your website is ready.

Get Involved 

If you are interested in attending similar talks, the Bristol SEO meetup is a monthly event that is free for all to attend, you can sign up to the event and get email reminders for upcoming events.

Leave a comment...