SEO Audit Banner General Health Audit 1 Baecon Marketing

SEO Audit Part 1: Case Study – Initial Technical Checks (Indexation, Mobile Friendliness, Robots.txt)

SEO Audit: Case Study

College wasn’t a bed of roses for me. As much as i knew the importance of going to school, i absolutely hated it.

Most of my time would be spent in the library typing away at something or learning about something else; digital marketing was one of them.

Most of the time, i was either in the library or not found in school at all, skipping 7 out of 8 of my classes.

My interest in school was digital marketing and I was eager, eager to get down and dirty with work. To get out of school and find out what REAL digital marketers do in their jobs instead of sitting down in class, learning academic things. I did play around with affiliate marketing for awhile but well yeah.. it didn’t really stick with me.

But yeah okay i should stop my rant right here and jump right into the SEO audit.

Different Types Of SEO Audits

I’ve always thought that SEO audits were quite one dimensional and basically encompassed everything web.

But after trawling around the internet for a bit, I found out that for SEO audits can actually be compartmentalised into smaller bite sized categories.

  1. General Health Audit
  2. Competitive Review
  3. Forensic
  4. Content Quality Review
  5. Backlink Audit
  6. Local SEO

I’m sure there are many others out there but i think these categories cover the essentials.

Goals Of An SEO Audit

Before i even begin an SEO audit, i thought it would be a good idea to have some goals in mind. Why am i even conducting an audit in the first place?

  1. To make sure that everything is ‘OK’ with my site. Kind of like how you got to send your car in for servicing every 6 months to make sure that everything’s good and nothing fucks up.
  2. To reassess the website’s SEO/content marketing approach to ensure that i/they are doing things in compliance to search engine’s (Google, Bing, Yahoo’s) guidelines.
  3. Maybe organic traffic has been going down/not very high.
  4. Other than searching for any problems or loopholes, i would like to see if there are any opportunities for growing my organic traffic as well.

Basically, this initial audit is to diagnose the health of my webpage.

Only after the audit is done will i move on to optimisation through, maintenance, fixing of problems and looking for opportunities.

Checklist For A General Health Audit

So when conducting an SEO audit, there are a few items that i have to check off the list when conducting a ‘General Health Audit’ for the website. They would be

  1. Initial Technical Checks
    • Indexation
    • HTTP Status Codes & Redirect
    • Canonical Redirects For Top Level Domain Redirects (TLDs)
    • Robots.txt
    • Mobile Friendliness
  2. What Search Engine Sees (Screaming Frog)
    • Redirects & Broken Links
    • Title Tags/Page Titles
    • Meta Descriptions
    • H1 & H2 Tags
    • Alt Tags
    • Site Structure (Link Depth)
    • Response Times
    • Manual Individual Page Review (Browseo.net)
    • HTML Validity (W3 Validator)
  3. Traffic & Site Speed
    • Google Analytics Review
    • Site Speed Test
  4. Google Search Console
    • Search Appearance & Structured Data
    • HTML Improvements
    • Search Analytics
    • Search Traffic
    • Index Status
    • Crawl Reviews
  5. Authority & Content Checks
    • Domain Authority
    • Trust Flows
    • Backlinks
    • QDF & Scrape Checks
    • Plagarism
    • Social Media

In part 1 of this series, i will be focusing on those bullet points mentioned under No. 1. Some initial technical checks (HTTP Status Codes, Redirects, Robots.txt and Mobile Friendliness)

The live website i will be conducting the audit on will be www.childyouthdevelopment.com

Alright! Jumping straight into the SEO audit itself in the next chapter. First up, Indexation, HTTP Status Codes & Redirects.

General Health Audit: Indexation, HTTP Status Codes & Redirects

So the first thing i’m going to do is to hop on over to Google and use the ‘site:www.example-site.com’ function. To check if Google has indexed my webpages.

1. site function google search

The first thing i’m going to type in is ‘site:childyouthdevelopment.com -www.childyouthdevelopment.com’ to see if there are any unwanted subdomains (created by the previous webmaster) indexed into Google.

So this was what i found,

site function google 1 baecon marketing

Screen Grab 1

site function google 2 baecon marketing

Screen Grab 2

site function google 3 baecon marketing

Screen Grab 3

Hoookays, great! So here are my findings.

  • There aren’t any unwanted subdomains. Just ‘blog.childyouthdevelopment.com’ which i created recently for them to post about themselves and their work. As well as content marketing purposes.
  • I also found out that there are dummy pages (?) for blog posts that i created previously, indexed by Google! Just basically pages that i previously created in wordpress that i don’t want indexed. NTS to get rid of em’ asap. (Highlighted in red in the 3rd picture)

The second thing i’m going to do is to type in,

2.site function google search 2

 and

site function google search 3

So this was what i found,

site function google 5 baecon marketing

Screen Grab 4

site function google 5 baecon marketing

Screen Grab 5

site function google 6 baecon marketing

Screen Grab 6

Other findings,

  • I realise that i may have some 301 redirecting work to do for TLD (Top Level Domains) possibly. As both ‘childyouthdevelopment.com’ and ‘www.childyouthdevelopment.com’ are being indexed. I’ll confirm that later on when i check the URL’s HTTPS Status Codes.
  • Old webpages that i have previously deleted (files whose URLs end with .php, marked out in red in the second picture) are still being indexed/show a 404 error.

After checking the indexation in Google, i then went on to check on the URL’s HTTPS status codes. This was what i got.

HTTPS Status Codes Check Screen Grab

HTTPS Status Codes Screen Grab

So looking at most of the ‘https://childyouthdevelopment.com’ URLs, they are either 200 or 301s which is great. That means i do not have to do any form of canonical redirects for the Top Level Domains (TLDs). All i have to do is to take care of the 404s for the older webpages.

General Health Audit: Robots.txt

Since i’m using a wordpress CMS, the first thing i’ll have to do to ensure that Google bot’s allowed to crawl the site is to head to settings -> reading to check off this box.

wordpress reading settings baecon marketing

WordPress Reading Settings

Great! Mine’s not checked. Amazin’.

Google Search Console Robots.txt Baecon Marketing

Google Search Console

WordPress will create a robots.txt file to prevent crawling of my pages if that box is checked. Using robots.txt to manage indexation is the wrong way to go about it and I will be using Meta Robots No Index tags later on to properly manage and control indexation.

General Health Audit: Mobile Friendliness

Mobile Friendliness Test 1 Baecon Marketing

Mobile Friendliness Test Screen Grab 1

Mobile Friendliness Test 2 Baecon Marketing

Mobile Friendliness Test Screen Grab 2

Mobile Friendliness Test 3 Baecon Marketing

Mobile Friendliness Test Screen Grab 3

Alight so i know Google has moved on to mobile first indexing, so this means it is crucial that webpages are mobile friendly right now. So to test this i went on to Google Mobile Friendly Test to check it out. Oh this test also helps to find out which resources are being blocked and stopping Google bot from crawling it.

Okay so.. NTS that there is 1 page resource (the google maps) that is being blocked by robots.txt. Weird. But yeah okay i’ll take note and fix it later on.

General Health Audit: Recap

So here are the results of the checks that i conducted in part 1.

Indexation, HTTP Status Codes & Redirects

  1. No unwanted subdomains. Just another blog website that i’d set up previously.
  2. Dummy pages (‘test posts’) that i’d created previously was indexed. NTS to get rid of em’.
  3. No major issues regarding Top Level Domains (TLDs). They are either 200 or 301 redirects. Great!
  4. Have to take care of older 404 webpages/URLs from previous developer.

Robots.txt

  1. Robots.txt not blocking webpages.
  2. WordPress reading -> settings option unchecked.

Mobile Friendliness

  1. Passed Google Mobile Friendliness check using Googlebot Smartphone User Agent with flying colours! Nicely done.
  2. Had page resource that was blocked.

Alright! Guess that’s it for now!

In part of the General Health Audit, we will be using a tool (Screaming Frog) to see what a search engine picks up when looking at our site. Essentially, i will be covering these.

What Search Engine Sees (Screaming Frog)

  • Redirects & Broken Links
  • Title Tags/Page Titles
  • Meta Descriptions
  • H1 & H2 Tags
  • Alt Tags
  • Site Structure (Link Depth)
  • Response Times
  • Manual Individual Page Review (Browseo.net)
  • HTML Validity (W3 Validator)

Till next time!

Leave a Reply

Your email address will not be published. Required fields are marked *

Related Posts

Begin typing your search above and press enter to search. Press ESC to cancel.

Back To Top