SEO – Gouging Eyes out with Rusty Spoons

A very short piece on my current, off-the-wall, opinions on my little SEO journey so far and the shenanigans we’ve faced up to this point.

Ok, so it hasn’t been that bad! Plus it’s interesting I have opted to use ‘spoons’, and not just a singular spoon. It looks like I plan to gouge out both my eyes at once based on the post title!

I have certainly had moments of scratching my head as to what the hell is going on and SEO often brings up questions like ‘is option a, b, c, z or a combination of some or all, the best way to approach it?’; which quickly leads to the questioning of every blooming decision you plan to make to the nth degree!

With the website ‘fundamentally’ finished (ok, I’m completely and utterly lying, I want to rewrite massive chunks in the usual ‘unhappy with the code as soon as you’ve finished writing it’, developer condition, or sickness if you will!), we have desperately been trying to hike our way up the rankings. This has, after a painstaking process, happened; to some degree at least.

I picked this book up for starters, as all good things start with a good read; don’t be put off by the whopping great name:

SEO 2016: Learn search engine optimization with smart internet marketing strategies

After reading this and (thanks for this by the way) picking a few brains of SEO boffins and asking friends to pick additional boffin brains for me, I was ready to start making some of the larger, more critical changes.

So, in no particular order, here were the ideas and concepts that I feel have made the most impact in the short time we have been climbing the SEO mountain…

Page Titles and Page Descriptions

This, without a shadow of a doubt, had the largest impact. We, quite literally, just added unique metadata to each page and ensured our titles and descriptions were informative, without skipping out on the correct keyword combinations we were targeting.

Up until this point the web applications that I have produced, out in the wild, have been linked to from other sites (i.e. local authorities, for example); these other sites took on the burden of SEO and it most cases were sought out for the services they provide (i.e. it was non-commercial in a sense and the traffic was not being fought over).

Now I find myself competing for these precious nuggets of traffic, this takes the biggest bite of the biscuit for me. We went from completely impossible to find, for a small subset of keywords we were targeting, to being on page 3 of the search results (a little variance across Google and Bing, but just a rough approximation). Not perfect, but a very good start!

Keyword Optimisation

This took a couple of days to do in the end. This boiled down to, in essence, looking for keyword combinations with good levels of traffic but lower degrees of difficulty when it came to competition. We ended up spreading our eggs amongst a few baskets here and did target a few competitive keywords; only in an effort to make our content as natural and accurate as possible (clarity/quality of content is also a ranking factor, after all).

We used the Moz tools for this job; this was to check out keyword densities of comparable sites and then plan some alternatives. These references can be found here (you get a month’s free trial by the way):

Moz Keyword Explorer

Moz Open Site Explorer

This has had some impact, once our site was re-crawled and re-indexed. Just from a content quality perspective, it was well worth running everything through Grammarly. I have the Chrome Plugin but, in this instance, I used a Word Plugin to get the job done (you can get a free account, with additional paid for options being available):

Grammarly for Word

Grammarly for Chrome

Keywords in Content, Image Titles/Alt Attributes and Anchor Text

A simple one but one that, after a few hours of looking around the site and post-building up a new keyword strategy using Moz tools, we knew we could improve on.

We now have a much better spread of keywords across our pages, including LSI keywords (Latent Semantic Indexing); essentially just related phrases.

As for image titles and title/alt attributes, we went on a massive, night long, rampage to improve these; based on our new found keyword knowledge (using hyphens to separate words in resource names).

Lastly, a great deal of effort was placed upon using natural and informative text within anchor tags which, to be honest, we were a little weak on. This appears to have had a positive impact.

Sitemaps.xml

This one got me! I generated a sitemap that, on the face of it, looked prim and proper. However, after checking it in the Google Search Console I had been sucker punched by dodgy encoding (UTF-8 BOM, to be exact). There, at the start of my file, sat a Byte Order Mark (otherwise invisible in Notepad/Visual Studio, as these things often are). This equalled an instant red flag and is something worth not getting caught out on.

I found a reasonable looking Sitemaps.xml validator which flagged the issue:

Robots.txt Checker

Recreating the file in Notepad and checking it in fixed the issue (and doing the resubmit via Google Search Console).

One last thing (luckily I didn’t do this!), make sure to double check that in the robots.txt file you aren’t disallowing access to your entire site when it comes to legitimate bots. Thankfully, I was dead careful in this regard!

So, what has raised eyebrows (in the ‘why does this matter’ stakes)? One thing that has really got my goat is the Flesch-Kincaid Reading Ease score. I have spent a fair bit of time wracking my brains as to whether content really needs to be targeted at a thirteen to the fifteen-year-old student (when it comes to reading ability that is). In honesty, I’m not sure on this…

The content outlined in SEO 2016 references Searchmetrics findings show sites appearing in the top ten results, on average, have a Flesch reading score of 76.00; which puts a stake in the ground at the aforementioned thirteen to fifteen-year-old reading level ability.

This doesn’t sound all that bad to me; but, after some initial testing I was finding that our content was ranking below this bar on all occasions. Not by too much, but enough to concern me. Upon testing some other comparable, higher ranking sites, I wasn’t able to determine that any other site had much better scores, to be honest.

We reread the content, multiple times, before coming to the conclusion that it was easy enough to read. I personally felt that certain sentences and words were being penalised too harshly for being ‘too complex’ or as having ‘too many syllables’ when I’m sure that it could be easily read by children (far below the thirteen to fifteen-year-old mark).

It felt as though we had to dumb down our content to a point where it felt very unnatural; as if we couldn’t compose a sentence of more than five words. Possibly borderline insulting to most people; so we’ll be sticking to our guns for the time being.

What’s Next?

Claire and I have a few other strategies going on, such as getting some strong backlinks in play; this is the next port of call. We also need to focus more on the all-important social media. I have additional tasks to perform, such as adjusting how JavaScript is loaded (looking at async/defer options for all of my resources; without busting pages!), looking at resource bundling and caching including resource optimisation (i.e. a little bit more image compression work). I’ll be a busy bee for a fair while yet!

We’ll see how the first batch of alterations plays out first; it is incredibly difficult to shake the feeling that a good chunk of this is highly experimental!

If any SEO experts come across and can offer any further advice or would like to comment please do! I’d love to hear from you.

Until the next time, toodles!