Reference guide: on-site SEO Top 10

A base checklist for SEO projects

Introduction

After managing multiple digital projects and pitching many agencies on the way, I’ve learned that you can’t always expect that best practices will be implemented and you must specify exactly what and how things should work. One of the best examples of this is SEO and how in multiple cases we were delivered websites with robots.txt blocking search engines, non-SEO friendly URLs or pages taking 25s to load.

To help solve this, I decided to make this list that aims at organising the top 10 SEO tips in one place so you can just attach it to your pitch and at least get the basic on-site technicals in place. It also includes a few extra tips and a testing reference that you can use as a reference when evaluating the agency work.

Search Engine Optimisation Top 10

  • The website is accessible and crawlable (e.g. robots.txt is not blocking, other no-index tags)
  • Website can be easily discovered (e.g. uses optimised URLs, sitemap.xml)
  • The website uses HTTPS and forces its usage
  • The website is fast. It should not weigh more than 1mb, without images.
  • All pages have unique titles and descriptions
  • Structured data is implemented
  • Language tags are properly configured (e.g. hreflang)
  • All images on the website have Alt tags
  • Headings hierarchy is respected: i.e. most important title is H1, the second H2, etc.
  • The website is shareable and includes the proper og:tags

Testing

  • Websites must be regularly tested for performance, best practices and accessibility and I recommend the following tools to test:
  • Lighthouse: at least 80 on all metrics.
  • GT Metrix minimum PageSpeed Score: B
  • ScreamingFrog: test titles, descriptions, 404s, redirects, etc.

Extra tips

  • If the structure of your website is going to change (e.g. new CMS), a solid redirection strategy is prepared
  • A process to control and solve errors is implemented (e.g. Google Search Console is configured)
  • The sitemap was submitted and validated by Google Search Console
  • A process to regularly assess website speed and performance is in place
  • The website should not make more than ~30 requests.
  • Development or testing environments are not public and / or not crawlable by search engines