What is Technical SEO?

So you’ve optimized your content, researched your keywords, configured your meta descriptions, but what more can you do in terms of your site’s SEO?

SEO can be thought of in three main concepts that work together to make your site more appealing to a search engine. These three concepts are:

  • On-Page – this section deals with the content that is provided by your website. This is where optimizing your content and keywords come into play.
  • Off-Page – These are the inbound links coming from other relevant sites giving your site a sense of authority. You can think of this as the promotion section.
  • Technical – This deals with everything else that the content does not cover. We will be discussing this side of SEO in this post.

What is Technical SEO?

Technical SEO is the optimization of your website in order to rank better. There are many facets of site optimization that could affect how your website is crawled and indexed by a search engine. The better your site performs, the better it will rank.

What can I do to better optimize my site?

First things first, your site needs to load quickly and efficiently. You can use tools such as Google Developer tools or Pingdom Speed test to see how your site is currently performing. These tools will not only give you metrics but also suggestions on how to improve. These suggestions include things such as compressing images, Remove render-blocking JavaScript, leverage browser caching, etc. Addressing these suggestions can increase your page performance and therefore your rankings.

Next, you want to make sure that your site is mobile friendly. In 2018 mobile devices made up 58% of site visits and 50% of the bounce rate. You can test this using the google mobile-friendly testing tool (https://search.google.com/test/mobile-friendly).

You then need to optimize your site architecture. You can create HTML and XML sitemaps that will list out the URLs present on your site. As robots crawl your site, these sitemaps direct the robots where to go. The simpler the sitemaps are to follow the better the ranking. This can be difficult as a site grows and get more inflated and more difficult to navigate. Essentially you want no more than 4 clicks to navigate to your content.

Finally, fix any errors that occur during the crawling phase. You don’t want to have a confused robot navigating through a tangled web of redirects and non-existent pages. This can be achieved by limiting the number of redirects and 404 errors. You want the robot to navigate through your site easily and efficiently. If a robot hits these errors, chances are so will a visitor on your site. You can see where these errors occur on your site’s search console if you navigate to “Crawl > Crawl Errors”. This will show you where the errors occur as well as the inbound links that go to the error.