Technical SEO Consulting

Technical SEO is a discipline within Search Engine Optimization that focuses on ensuring that your website meets the technical requirements of modern search engines, which are constantly evolving and changing with time.

It is focused on both optimizing content by making sure it’s optimized to rank well as well as understanding how search engine bots (e.g. Googlebot) crawl websites and index. A large part of Technical SEO has to do with site speed and performance, and ensuring that pages load quickly when accessed from any device or browser type (a process called rendering).

Common Technical SEO Issues We Fix

An SEO audit is typically required to properly diagnose these issues. However, to a well-trained eye some are fairly obvious.

1

Pages Loading via standard HTTP (Non-HTTPS)

Mixed content occurs when initial HTML is loaded over a secure HTTPS connection, but other resources (such as images, videos stylesheets scripts) are loaded over an insecure HTTP connection. This is called mixed content because both Secure Hypertext Transfer Protocol (HTTPS) and Unsecured http connections were used. It can also occur due to other SSL related issues.

2

Website crawls show multiple versions of the homepage

This is typically caused by www and non-www versions or http and https versions return 200 OK

3

Indexation Problems

Most of the time if your site has indexation issues, it's either an easy fix or a massive headache. A really good scenario would be that we're looking at meta robots no-index tags that were set at the time of development and never taken off. That's a simple fix. For large sites we may be index bloat - so a site may have a ton of pages, yet they're either not valuable or some are deemed duplicate. It could also be that a site's robots.txt is accidentally blocking certain pages from being crawled. Ultimately, there are a lot factors that are potentially at play here. Most technical SEO consultants will want to see an audit before issuing a diagnosis.

4

XML Sitemap either doesn't exist or has problems

It's very easy to tell if a website doesn't have a sitemap - they are typically located at /sitemap.xml or /sitemap_index.xml. An XML sitemap is a file that provides search engines and sometimes people information about the pages, videos, and other files on your site.

Sitemaps are then read by search engines like Google to efficiently crawl through websites for any important links, new content, or content they may have missed during their previous indexing process. Put simply, an XML sitemap informs search engines about which webpages on a site are the most important. With this information webpages can be crawled faster and search engine bots are not wasting crawl budget on less relevant or lower priority webpages.

5

Content gets mistakenly blocked in a Robots.txt record

When a robots.txt configuration is the culprit of SEO problems, it is typically because a business user mistakenly blocked important content from appearing in search results while attempting to alleviate what is called index bloat.

A robots.txt file tells search engine crawlers which URLs the crawler can access on your site. Large websites (such as e-commerce) typically disallow specific pages or irrelevant resources (such as scripts) to avoid overloading their server with requests when a bot is crawling their pages. There is a common misconception that the robots.txt record is a valid method to keep certain pages out of search results.

6

HTML Meta Robots tag is mistakenly set to no-index

Like many other Technical SEO issues, this one happens often because someone forgot to reverse a setting after deploying a website from a staging environment to production. This tag is usually set via a WordPress SEO plugin such as Yoast SEO, All-In-One SEO, or RankMath. There are global configuration options that apply to page types or taxonomies, we first want to go through those settings and make sure it is configured appropriately.

When a site is in development you don't want it getting indexed because users might then find it and have a poor experience. New versions of a website that are in development also typically have similar, if not exactly duplicate content as the current site in production. If a development site is indexed, search engines might also flag it as duplicate content. This can hurt the live (in this case also canonical) website's organic visibility.

7

Website loads too slowly

The page speed of your website is an important factor when it comes to user experience (UX) and SEO. Google's Page Experience update made Core Web Vitals a rankings factor. If your website's Core Web Vitals (CWV) are poor, there is a chance that pages that previously ranked high for various search terms may lose that ranking to pages with similar content that are better optimized.

For large websites (~10,000+ pages) slow speeds also cause less pages to be indexed by search engines because search engine bots have a budget on the number of webpages they can crawl each day. Slow load times can result in a higher bounce rate, meaning that more people will leave the site without engaging with any content or clicking on links on the page.

Other Common Issues

  • Site crawls show multiple versions of the homepage
    • Typically caused by www and non-www versions or http and https versions return 200 OK
  • Incorrect rel=canonical attributes
    • These indicate to search engines that a the specified URL is the the authoritative (master copy) of a page
  • Duplicate content exists (very common)
  • Images with missing Alt tags (insanely common)
  • Broken internal links
  • Pages contain invalid structured data (JSON-LD) or a lack thereof
  • Mobile friendly optimization (responsive design) and/or poor performance on 3g networks
  • Improper Technical Internationalization (users experience unoptimized version of a multilingual site)

Common Situations That Require a Technical SEO Consultant

Website Migrations

A bad website migration can cost a business massive amounts of traffic due to pages not being redirected (or properly). Any time the URL of an existing important page is changed, the old URL must be redirected to the new one.

If this is not done before a new site is pushed live, requests for the old URL will return 404 Not Found. The only time a 404 HTTP status code should be returned is if a removed page has no replacement or logical equivalent.

Random Loss of Keyword Rankings

When a site loses its rankings out of nowhere, it is typically due to either an an algorithm update, technical issues like soft 404s aka (redirecting users to another page, such as the homepage, instead of returning a 404 aka “not found”), or the removal or replacement of an XML sitemap.

It would be very unusual for on-page rankings factors such as Title Tags and H1s to cause such a drastic shift.

Moreover, if your rankings have taken a significant hit out of nowhere, you should have an SEO audit conducted as soon as possible.

Internationalization

The hreflang HTML attribute is used to specify the language of a page on a multilingual site, and also target specific geographic locations.

Using this the hreflang attribute can help search engines, such as Google, serve users the appropriate version of a globally relevant website.

Important to note is that while Google and Yandex both use hreflang this way, Bing and Baidu do not. Instead, they use the content-language HTML tag.

How to Identify a Qualified Technical SEO Consultant

A sure-fire way to spot a legitimate Technical SEO is if they understand how to navigate/interact with a shell/terminal/command line. Most probably use, Bash/Zsh (MacOS) or PowerShell (Windows). However, there are absolutely exceptions.

Most qualified Technical SEOs will also identify as Web Developers. They might not be full-stack engineers, but they’ll know the basis of front end development (HTML, CSS/SCSS, and JavaScript) and at least be able to read/understand server side code (PHP/Node.js).

There’s no “I” in “Team”. An SEO that knows their stuff could absolutely work with a developer to successfully get the job done. If you work with an agency, you might not know who does your work. That’s just the reality, unfortunately. In those cases, we absolutely recommend educating yourself on the basics.

Don’t Wait Until It’s Too Late

Technical SEO should not be overlooked. While its importance has grown in recent years due to changes made by Google in how it ranks pages such as Mobile-first indexing and Page Experience/Core Web Vitals, it’s often get’s less attention because it is difficult to correlate with deal breaker KPIs like traffic and revenue.

In 2014 Google released the Panda 4.0 update, as a result Ebay lost 80% of its organic rankings.

Have SEO Issues?

Register for Campaign Advisor and get Mike's opinion for free