SEO Consulting

When it comes to SEO, businesses have three options:

  1. Hire an SEO Consultant
  2. Hire an SEO agency
  3. Hire an in-house SEO team

Throughout my career, I have been a part of all 3 of these arrangements. I got my start at a Digital Marketing Agency, this is where I learned that I had a knack for SEO and fell in love with it.

Later, I was hired by a startup to manage their SEO in-house and ultimately structured and led the SEO team. As the company grew I was promoted to be a product manager on the growth team, where I continued to steer SEO while developing areas of the product that would fuel continued organic traffic growth.

I left that role in 2020 and went out on my own as a freelance SEO consultant, where I conducted white-label SEO audits for a handful of SEO agencies and took jobs on sites like UpWork. Before I knew it I had 10 clients and with that, I launched Ciffone Digital and have been consulting full-time for the past 4 years.

Benefits of Working with an Independent SEO Consultant

The choice of hiring an SEO Consultant or an SEO Agency is a big decision that largely depends on the stage of your business’s growth and your goals. However, over the past 5 years, there has been a shift in the market trending toward the consultant model. There are several reasons for this:

1

Specialized Expertise and Experience

SEO Consultants are usually highly talented in specific areas of SEO and are sought after for their knowledge in particular niches. For example, as a Technical SEO Specialist, I am highly sought after for my SEO Audits and other technically involved disciplines such as International SEO and Page Speed Optimization. Businesses come to me with specific goals and challenges that are best suited to be addressed by an expert.

2

Personalized Attention

A consultant is more likely to provide businesses with more personalized service, develop a deeper understanding of an organization’s unique needs and goals, and be more flexible when it comes to strategy and execution. For example, I go out of my way to learn as much about a company as possible before landing on a specific SEO strategy so I can see the big picture; where the business is at now and where it wants to be in 5 years. Businesses also get to work directly with me and/or the small team of specialists that I assemble for a project.

3

Cost-effectiveness and Flexibility

In my experience, the way you get ROI from SEO is by not wasting your budget. SEO is inherently iterative; we need to change the course of action and focus until we find the right mix. Some months we’ll be executing and others there will be downtime while we are monitoring the effects of recent actions that we take. SEO usually fails when businesses are paying for things that they don’t need. Hourly or project-based pricing to address specific problems is much more efficient than paying a monthly retainer for a variety of services.

4

Direct communication and responsiveness

Direct communication is a significant benefit of working with a consultant. We can reach decisions fast without being hindered by bureaucracy and chain of command. The current SEO landscape is as volatile as ever, and being able to enact changes quickly is critical to success.

5

Approach to SEO

Consultants and agency owners/agency directors usually approach SEO in very different ways. On the agency side, they’re interested in tried and true methods and strategies that they can rinse and repeat over and over again. They depend on consistency at scale. Consultants, on the other hand, often adopt more innovative strategies because we have more time to experiment and test, and are constantly trying to stay ahead of the curve because we depend on doing so for content. In short, with a consultant, you’re more likely to get service that aligns with the forefront of SEO trends and techniques.

Best-in-class SEO Audits, Technical SEO, and Comprehensive Growth Strategy

For most businesses, the goal of Search Engine Optimization is for your website to rank on Google and be visible to potential clients and customers. Success, however, is not just about how well your website ranks on Google, but how profitable your business is. ROI is key.

In short, SEO is not quick or easy, and there is no “one size fits all” solution. That’s why it’s important to have trust in the SEO consultant or agency that you choose. My top priority is not only to build a strong relationship and trust, but to also work with you to build the right SEO campaign to fit your business’s needs. The foundation for good SEO is always built on:

The Right Keywords

The Right Pages

The Right Content

Organic search is referred to as “organic” because organic results are meant to reflect the internet’s natural ebb and flow. Trends and algorithm updates come and go, but as long as you’re providing the best content that you can for your users, you’ll generally be safe.

A lot of SEO agencies rely on expensive software and tools to do their job for them, I do not. My approach involves fully understanding your business, determining your users true wants/needs, and recommending the ideal forms of content to satisfy them. I believe in doing SEO the right way, which is not by taking shortcuts. When done properly, SEO can be one of the best ROI generating investments a company can make.

Throughout my career I’ve built a reputation within the SEO community for my SEO auditing skills and knowledge of Technical SEO.

Mike is able to dive into tech SEO and uncover things often missed by a less experienced digital marketer. He goes further and sees the big picture which can make or break your campaign and results.

Yosef Silver

Digital Marketing Speaker, CEO of Fusion Inbound

On-Page and Technical SEO Services

SEO Audits

My approach to auditing a website is very straightforward – search for the weird and keep digging until the problem reveals itself. Ultimately, everything ends up connected. I typically conduct audits before each campaign by default, but also on an individual basis.

Technical SEO

Technical SEO can mean a lot of things these days. It includes much of what we look at in our full SEO audits. The primary goal here is to uncover technical anomalies that exist under the hood of your website that may be contributing factors to the cause of drops in rankings or traffic.

Keyword Research

Keyword research is the very core of SEO. I leverage a combination of intuition, experience, search volume data, and other relevant metrics when conducting my keyword research. These days it's more important to holistically cover a topic than target exact match keywords. It's important to understand that the right terms aren't always the most searched ones

On-Page Optimization & UX

I blend tried and true On-Page SEO best practices with a great overall user experience to make each page on your site the as relevant as possible to the user. Spammy tactics like repetitive exact match keywords in your title tag or H1 will only hurt you - it's important to use natural language and fit in keywords when it's logical. My primary focus is providing genuine value to your users.

Local SEO

Local SEO is a staple for restaurants, banks, coffee shops, and other brick and mortar businesses that need to be found on maps and local organic results. There's more to Local SEO than just setting up Google My Business and creating citations. You need to get involved in your community and make an impact.

JavaScript SEO

I help teams ensure that their JavaScript powered experiences built on frameworks such as React and Angular are easy for search engines to crawl, render, and index. The way that I get involved can be totally up to your team and the way you work. I have professional experience as a Product Manager in an agile/scrum enviorment.

Our Approach to Search Engine Optimization

I’m committed to doing SEO the right way, which is not by taking shortcuts or trying to game search engines. Real SEO requires getting the fundamentals right, patience, discipline, extreme curiosity and constant testing and experimentation.

I focus on white hat tactics and providing value to your users through high quality, unique content and exceptional user experiences. My general philosophy is not to build links, but instead to earn them. I still want to build links, but we’ll mostly leverage Digital PR when we need to.

There was a time, roughly between 2008-2015 when it was relatively easy to rank a website on the first page for specific keywords by acquiring a lot of backlinks to a website with its target keywords in the anchor text.

On Google, this lead to many of the top results for a search not existing by merit of quality, but because they were “optimized” by someone who understood how to “optimize” the site.

Over the past 5-6 years, Google has released a number of algorithm updates to ensure the quality of the results that it ranks on the first page of results.

Nowadays, it requires a lot of hard work and effort to maintain good rankings. For a given Google search, the top results are always the pages that provide the best and most useful content relevant to what the searcher is looking for.

Sites must also be mobile friendly, provide a good User Experience (UX) and be optimized for page load performance and speed.

To understand where I see SEO heading in the future check out my article: How to Think About Ranking on Google Search in the Future.

Technical SEO

Technical SEO is a discipline within Search Engine Optimization that focuses on ensuring that your website meets the technical requirements of modern search engines, which are constantly evolving and changing with time.

It is focused on both optimizing content by making sure it’s optimized to rank well as well as understanding how search engine bots (e.g. Googlebot) crawl websites and index. A large part of Technical SEO has to do with site speed and performance, and ensuring that pages load quickly when accessed from any device or browser type (a process called rendering).

Common Technical SEO Issues I Fix

An SEO audit is typically required to properly diagnose these issues. However, to a well-trained eye some are fairly obvious.

1

Pages Loading via standard HTTP (Non-HTTPS)

Mixed content occurs when initial HTML is loaded over a secure HTTPS connection, but other resources (such as images, videos stylesheets scripts) are loaded over an insecure HTTP connection. This is called mixed content because both Secure Hypertext Transfer Protocol (HTTPS) and Unsecured http connections were used. It can also occur due to other SSL related issues.

2

Website crawls show multiple versions of the homepage

This is typically caused by www and non-www versions or http and https versions return 200 OK.

3

Search Engines are not indexing content

Most of the time if your site has indexation issues, it's either an easy fix or a massive headache. A really good scenario would be that we're looking at meta robots no-index tags that were set at the time of development and never taken off. That's a simple fix. For large sites we may be index bloat - so a site may have a ton of pages, yet they're either not valuable or some are deemed duplicate. It could also be that a site's robots.txt is accidentally blocking certain pages from being crawled. Ultimately, there are a lot factors that are potentially at play here. Most technical SEO consultants will want to see an audit before issuing a diagnosis.

4

XML Sitemap either doesn't exist or has problems

It's very easy to tell if a website doesn't have a sitemap - they are typically located at /sitemap.xml or /sitemap_index.xml. An XML sitemap is a file that provides search engines and sometimes people information about the pages, videos, and other files on your site.

Sitemaps are then read by search engines like Google to efficiently crawl through websites for any important links, new content, or content they may have missed during their previous indexing process. Put simply, an XML sitemap informs search engines about which webpages on a site are the most important. With this information webpages can be crawled faster and search engine bots are not wasting crawl budget on less relevant or lower priority webpages.

5

Content gets mistakenly blocked in a Robots.txt record

When a robots.txt configuration is the culprit of SEO problems, it is typically because a business user mistakenly blocked important content from appearing in search results while attempting to alleviate what is called index bloat.

A robots.txt file tells search engine crawlers which URLs the crawler can access on your site. Large websites (such as e-commerce) typically disallow specific pages or irrelevant resources (such as scripts) to avoid overloading their server with requests when a bot is crawling their pages. There is a common misconception that the robots.txt record is a valid method to keep certain pages out of search results.

6

HTML Meta Robots tag is mistakenly set to no-index

Like many other Technical SEO issues, this one happens often because someone forgot to reverse a setting after deploying a website from a staging environment to production. This tag is usually set via a WordPress SEO plugin such as Yoast SEO, All-In-One SEO, or RankMath. There are global configuration options that apply to page types or taxonomies, we first want to go through those settings and make sure it is configured appropriately.

When a site is in development you don't want it getting indexed because users might then find it and have a poor experience. New versions of a website that are in development also typically have similar, if not exactly duplicate content as the current site in production. If a development site is indexed, search engines might also flag it as duplicate content. This can hurt the live (in this case also canonical) website's organic visibility.

7

Website loads too slowly and/or has poor Web Vitals

The page speed of your website is an important factor when it comes to user experience (UX) and SEO. Google's Page Experience update made Core Web Vitals a tie breaking rankings factor. Web Vitals are a new set of metrics used to benchmark a site's performance. Passing scores are a tie breaking rankings factor on Google. If your are poor, there is a chance that pages that previously ranked high for various search terms may lose that ranking to pages with similar content that are more likely to provide users with a good experience.

For large websites (~10,000+ pages) slow speeds also cause less pages to be indexed by search engines because search engine bots have a budget on the number of webpages they can crawl each day. Slow load times generally result in poor user experience. A slow website might experience a higher bounce rate, meaning that more people are leaving the site without engaging with any content or clicking on links on the page.

8

Incorrect rel=canonical attributes

rel=canonical indicates to search engines that a the specified URL is the the authoritative (master copy) of a page. We often use this to prevent duplicate content, or in complex internationalization configurations (hreflang). Cross-domain canonicals are often used to specify syndicated content or used as an annotation on widely distributed press releases.

Other Common Issues

  • Duplicate content exists (very common)
  • Images with missing Alt tags (insanely common)
  • Broken internal links
  • Pages contain invalid structured data (JSON-LD) or a lack thereof
  • Mobile friendly optimization (responsive design) and/or poor performance on 3g networks
  • Improper Technical Internationalization (users experience unoptimized version of a multilingual site)

Common Situations That Require a Technical SEO Consultant

Website Migrations

A bad website migration can cost a business massive amounts of traffic due to pages not being redirected (or properly). Any time the URL of an existing important page is changed, the old URL must be redirected to the new one.

If this is not done before a new site is pushed live, requests for the old URL will return 404 Not Found. The only time a 404 HTTP status code should be returned is if a removed page has no replacement or logical equivalent.

Random Loss of Keyword Rankings

When a site loses its rankings out of nowhere, it is typically due to either an an algorithm update, technical issues like soft 404s aka (redirecting users to another page, such as the homepage, instead of returning a 404 aka “not found”), or the removal or replacement of an XML sitemap.

It would be very unusual for on-page rankings factors such as Title Tags and H1s to cause such a drastic shift.

Moreover, if your rankings have taken a significant hit out of nowhere, you should have an SEO audit conducted as soon as possible.

Internationalization

The hreflang HTML attribute is used to specify the language of a page on a multilingual site, and also target specific geographic locations.

Using this the hreflang attribute can help search engines, such as Google, serve users the appropriate version of a globally relevant website.

Important to note is that while Google and Yandex both use hreflang this way, Bing and Baidu do not. Instead, they use the content-language HTML tag.

Improving Website Performance (Page Speed)

There are a lot of ways to improve the performance of a website, but not all of them are reliable. Improving performance is often a headache for businesses because fixing many of the biggest problems requires paying a developer to re-configure how much of the website behaves.

In turn, many in-house SEOs and agencies turn to caching plugins like Nitro Pack or WP Rocket. They primarily provide forms of web caching, and the automation or other performance optimizations.

In our opinion, it’s not a good idea to rely on WordPress plugins to be responsible for core functionality of a website. Why? There are a handful of reasons:

  • Plugins conflicts can lead to other aspects of a website to break
  • WordPress updates are often neglected, and outdated plugins are common attack vectors in successful security breaches
  • Most plugins that do a lot you have to pay for, or require a premium version to unlock the key features. An outdated payment method and/or an expired license can lead to unpredicted downtime and loss of income.

So while the aforementioned plugins can work out, it is more reliable to have web caching configured at the server level by a DevOps Engineer, and performance optimizations implemented by a professional Web Developer or Technical SEO. At Ciffone Digital, we prefer Varnish Cache – it comes installed by default on our custom web hosting.

Check out my WordPress Varnish Cache guide.

Don’t Wait Until It’s Too Late

Technical SEO should not be overlooked. While its importance has grown in recent years due to changes made by Google in how it ranks pages such as Mobile-first indexing and Page Experience/Core Web Vitals, it’s often get’s less attention because it is difficult to correlate with deal breaker KPIs like traffic and revenue. In 2014 Google released the Panda 4.0 update, as a result Ebay lost 80% of its organic rankings.

Have SEO Issues?

Subscribe to Campaign Advisor and get Mike's opinion for free