For most businesses, the goal of Search Engine Optimization is for your website to rank on Google and be visible to potential clients and customers. Success, however, is not just about how well your website ranks on Google, but how profitable your business is.
In short, SEO is not quick or easy, and there is no “one size fits all” solution. That’s why it’s important to have trust in the SEO agency you choose. At Ciffone Digital, our top priority is not only to build a strong relationship and trust, but to also work with you to build the right SEO campaign to fit your business’s needs. The foundation for good SEO is always built on:
The Right Keywords
The Right Pages
The Right Content
Organic search is referred to as “organic” because organic results are meant to reflect the internet’s natural ebb and flow. Trends and algorithm updates come and go, but with our team on your side you’ll always be safe. Our SEO team will ensure that you’re providing the best content that you can for your users.
A lot of SEO agencies rely on expensive software and tools to do their job for them, we do not. Our approach involves fully understanding your business, determining your users true wants/needs, and recommending the ideal forms of content to satisfy them. We believe in doing SEO the right way, which is not by taking shortcuts. When done properly, SEO can be one of the best ROI generating investments a company can make.
Our founder, Mike Ciffone, is known within the SEO community for his SEO auditing ability and knowledge of Technical SEO.
Mike is able to dive into tech SEO and uncover things often missed by a less experienced digital marketer. He goes further and sees the big picture which can make or break your campaign and results.
Digital Marketing Speaker, CEO of Fusion Inbound
On-Page and Technical SEO Services
Our approach to auditing a website is very straightforward – search for the weird and keep digging until the problem reveals itself. Ultimately, everything ends up connected. We conduct audits before each campaign by default, but also on an individual basis.
Technical SEO can mean a lot of things these days. It includes much of what we look at in our full SEO audits. The primary goal here is to uncover technical anomalies that exist under the hood of your website that may be contributing factors to the cause of drops in rankings or traffic.
Keyword research is the very core of SEO. It sets the stage for the show. We leverage a combination of intuition, experience, search volume data, and other relevant metrics when conducting our keyword research. Our data comes from Mangools. It's important to understand that the right keywords aren't always the most searched ones
On-Page Optimization & UX
Our team blends tried and true on-page SEO best practices with a great overall user experience to make each page on your site the as relevant as possible to the user. Spammy tactics like repetitive exact match keywords in your title tag or H1 will only hurt you - use natural language and fit in keywords when it's logical. Our primary focus is providing genuine value to your users.
Local SEO is a staple for restaurants, banks, coffee shops, and other brick and mortar businesses that need to be found on maps and local organic results. There's more to Local SEO than just setting up Google My Business and creating citations. Our team will help you get involved in your community and make an impact.
Our Approach to Search Engine Optimization
At Ciffone Digital, we’re committed to doing SEO the proper way, which is not by taking shortcuts or trying to game search engines. Real SEO requires patience, discipline, extreme curiosity and constant testing and experimentation.
We condemn any/all violations of Google’s Guidelines (aka black hat SEO) and spammy tactics. Our SEO strategy is white hat and focused on providing value to your users through high quality, unique content and exceptional user experiences. Our philosophy is not to build links, but instead to earn them. If your content is truly great you shouldn’t have to build any links, people will just link to it.
There was a time, roughly between 2008-2015 when it was relatively easy to rank a website on the first page for specific keywords by acquiring a lot of backlinks to a website with its target keywords in the anchor text.
On Google, this lead to many of the top results for a search not existing by merit of quality, but because they were “optimized” by someone who understood how to “optimize” the site.
Over the past 5-6 years, Google has released a number of algorithm updates to ensure the quality of the results that it ranks on the first page of results.
Nowadays, it requires a lot of hard work and effort to maintain good rankings. For a given Google search, the top results are always the pages that provide the best and most useful content relevant to what the searcher is looking for.
Sites must also be mobile friendly, provide a good User Experience (UX) and be optimized for page load performance and speed.
Mike talks about the current state of SEO and more in his blog post How to Think About Ranking on Google Search in 2022 and Beyond
Technical SEO is a discipline within Search Engine Optimization that focuses on ensuring that your website meets the technical requirements of modern search engines, which are constantly evolving and changing with time.
It is focused on both optimizing content by making sure it’s optimized to rank well as well as understanding how search engine bots (e.g. Googlebot) crawl websites and index. A large part of Technical SEO has to do with site speed and performance, and ensuring that pages load quickly when accessed from any device or browser type (a process called rendering).
Pages Loading via standard HTTP (Non-HTTPS)
Mixed content occurs when initial HTML is loaded over a secure HTTPS connection, but other resources (such as images, videos stylesheets scripts) are loaded over an insecure HTTP connection. This is called mixed content because both Secure Hypertext Transfer Protocol (HTTPS) and Unsecured http connections were used. It can also occur due to other SSL related issues.
Website crawls show multiple versions of the homepage
This is typically caused by www and non-www versions or http and https versions return 200 OK.
Search Engines are not indexing content
Most of the time if your site has indexation issues, it's either an easy fix or a massive headache. A really good scenario would be that we're looking at meta robots no-index tags that were set at the time of development and never taken off. That's a simple fix. For large sites we may be index bloat - so a site may have a ton of pages, yet they're either not valuable or some are deemed duplicate. It could also be that a site's robots.txt is accidentally blocking certain pages from being crawled. Ultimately, there are a lot factors that are potentially at play here. Most technical SEO consultants will want to see an audit before issuing a diagnosis.
XML Sitemap either doesn't exist or has problems
It's very easy to tell if a website doesn't have a sitemap - they are typically located at /sitemap.xml or /sitemap_index.xml. An XML sitemap is a file that provides search engines and sometimes people information about the pages, videos, and other files on your site.
Sitemaps are then read by search engines like Google to efficiently crawl through websites for any important links, new content, or content they may have missed during their previous indexing process. Put simply, an XML sitemap informs search engines about which webpages on a site are the most important. With this information webpages can be crawled faster and search engine bots are not wasting crawl budget on less relevant or lower priority webpages.
Content gets mistakenly blocked in a Robots.txt record
When a robots.txt configuration is the culprit of SEO problems, it is typically because a business user mistakenly blocked important content from appearing in search results while attempting to alleviate what is called index bloat.
A robots.txt file tells search engine crawlers which URLs the crawler can access on your site. Large websites (such as e-commerce) typically disallow specific pages or irrelevant resources (such as scripts) to avoid overloading their server with requests when a bot is crawling their pages. There is a common misconception that the robots.txt record is a valid method to keep certain pages out of search results.
HTML Meta Robots tag is mistakenly set to no-index
Like many other Technical SEO issues, this one happens often because someone forgot to reverse a setting after deploying a website from a staging environment to production. This tag is usually set via a WordPress SEO plugin such as Yoast SEO, All-In-One SEO, or RankMath. There are global configuration options that apply to page types or taxonomies, we first want to go through those settings and make sure it is configured appropriately.
When a site is in development you don't want it getting indexed because users might then find it and have a poor experience. New versions of a website that are in development also typically have similar, if not exactly duplicate content as the current site in production. If a development site is indexed, search engines might also flag it as duplicate content. This can hurt the live (in this case also canonical) website's organic visibility.
Website loads too slowly and/or has poor Web Vitals
The page speed of your website is an important factor when it comes to user experience (UX) and SEO. Google's Page Experience update made Core Web Vitals a tie breaking rankings factor. Web Vitals are a new set of metrics used to benchmark a site's performance. Passing scores are a tie breaking rankings factor on Google. If your are poor, there is a chance that pages that previously ranked high for various search terms may lose that ranking to pages with similar content that are more likely to provide users with a good experience.
For large websites (~10,000+ pages) slow speeds also cause less pages to be indexed by search engines because search engine bots have a budget on the number of webpages they can crawl each day. Slow load times generally result in poor user experience. A slow website might experience a higher bounce rate, meaning that more people are leaving the site without engaging with any content or clicking on links on the page.
Incorrect rel=canonical attributes
rel=canonical indicates to search engines that a the specified URL is the the authoritative (master copy) of a page. We often use this to prevent duplicate content, or in complex internationalization configurations (hreflang). Cross-domain canonicals are often used to specify syndicated content or used as an annotation on widely distributed press releases.
Other Common Issues
- Duplicate content exists (very common)
- Images with missing Alt tags (insanely common)
- Broken internal links
- Pages contain invalid structured data (JSON-LD) or a lack thereof
- Mobile friendly optimization (responsive design) and/or poor performance on 3g networks
- Improper Technical Internationalization (users experience unoptimized version of a multilingual site)
Common Situations That Require a Technical SEO Consultant
A bad website migration can cost a business massive amounts of traffic due to pages not being redirected (or properly). Any time the URL of an existing important page is changed, the old URL must be redirected to the new one.
If this is not done before a new site is pushed live, requests for the old URL will return 404 Not Found. The only time a 404 HTTP status code should be returned is if a removed page has no replacement or logical equivalent.
Random Loss of Keyword Rankings
When a site loses its rankings out of nowhere, it is typically due to either an an algorithm update, technical issues like soft 404s aka (redirecting users to another page, such as the homepage, instead of returning a 404 aka “not found”), or the removal or replacement of an XML sitemap.
It would be very unusual for on-page rankings factors such as Title Tags and H1s to cause such a drastic shift.
Moreover, if your rankings have taken a significant hit out of nowhere, you should have an SEO audit conducted as soon as possible.
The hreflang HTML attribute is used to specify the language of a page on a multilingual site, and also target specific geographic locations.
Using this the hreflang attribute can help search engines, such as Google, serve users the appropriate version of a globally relevant website.
Important to note is that while Google and Yandex both use hreflang this way, Bing and Baidu do not. Instead, they use the content-language HTML tag.
Improving Website Performance (Page Speed)
There are a lot of ways to improve the performance of a website, but not all of them are reliable. Improving performance is often a headache for businesses because fixing many of the biggest problems requires paying a developer to re-configure how much of the website behaves.
In our opinion, it’s not a good idea to rely on WordPress plugins to be responsible for core functionality of a website. Why? There are a handful of reasons:
- Plugins conflicts can lead to other aspects of a website to break
- WordPress updates are often neglected, and outdated plugins are common attack vectors in successful security breaches
- Most plugins that do a lot you have to pay for, or require a premium version to unlock the key features. An outdated payment method and/or an expired license can lead to unpredicted downtime and loss of income.
So while the aforementioned plugins can work out, it is more reliable to have web caching configured at the server level by a DevOps Engineer, and performance optimizations implemented by a professional Web Developer or Technical SEO. At Ciffone Digital, we prefer Varnish Cache – it comes installed by default on our custom web hosting.
If you’re an aspiring Tech SEO, we highly recommend checking out our Varnish WordPress guide.
How to Identify a Qualified Technical SEO Consultant
A sure-fire way to spot a legitimate Technical SEO is if they understand how to navigate/interact with a shell/terminal/command line. Most probably use, Bash/Zsh (MacOS) or PowerShell (Windows). However, there are absolutely exceptions.
There’s no “I” in “Team”. An SEO that knows their stuff could absolutely work with a developer to successfully get the job done. If you work with an agency, you might not know who does your work. That’s just the reality, unfortunately. In those cases, we absolutely recommend educating yourself on the basics.
Don’t Wait Until It’s Too Late
Technical SEO should not be overlooked. While its importance has grown in recent years due to changes made by Google in how it ranks pages such as Mobile-first indexing and Page Experience/Core Web Vitals, it’s often get’s less attention because it is difficult to correlate with deal breaker KPIs like traffic and revenue. In 2014 Google released the Panda 4.0 update, as a result Ebay lost 80% of its organic rankings.
Mike’s Latest Appearances
Listen to Mike’s Episode on Spotify