Brodie Cornett

Brodie is an SEO Associate and account lead at Sagapixel. He specializes in on-page SEO and content marketing.

Why is My SEO Not Working: 3 Areas for SEO Improvements

Category:
Table of Contents

Why Is My SEO Not Working?

So, you’ve started working SEO into your business marketing.

You may have hired an SEO company or you are trying to DIY it,  but you’re not seeing the needle move on your organic traffic. Now you’re asking yourself “why isn’t my SEO working?”

Well, to be honest, the simple answer is either a. you haven’t been working on your SEO long enough or b. you don’t have someone good working on it.

The more complex answer is that there are loads of factors to consider when trying to see why a website simply isn’t getting organic traffic. From your technical SEO to your content to your backlinks, tons of things can affect the success of your SEO strategy.

So, let’s look at a few things that might explain why your SEO isn’t working and what you can do to get it on the right track.

This guide covers a number of common scenarios that prevent websites from ranking well on Google. If you’d like to speak with an SEO associate at Sagapixel and have them take a look at your website, click here.

1. Improve Your Content

Let’s start with the most common reason why your SEO isn’t working: bad content.

Not every article or landing page or post is gonna be a home run—that’s just the facts. But, if all your pages suffer from poorly planned, poorly written, or unoptimized content, then your website is gonna have a bad time if you want to rank in Google.

Here are 5 reasons why your content may be preventing your SEO from working.

Your Website Has Thin Content

This is a big one because it’s the only mistake that could result in a manual penalty.

Basically, if you’re writing content that doesn’t fully address a query or only touches on the topic to the extreme least, then you might have a thin content problem.

But, what is thin content?

Well, that depends on your niche. And, you need to factor in how complicated the query your targeting is.

For example, you don’t need to write a 3,000 word article for the search “how far from New York to Los Angeles.” That has an empirical answer. But, “what can you do on a drive from New York to Los Angeles” should probably be longer than a few hundred words.

The main point here is to be as thorough as the topic requires. Answer all the questions about the topic (if there are any) and write as much as you need to cover the topic in its entirety.

Your Content Contains Keyword-Stuffed Content

That all said about thin content, make sure your content isn’t keyword stuffed.

While this doesn’t carry the weight of a potential manual penalty, it still harms your SEO efforts. So, you need to avoid it when creating content for your chosen topic.

What is keyword stuffing?

Basically, keyword stuffing is an unnatural method of writing where you cram your keyword in wherever you want with reckless abandon. This content does not read well and Google’s bots devalue content that suffers from over-optimization. Additionally, keyword-stuffed content reads oddly to your potential customers, often resulting in them getting a bad impression of you and your business and leaving your website.

For example, a piece of keyword stuffed content for Glassware might read something like:

    “We have the glassware you need to fill out your glassware cabinet. If you’re looking for custom glassware, we’re your number one place for personalized glassware online. Get your unique glassware here!”

Now, this is awful and I wish it were an exaggeration. But, there are plenty of websites out there with content written like this:.

healthcare digital marketing

Yes, you need to work keywords into your content. As much as Google says their web crawlers can usually pick them up through the context alone, it’s still a good practice to explicitly state your target keyword in your content.

Just don’t go overboard.

Sprinkle it in. And, if you’re unsure how your content sounds, read it out loud. I can’t tell you how many times I’ve written something that sounded absurd when I actually said it.

Fixing keyword stuffed content is pretty easy. Just go through your content and remove awkward uses of your target word or phrase. Maybe replace it with a variant to still address the topic without sounding repetitious.

Your Website Contains Duplicate Content

Duplicate content is content that you duplicate across your site.

In all seriousness, this happens when you write two pages that cover the exact or almost the exact same content. And, in the most extreme cases, you’ll have word-for-word duplicates of sentences and paragraphs.

Now, duplicate content on your own site only hurts so much. The main issue here is that Google might not know which article to rank. In the end, neither one might rank because Google can’t make a decision between them.

This often occurs with service-area businesses that put up dozens of the same page for each of the cities they wish to rank in and simply swap out the name of the town. While this is unlikely to result in any sort of penalty, it will often result in Google simply ignoring these pages, such as what happened with this new client and their previous SEO agency:

healthcare digital marketing

The best thing to do in this case is to either have 100% unique content for each of these pages or consolidate all of your content into a single, awesome article or page.

On the other hand, if you have identical content to a page you don’t own, well, that’s plagiarism and is a bit more serious.

Plagiarism can result in consequences ranging from a lawsuit, to a Digital Millennium Copyright Act (DMCA) request which could result in the removal of your website from both Google and your server.

healthcare digital marketing
healthcare digital marketing

So, make sure you’re not duplicating content from another source without proper citation.

Your Content Doesn’t Match User Intent

The next aspect of creating great content is matching user intent.

You need to research what Google is already ranking for a given query so you can write content that aligns with what searchers are looking for.

Your SEO might not be working because you’re not matching the search intent for your content.

This gets a bit nebulous. After all, you can’t guarantee you know what any given person’s search intent is. But, you can get an idea through looking up your target topic.

What kinds of articles and pages rank?

Are they listicles, landing pages, comparison pages, or educational pieces? Do they come from retail, wholesale, or consumer sites? Will you be competing with .gov or .edu sites?

In other words, if you Google “best SEO companies in Philadelphia,” you’ll find that almost all of the top results are review websites comparing different companies. 

healthcare digital marketing

Trying to rank a single service page will be exceedingly difficult and even if you are successful, most of the people searching won’t click on your result anyway since they want to read reviews and compare companies.

Identifying the intent for any given topic guides your content strategy on whether that topic is worth writing about.

E-A-T

The final aspect to consider when creating content (and why your SEO might fall flat) is what Google’s Quality Rater Guidelines calls E-A-T: expertise, authority, and trustworthiness.

You want your content to demonstrate:

  • Expertise: You are an expert in your niche with great knowledge or experience
  • Authority: You have a good reputation with professionals in your industry
  • Trustworthiness: You are a legitimate business or service with accurate information

If you own or work for a Your Money or Your Life (YMYL) type site like anything in the medical or legal field, E-A-T is a major aspect to consider in your content strategy.

Now, E-A-T isn’t a direct ranking factor. But, it’s an ideology to improve your content. There’s some debate on the actual usefulness of it though.

On the one hand, E-A-T is nebulous. Meaning, it doesn’t really provide any insight as to how to improve your site’s expertise, authority, or trustworthiness. Without any sort of guidance or action plan, you don’t have much in the way of using E-A-T to improve your site.

On the other hand, E-A-T pushes sites to be better. This means ensuring factual accuracy, building trust in searchers with secure connections and transparent business practices, and demonstrate your knowledge in your niche.

Measuring E-A-T is hard because, well, there is no score tied to it. But, some best practices include:

  • Building links from legitimate, in-niche sites
  • Hiring experts to write and review your content. Google is getting really good at determining the level of knowledge of an author; if your content wasn’t written by an expert, there’s a good chance Google will be able to tell.
  • Fact-checking your content
  • Including your credentials on-site
  • Having “About Us” and “Contact Us” pages
  • Encrypting your site with an SSL certificate

2. Improve Your Off-Page SEO Through Link Acquisition

Time to look at things outside of your control as to why your SEO isn’t working; off-page problems.

Off-page SEO includes anything that happens outside of your site.

The main aspect of off-page SEO is link-building, but it also includes things like social media, citation building, and your Google My Business listing.

Fixing off-page issues that cause a failing SEO strategy is…tricky at best. Mostly because you rarely have direct control over it. But, there are some things you can do to get your off-page SEO back on track.

Your Website Lacks Links from Authoritative Websites

A major reason why your SEO strategy suffers is if you don’t have backlinks from authoritative, relevant websites.

Google has a metric called “PageRank” that it uses as a factor to determine the search engine results. Google Chrome used to share this number, but it’s now an internal number only Google has access to. Every time a web page links to another web page, it passes a percentage of its PageRank to this other page.

Another way to think of this is to think of backlinks as a vote from another site that your page is a good place for users. But, these “votes” aren’t equal. Backlinks from unrelated or low-quality pages carry little weight.

If you have no backlinks or loads of links from low-quality sites, your SEO campaign will struggle.

Now, you can disavow or remove bad backlinks in Google Search Console. But, Google seems smart enough to just ignore bad backlinks rather than penalize your site. And, you could accidentally damage your backlink profile in doing so, therefore it’s not recommended to use Google’s disavow unless you really know what you’re doing.

Low Quality Links Are Worse Than No Links At All

Sites and tools like Ahrefs or SEMrush help you analyze your backlink profile. If you’re seeing tons of links from low-quality, irrelevant sites, you probably have a weak backlink profile.

The best way to fight bad backlinks is by building good ones.

There are loads of backlink building strategies. Guest posts (but leave them as nofollow), email outreach, social media, citations for local businesses. The list goes on.

Just remember; you need great content that people want to link to.

Google My Business

If you’re a local business or service and you don’t have a Google My Business (GMB) account, you’re gonna have a bad time.

Basically, GMB is your dashboard for tracking and optimizing for local SEO. This is how you start ranking in the Local Pack, that box with the map and local listings whenever you search for a localized service.

My recommendation for localized businesses?

Make sure you have a GMB account.

I cannot stress this enough. Creating a GMB account is free and relatively easy to set up. And chances are, your competitors already have theirs, outranking you and generating leads.

Now, make sure you follow the Guidelines for representing your business on Google. You don’t want to go through the work of creating an account and verifying only for Google to remove your listing because you broke one of the rules.

Social Media

Finally for off-page SEO issues; social media.

If you’re not using social media or using the wrong sites, you’re missing out on traffic.

Simple as that.

While Google doesn’t use social media signals as ranking factors, utilizing these sites still helps your SEO efforts. Mind you, it’s all indirect. But, it’s still worth putting the time into building a social media presence.

Think of it as indirect link building.

If you create a piece of content that’s informative and helpful and post it to social media, you’re expanding the reach of that post. You’re essentially doing outreach on a larger, if less controllable, scale. And, if you reach the right eyes, they might link back to that page on their site.

That last part is the important thing here.

Yes, you should drive traffic from social media users on a whole. But, you want to try and target those users with their own web presences. Users who have relevant and, ideally, authoritative sites.

Another aspect of social media is where your target audience is.

If your target demographic is for the elder car industry or investing, you probably don’t want to do anything on Tiktok where the user base skews much lower.

The point is; a poor social media strategy harms your SEO because you’re failing to drive that potential engagement with your site.

3. Perform a Technical SEO Audit of Your Website

The good news is that technical SEO fixes can often deliver increases in organic search visibility virtually overnight.

Think about it; at the end of the day, all the optimizations and link building you’ve done for your website mean nothing if web crawlers can’t find it in the first place. Or, if they have a hard time crawling your site.

Technical SEO provides structure to websites. It covers many, many aspects of how a page functions. So, there are quite a few places to check to make sure Google and other web crawlers have an easy and delightful time indexing your site.

Here’s a few things to check to see if your Technical SEO isn’t working.

Is Your Site Set to noindex?

So, say you have a new site. Or, you have a bunch of pages previously held in a development environment far, far away from Google’s index.

But, as soon as you move everything over to production, Google still isn’t indexing those pages.

Months go by and no sight of rankings.

Well, if your dev team set the meta robots to noindex, you’ll never see those rankings.

Ensure your meta robots tag is set to “index.”

Check your meta robots tag and make sure it isn’t set to noindex. It’ll look something like this:

    <meta name=”robots” content=”noindex, nofollow”>

Now, your meta robots might not include “nofollow.” If you want the page to enter Google’s index and have web crawlers follow links on it, make sure the attribute is set to “index, follow.”

Blocking Crawlers In robots.txt

Next up, are you even allowing web crawlers to access a page?

While noindexing means Google won’t index your page, blocking crawlers in your robots.txt file means they’ll never see a page to begin with.

Check your robots.txt file to see if you’re Disallowing web crawlers from accessing your pages.

Your robots.txt file tells web crawlers which pages they can and cannot access. If you’re blocking them, they can’t index them. If they can’t index them, then Google can’t find you.

This file outlines specific rules, called directives, that let web crawlers, called User-agents, know which URLs they’re allowed to access.

It’s a bit technical and can become remarkably granular once you get into the intricacies of which pages you don’t want crawled. But, and easy fix is to see if your robots.txt file allows crawlers at all.

If you see something like:

    User-agent: *

    Disallow: /

Then web crawlers can’t access any page on your site. Switch out “Disallow” for “Allow” to let the robots in.

Your Site Architecture Doesn’t Make Any Sense

Related to how web crawlers can even access a page is how you lay out your entire site.

Your site’s architecture means how different pages relate to each other. Usually, this means siloing pages and articles into a single, broad topic. For example, an ecommerce site selling kitchen supplies might have a Glassware category that breaks out into Coffee Mugs, Pint Glasses, Wine Glasses, Shot Glasses, and so on.

A good site architecture groups related pages with internal links. And, if your internal link structure doesn’t make any sense, well, the robots might get lost or confused.

Ensure your linked topics make sense.

In the above example, if you start linking to Car Accessories for whatever reason, that might lead to web crawlers misunderstanding what the pages are about. Which means your pages become less likely to rank for your targeted keywords and topics.

Now, top navigation bars get a little tricky in this regard. But, generally speaking, relevant anchor text for your links within your content work better for passing context on to web crawlers.

Also, put your most important pages where people can find them.

For both user experience and ease of access for web crawlers, make sure you’re linking to pages you want to rank for. And, put them where users can find them. Burying a page dozens of clicks deep means no one’s gonna find it and it devalues the page.

Do You Have an XML Sitemap?

Following your site’s architecture, make sure you have an XML sitemap and submit it in Google Search Console.

Ideally, your site architecture should be enough for web crawlers. But, it’s best practice to submit an XML sitemap to Google Search Console (and any other services like Bing Webmaster Tools).

This way, if web crawlers need to, they can crawl through your site using the links included in your sitemap.

Now, I’d recommend optimizing your site’s architecture first. If a user can easily make their way to any page, so can robots.

That said, a sitemap gives you the power to (kind of) direct web crawlers to your most important pages. Any page included in a sitemap should be important. You don’t want to throw in junk or duplicate pages just because they exist.

Are You Using Schema?

A bit of a newer trend in the SEO space is utilizing schema structured data.

This is a bit more complicated than some of the other Technical SEO elements we’ve address so far. But, it can help improve your rankings if you do it right.

Basically, schema structured data is a way to provide direct context to web crawlers. It takes the most important elements of a page and strips out all the fluff in favor of providing a direct answer to the question; what is this page about?

I won’t go into the myriad of topics and types of schema that exist.

But, I highly recommend including schema on your pages.

Often, the name of the game is ensuring web crawlers understand what a page is about. If you write perfect content with contextual internal linking, you might get lucky and the robots won’t misunderstand.

Schema helps remove some of the uncertainty by speaking directly to those robots and providing the direct context. Which is great for helping you rank in Google.

Is Your Site Slow?

Finally for Technical SEO, are your pages loading too slowly?

You need to make sure your pages load fast or you risk losing rankings and conversions.

Google wants to rank sites that 1) provide the best answer to a query and 2) provide the best user experience. And, slow load times dramatically harm the latter of those two.

Not to mention the Page Experience update slated for May 2021 where Google’s gonna factor load speed in as an actual ranking factor.

Luckily, there are a few things you can do to inject a bit of speed into your pages. These include:

  • Compressing images
  • Using a CDN
  • Stripping unnecessary JavaScript and CSS
  • Making sure you don’t block the rendering process
  • Reducing the amount of third-party code on your site
  • Utilizing an efficient cache policy

Among many, many other things.

But, increasing your site’s speed could become as complicated (and pricey) as upgrading your hosting plan or servers. Which brings on a whole load of other considerations that will be unique to your specific goals and circumstances.

4. Give Your Updates Time

The last reason why your SEO is failing is you’re not giving it enough time.

SEO is often a long game. You can see results in a couple weeks. But, the real gold lies in waiting 3 months or longer for Google and other search engines to respond.

If you’ve only just started an SEO campaign within a month, chances are you won’t see results.

This goes doubly true for brand new sites. You might not see anything for 6 months to a year or even longer. Google needs time to vet your site and then decide to test the waters with your content.

My advice?

Wait it out. Make some changes here and there, then come back later while you work on other pages or tactics.

Final Thoughts On Why Your SEO Isn’t Working

Search Engine Optimization is a fickle game. It’s a weird blend of art and science, gut instinct based on data analysis. So, finding out why your SEO strategy isn’t working takes a bit of time.

But, for an easy checklist, I recommend:

  • Ensuring you’re writing great content that answers a user’s questions
  • Reaching out to in-niche sites and influencers for backlinks
  • Checking on your site’s technical structure and make sure robots have an easy time navigating your pages
  • Waiting and following your progress over the months

Schedule a call with us