Episode 3 – The Truth About Keyword Stuffing

Get it on iTunes

Hi and thank you listening to SEOFightClub.org. I’m Ted Kubaitis and I have 22 years of web development and SEO experience. I have patented web technologies and started online businesses. I am both an engineer and a marketer. My goal is to help you win your SEO fights.

Maybe you are new to SEO or maybe you have been doing SEO for as long as I have. This SEO primer will reboot your effectiveness and point you in a direction for achieving positive results that continue to build on new learning. Most SEOs fail in the “O” part of SEO. The optimization part of SEO demands iterative learning based on empirical measurements. In this and future episodes I’m going to show you how fight the good SEO fight.

This episode’s FREEBIE

With every episode I love to give something away of high value. This episode’s SEO freebie is the all the data I collected on DUI lawyers in 10 major US cities to present my case. Its not just the keyword stuffing data but over 300 factors for every page 1 result for each search in the sample. Additionally, I include the aggregate analysis across the whole sample set. If you are an SEO data junkie like me then hang on. This data is GOLD!

You can download the freebie at: http://seofightclub.org/episode3/

First my disclaimer. I have discussed the ethics of talking about the truth of keyword stuffing at length with SEO Expert and Search Ethicist Josh Bachynski. I understand the reasons why he would never share such information with a client. The risks are simply too great. The temptation for beginners or desperate people might be too great to resist. I come from a different background and my opinion is that to be good at SEO you have to at least understand the forces that are working for and against you. I am not recommending you use keyword stuffing. I am explaining the controversy and my understanding of this particular search engine exploit. I believe that you can both know about bad behavior and still choose to do the right thing.

When you google “keyword stuffing” wikipedia is the #1 result. This is some of what wikipedia has to say about keyword stuffing:

“Keyword stuffing is a search engine optimization (SEO) technique, in which a web page is loaded with keywords in the meta tags or in content of a web page. Keyword stuffing may lead to a website being banned or penalized in search ranking on major search engines either temporarily or permanently.”

Wikipedia goes on to say “This method is outdated and adds no value to rankings today. In particular, Google no longer gives good rankings to pages employing this technique.”

Citation: https://en.wikipedia.org/wiki/Keyword_stuffing

I wonder if that’s true? Was it ever true?

Google’s webmaster guidelines defines keyword stuffing as “the practice of loading a webpage with keywords or numbers in an attempt to manipulate a site’s ranking in Google search results.”

Google goes on to say keyword stuffing “can harm your site’s ranking.“

Matt Cutts said in a video that there was diminishing returns and at some point the more you stuff the worse you rank.

If this is true then how is it that when I search for DUI lawyers in 10 different US cities the average page 1 result has more than 500 keyword matches in the HTML source and the maximum has 2550 matches in a single page!

To understand this issue I think we need to look at the core of what google is doing. At a basic level Google is performing a full text search.

“In a full-text search, a search engine examines all of the words in every stored document as it tries to match search criteria.”


“In information science and information retrieval, relevance denotes how well a retrieved document or set of documents meets the information need of the user.”


“Nowadays, commercial
web-page search engines combine hundreds of features
to estimate relevance. The specific features and their
mode of combination are kept secret to fight spammers
and competitors. Nevertheless, the main types of
features at use, as well as the methods for their
combination, are publicly known and are the subject of
scientific investigation. “


“ Although web search engines do not disclose
details about their textual relevance features, it is
known that they use a wide variety of them, ranging
from simple word counts to complex nonlinear
functions of the match frequencies“

That’s right folks… at the most basic and fundamental core search relevance is influenced by simple word counts and match frequencies. Obviously there is a lot more going on in the rankings but at some basic level when all else is equal whoever says it more wins.

The best evidence that is is the case is that Google openly threatens to punish you if you exploit this. I’m not a great poker player but even I can read that tell. But does this happen?

In fact Matt Cutts talks about the keyword stuffing manual action in a youtube video. Link in the show notes.


I have found a few short lists of sites that have been publicly penalized but literally none of them were penalized for keyword stuffing. I’ll put some links in the show notes. I searched for stories of people being penalized by google for keyword stuffing and I found the opposite… I found people trying experimentally to be penalized by google for keyword stuffing and failing to get any result. See links in the show notes.


Don’t get me wrong. I understand that the penalties for keyword stuffing might only be manual penalties but if that is the case then there is no way google can police this effectively which explains why in high competition niches where the cost per click can reach hundreds of dollars the keyword stuffing is running rampant.

In another video Matt Cutts says there is a point of diminishing returns and at that point having more matches starts to hurt you. But since the amount of keyword stuffing increases with the competition for the keyword and we are seeing today page 1 results having 2500+ matches per page we have to question if this is even true or possible. Ok the point of diminishing returns might be 3000 matches per page. Does that make sense to you?

Google could easily make a rule that if you have more than 500 matches per page then you are over-optimized but a rule like that would strip the first couple pages of results for every high traffic search term there is. Removing all the most relevant results would be a very bad user experience. So while they technically can auto-enforce a hard rule, it appears they are choosing to not do so.

Ok… so we know how it works, we have an idea why google isn’t automatically punishing it. How well does it work? Well. Now we are at my favorite part of the episode. Lets look at some measured data!

I used my software to measure over 300 SEO factors for every page 1 results for DUI lawyers in 10 different cities. The CPC for some of these searches is over $150 per click. These websites could not be more motivated to win free clicks.

The overall number of matches in the HTML source has NO CORRELATION with ranking position. It appears that the Google algorithm has accounted for keyword stuffing in a general kind of way. But but but but but… There is strong correlation for keyword stuffing in certain parts of the page.

Strong Correlation:
“Number of exact matches in web page H4 to H6 tags”
“Number of exact matches in web page H4 tags”
“Has leading matches in web page H2 tags”
“Number of leading matches in web page H1 to H3 tags”
“Number of exact matches in web page li tags”
“Number of matches in web page label tags”

Weak Correlation:
“Number of exact matches in web page option tags”
“Number of matches in Google result URL path”
“Has leading matches in web page H3 tags”
“Has leading matches in web page H1 to H6 tags”
“Number of matches in web page H1-H2 tags”

So it would seem that keyword stuffing in headings, lists, labels, form options, and URLs does correlate with rankings. So the secret to the keyword stuffing exploit is knowing where and how how much to stuff.

The problem with stating how much is that the degree of keyword stuffing is relative. In high competition keywords achieving competitive parity is going to be quite difficult and ranges from hundreds to potentially thousands of matches per page. In low competition keyword niches we might hit competitive parity with a dozen matches. The only way to know is to pull the competitive numbers for our niche.

The other thing we need to do is to really investigate the outliers. The page that had 2550 matches had a long thread of comments on the topic. Very understandable for a hot topic. Putting 2550 matches on our product page will scare away our customers. We have to understand the nature of the stuffing which is probably why Google has to police this exploit manually. This is also why I could see there being a point where more stuffing doesn’t benefit more, but I doubt there is a point where more stuffing starts automatically hurting our rankings. There would likely be too much collateral damage.

Until the data tells a different story thats how I see it. Don’t keyword stuff. Do understand how the technology works and sculpt your content to communicate effectively to both your audience of humans as well as search engines. There is a difference and it is often found in the intentions behind making the page. Have good intentions and apply integrity to everything you do.

There is a lot to consider in this episode. Please download the FREEBIE which is the all the data I collected on DUI lawyers to present my case. Its not just the keyword stuffing data but over 300 factors for every page 1 result for each search in the sample plus the aggregate analysis across the whole sample set. If you are an SEO data junkie like me then hang on. This data is GOLD!

Please subscribe and come back for our next episode where I will tell my story of “Real Negative SEO”. I will warn you now that it is not for the feint of heart. It was a brutal gauntlet for me that took hundreds of thousands of dollars and almost a year to solve.

Thanks again, see you next time and always remember the first rule of SEO Fight Club: Subscribe to SEO Fight Club