You’re Making at Least One of These On-Page SEO Mistakes


On Page SEO Mistakes

Your website’s on-page SEO is like the foundation for a home. Without it, your home (website) will sink into the ground or sink into the depths of the SERPS. Fortunately, I have seen every possible on-page optimization issue there is and I’m going to help you avoid some of the most common pitfalls.

If you’re planning a new website, this guide will prevent you from making mistakes that could destroy your SEO campaign down the road. And if you already have a website, you can make these changes right away and will start seeing the benefits once Google recognizes it.

Trust me, I have worked on sites, got them ranking with quality backlinks, and then got slaughtered by a Panda penalty because I didn’t take the time to optimize the website the way I’m suppose to.

I don’t make these mistakes anymore!

In fact, I don’t build a single link until I am 100% satisfied that I have created an indestructible foundation. It IS that important.

It doesn’t matter if you have links from God himself, if you are making many of the mistakes I’m about to show you, there is a good chance your website will disappear from the SERPS. And in all honesty, I’m not trying to scare you, but I just want you to understand how important on-page SEO really is.

One more annoying analogy: don’t start framing the house before the foundation is laid out.

I’m writing this post because I want A) your website to avoid being hit by Google Panda, B) I want your website to rank with ease, and C) I not only want Google to love your website, but I also want to make sure that your user experience is stellar because it brings your business and website more conversions.

Ready? Let’s do this.

What is Considered “Good” On-Page SEO?

Before I start criticizing everyone’s websites, I should layout what I believe is “good” on-page SEO.


The best on-page SEO strategy always begins with the users in mind. While I think Google is annoying about their link building stance, I do believe they have one very solid principle: build websites for the user. As you’ll notice, all of the issues I list DO affect user experience and therefore force Google to devalue your website.

Here’s what happens when you optimize your website for the user: Google loves it, your users love it, and you make more money.


Sure, most on-page optimization involves technical stuff, but don’t neglect the fact that every single page on your website should be valuable or designed with the user in mind.

Unique Content

Every single page on your website should have UNIQUE content. There is no way around this.

Link Flow

Creating a strategic site architecture makes ranking new pages super easy and will require less backlinks to see your desired results.

Keyword Placement

Yes, knowing WHERE to place your keywords is critical to ranking any page. More on this in a second.

13 On-Page SEO Issues That Panda Hates

In my experience working with websites both large (40,000 + pages) and small, here are the most common on-page issues I have found and continue to find with every new client I take on:

1. Duplicate Content

Duplicate content can be the death of your website if it isn’t corrected, and unfortunately, it can rear its ugly head in many different forms on your site.

2. Keyword Density

If any page on your website is rocking anything higher than 1-3% KW density you will have trouble ranking, and you are putting yourself in a position to get slapped.

3. Thin Pages or Pages That Lack Value

Pages that lack valuable content or content at all devalue your site, clog up space in the search engine, and serve no purpose to your users.

4. Duplicate META Tags

Some websites think it’s okay to copy and paste META descriptions throughout the entire site. Trust me, Panda doesn’t like this.

5. Broken Links

Large numbers of broken links are unprofessional, sloppy, hurt the flow of your website, and force Google to think that you don’t care about the “user experience”. Big no-no.

6. 302 Redirects

Too many temporary redirects are not only unnecessary in most instances, but will hurt your website when used in excess.

7. Ineffective Link Flow

No, you probably won’t get penalized for having an ineffective site architecture, but you will have lower conversions, high bounce rates, minimal time actually spent on your site, and it will be harder to rank pages throughout your website.

8. Slowww Page Speed

Having a slow website creates a bad user experience and Google takes it into consideration when ranking your site.

9. Improper Use of Header Tags

Using more than 1-2 H1 tags per page can dilute the effectiveness of your keyword placement.

10. Missing ALT Tags

Google has no idea what your images are unless you specify in the ALT tags. All this robot sees is CODE, not your beautiful photography.

11. Missing META Tags

This isn’t a critical issue, but having them can increase click through rates when your pages are found in the SERPS.

12. Too Many Indexed Pages

This is probably one of the biggest misconceptions about being an “authority” in Google. For some weird reason, some website owners think because they have a ton of pages indexed, it means their site is authoritative. WRONG.

If you have a ton of QUALITY pages indexed, then this is true.

But if your website has 10 blogs posts, and you have 400 pages indexed, then I would bet a lot of cash that you have duplicate content and thin pages.

13. Missing Trust Signals

Google looks for certain pages that a legitimate website would have such as Terms of Service, Privacy Policy, and Contact and About pages. If you don’t have these, create them.

As you can see from the majority of these issues, it’s all about fixing your user experience. The ultimate formula to fixing your on-page SEO is by creating the best user experience possible. I’m going to help you correct all of those technical issues above with ease. Keep reading.

How to Fix Your On-Page SEO

The good news is that there are a ton of tools that will do most of your heavy lifting when it comes to optimizing your website.

Tools for Fixing the Technical Stuff

Fortunately since SEO has been around for awhile, there are a ton of tools at your disposal for analyzing a website. Whenever I start on a new website, the first thing I do is throw the target site into the tools below:

Siteliner – this is the ultimate tool for finding duplicate content and broken links on your website. If you have duplicate content, then it should be your number one priority to fix it. In my opinion, there is NO bigger issue than duplicate content when it comes your on-page optimization.

The most common forms of duplicate content are copied product descriptions from the manufacturer, multiple pages for the same product, and tags, categories, and archives being indexed, and if you have an eCommerce website, you may have duplicate content due to canonicalization issues.

How to Fix Duplicate Product Descriptions

1. Unique.
2. Unique.
3. Unique.

Many eCommerce stores have hated me for this, but every single product page MUST have unique content.

Do not copy product descriptions from the manufacturer or of your own! By taking the time to create 100% unique content your website will increase your conversion rates, but it will also bring your products more long tail traffic from Google. And even more importantly, it will make your business standout because you put in that effort.

There’s one super easy way to generate unique content on these pages outside of just writing a block of text: user generated content.

Create a review, comment or question section and ask anyone who buys your product to leave a review.

How to Fix Multiple Pages for the Same Product

This is by far one of the most common issues I find with eCommerce stores. They have one product and 10 pages for each color, and another 10 pages for each size of the product.

Google HATES this because 1) you filling the SERPs with nonsense, and 2) there’s no way you’ll be able to write a unique description for 20 different pages of the same product.

Solution: consolidate all the pages into one. Google will love you for this and so will your users.

If it’s too technical of a process to consolidate, then use the “noindex, follow” on the duplicate product pages.

Tags, Categories, and Archives

By default WordPress indexes these and they can cause some serious duplicate content issues. I highly recommend you noindex them. If you’re using WordPress, you can do this easily through the All in One SEO pack or Yoast SEO.

How to Fix Canonicalization Issues

Most websites will not suffer from this because almost every eCommerce and blogging platform has code in place to prevent it. I’m not going to get into this because I will fall asleep writing about it, so either read this article or watch the video below. Moral of the story, decide whether you want your website to have the www. extension or not. Once you decide, simply 301 redirect the one you didn’t choose to your preferred domain.

Example: I like “”, so I’m 301 redirecting “” to “”

Siteliner will find pretty much any canonicalization issues you may be having, so make sure you use it. Especially if you have an eCommerce store.

Screaming Frog SEO Spider – it doesn’t matter whether you’re an SEO agency or single website owner, you NEED to use this tool. It’s free, so there’s really no excuse.

This tool is perfect for:

  • Identifying duplicate META information throughout your site
  • Finding broken links, 302 redirects
  • Displaying missing H1 and H2 tags that you could be using to place primary keywords
  • Directives: can identify accidental “nofollow” or “nofollow, noindex” placement on pages or lack thereof.

SEO Quake (Firefox, Chrome) – SEO Quake is another must-have tool. Aside from it’s link analysis capabilities, I actually really love it’s “Diagnosis” and “Density” extensions.

Diagnosis: with a single click, the Diagnosis tool shows all kinds of useful metrics including your page’s META information, header tags, image ALT tags and many other things. This simple tool has helped me identify duplicate H1 tags on a clients site, which I then fixed and brought his page from the top of the second page, to the top 4 in only a few days after the changes. Moral of the story, USE THIS TOOL.

Density: I use the keyword density tool religiously because it’s super comprehensive. Just keep in mind, it doesn’t count keywords in your ALT tags, IMG extensions, and other stuff in the code. Every page on your site should have between 1-3% KW density. Anything more and you start playing with fire.

* Tip: if you create a quality long tail keyword-based blog post, publish it, and it’s not ranking in the top 100, then it’s more than likely has to do with your keyword density either being too high or low. OR, you’re just being silly and trying to rank for something like “SEO” in a blog post, which I know you would never do….

Every single blog post I have created for my authority site project is ranking in the top 30 for it’s target keyword without backlinks. That’s because of the keyword placement and density alone.

About Keyword Placement

Since you need to keep your density around 1-3%, you don’t have a ton of leeway. This means you need to know WHERE to place your keywords for optimal rankings.

First, I highly recommend you read How to Optimize a Blog Post in 10 Easy Steps because it explains a lot of this. Secondly, the most critical places to insert your target keyword is in the URL, the title, first sentence or paragraph, first image ALT tag, and last sentence. Aside from that, you can sprinkle into header tags or throughout the copy.

So, what I do is place it in the critical spots first, and then adjust the density up or down from there.

Word of advice: whenever you’re creating a blog post or landing page, just write naturally and don’t think about KW density. AFTER you’re finished with the copy, that’s when you go back and make sure the percentages are correct.

I highly recommend you run your website through these programs as well:

Okay, that’s basically it for fixing the technical side of things.. unfortunately, the true meat and potatoes of good on-page SEO can’t be identified or corrected with tools.

Site Architecture for Hilariously Easy Ranking

Strategic site architecture is what allows me to rank pages throughout websites with a minimal amount of links. As I have shown many times, I use the “silo” structure for every single website I create or work on.

Here is the linking structure I prefer (from case study #2):

For an eCommerce website, you would want something like this:
ecommerce on-page seo

A few tips to remember about site architecture:

Google only counts the first anchor text – you need to make sure, if you are using the same link on more than once on a page, that you place your best anchor text first. This can easily be done through the navigation.

Be smart about your navigation – eCommerce stores struggle with this. If you’re selling shoes and you have a tab for color, your drop-down menu should have “Blue Shoes”, “Red Shoes”, etc. Not “Red”, “Blue”, etc. Another way to combat this issue if you don’t want a clutter nav bar, is to create landing pages.

For example, you would create a category/landing page for shoe colors, write some awesome unique content, place your “Blue Shoes”, “Red Shoes”, and other internal links directly on the page, and the link the landing page in the navigation bar. This strategy reduces a ton of clutter in your navigation bar and is best for link flow.

Don’t go crazy with the internal links – Google has said before that anywhere between 100-150 internal links per page is optimal. From a usability and link flow standpoint, it’s a good standard to live by.

Okay, so now you have unique content on every single page of your website, you have created site architecture for optimal link flow, you have placed keywords in the most strategic positions, and you’re no longer making any of the silly mistakes I listed above.

But there’s one more piece to on-page SEO equation: how fast your website loads.

7 Easy Steps to Make Your Website Faster

Website speed is critical for your website’s SEO, crawl rate, and user experience. Do not neglect it.

I’m by no means a developer or have ever claimed to be, but I’m a good reader/learner/action-taker, so I have figured out how to make website’s faster in the easiest way possible. If you’re using WordPress, then it’s a piece of cake. If you have a custom design or something other than WordPress, then you’ll have to implement these changes manually.

1. Test current your speed with this tool
2. Backup your website
3. Download W3 Total Cache
4. Read this guide about how to optimize W3 Total Cache
5. Sign-up for CloudFlare
6. Download
7. Test your speed again and be happy

If you take the time to optimize your website, fix the technical issues, develop a quality architecture, and increase the speed of your website, then you are going to love how easy it is to rank your website once you start link building.

I know this is probably a lot to digest and I’m sure you have questions, so please leave a comment below.

Thanks for reading!

– Gotch



  1. WOW! Even though I have spent many many hours reading and implementing best practices for on-page SEO, I still found a few things here that did not know. Thank you so much Nathan for sharing your expertise/experience with us in this excellent how-to guide.

  2. Hi Gotch,

    Thanks for you great blog post! I’ve learnt alot about Onpage SEO from your content. You taught me to optimize my website for people, instead of robots.

    I totally agree with you about website speed when it comes to Onpage SEO. It’s one of Google ranking factors. I’m running a Magento e-commerce website and speed takes an important role for UX. Loe speed website = people don’t like = no sales = no conversion = Fail. Is it true?

    So, i can add some useful information for your blog post about speed optimization in Magento. I’m using an extension for Magento called Full Page Cache by BSSCommerce. If WordPress has W3 Total Cache, Magento has Full Page Cache, too. They’re must-use cache tools.

  3. I read many articles on SEO strategies but most of the articles mention theories only. It becomes difficult to understand or predict the actual problem or problem solving technique only from theory points. But Nathan explained all his articles with examples which he faced in his experience. These examples really clarifies all doubts and gives a apparent idea regarding SEO. Thanks Nathan to reveal such a experienced knowledge.

  4. This article is very good! one of my questions, why the category and tags do not need to be indexed? there were found only tag that does not need to be indexed and the categories to be indexed?

      1. Hi Nathan, interesting read – pleased to say I reckon my on page is pretty good 🙂
        I’m still in 2 minds about the duplicate content issue on wordpress category pages (though definitely noindex tags!). I think category pages are of great use for visitors, bringing together useful info. I have some category pages on my site that are indexed, rank well, bring in visitors, have page rank and authority, despite a number of online tools warning me about duplicate content. I do try and have 300-400 words at the top of each category page, so I guess that must be helping reduce the duplication issue. Being in the travel niche, I know most of my competitors also have category pages bringing together trips and tours to various destinations, so perhaps it’s just a case of us all having some kind of duplication. Now, if I was starting over I would noindex them from the start but I’m reluctant to remove them from the index now. I guess the best way to have a category page now is to manually produce one and get that indexed and ranked.
        Right, I’m off to read the rest of your site!

  5. A lot of information here, Nathan!
    Every blogger should keep this in mind when creating new content.

    Apart of on-page SEO, we also need to consider off-page SEO.
    Google still loves links! 😉

    Thanks for sharing, Nathan.
    Have an awesome week.

  6. Good post – with certain clients it’s especially important to find some easy wins like this to really get the ball rolling and show some results without a lot of effort. It can pave the way for getting the more difficult but certainly no less important changes implemented.

  7. hello once again nathan 🙂 i have a question regarding duplicate pages
    i recently installed SSL and now my urls are in https and i think it was a successful installation and redirection since i am redirected to https anyway my question is this

    google serp already index my website and the title is “ArnoldTheMighty.Com – I am the chosen one not you”

    when i have my SSL installed i decided to use different slogan so my title now is “ArnoldTheMighty.Com – I am not just a random guy”

    after a few weeks google index my https website but the http version is still there now im getting duplicate home 2 different version http and https do you think its ok and not worry about that?

    i also have 2 other pages that is index and signal a duplicate since they are the same but having different http and https 🙁


  8. Hey Gotch, thanks for another great article. I do have a question. It’s on duplicate content. Most of my articles are unique (or so I thought lol) I either wrote them or had someone write and i fixed. I plugged my site into siteliner tool u posted. For what it would scan my site is coming up %5 duplicate content. Also, my blog page where my last five posts are going coming up %55 duplicate content. Then I clicked on it to see what the tool stated as duplicate. I’m sure it’s the headlines, some content but very little, tags at bottom but it’s also highlighting all my links to social media at bottom etc. For the headlines and tags I’m basically reengineering what someone else is doing on the internet to rank for keywords in my niche. So my headlines are kind of the same but not exact. Does that make a difference in Google’s eyes if Headlines are close. I know when something happens in the news or people in the same niche ranking for similar keywords most are posting same type of headlines etc. My articles are getting indexed and some are on page 2 and 3 and I haven’t really started building links to them yet. That’s basically on page that’s ranking them a bit. But after reading this article and plugging in to your tool I don’t want to do all this work then I have problems down the road. Do you have any suggestions?

    Also on H1 and H2 I see you only put H1 one or two times. How many H2 and H3’s do you suggest? I usually put 1 or 2 H1 and 4 or 5 H2.

    Thanks, and I am really looking forward to hearing back.

    1. Hey Jimmy,

      Thanks for the comment. I wouldn’t stress too much about title similarities. Duplicate content only becomes an issue when entire pages are duplicates. Not only a few words.

      1. Thanks for quick reply. Was stressing a bit about that because of my last ten articles posted. Do you pay much attention to google sandbox? I have never been there but I plugged in my site today and it is saying it’s possible I’m in the sandbox. I check webmaster tools and few other things doesn’t seem to be a problem any place else. Thanks.

  9. Completely agree with you on this. On-page seo is not limited to the old school stuff. It’s about how good and valuable content you give to your readers. Today readers for more intelligent and look for content that are new, good and more detailed. The power of social media has thus enhanced the use of valuable content for good seo signal.

  10. Hey Nathan,
    Great article. Full of amazing advice I’m implementing on my own website. I have a blog on my website. On the blog page it shows snippets of the blog articles. When I put my website into Siteliner it tells me that this is duplicate content. Should I be telling the Meta Robots that this is a No follow page to avoid duplicate content?


  11. Great article.
    You mention IMG extensions. Are keywords in the IMG URL considered part of the density (i.e. should they be included in the 1-3%)?

  12. Hello.
    You say that all content should be unique, but what about Terms and Privacy? I see many sites in top10 have not unique content here. Also I hear that Google allow not unique content for this pages. I not sure that this is true but looks like this is controversial moment. What your opinion about unique content for Terms, Privacy pages?

    1. Hey Den,

      Thanks for the comment. Terms of service and other necessary pages don’t need unique content. In fact, noindexing those types of pages is a good idea

  13. Hi Nathan,Thanks for the great tips! I’m new to online marketing, and this is really helpful! Since getting started, I’ve been bombarded by “spin writers” and such to create a TON of content quickly, but you seem to say that these search engines have become sophisticated enough to determine when your content is crap. Am I understanding that right?

Leave a Comment

This site uses Akismet to reduce spam. Learn how your comment data is processed.