SEO audits are the single best way to figure out why you’re not getting SEO results.
It is the first activity my agency does when bringing on a new client.
In this article, I’m going to show you how you can perform a complete SEO audit in 2018.
Want access to the step-by-step SEO audit template and SOP in the video above? Go here to learn more.
Remember two things before you begin:
- The time investment for any audit is dependent on the size of your website.
- A good SEO audit is all about asking the right questions.
Here’s what this SEO audit checklist will be covering:
- Step 1: What Are Your Strategic Objectives?
- Step 2: Keyword Analysis
- Step 3: Competitor Analysis
- Step 4: Technical Analysis
- Step 5: Page Level Analysis
- Step 6: Content Analysis
- Step 7: User Experience Analysis
- Step 8: Link Analysis
- Step 9: Citation Analysis
Let’s jump in.
What is an SEO Audit?
An SEO audit uncovers ways for you to improve your SEO campaign.
The goal is to identify weak points in your campaign that are hurting your performance.
This process will give you a list of action items that you need to fix.
If you take action on this list, you should see improvements in your SEO performance.
When Should You Do An Audit?
As I mentioned, we always perform an audit when we bring on a new client.
But, we will also audit a current campaign every quarter.
This is to ensure that we didn’t miss anything and to identify any new problems.
An audit is always a good way to evaluate our performance.
There are two times we perform audits:
1. at the beginning of every new campaign
2. once a quarter
Now that you understand the basics, let’s jump into the first step of the SEO audit.
The Complete 9 Step SEO Audit
Follow these 9 steps and you will leave no stone left unturned. Remember, a successful SEO campaign is the product of hundreds of positive ranking factors. That’s why it’s critical that you examine every detail of your campaign. You don’t have to be 100% perfect, but that should be the goal.
Make sure you download my SEO audit template and SOP because it will make this entire process more efficient (and save you a massive amount of time).
Step 1: What Are Your Strategic Objectives?
Goal: to determine what your long-term goals are for your SEO campaign and business.
I have said this before and I will say this again:
SEO is a means to an end.
It’s nothing more than a marketing channel to grow your business.
That’s why your Strategic Objectives should be what your business is trying to achieve through SEO.
Clear Strategic Objectives keep your campaign focused and help you achieve your goals.
If you already have a Strategic Objective, then this is the time to review it.
Are your objectives Specific, Measurable, Attainable, Relevant, and Timely (S.M.A.R.T.)?
You need to refine them if they’re not.
If you do not have Strategic Objectives for your SEO campaign, then now is time to create them.
Here are some examples of Strategic Objectives for an SEO campaign using the S.M.A.R.T. principle:
- “Blue Widget Inc. will easily increase its organic search visibility by 50% within the next 6 months.”
- “Blue Widget Inc. will easily grow from 20 linking root domains to over 100 link root domains within the next 6 months.”
- “Blue Widget Inc. will easily grow its lead volume from organic search by 20% within the next 12 months.”
Your Strategic Objective should be a mix of SEO KPIs and business KPIs.
Now let’s go into keyword analysis.
Step 2: Keyword Analysis
Goal: to determine whether the current keyword targeting strategy is worth it. And, to find untapped keywords that could result in “easy” wins.
You need to reexamine your current set of keywords before jumping head first into your audit.
The first thing you want to ask is:
Are you targeting the right keywords?
Often times, the keywords that some businesses are going after are way out of their league.
They think they can win on “homerun” keywords… But they will more than likely end up failing.
A good audit will help you determine the quality of your keywords.
More often than not, I will have the client target less competitive long tail keywords.
My team and I refer to these keywords as “easy wins”.
It is a good practice to review your current set of keywords.
You should do this on a quarterly basis.
It’s always better to focus your resources on keywords that are performing well.
Do not spread your resources across many keywords.
Isolate your winners and go after those.
But, now you are likely wondering:
How do I know if I’m targeting the “right” keywords?
Think of your keywords as goals.
Every keyword that you decide to target is a goal you want to achieve for your SEO campaign.
That means you need to use the S.M.A.R.T. principle.
You need to choose a specific set of keywords to target.
A list of a thousand keyword is not specific.
Choose 10, 20, or 100 keywords depending your budget and resources.
You must measure the performance of your keywords.
There are some SEOs that say you shouldn’t track keywords anymore.
I agree that tracking keywords without tracking other important KPIs isn’t effective.
But, tracking your core keywords is an excellent way to see how Google is valuing your website.
It’s also a way to measure the impact of your link acquisition.
To measure the performance of your keywords, I use Ahrefs.
Are you targeting keywords that are beyond what your website is capable of?
The truth is:
New websites struggle to rank for competitive keywords.
- The websites that rank for competitive keywords are aged and trusted.
- These same websites will be more authoritative than yours because they have been acquiring backlinks for years.
- Since they are ranking for competitive keywords, that means they will also have a much larger budget than you. This will allow them to buy authoritative link placements to maintain their position.
You have to be realistic.
If your site is new, then you should target long-tail keywords.
Don’t let your ego determine what keywords you want to go after.
I’m not saying you are egotistical.
I’m saying that because I have let my ego determine my keyword selection process in the past.
It went something like this:
“Dude, I’m so good at SEO and I can literally rank for anything.”
That’s how I used to be.
Moral of the story: don’t let your ego dictate your campaign.
Be realistic and use the data to determine your path.
This should be obvious, but your keyword should be relevant to what your business does.
How long do you think it will take you to rank for your current set of keywords?
You need to put a deadline.
Remember, improving your site’s performance for a keyword is goal. You should try to achieve that goal as fast as possible.
The S.M.A.R.T. principle is only the first step to validating your current keyword set.
You now need to analyze the competition for those keywords.
Step 3: Competitor Analysis
Goal: to validate your keywords and find missed link opportunities.
There are a few objectives for analyzing your competitors:
- To see whether a keyword is too competitive.
- To find new keyword opportunities.
- To see what types of content are performing well (so you can skyscraper them).
- To find link opportunities.
You need to analyze your competitors to validate your keyword selections.
You should be asking yourself:
- “Are my keyword selections too ambitious?”
- Or, “are my keyword selections too conservative?”
We split our competitor analysis into two segments.
The first is just a quick analysis of PA and DA in the SERPS.
You will need the Moz toolbar for this.
Let’s say we wanted to rank for the keyword “marketing automation”.
Enter “marketing automation” into Google and scan the results.
We look for websites that have a DA less than 50. In this case, there is one site ranking for the keyword “marketing automation” with a DA less than 50.
DA is a decent gauge for determining whether a keyword is worth going after or not.
At scale, this process is the quickest way to eliminate keywords from your list.
Keep in mind:
Competition is all relative.
For example, it would be foolish to target “marketing automation” if your website is new. But, if you have an established website with authority, then it may be something to consider.
The second analysis is more in-depth because we are trying to find link opportunities.
I won’t go too deep into this, but use Ahrefs or Majestic to analyze the link profiles of your competitors.
Are There Any Low Hanging Fruits / “Easy” Wins?
Now let me show you how you can find low hanging fruits.
We will use SEM Rush and Google Search Console for this.
- Go into Google Search Console and click on “Search Traffic” and “Search Analytics”.
- Select “Impressions” and “Position”.
- Then sort the results by “Position” will the lowest ranking position at the top.
These are low hanging fruits that you can target.
If you website isn’t ranking for any keywords, then you will need to use SEM Rush to find low hanging fruits.
- Go to SEM Rush
- Enter a competitor URL
- Go to “Organic Research” and “Positions”
- Sort the keyword list to show lowest search volume keywords
I prefer to start with the lower volume keywords because they are the easiest to rank for. Here are some low hanging fruits I found digging through BodyBuilding.com’s traffic data:
Now let me show you how to perform a technical analysis.
Step 4: Technical Analysis
Goal: to identify technical issues that are hurting user experience and hurting your search engine performance.
Technical issues can plague your website’s SEO performance.
The good news is that you have tools like Screaming Frog SEO Spider at your side.
These tools will help you identify many of the prevalent issues.
How Fast Does Your Website Load?
How fast your website loads impacts user experience in either a positive or negative way.
That’s why it is at the top of the Technical Analysis checklist.
Any website that takes longer than 3 seconds to load has room for improvement. It is ideal if you can your site load under 1 second, but this is challenging.
Here are some resources that will speed up your website:
- How to Improve Your Page Load Speed by 70.39% in 45 Minutes
- How To Speed Up Your WordPress Site – And Increase Organic Traffic By 39.1%
- 11 Low-Hanging Fruits for Increasing Website Speed (and Conversions)
Is the Website Mobile Friendly?
This is a no brainer, but you need to check whether your site is mobile friendly or not.
Google considers this to be a strong ranking factor, so do not take it lightly.
Use Google’s mobile friendly check for the analysis.
The solution is pretty simple here:
If your site isn’t mobile friendly, then make it mobile friendly.
Check these guides for further assistance:
Is There Keyword Cannibalization?
One of the most important factors to look for in an audit is keyword cannibalization.
“Keyword cannibalization” is when two pages are competing for the same keyword.
This can confuse Google and force it to make a decision on what page is “best” for the search query.
It’s always better to guide Google instead of letting it make decisions.
You must get rid of keyword cannibalization to achieve this goal.
There is one form of keyword cannibalization that is most common:
When you optimize the homepage and a subpage for the same keyword.
This is most common on the local level.
Let’s say it’s a local personal injury lawyer from Chicago.
The homepage title would look like this:
- “Chicago Personal Injury Lawyer | Awesome Law Firm”
At the same time the client will also have a subpage optimized like:
- “Best Chicago Personal Injury Lawyer | Awesome Law Firm”
These needs to be avoided.
Choose one page to optimize for “Chicago Personal Injury Lawyer” and unoptimize the competing page.
There is one other cannibalization issue you need to look for and it involves your blog.
There is nothing wrong with writing about the same topics more than once.
But in excess, it can cause some confusion.
Google will struggle to identify what page is most authoritative for that keyword.
Not short, thin articles that do not fully explain a topic.
There are exceptions to the rule, but thin content should be avoided for most businesses.
Remember that powerful and well-developed SEO content performs better in the search engines and will produce better user engagement.
On the contrary, publishing thin, underdeveloped content will likely lead to keyword cannibalization and Google may interpret your activity as long-tail keyword manipulation.
If that happens, the Panda algorithm will kick your website to the curb.
With that said, let me show you how you can quickly identify keyword cannibalization issues:
Open up Screaming Frog SEO Spider.
Enter your website and start the scan:
Go to “Page Titles”:
Enter one of your main keywords into the search bar (this will show you all pages competing for that keyword).
Look through your page titles and identify pages that might be competing for the same keywords.
Are There Redirect Issues?
There are four types of redirects that can hurt a website’s SEO performance:
- 302 redirects
- redirect chains
- non-preferred version of domain not 301ing to preferred
- non-secured version of domain not 301ing to secured version
- Unnecessary 301s
Let’s start with 302 redirects.
To see if you have any 302s, open up Screaming Frog SEO Spider.
- Enter your target URL and start the scan
- Go to the “Response Codes” tab
- Click on the “Filter” dropdown and select “Redirection 3xx”
- Click on “Export” to export all 302 redirects
Redirect chains are when there are a string of redirects connected together.
Breaking the chain will send all authority to the final destination page (instead of partial authority).
Here’s it will look like when you fix a redirect chain:
Here’s how you find redirect chains with Screaming Frog SEO Spider:
- Go to “Configuration” and click on “Spider”
- Click on the “Advanced”, select “Always Follow Redirects”, and click “Ok”
- Enter your target URL and start the scan
- After the scan is complete go to “Reports” and click on “Redirect Chains”
Is the non-preferred version of the domain 301 redirecting to the preferring version?
Every website owners must decide what version of their website they want to show to their users.
Some people prefer the “www” while others prefer non-www. domains. Understand that whichever one you pick will not have an effect on your SEO performance.
Google treats them the same way, so it is a matter of preference.
Problems arise if you don’t redirect the non-preferred domain to the preferred.
For example, let’s say you decide to go with “www.awesomewebsite.com”.
By doing so, www. becomes your preferred domain.
And now, the non-www. becomes your non-preferred domain and vice versa.
You must 301 redirect your non-preferred domain to the preferred. Otherwise, you will end up with two duplicate websites AND you will leak authority.
I have found that websites built on custom platforms will suffer from this issue.
The developers underestimate the repercussions of keeping two versions of the site live.
They often won’t 301 redirect the non-preferred version of a domain to the preferred.
In essence, if you do not redirect, you have two duplicate websites.
I use this tool to see if the proper redirection has been done.
Is the non-secure version of the website 301 redirecting to the secure version?
Let’s just say that the transition to SSL hasn’t been pretty.
Many websites have made a great decision to secure their sites with a certificate.
But, many are struggling with the implementation the certificate.
Many clients forget to 301 redirect the non-secure (http) site to the secure (https). This has a similar effect of not redirecting a non-preferred domain to the preferred.
Identifying this issue is simple:
- Go to your target URL: https://www.gotchseo.com/.
- On the address bar in your browser, remove the “s” from http and hit enter.
It should redirect back to the secure version.
If it doesn’t, then you need to get it fixed!
You can also use the tool above to check as well.
Is the Site Being Indexed Well?
Your website can only get traffic if your pages are indexed in Google. That’s why it’s always a good idea to make sure your ENTIRE website is being indexed well.
A good place to start is with your robots.txt file.
Sometimes by accident, website owners will block the search engines from crawling their site.
That’s why you must audit your robots.txt file to ensure that your site is being crawled well.
The command you need to look for in your robots.txt file is “disallow”.
If you use this incorrectly, you could stop search engines from crawling your site.
The specific command you want to look for is “Disallow: /” – this instructs search engine spiders not to crawl your website.
You website should have a sitemap because it helps with indexation.
If you are on WordPress, Yoast will automatically create one for you.
If you aren’t using Yoast then install the XML Sitemap plugin.
For those on custom-builds or non-Wordpress websites, you will have to take the traditional route.
Go into Google search “site:yourwebsite.com”.
This will show you how well your site is indexed.
If you site isn’t showing as the first result, then you likely have a penalty.
Or, you are blocking the search engine from crawling your website.
Is There Duplicate Content?
Duplicate content can plague your website and could land your website a Panda penalty.
Ecommerce stores are most susceptible to duplicate content issues because they will copy manufacturer product descriptions.
To top it off, they will also use cookie-cutter META information for those pages.
This creates a duplicate content tsunami.
Let me show you the issues with duplicate META data first:
Duplicate META Data
Duplicate META data is most prevalent on Ecommerce websites.
This is because many Ecommerce websites have many pages with similar products.
As a result, they will get lazy and paste similar META descriptions on pages.
This isn’t a good practice.
If your Ecommerce has many similar pages, then you should consider consolidating them. There is no reason to have several pages for different colors or sizes of the same product.
Once you have taken care of this issue, then you need to write unique descriptions for every single page.
Yes, that’s right. Every single page.
You should strive to have unique META data and unique content on every single page on your website.
This will take a ton of effort and resources, but it’s worth it in the end.
Remember: you don’t have to complete it in one day.
If you improve only 10 pages a day, you will have 3,650 optimized pages within a year.
To find duplicate META data you can use Screaming Frog SEO Spider and Google Search Console.
Let’s start with Screaming Frog:
Enter your URL and start the scan
Go to “Meta Description”, on “Filter” dropdown select “Duplicate”, and “Export”.
The next place to look for duplicate META descriptions is in Google Search Console.
Go into Google Search Console and go to “Search Appearance” and “HTML Improvements”:
In this section you’ll find duplicate META descriptions and title tags.
Page-Level Duplicate Content
Now that you have identified all duplicate META data, you now need to find page-level duplicate content.
To perform this task you will need to use Siteliner.
This tool will show you what pages share the same or very similar content.
Go to Siteliner.com and enter your target website. Click on “Duplicate Content” and see what pages are suffering from it.
Keep in mind that this tool isn’t always accurate. For example, it may not know that you have “noindexed” your category pages. So, it will likely classify those pages as duplicate content. Use your best judgement.
Are There 404 Errors (With Link Equity)?
Not all 404 errors are equal.
First, let me dispel a common myth that “all 404 errors are bad for SEO”.
This isn’t true.
404s are an effective tool for telling search engine that the page no longer exists.
When a search engine like Google finds a 404, it will remove that page from the index.
For intentional 404 errors, this is exactly what you want.
Think about it: would you want someone to find this dead 404 page through a Google search?
Of course not.
That’s why Google removes them because it isn’t helpful for the user.
With that myth dispelled, there ARE 404 errors can actually hurt your site’s performance:
404 pages that have backlinks.
These types of 404s are leaking authority on your site.
What you want to do is reclaim these backlinks by 301 redirecting the 404 page to a relevant page on your site.
If there isn’t a relevant page, then redirect it to the homepage.
To find 404 errors, I recommend you use Google Search Console:
Go to “Crawl” and “Crawl Errors”. Click on the “Not Found” tab to see your site’s 404 errors:
Is Your Site Architecture Efficient for SEO?
Many audits skip right past site architecture, but this is a big mistake.
Most websites are not designed with SEO in mind.
Weirdly, this isn’t always a bad thing. That’s because many businesses create their website based on what they believe the user wants.
You should always be user-centric with your SEO strategy.
But, you still need to guide and please the search engine at the same time.
A strong site architecture makes both the users and the search engines happy.
When examining site architecture ask the following questions:
- Is the navigation clean or is it cluttered?
- Are the internal links using effective anchor text?
- Can you improve the navigation to make it easier for users and the search engines?
Are the URL Structures SEO Optimized?
We always analyze the URL structure during the audit to make sure they are SEO friendly.
But, we are also careful at this stage as well.
You do not want to change URL structures if the client’s site is performing well.
The reason is because you have to 301 redirect the old URL to the new URL.
301 redirects are spotty and won’t always send the trust and authority from the old URL.
This means you could end up losing rankings for an extended period of time.
Changing your URL to a more optimized and clean version will likely help your site in the long run.
You just have to be willing to lose some organic traffic upfront. Or, you can just avoid changing the URL at all.
Now, if the client isn’t ranking for anything, we will always suggest to change the URL structure (if it’s bad).
You have to use your discretion and remember that “if it ain’t broke, don’t fix it”.
In attempt to game the search engine, some clients will keyword stuff their URLs. Keyword stuffing anything on your site is never a good practice. In fact, it will likely hurt your performance more than help it.
Here is an example of a keyword stuffed URL that we run into a lot:
You will notice that “cool widgets” is in the URL three times. Whether intentional or not, it will hurt a page’s performance.
I recommend removing the the subfolder “cool-widgets” so the URL look like this:
Are Internal Links Injected the Right Way?
Ineffective/non-strategic internal linking can confuse the search engines. Internal links are supposed to be clear and are supposed to use exact match anchor text.
If you have a page about “blue widgets”, then “blue widgets” should be your internal anchor text.
In my eyes, this seems like a pretty simple concept.
Unfortunately, I see this problem repeated over-and-over again when we audit sites.
Finding ineffective internal links isn’t easy…
You have to go page-by-page to identify them and fix them.
This is one of the most time consuming on-site SEO changes you will encounter.
To avoid this from happening, just make sure you always use good practices.
The majority of your internal link anchor text should use exact or partial match anchor text.
Step 5: Page Level Analysis
Goal: to ensure that each keyword-targeted landing page is optimized effectively.
Every audit must examine the quality of content and the optimization of each page.
Strong content without effective optimization won’t perform. Weak content with strong optimization also won’t perform.
You need both strong content and effective optimization to drive search engine traffic.
The first page-level optimization question you have to ask is:
Does this page satisfy search intent?
Satisfying search intent is critical for your ranking well in Google. It doesn’t matter how long your content is. What matters is how well you satisfy search intent.
Read my SEO content guide to learn more about satisfying search intent (it’s critical).
The next steps is to run the page through Copyscape.
Has the content been copied?
I don’t run the target page through Copyscape because I think my client is liar.
It’s because there are some scums on the Internet that will steal content.
All you need to do is file a DMCA report to Google and they will remove the content from the index.
After we run each target page through Copyscape, we then examine the basics.
Is the keyword in the title?
Your target keyword for the page needs to be in the title. And, the keyword only needs to appear once.
Is the keyword in the META description?
Make sure the target keyword is in the META description. Do not stuff it in there more than once.
Is the target keyword within the first few sentences?
Your main keyword should appear once at the beginning of the content. This is to strengthen the relevancy of the page.
Is the URL SEO-optimized and clean?
The landing page should include the target keyword in the URL and the URL should be short and clean.
Does the ALT tag on the first image of the page contain the target keyword?
All of your ALT tags should be filled out, but your main keyword for the page should appear in the first image ALT tag.
Does the last sentence of the content include the target keyword?
The last sentence or conclusion is your chance to solidify the relevancy of the page. Make sure you include your keyword.
Are there internal links? If so, are they placed the right way?
As I mentioned before, if you have internal links, make sure they are using exact match anchor text.
This is all you need to analyze for page-level optimization. Now let me show you how you need to examine your content.
Step 6: Content Analysis
Goal: to determine whether or not the current content strategy is working. And, what needs to be improved to get more out of the content.
Your content analysis must explore both your keyword-targeted landing pages and any blog content that’s been published.
Analyzing content is the most time-consuming part of an SEO audit.
That’s because it is the most important part of the entire audit.
You can get all of the other parts of an SEO campaign right, but if your content is slacking, your results will not last.
You Need an Outside Perspective
It is critical that you bring in a third party to analyze your content strategy.
Because you need an outside viewpoint. It’s hard to self-examine and critique your own content because you will be biased.
You need an outside party to tell you the truth.
Most businesses do not have effective content strategies.
In fact, most don’t have a “strategy” at all.
Here are the questions you need to ask during your content analysis:
Is Your Content Unique and Original?
This should be a no-brainer, but the content on your site needs to be unique and original.
That means using your creative mind to come with awesome ideas!
No regurgitated garbage. Taking the extra effort to create something original is worth it.
Is Your Content Useful and Informative?
In addition to your content being original, you also need to make sure it’s useful and informative.
That means, it should inform, instruct, or solve a problem that your ideal customer has.
You must always consider your ideal customer when creating content.
The content on your site isn’t there to impress your co-workers.
Your content is there to serve and help your prospective customers.
Is Your Content Better Than Your Competitors?
There is no point in creating content unless you believe it will be better than what’s currently ranking in the search engine.
Every single piece of content must have the intention to beat your competitors.
Otherwise, you are wasting your time.
Is Your Content Engaging?
Your users need to feel like you are speaking directly to them. “You” and “your” need to become your favorite words.
Is Your Information Accurate?
Don’t make up facts or statistics or falsify information.
Is Your Content Long Enough?
Longer content performs better in Google and this has been proven here.
You can also do your own research and see this demonstrated in the SERPS.
Are There Grammar and/or Spelling Errors?
I’ve said this many times but dnt rite lik dis. Use the Hemingway Editor if your writing is less than stellar.
Are There Broken Links?
Google hates when there are broken links in your content because it hurts user experience. Make sure you audit your pages to make sure your links are working correctly. Use this free broken link checker to find broken links on your site.
Do You Have Excessive Ads?
Excessive use of ads can take away from your content, are distracting, and will make users hate your website.
When users hate your website, Google will hate it as well.
If you use ads, do not let them overwhelm your content or Panda will be paying your website a visit.
Are You Moderating Your Blog Comments?
Spammers love to inject nasty links in blog comments.
That’s why you need to make sure yours are properly moderated.
You don’t want to be guilty by association, so make sure you keep your comment section clean.
These questions are the first step to determining whether your content strategy is working or not.
The ultimate indicator of your content’s performance will come from real user experience data.
Step 7: User Experience Analysis
Goal: to see how well users are interacting with your content and website as a whole.
It is impossible to know what every user thinks about your website.
Fortunately, you can get a general picture of user experience based on the data inside Google Analytics.
There are few data points you want to examine in your user experience analysis:
You are likely wondering: “what is a good bounce rate?”
Unfortunately, there isn’t a clear answer.
Bounce rate is all relative and depends on what type of website it is.
For example, a “funny cat pictures” website will likely have a high bounce rate.
That’s because people go to the page, get their laughs in, and leave.
Sites like mine will have lower bounce rates because people will want to read and learn more.
With that all said, a bounce rate between 60% – 80% is solid.
80% – 90% is enough to warrant looking into the issue further.
If it is above 90%, then it needs to hit the top of the priority list.
Average Time Spent on Site
The longer users stay on your site, the more chances you get to convert them.
Like bounce rate, average time spent on site is relative.
If the average time spent on site is less 1 minute, then it’s definitely something you will want to look into.
As a general rule of thumb, users will spend more time on your website if there is a lot of content to consume.
For example, my readers spend an average of 2:52 minutes on Gotch SEO.
If this was less than 1 minute, I would have to start questioning my content strategy and my site in general.
There is one thing that will quickly repel users:
A lack of quality content.
Low average time spent on site often plagues local businesses for this exact reason.
That’s because anyone looking for a “plumber in St. Louis” is likely price shopping.
They will jump from business-to-business looking for the best deal.
The best way to combat this problem on the local level is to produce more helpful content.
You should focus on educating your prospective local clients.
Education and transparency lead to trust.
Focus on giving more value than your competitors.
This will improve bounce rates and force users to stay on your site for longer.
Think about this way: if someone wanted to get to you, could they learn more in 30 seconds or in 3 minutes?
Yes, I am captain obvious, but it’s necessary.
The longer users stay on your site and digest your content, the more they will feel like they “know you”.
Tracking goal competitions is the most important metric in Google Analytics.
The only reason your business should even have a website is to get conversions/goal completions.
It doesn’t matter if your bounce rate is low or people are staying your website for hours…. If the visitors aren’t converting into leads, sales or email subscribers then you are wasting your time.
The goal of improving the other metrics is to make you more money!
Remember, SEO is just a means to an end. SEO by itself doesn’t make money.
YOU make money by selling.
You can have the best SEO on the planet, but if you can’t sell, it won’t matter.
The word “sell” will have a different meaning for everyone.
But there is one thing that every online business has in common:
You must sell through through copywriting or through video. If you skip this step, then no one will buy your products or no one will become a lead.
With that said, whenever goal completions are abysmal we immediately look at the client’s on-site sales strategy.
- Is it easy for leads to contact you?
- Is there enough information about your service?
- Are you showing enough social proof?
Identifying what pages users leave from the most is the first step to fixing the issue. It should be obvious, but you must analyze the most frequently exited page.
You have to ask the simple question “why are they leaving this specific page more than others?”
Believe it or not, it’s not always a bad thing to have a high exit rate on a page.
Sometimes the content does its job for the reader and forces them to go out and take action.
Don’t always think that users are leaving a specific page because they hate it.
If the content solves the user’s problem well and they leave the page, you have done your job.
There is one very important thing to consider when examining Exit Rate inside of Google Analytics.
Do not look at the total number of “Exits”.
The total number of exits will always be higher on pages that get more traffic.
The number you want to look at is the “% Exit”.
Sort your data from the highest percentage to the lowest.
A “high” exit percentage would be anything over 80%. A “normal” exit percentage is around 50-65%.
The #1 issue that will force people to leave a page at a high frequency is that your content did not solve their problem or answer the questions they had.
There are other factors that may force people to leave a page like design, but content is almost always the culprit.
Go the page with the highest exit rate and ask:
- Does this page solve a problem or answer a question to the fullest extent?
- Are there still some questions left unanswered?
- How is the readability of the content?
- Are there too many big blocks of text?
- Too little images?
- Broken images?
- Does the page load slowly?
- Are there distracting elements such as advertisements that would send a user off your site?
- Are you setting external links to “open in a new window” (if not, you should)?
These questions should be more than enough to get to the bottom of the issue. Go through this process for every page with a high exit rate.
The quantity of visitors who return back to your website is a strong positive user signal.
It means that your website or content is worth seeing again.
Return visitors are also good from a conversion standpoint because it gives you more opportunities to convert them into a lead or email subscriber.
If you do not have a high percentage of Return Visitors then this may be a sign that your content is lacking. Or, your website has one or many of the technical or content issues that I described above that are repelling your users.
Like Return Visitors, branded searches are a strong indictor that people are interested in your website and brand.
If you are producing great content and your website is built with users in mind, then people will want to return. That means they will go into Google and search for your brand.
To see how well you are currently doing, you will need to use Google Search Console.
Go to “Search Traffic” and click on “Search Analytics”. Filter by “Clicks” so that the search query with the most clicks is on top.
Your brand name should be one of the top queries.
Social signals by themselves are not powerful.
BUT, if you combine them with all of the other positive user metrics, then your website will get a whirlwind of positive ranking signals.
Getting REAL social signals should be a priority for your business. The only way to get them is through creating great content and pleasing your users. You can also consider using social locker plugins if you are really struggling.
Now it’s time to take a look at your link profile.
Here we go:
Step 8: Link Analysis
Goal: to identify strengths and weaknesses in your link profile.
As you know, backlinks can make or break an SEO campaign. This is why a large portion of our audit is spent analyzing the client’s link profile. We use Ahrefs, Majestic, Open Site Explorer, and Google Search Console to analyze the links.
Now you are probably wondering: what are we looking for?
We are looking at a few different factors:
Link relevancy is king when it comes to link building.
That’s almost always where I begin a link audit.
Are the backlinks hitting their site relevant?
100% of your backlinks don’t have to be relevant, but the majority should be.
To quickly identify the relevancy of a client’s link profile, we export their links from Ahrefs and use the bulk check on Majestic.
When you export from Ahrefs, make sure you export the referring domains like so:
Now you are going to take those referring domains and use Majestic’s bulk check to see Topical Trust Flow Topics.
Although the Topical Trust Flow Topic metric isn’t perfect, it is the only scalable relevancy metric there is.
Manually checking the relevancy of each linking site would be a horrible waste of time.
The goal of this exercise is to get a general relevancy picture of the DOMAINS that are linking to the client’s site.
Go to “Tools” > “Link Map Tools” > “Bulk Backlinks”.
Place the referring domains into the bulk checker and export the results. Sort your CSV file based on Topical Trust Flow Topics.
Identify what link sources are completely off the wall.
If you are a lawyer and you have a backlink from a domain with a Topical Trust Flow Topics of “Pets”, then you should be concerned.
Mark all backlinks that are irrelevant. This doesn’t mean you are going to get them removed.
It’s just a way for you to know that they exist. That way, you could go back to them if your site was ever hit with a penalty.
After link relevancy, link authority comes in a close second.
In fact, pure authority can sometimes mask a lack of link relevancy.
I prefer relevancy before authority because I believe it keeps your site safer from algorithm updates.
But to each their own!
There are several ways to find how “authoritative” your backlinks are.
You can run a bulk check on both Majestic and Ahrefs.
Ahrefs “Domain Rating” (DR) is an accurate gauge of site authority.
It is much more accurate than PA and DA because it updates on a frequent basis.
The data from Open Site Explorer updates at a snails pace and is inaccurate most of the time.
Don’t believe me?
Open Site Explorer gives GotchSEO.com a DA of 25 and claims the site only has 30 linking root domains…
Ahrefs is showing 562 linking root domains and it’s only showing about 80% of the backlinks GotchSEO.com actually has.
With that said, you can use Open Site Explorer to crosscheck, but don’t rely on it’s metrics alone.
Another metric that is nearly impossible to “game” is the SEM Rush traffic score.
That’s because it based on real organic search engine rankings.
SEM Rush uses its own algorithm to determine how much your organic traffic is “worth”.
It’s not perfect, but it’s a metric I rely on a daily basis to determine the quality of link opportunities.
Use all of the metrics available at your disposable to gauge the quality of your current backlinks or opportunities.
Diversifying your backlinks makes your profile more “natural”.
Different “types” of backlinks include
- contextual links
- site-wide footer/sidebar links
- directory links
- resource page links
- niche profile links
- forums links
- relevant blog comment links
In addition to the “type” of backlink, you also want to have diversity with Follow and NoFollow links.
At this part of the analysis, just ask the simple question:
“Is my link profile diversified enough?”
Another important link factor you need to examine is the ratio of homepage links compared to deep links.
If you are using a content-focused SEO approach, then the majority of your backlinks should be going to deep pages.
Regardless of what approach you are using, it is always a good practice to distribute backlinks across your entire website.
This will build the overall authority of the site and improve your chances of seeing SEO results.
Anchor Text Diversification
Anchor text abuse is rampant and that’s why we always check the ratios.
The first ratio we care about the most is the client’s percentage of exact match anchor text.
After that, we want to see their percentage of branded anchor text.
If the EMA outweigh branded anchors, then there needs to be a change of strategy.
As you may know, the bulk of your anchor text profile should be branded anchors.
EMA’s should be used far and few between because it is a strong spam signal to Google.
If the client is suffering from over-optimized anchor text, there are a few solutions:
- Build new backlinks with branded anchor text to offset the over-optimization
- Consider getting some of the EMA changed to branded anchor text
Total Referring Domains
The more unique referring domains a site has linking to it, the better.
The analysis we do here is nothing more than a comparison against their top ranking competitors.
For example, how many referring domains do they have linking to them compared to their competitors?
The solution is simple here:
Get more relevant, high quality backlinks from unique domains.
Historical Link Velocity
Has their link velocity stayed steady throughout the life of their website? Or has it been erratic?
Massive dips in link loss are suspect.
Backlinks from real websites rarely fall off.
Backlinks from artificial websites fall off when the link providers stops paying for their hosting or do not renew a domain.
Your goal should be to achieve steady link growth overtime like this:
Now that you know how to analyze your link profile, let me show you how to analyze your citations.
Step 9: Citation Analysis
Goal: to see whether or not the client has consistent NAP-W information across all listings. And, to identify business directories that the client is not listed on.
The citation analysis is used for local clients.
However, it can be used for any business who is looking to maintain consistency across all online properties.
I recommend that every business performs a citation audit even if you aren’t engaging in local SEO.
The good news is that citation cleanup is one-and-done activity.
Let me show you what we look for in a citation analysis:
Having consistent NAP-W (name, address, phone, website) consistency is one of the most important ranking factors in Google Local.
There are countless tools for auditing your citations such as:
There are hundreds of business directories to submit your site and that’s why it’s best to use a tool. Once again, we use Bright Local’s Citation Tracker, White Spark, Moz Local and Yext to find these untapped citations.
Wow, that was super long, but I really didn’t want to leave anything out! You are now equipped to perform a comprehensive SEO audit whenever you want.
Don’t want to do this SEO audit yourself? Check out our new SEO audit service because we handle the entire process.