By now you probably know that SEO involves strategically using keywords and links to rank your site higher in the search engine results pages (SERPs) for relevant searches.
That’s called on-page optimization, and it’s only half of the SEO puzzle. You also need to make sure your entire website has key features and characteristics to help search engines understand how important and safe your website is.
After all, search engines bank their reputation on their ability to quickly display exactly what users need. If unsafe and irrelevant websites made it to the top of the SERPs, users would lose faith and trust in the search engine.
In short, you need to run this 45-point SEO site audit to make sure your site says, “I’m worthy of the top spot!!” to Google, Yahoo, and Bing.
45-Point Checklist for Running an SEO Site Audit on WordPress
First Things First
1. Run a crawl of the site: In order to get the data you need in the first place, we recommend using Screaming Frog or Deep Crawl for this first step. It’s important to know what errors might occur when the search engine crawlers retrieve pages from your site for indexing. Get ahead of the game by running a crawl yourself and addressing any errors with redirects, meta tags, indexes, headers, content, etc. based on the results.
2. HTTP vs. HTTPS: HTTPS signals to search engines that data sent through your website is secure. This keeps user info safe. Remember, Google wants to display secure websites versus non-secure sites. If your site is HTTP, you’re already at a disadvantage. You can force your WordPress site to HTTPS, or use an HTTPS plugin. Then, make sure both versions of your site are indexed by using a site:domain.com command in Google.
3. WWW vs. non-WWW: Back in the day, typing in WWW. before the domain name was an important step for getting websites to load correctly. Now, most Internet users take it for granted that a site will automatically load without WWW. Does your site follow this assumption? You need to check just in case.Use the site:domain.com command in Google to check, or visit your site, delete WWW from the URL, and hit enter. The page should still load your site. If not, there’s a potential for both users and crawlers to get confused.
4. Max Load Time: Unfortunately, the longer a page takes to load, the more likely users are to abandon the page or “bounce.” When you run the crawl of your site, highlight pages that took more than five seconds to load. Even three seconds is considered too slow for some users. Figure out what’s dragging down that page to improve the page speed and user experience, both of which search engines will take into account.
5. Potential Site Speed Issues. Besides looking at individual pages that run slow, you also need to look at your site in general. Remember, the general rule of thumb is that your site should load in less than 3 seconds. Use tools like GTmetrix or the Google Speed Test Tool.
6. Mobile-Friendly Test Pass. Your site needs to be easy to read and navigate on small touch screens. You can run it through Google’s mobile-friendly test tool to check if any errors pop up.
Navigation and Sitemaps
7. Pagination: The easier you can make navigation for both users and crawlers, the better your SEO. Crawlers need to be able to find new content easily, and pagination (displaying a limited number of posts or products on a page and listing additional page numbers at the bottom) is preferable over infinite scrolling or a simple “next” button.
8. Faceted Navigation: If you have an eCommerce store with a lot of products (especially variable products) for sale, you want it to be easy for people to find exactly what their looking for. That’s where filters, or faceted navigation, comes into play. The problem is, all of these filters can create duplicate content for search engine crawlers. There are multiple ways to address faceted navigation to improve SEO.
9. Navigation. You’ve already looked at a couple aspects of navigation, including pagination and faceted navigation. But how’s the site’s navigation structure overall? The fewer steps (aka links) that crawlers need to go through to find pages, the better. The organization of your menus and your internal linking structure are crucial factors here.
10. XML Sitemap. Upload your XML sitemap directly to the Google Search Console to make sure the pages are all indexed. You should also make sure it updates dynamically any time new content is added. You can also upload a new XML sitemap any time you finish a large renovation to help ensure all the new content is indexed faster.
11. HTML Sitemap. Another way to help search engine crawlers do their job is to include an HTML sitemap in the footer of your website. This document is easy for the crawlers to scan and helps them index your entire site.
12. Robots.txt. The Robots.txt file is another important document for site crawlers. Make sure it does not block mobile landing pages, and also check that it contains the URL for the sitemap.
300s, 400s, and 500s
13. 302 Redirects: 302 redirects tell search engines that the redirect is meant to only be temporary. This is useful if you’re renovating your site. But if you actually have no intention of removing those redirects, ideally you will change them to 301s, which are permanent. Why does this matter? Search engine crawlers use redirect information to replace the links they display in search engines. The last thing we want to do is confuse crawlers to the point where they don’t display our site at all. So, use Screaming Frog again to highlight any 302 redirects from your site crawl. Change them to 301 redirects.
14. 301 Redirects: Now that you have all your 302s changed to 301s, you’re all set, right? Well, not quite. You need to make sure your 301 redirects are not in a chain. In other words, you shouldn’t have a redirect leading to a redirect, going from A to B to C. Redirects should be A to B, period.
15. 500 Errors. Take note of any pages that return a 500 error of any kind, such as 503, etc. 500 errors are typically caused by a temporary problem with the server, but you should investigate anyway to make sure the issue is resolved quickly.
16. 400 Errors. Another set of errors that you need to look for after your scan include 400 errors. A 404 error, for example, means that the URL points to a page that doesn’t exist. If you know that the pages do indeed exist, you can check out our post about how to fix the 404 error on WordPress.
17. Canonical tags: If you have URLs that display very similar or even identical content, search engine crawlers don’t have a clear idea of which one to display for search results. Adding a canonical tag says, “Display this one!” On Screaming Frog, use the directives tab to check that canonical tags are in place.
18. Canonicalized Pages. Next, highlight pages with URLs that are different to the canonical URL specified in the canonical tag in either the HTML or HTTP header. Make sure these end up matching.
19. Pages Without Meta Canonical Tags. Highlight all pages without a meta canonical tag.
20. Mobile vs. Desktop URLs. Make sure that mobile and desktop URLs have the rel=”alternate” and rel=”canonical” tags respectively in order to show crawlers the relationship between the two.
21. Noindex Pages. If you don’t want crawlers to index certain pages, that’s fine. It makes perfect sense for faceted navigation, for example. But check that the pages indicated as Noindex are actually meant to be Noindex.
22. Duplicate Content. Duplicate content can be very problematic for indexing and ranking. Your content should be as unique as possible, even when you need to explain the same services or concepts. Do a domain search and paste in the content you want to check. Add quotation marks around the content in the search in order to show exact matches.
23. Hidden Content and Links. If you’ve done all your own SEO work, you may feel confident that this is not a problem on your website. However, some SEO companies use black hat techniques to stuff keywords and links onto pages without letting users notice. The same can happen if your site has been hacked. Look for white text on a white background, or a link that only uses a single letter as anchor text. Other techniques that you have to watch out for are positioning text behind an image or even offscreen using CSS, or setting the font size to 0. Search engines will penalize these techniques, so don’t let them slide.
24. Missing H1 Tags. H1 tags are header tags that apply to the page or post titles on your site. These tags are important for SEO because they represent yet another signal for crawlers to understand what the most important information is on the page. Highlight pages in your crawl that have missing H1 tags, and fix them.
25. Multiple H1 Tags. Each page or post should only have one H1 tag. H1 tags lose their power if there are multiples for the same page/post. Highlight pages with multiple H1 tags and evaluate the most important one to keep for each. The additional H1 headers on each page should be downgraded to H2 or H3.
Meta Titles and Descriptions
26. Short Titles.You have 65 characters in which to post a page title, which helps both people and crawlers understand what the page is about when it shows up in SERPs. Make sure you use all that space to your advantage. Use specific terminology, and don’t cut yourself short.
27. Max Title Length. If your page titles are too long, on the other hand, you’re not efficiently conveying what the page is about. Highlight titles over 65 characters in your Screaming Frog scan and edit them to make sure they are effective.
28. Duplicate Titles. Just like content, you want page titles to be totally unique so that search engines know which pages to display for which keywords. If you have the same titles for every page, you won’t send a clear signal to the search engines and will show up less often in the SERPs.
29. Missing Titles. Page titles are an important SEO signal for search engines. They shouldn’t be missing. Look at your Screaming Frog crawl to find pages with missing page titles, and then make sure you give them titles that are unique and an appropriate length.
30. Short Descriptions. Meta descriptions appear on the SERPs underneath the page title. They provide additional information about the page that can’t be conveyed in the title alone. The goal is to present keywords in the meta description for the crawlers, but also to persuade people that clicking on your site will be worthwhile. In other words, they’ll find the answers they are looking for on your page. Highlight pages with meta descriptions shorter than 50 characters, and use all of the space available to you to be persuasive.
31. Missing Descriptions. If you don’t write meta descriptions, search engines will pull sample text from the page to display in the SERPs. Do you really want to leave it to chance whether compelling copy appears in front of potential customers? Metas can sometimes be the difference between ranking and not ranking, or a viewer clicking or not clicking. Write those descriptions.
32. Duplicate Descriptions. You don’t want to dedicate two different pages to the same topic because search engines can only choose one from your site to display. Likewise, each page needs a unique meta description to make it crystal-clear to search engines that they should display those pages for different search terms.
Google Search Console SEO Audit
33. Set Up Google Search Console. This useful tool is provided by Google and helps you understand what’s going on with your site. Before it can be of any assistance, you need to set up your account.
34. Search Console Penalties. On Google Search Console, look for manual actions against the domain. This information is located under Search Traffic — Manual Actions. Google has more than just robotic crawlers who handle optimization. Real people can also evaluate websites and determine whether you meet their guidelines. This is where manual action penalties come from.
35. Search Console Structured Data. Also called “rich results,” structured data can include videos, logos, recipes, etc. Using the Google Search Console, you can determine if the crawlers have any trouble reading the structured data on your site.
36. Search Console International Targeting. If you have multiple versions of your site in different languages or for different geographic areas, check that the hreflang tag is functional.
37. Search Console HTML Improvements. If you are using the old version of the search console, you can still pull up a report that will tell you which pages should have their titles and descriptions rewritten. This report was discontinued with the new version of the search console.
38. Search Console Mobile Usability (the Same URLs). Check out this tab on the Google Search Console for any recommendations about improving mobile responsiveness.
39. Search Console Index Status. See exactly which pages on your site have been indexed by Google and which have not. This tab also provides some suggestions for how to make sure all your pages get indexed.
40. Search Console Content Keywords. Which keywords does Google associate with your pages? You have a keyword plan in mind for your SEO efforts. Did it translate well to Google? This is where you find out if you’re on track with keyword placement.
41. Search Console Remove URLs. Make sure important URLs haven’t been mistakenly removed from the index.
42. Search Console Robots.txt Test. Run this test on the search console to check for any errors in your Robots.txt file.
43. Search Console Sitemaps. Determine which sitemaps are active and how quickly they are being indexed.
44. Search Console URL Parameters. Use this tab to check whether URL parameters discovered by Google or indicated manually are correct.
45. Search Console Security Issues. This tab will tell you whether Google has detected any security issues on your site.
And that wraps up the SEO site audit checklist! Once you check each of these items off, you can be sure that your site has the best chance possible for ranking well. Of course, there’s more to SEO than just following these 45 steps. You still need to have useful content, a healthy network of links pointing to your site, and optimized images.
Feeling overwhelmed? We get it. Let us help! We can perform a thorough site audit for you, fix any problems we encounter, and report back with your site’s progress. Contact us today for a free consultation.