- 361129/Blog/Conducting-a-Technical-SEO-Audit9<p>Most SEO strategies focus on developing unique content, increasing inbound links and boosting rankings for target keywords and queries.</p>
<p>Add technical SEO to the list. A technical SEO audit of your website can shed light on how search engines view your site and interact with your content.</p>
<h2>What is Technical SEO?</h2>
<p>Technical SEO aims to make your site as accessible as possible to search engines, that is, to make your content easily crawled and indexed. This type of SEO does not focus on inbound strategies and tactics, such as writing the perfect blog post, or off-page strategies, such as link-building.</p>
<p>Technical SEO has grown in importance over the last two years, due to the increasing power of the AI behind search engine algorithms. If you’ve never conducted a technical SEO audit, now is the time.</p>
<hr />
<p><iframe allow="accelerometer; autoplay; clipboard-write; encrypted-media; gyroscope; picture-in-picture" allowfullscreen="" frameborder="0" height="215" src="https://www.youtube.com/embed/UC4nkNMM1mo?rel=0" title="What is Technical SEO?" width="360"></iframe></p>
<hr />
<h2>How to Conduct a Technical SEO Audit</h2>
<ol>
<li><strong>Start with Google Search Console, </strong>especially if you’re conducting your first technical SEO audit. It provides a wealth of data you can use to determine how Google crawls and indexes your site.
<ol>
<li><strong>The Performance Report:</strong> This report highlights the number of clicks your site receives from Google. It is a good “gut check” for determining how much traffic the search giant sends your way. It also shows the number of your pages Google indexes. If you own 10,000 pages and attract 600 clicks, you might have a problem. The report also highlights your average CTR and average position of your listings on SERPs. Again, these metrics, as well as your organic queries, can raise the red flags you ought to see. For example, if you’ve been hacked, you might see that one of your top queries is unrelated to your products or services.</li>
<li><strong>The Coverage Report: </strong>This report highlights errors, warnings, valid pages and those currently excluded from the search index. Many users are concerned when they see numerous pages excluded from Google’s search index. Don’t panic when you see this. Figure out why Google isn’t crawling and indexing these pages.
<ol>
<li><strong>Submitted URL has a Crawl Issue:</strong> This occurs when Googlebot tries to crawl your page but runs into an unspecified crawling error. The best way to handle these pages? Use the URL inspect tool to crawl these pages on an individual basis. This will provide more insight as to why the page isn’t being crawled.</li>
<li><strong>Submitted URL Not Found (404): </strong>As you may have guessed, this indicates the URL submitted doesn’t exist. Before setting up appropriate redirects, decide whether the error merits correction. If so, set up the redirects. <a href="https://support.google.com/webmasters/answer/7440203#fixing_404_errors" linktype="3" target="_blank">(Learn more about 404 guidelines here.)</a></li>
<li><strong>Submitted URL is a soft 404: </strong>Soft 404 errors are fairly common. They indicate that the page returned looks like a 404 page but doesn’t actually return a 404 error (i.e., page not found) to the search engine. To fix these errors, either redirect the URL to an appropriate page or return a “hard 404” error because the page no longer exists. <a href="https://support.google.com/webmasters/answer/181708?hl=en" linktype="3" target="_blank">(Read more about soft 404 errors here.)</a></li>
<li><strong>Submitted URL Blocked by Robots.txt:</strong> This occurs when your robot.txt file is blocking the resource. Again, inspect the specific URL for more information. A robots.txt file gives web robots and spiders information about which parts of a website should be indexed. So, if something is marked “do not index” in your robots.txt file, it will not show up in search results. You can follow this link to <a href="https://en.wikipedia.org/wiki/Robots_exclusion_standard" linktype="3" target="_blank">learn more about your robots.txt files</a> and by reading on to Tip 9, below. If something is marked “do not index” in your robots.txt file, but it should be indexed, adjust accordingly.</li>
<li><strong>Submitted URL Marked No-Index: </strong>Like pages blocked by a robots.txt file, this also indicates that the page should not be indexed. However, in these instances, it is likely marked “no index” because of a meta tag or instruction in your header. Removing this directive will help search engines crawl your page and content.</li>
</ol>
</li>
<li><strong>Site Maps:</strong> The sitemap report section gives you an opportunity to submit your sitemap to Google. If you’ve already submitted a sitemap(s), it will show the last time it was crawled and whether the crawl succeeded. (If you haven’t submitted your sitemap, place this task atop your to-do list.)</li>
<li><strong>Mobile Usability:</strong> Mobile usability is pretty straightforwar It highlights your site’s mobile friendliness. (With mobile-first indexing, this is extremely important for overall website SEO health.) Even if you have a responsive website, review this data to ensure that search engines can easily crawl all elements of your website.</li>
<li><strong>Links: </strong>The report highlights all the external websites linking to your site, the top linked pages and the anchor text used to drive visitors to your sit In most cases, you don’t have to spend a lot of time in this section. Simply review the information to spot any spammy links you need to disavow.</li>
</ol>
</li>
<li><strong>Use Screaming Frog:</strong> Like GSC, <a href="https://www.screamingfrog.co.uk/seo-spider/" linktype="3" target="_blank">Screaming Frog</a> is a powerful, free tool for running an SEO Audit. (Screaming Frog has a paid version, but the free version is fine for a baseline technical SEO audit.) The crawl report should focus on:
<ol>
<li><strong>Page Titles:</strong> The perfect page title will not boost you from position #52 to position #3 overnight. However, neglecting page titles entirely won’t help your SEO. Screaming Frog’s crawl report will identify pages that lack titles and will reveal duplicate titles. It also highlights titles that are too long. As you address this report, ensure your top and most important page titles are optimized.</li>
<li><strong>H1s:</strong> Like a page title, the perfect H1 won’t make or break your SEO. However, it’s important that each page have only one H1 that accurately reflects the target keyword, phrase, or query for the page. (Bonus tip: Optimized H1s also help with accessibility.) Again, as you review the report and address issues, prioritize top pages.</li>
<li><strong>Alt Text:</strong> This report sheds light on any images missing alt text. By exporting and reviewing the data, you can ensure you have a thorough alternative text description for all your images. (Bonus tip: This also <a href="/Blogs/ADA-Compliance-Website-Checklist" linktype="8" target="_self">helps with accessibility</a> and improves the chance that your image will appear as result on an image search.)</li>
<li><strong>Pagination Report: </strong>If you have many products or you simply make use of previous and next directives on your site, the pagination report will highlight issues that may prevent search engines from accurately crawling this content. If you’re a store, for example, and search engines can’t crawl all your products, your SEO suffers. This report can get pretty detailed, so if you’re worried about this type of error,<a href="https://www.screamingfrog.co.uk/how-to-audit-pagination/" linktype="3" target="_blank"> (check out Screaming Frog’s detailed documentation.)</a></li>
<li><strong>Meta Descriptions:</strong> <a href="/Blogs/2019-SEO-Strategy-and-Meta-Descriptions" linktype="8" target="_self">Meta descriptions are becoming less and less relevant in SEO.</a> Still, give them at least some attention. Screaming Frog will let you know which pages are missing or have duplicate meta descriptions. Because we’ve seen rewritten meta descriptions lately, you can move “optimizing meta descriptions” to the bottom of your SEO to-do list.</li>
<li><strong>Redirects: </strong>Monitoring your redirects is especially important if you recently launched a new site. Review redirects and URL response types. For example, if you removed pages as part of a redesign, use the response codes report to verify that your 301 redirects are set up correctly.</li>
</ol>
</li>
<li><strong>Check Your Site Speed:</strong> Site speed is an increasingly important SEO ranking factor. Check speed in any of several ways. Two of the most popular are Google Analytics page speed report and the <a href="https://developers.google.com/speed/pagespeed/insights/" linktype="3" target="_blank">Page Speed Insights tool.</a> The speed report within Google Analytics highlights slow load times on a browser-by-browser and page-by-page basis. The page speed tool shows why your site loads more slowly than expected. Be warned: Unoptimized images are usually the biggest reason for decreased speed. Be sure to <a href="/Blogs/Optimizing-Image-Sizes-On-Your-Website" linktype="3" target="_blank">optimize and compress your images</a> to improve your load time. This will make your both users and search engines happy.</li>
<li><strong>Review Your Robots.txt File:</strong> The Robots.txt file tells bots, such as Googlebot and Bingbot, how to crawl your site and which sections it should ignore. View your robots.txt file by going to yourdomain.com/robots.txt in your browser. The file should be broken down into a few parts:
<ol>
<li><strong>User Agent: </strong>This indicates which robots the information is directed at. In most cases, you’ll see User-Agent: *. This means that the information is directed at all robots. However, robots, especially malware bots, can ignore these instructions.</li>
<li><strong>Disallow: </strong>This section indicates which parts of your site you don’t want crawled. From an SEO perspective, these are pages you do not want to appear on a search engine result page. Unlike malware bots, search engines usually respect your robots.txt. If you want to make absolutely sure parts of your website never appear on an SERP, require users to login to those pages. This is the only foolproof way to shield pages and files from crawlers.</li>
<li><strong>Sitemaps:</strong> Include your XML sitemaps in your robots.txt files. Although most search engines can find your sitemaps anyway, best practice calls for including them in the file. Most content management systems automatically create a robot.txt file for you or allow you to use a third-party tool to create one. For instructions for creating a robots.txt file, <a href="http://www.robotstxt.org/robotstxt.html" linktype="3" target="_blank">follow this link.</a></li>
</ol>
</li>
<li><strong>Check for Duplicate Content:</strong> Everyone knows duplicate content is bad for SEO. However, many websites still have duplicate errors and issues. Address them. To identify duplicate content issues, I recommend SEMrush’s SEO audit tool. It will highlight the duplicate pages so you can start fixing them. (You can also use SEMrush’s content ideas to improve the content on those pages.) If you don’t have a SEMRush subscription, you can also use Screaming Frog to identify some duplicate content errors, although the report is not as robust the one created by SEMrush.</li>
<li><strong>Identify (and fix) Broken Links:</strong> As discussed above, GSC does excellent reporting on websites linking to yours as well as your most popular internal links. But it does not call out broken links so well. Many free tools do identify broken links, their location and error type. Two of my favorites are <a href="https://www.deadlinkchecker.com/" linktype="3" target="_blank">Deadlink Checker</a> and <a href="https://www.brokenlinkcheck.com/" linktype="3" target="_blank">Broken Link Check.</a> You can also get a broken link report from Screaming Frog. Once you identify the broken links and their locations, update them, starting with your most important and most-visited pages.</li>
<li><strong>Review URL Structure: </strong>A review of all URLs adds great value to an SEO audit, but is often overlooked during SEO content development and optimization. This is especially true on larger sites with lots of products. Use Google Analytics or Search Console to export your URLs. Once you have the data, consider each URL. Does it include the page’s target keyword? Is it easy to understand your hierarchy or overall structure by looking at the URL? Does it use only product numbers instead of descriptions? (That’s bad.) Optimize your URLs with search in mind. Descriptive URLs go a long way toward improved overall SEO.</li>
<li><strong>Check Mixed Content: </strong>We hope that by now your site is secured with an SSL certificate. Even so, mixed content issues could persist. Such issues occur when your website has unsecured elements. For example, if you’re embedding a resource through an iFrame and the site you’re referencing is not secured, you have a mixed content issue. As browsers continue to clamp down on unsecured sites, this may hurt your credibility and SEO. Use <a href="https://www.missingpadlock.com/" linktype="3" target="_blank">MissingPadlock</a> to identify mixed content so you can fix these issues right away.</li>
<li><strong>Don’t forget about Schema Markup:</strong> Schema markup is microdata added to a website to describe its content. <a href="/Blogs/What-is-schema" linktype="8" target="_self">(Learn more about Schema markup here.)</a> Many marketers are unfamiliar with Schema code and implement it incorrectly. If you’re using Schema, but don’t see the rich snippets or enhanced SERP descriptions you expect, use <a href="https://search.google.com/structured-data/testing-tool/u/0/" linktype="3" target="_blank">Google’s Structured Data Testing Tool </a>or Screaming Frog to validate your code. The structured data report within GSC highlights all structured data across your domain.</li>
<li><strong>Consider Your Server Logs: A</strong>lthough more in-depth than steps 1-9, review of server logs can identify major technical SEO issues. This is especially valuable if you’re conducting an SEO audit after a redesign. The logs highlight how bots, especially those from search engines, crawl your site’ They show which pages they’re attempting to access and which server codes they receive. For example, this review will identify if a search engine is trying to access an old URL, so you can set up a redirect. SEMRush recently launched a tool that allows you to upload and review your server logs within its interface. Free tools, such as AWStats, allow you to review this information. But you’ll likely have to pay for a (good) log analyzer tool and you’ll need a developer to pull the logs.</li>
</ol>
<h2>Summary</h2>
<p>Unique content and on-page optimization will always be important parts of SEO. However, technical SEO is just as important, and, unfortunately, often neglected. If you’re just developing your technical SEO strategy or if you’re unsure of how your site is being crawled, conducting a (mostly free) technical SEO audit can help identify areas of opportunity to boost rankings, increase traffic and, most importantly, grow sales and conversions.</p>/Northwoods-2020/Hero-Images/Hiker-Looking-Out-Over-Mountains.pngHiker Looking Out Over MountainsNorthwoods Team/Northwoods-2020/People/NWS-Bug-Grey.pngNorthwoods logo bug<script charset="utf-8" type="text/javascript" src="//js.hsforms.net/forms/embed/v2.js"></script><script>hbspt.forms.create({ region: "na1", portalId: "23630176", formId: "40c5bbae-05a2-42ea-94dd-1662181fd56e" });</script>Conducting a Technical SEO Audit2019-03-14T00:00:00/Northwoods-2019/Blogs/Conductingatechnicalseoaudit.jpg?MediumA technical SEO audit of your website can shed light on how search engines view your site and interact with your content while boosting your overall SEO strategy.3621620/People/Northwoods-TeamNorthwoodsTeamYour Trusted Digital Trail Guides<p>For more than 25 years, the digital experts at Northwoods have been helping clients improve their websites, software, and digital strategy and marketing. How can we help you meet your goals? </p>
<p><a href="/Northwoods-2020/Services" linktype="2" target="_self">Learn more about our services</a>.</p>Northwoods Team/Northwoods-2020/People/NWS-Bug-Grey.pngAdd-In Type - NWS Data ModulesCategory - NWS Data ModulesCommittee - NWS Data ModulesDivision - NWS Data ModulesEvent Audience - NWS Data ModulesEvent Service - NWS Data ModulesEvent Type - NWS Data ModulesLocality - NWS Data ModulesModule - NWS Data ModulesPackage Type - NWS Data ModulesPersonID - NWS Data Modules
- Northwoods Team
ProductVersion - NWS Data ModulesRecorded Webinar TopicsRegion - NWS Data ModulesSite Display - NWS Data ModulesSkillLevel - NWS Data ModulesTopic - NWS Data ModulesVideoAudience - NWS Data ModulesVideoClassification - NWS Data ModulesVideoStatus - NWS Data ModulesTeamAdd-In Type - NWS Data ModulesCategory - NWS Data ModulesCommittee - NWS Data ModulesDivision - NWS Data ModulesEvent Audience - NWS Data ModulesEvent Service - NWS Data ModulesEvent Type - NWS Data ModulesLocality - NWS Data ModulesModule - NWS Data ModulesPackage Type - NWS Data ModulesPersonID - NWS Data Modules- Northwoods Team
ProductVersion - NWS Data ModulesRecorded Webinar TopicsRegion - NWS Data ModulesSite Display - NWS Data Modules- NWS Digital
SkillLevel - NWS Data ModulesTopic - NWS Data Modules- Digital Marketing
- SEO & Content Marketing
VideoAudience - NWS Data ModulesVideoClassification - NWS Data ModulesVideoStatus - NWS Data Modules02024-03-15T13:56:49.10000