Technical SEO

How to Do a Technical SEO Audit: Step-by-Step

Paul Donnelly8 min read
Website audit dashboard representing technical SEO audit guide

A technical SEO audit is a systematic evaluation of your website's technical foundations — the elements that determine whether search engines and AI platforms can crawl, understand, and index your content effectively.

This guide walks through a complete audit process, covering the six core areas that every technical audit should address.

Before You Start: Tools You'll Need

A thorough technical audit requires a combination of tools:

  • Screaming Frog SEO Spider — the industry standard for crawl analysis (free up to 500 URLs, paid for larger sites)
  • Google Search Console — indexation, coverage, Core Web Vitals, and search performance data
  • PageSpeed Insights / Lighthouse — Core Web Vitals and performance analysis
  • Ahrefs or Semrush — backlink analysis, redirect auditing, keyword data
  • Browser DevTools — for diagnosing specific page issues

Export data from each tool and work through the audit areas systematically.

Step 1: Crawlability

The first question: can search engines and AI platforms access your site?

Check robots.txt

Navigate to yourdomain.com/robots.txt. Verify:

  • No important sections are accidentally blocked (check for overbroad Disallow: / rules)
  • Your XML sitemap is referenced (Sitemap: https://yourdomain.com/sitemap.xml)
  • AI crawler access is configured intentionally:
    • OAI-SearchBot — allow for ChatGPT Search citation
    • PerplexityBot — allow for Perplexity citation
    • Google-Extended — allow for Google AI Overviews
    • GPTBot — your choice (training data only; blocking this doesn't affect ChatGPT Search citations)

Run a full site crawl

In Screaming Frog, configure the spider to match Googlebot's user agent and run a crawl of your entire site. Export the results.

Focus on:

  • Response codes — identify all 4xx (client errors) and 5xx (server errors) pages
  • Redirect chains — find A → B → C redirect chains and flag them for consolidation
  • Crawl depth — pages more than 4–5 clicks from the homepage may be crawled infrequently
  • Orphaned pages — pages in the sitemap but not reachable via internal links

Review crawl stats in Search Console

Under Settings → Crawl Stats in Search Console, review how frequently Googlebot is crawling your site and whether crawl errors are reported. Sudden drops in crawl rate or spikes in crawl errors warrant investigation.

Step 2: Indexation

Crawlability and indexation are separate concerns. A page can be crawled without being indexed.

Check for noindex issues

In Screaming Frog, filter for pages with noindex directives. Export the list and verify:

  • Are all noindex pages genuinely intended to be excluded from the index?
  • Any important pages accidentally carrying a noindex tag?

Common noindex problems: staging site settings pushed to production, CMS template settings applied too broadly, developer-added tags left in after testing.

Audit Search Console Coverage report

In Search Console, the Pages report (formerly Coverage) shows:

  • Valid pages — pages Google has indexed
  • Valid with warnings — indexable but with potential issues (e.g., submitted URLs not in sitemap)
  • Excluded — pages Google chose not to index (noindex tags, duplicates, not submitted, etc.)
  • Error — pages that couldn't be indexed

Review each category. "Valid" pages you expected to see excluded, or "Excluded" pages you expected to be indexed, both warrant investigation.

Check canonical tags

Filter your Screaming Frog crawl for canonical URLs. Verify:

  • All indexable pages have a self-referencing canonical
  • Canonical tags on paginated pages don't incorrectly point to page 1
  • Canonical href attributes use the correct protocol (https://, not http://)
  • No pages have both a noindex directive and canonical tags (conflicting signals)

Identify duplicate content

Screaming Frog's "Duplicate Content" report identifies pages with identical or near-identical content. For each duplicate cluster, verify a canonical relationship is correctly set or that consolidation is appropriate.

Step 3: Site Architecture

URL structure review

Evaluate the URL structure against best practice:

  • Lowercase URLs
  • Hyphens as word separators (not underscores)
  • No unnecessary parameters in indexable URLs
  • Logical hierarchy reflecting site structure

Internal linking audit

In Screaming Frog, review the inlinks and outlinks for key pages. Check:

  • Homepage links to all key category/service pages
  • All important pages have multiple internal links (not just breadcrumbs)
  • No important pages have very few inlinks (potential orphan risk)
  • Anchor text is descriptive (not "click here" or "read more")

Redirect map

Export all 301/302 redirects from the crawl. Review for:

  • Chains longer than one hop (A → B → C should be A → C)
  • Loops (A → B → A)
  • Redirects pointing to pages that have since moved again
  • Legacy redirects from migrations that may now be redirecting to incorrect destinations

Step 4: Core Web Vitals

Pull field data from Search Console

The Core Web Vitals report in Search Console shows LCP, CLS, and INP based on real Chrome user data. This is the data Google uses for ranking. Note which URL groups are failing and the proportion in "Poor" vs "Needs Improvement" vs "Good."

Run lab tests with PageSpeed Insights

Test representative pages (homepage, category page, product/service page, blog post) in PageSpeed Insights. Review:

  • LCP — identify the LCP element and what's causing delays
  • CLS — identify which elements are shifting and why
  • INP — identify long tasks blocking user interaction responsiveness

Prioritise fixes by impact

Not all pages are equal. Prioritise Core Web Vitals fixes on:

  1. High-traffic pages (most impact on users and engagement metrics)
  2. Pages in "Poor" before "Needs Improvement"
  3. Mobile before desktop (Google uses mobile-first indexing)

Step 5: Structured Data

Crawl for schema coverage

In Screaming Frog, enable JavaScript rendering and check the structured data report. Verify:

  • Key page types have the expected schema (Organisation on homepage, Article on blog posts, etc.)
  • No schema validation errors are reported

Validate with Google's tools

Test high-priority pages in Google's Rich Results Test. Check:

  • Which rich result types each page is eligible for
  • Any missing required properties that would prevent rich result display
  • Warning-level issues that reduce schema effectiveness

Check for missing schema

Cross-reference which pages have schema and which don't. Flag:

  • Blog posts missing Article/BlogPosting schema
  • FAQ sections missing FAQPage schema
  • Product pages missing Product schema (ecommerce)
  • Location pages missing LocalBusiness schema

Step 6: AI Crawler Access

This is a newer but increasingly important audit area.

robots.txt AI crawler review

As covered in Step 1, verify which AI crawlers can access your site and that your configuration is intentional.

llms.txt file

Check whether your site has an llms.txt file at the root domain. This file (llmstxt.org spec) provides AI systems with a structured overview of your site's content and most important pages. If absent, create one.

AI visibility testing

Manually test your target queries in ChatGPT Search, Perplexity, and Google AI Overviews. Note:

  • Whether your pages appear as citations
  • Whether competitors appear when you don't
  • Whether your brand is described accurately when cited

This qualitative data informs where GEO optimisation effort is needed.

Compiling and Prioritising Findings

A technical audit will surface more issues than you can address immediately. Categorise findings by:

Critical — Issues that are actively preventing important pages from being crawled or indexed (e.g., accidental sitewide noindex, robots.txt blocking key sections, mass 4xx errors)

High — Issues causing significant visibility loss (e.g., canonical errors on major templates, Core Web Vitals failures on high-traffic pages, missing JobPosting schema for a recruitment site)

Medium — Issues affecting performance but not causing immediate visibility loss (e.g., redirect chains, orphaned pages, missing FAQ schema)

Low — Best-practice improvements with smaller impact (e.g., minor duplicate content, missing alt text on decorative images, trailing slash inconsistencies)

Address critical and high-priority issues first. Assign medium and low items to an ongoing maintenance queue.

FAQs

How often should a technical SEO audit be done? A comprehensive audit annually is a reasonable baseline. Quarterly reviews of key metrics (indexation, Core Web Vitals, crawl errors) catch issues before they compound. After major site changes (migrations, redesigns, CMS updates), run a full audit immediately.

Can I do a technical SEO audit myself? Yes, with the right tools and this guide. Screaming Frog is accessible to non-specialists for basic crawl analysis. Google Search Console is free and provides excellent indexation diagnostics. The complexity increases significantly for large sites (10,000+ pages) or sites with complex technical architectures.

What's the most common finding in a technical SEO audit? Canonicalisation issues — incorrect self-referencing canonicals, canonicals pointing to the wrong version of a URL, or missing canonicals on paginated pages — appear in virtually every audit.

How long does a technical SEO audit take? For a small-to-medium site (up to 5,000 pages), a thorough audit takes 8–16 hours with experienced analysts. Enterprise sites with hundreds of thousands of pages can take several days or longer.

For a professional technical SEO audit with actionable recommendations and implementation support, get in touch or start with a free audit.

Paul Donnelly — Backend Developer at Dynamically

Written by

Paul Donnelly

Backend Developer

Paul is a backend developer at Dynamically, leading technical SEO audits, site migrations, and structured data implementation.

Back to Insights

Work with Dynamically

Ready to put these insights into practice?

Our Liverpool-based team works with UK businesses to grow organic search, improve paid media performance and build visibility in AI-powered search. Get a free audit to see exactly where your opportunities are.