Google Search Console is the single most important free tool in any SEO professional's toolkit. It is the only platform that shows you exactly how Google sees your website — which pages are indexed, which queries drive impressions and clicks, what technical issues exist, and how your Core Web Vitals perform in the real world.
Despite this, many businesses either do not use Search Console at all, use it superficially, or misinterpret its data. This guide covers everything you need to know to get the most out of GSC in 2026, from initial setup to advanced analysis techniques.
Setting Up Google Search Console
If you have not already verified your site, here is how to get started.
Property Types
GSC offers two property types:
- Domain property — covers all subdomains, protocols (HTTP and HTTPS), and URL variations. This is the recommended option because it gives you a complete view of your entire domain. Verification requires DNS record access.
- URL prefix property — covers a specific URL prefix (e.g.,
https://www.example.com). Simpler to set up, but you may need multiple properties to cover all variations of your site.
For most businesses, a domain property is the right choice. It ensures you capture data for every version of your URLs without worrying about www versus non-www or HTTP versus HTTPS discrepancies.
Verification Methods
- DNS record (required for domain properties) — add a TXT record to your domain's DNS configuration.
- HTML file upload — upload a verification file to your site's root directory.
- HTML tag — add a meta tag to your homepage's
<head>. - Google Analytics — if GA4 is already installed, GSC can verify through it.
- Google Tag Manager — similar to the GA4 method.
Once verified, data typically starts appearing within 48–72 hours, though it can take longer for new sites.
The Performance Report: Your Most Valuable Data
The Performance report is where most SEOs spend the majority of their time in GSC, and for good reason. It shows the queries that drive impressions and clicks to your site, broken down by page, device, country, and date range.
Key Metrics
- Clicks — how many times users clicked through to your site from search results.
- Impressions — how many times your pages appeared in search results, regardless of whether they were clicked.
- Click-Through Rate (CTR) — clicks divided by impressions. Indicates how compelling your listings are.
- Average Position — your average ranking position for a given query or page.
How to Use Performance Data Effectively
Find Quick Wins
Filter for queries where your average position is between 5 and 15 and impressions are high. These are keywords where you are visible but not yet in the top positions. Small improvements in content quality, internal linking, or on-page optimisation can push these into the top three, where CTR increases dramatically.
This kind of analysis is central to effective keyword research and prioritisation.
Identify CTR Opportunities
Filter for queries with high impressions but low CTR. This often indicates that your title tag and meta description are not compelling enough, or that SERP features (AI Overviews, featured snippets, People Also Ask) are pushing your listing below the fold.
Review the SERPs for these queries manually. If AI Overviews are present, read our guide on adapting your strategy for AI Overviews. If the issue is your listing itself, rewrite your title and description to be more specific, benefit-driven, and click-worthy.
Track Content Performance Over Time
Compare performance across date ranges to identify trending and declining pages. Content that is losing traffic over time may need refreshing — updated statistics, new sections, improved internal links, or a refreshed publish date.
Segment by Device
Mobile and desktop performance often differs significantly. A page might rank well on desktop but poorly on mobile (or vice versa). If you see major discrepancies, investigate whether mobile usability issues, page speed differences, or content rendering problems are to blame.
Index Management
The Pages report (formerly Index Coverage) tells you which of your pages Google has indexed and which it has not — and why.
Understanding Page Statuses
- Indexed — the page is in Google's index and can appear in search results.
- Not indexed — the page was discovered but not added to the index. This can happen for many reasons: the page is low quality, it is a duplicate, it has a noindex tag, it was crawled but not selected for indexing, or it returned an error.
- Excluded — pages that Google has intentionally excluded based on your directives (noindex, canonical pointing elsewhere) or its own assessment.
Common Issues and Fixes
"Discovered – Currently Not Indexed"
Google knows the page exists but has not crawled it yet. This is common on large sites where crawl demand exceeds Google's allocation. Improve the page's internal linking, ensure it is in your XML sitemap, and build authority to encourage Google to prioritise it.
"Crawled – Currently Not Indexed"
Google crawled the page but decided not to index it. This is often a quality signal. The page may be too thin, too similar to other indexed pages, or lacking in unique value. Review the content, ensure it offers something distinct, and consider whether it should be consolidated with another page.
"Duplicate Without User-Selected Canonical"
Google found multiple versions of a page and chose a canonical that differs from the one you specified. Check your canonical tags, ensure HTTPS/www consistency, and verify that URL parameters are not creating duplicate URLs.
A thorough SEO audit will systematically identify and resolve all indexation issues.
Sitemaps
The Sitemaps section lets you submit your XML sitemaps and monitor their processing status.
Best Practices
- Submit your sitemap index file if you use multiple sitemaps (most medium to large sites should).
- Check for errors regularly — invalid URLs, server errors, and format issues can prevent Google from processing your sitemap.
- Ensure your sitemap only includes indexable, canonical URLs. Do not include pages with noindex tags, redirected URLs, or non-canonical duplicates.
- Monitor the "Discovered URLs" count to ensure it aligns with the number of URLs you expect Google to know about.
Core Web Vitals Report
GSC's Core Web Vitals report shows real-world performance data for your pages, based on the Chrome User Experience Report (CrUX). Unlike lab tools like Lighthouse, this data reflects how actual users experience your site.
The report groups your URLs into three categories — Good, Needs Improvement, and Poor — for each metric (LCP, INP, CLS). It also groups URLs by similar structure, so a fix applied to one template can improve the status of many pages at once.
For a detailed breakdown of what these metrics mean and how to improve them, read our Core Web Vitals guide.
How to Use the CWV Report
- Identify which metric has the most "Poor" URLs.
- Click through to see specific URL groups and example pages.
- Test those pages in PageSpeed Insights for detailed diagnostic information.
- Implement fixes and use the "Validate Fix" button in GSC to trigger a reassessment.
Links Report
The Links section shows your site's backlink profile and internal linking structure as Google sees them.
External Links
- Top linked pages — which of your pages have the most backlinks.
- Top linking sites — which domains link to you most frequently.
- Top linking text — the anchor text used in links pointing to your site.
This data is limited compared to dedicated backlink tools like Ahrefs or Majestic, but it comes directly from Google, making it uniquely authoritative. Use it to verify that your link building efforts are being recognised and to identify any suspicious linking patterns that might indicate negative SEO or spammy links.
Internal Links
The internal links report shows how many internal links point to each page on your site. Pages with few internal links are harder for Google to discover and may be undervalued in terms of ranking potential.
Cross-reference this with your site's information architecture. Key pages — service pages, top-performing blog posts, conversion pages — should have the most internal links. If they do not, your internal linking strategy needs attention.
Security and Manual Actions
Manual Actions
If Google has applied a manual penalty to your site, it will appear here. Manual actions are relatively rare, but they are devastating when they occur. Common triggers include unnatural link patterns, thin or scraped content, cloaking, and structured data violations.
If you receive a manual action, address the issue immediately and submit a reconsideration request. Our penalty recovery service helps businesses navigate this process.
Security Issues
GSC alerts you to detected malware, hacking, phishing, or other security issues on your site. These are critical to resolve immediately, as Google may display warnings to users or remove affected pages from search results entirely.
Advanced Features
URL Inspection Tool
The URL Inspection tool lets you check how Google sees a specific URL. It shows:
- Whether the URL is indexed
- The canonical URL Google has selected
- When it was last crawled
- Whether it is mobile-usable
- Any detected structured data and its validity
Use this tool to debug individual pages that are not performing as expected. It is also invaluable after making changes — you can request indexing to prompt Google to recrawl the page.
Removals Tool
If you need to temporarily remove a URL from search results — perhaps you published sensitive information by mistake — the Removals tool lets you request a temporary block. This is faster than waiting for Google to recrawl and deindex the page naturally.
Note that this is a temporary measure (approximately six months). For permanent removal, you also need to add a noindex tag or return a 404/410 status code.
Search Appearance Filters
The Performance report includes filters for search appearance types — web, image, video, news, and Discover. These help you understand which types of search your content appears in and how it performs in each context.
If your content appears in Google Discover, pay particular attention. Discover traffic can be significant but is also volatile — articles can spike one day and disappear the next. It is a useful supplementary traffic source but not one to build your strategy around.
Common Mistakes to Avoid
Only Checking GSC Occasionally
GSC data is most valuable when you monitor it regularly. Set up email alerts for critical issues (manual actions, security problems, significant indexing drops) and review performance data at least weekly.
Ignoring Non-Indexed Pages
A high number of "crawled but not indexed" pages is a signal that Google does not find those pages valuable enough to include. Rather than ignoring this, treat it as feedback. Improve or consolidate thin content and ensure your site's overall quality bar is high.
Misinterpreting Average Position
Average position is an average across all queries and appearances. A page might have an average position of 15 but rank third for its primary keyword and fiftieth for dozens of long-tail variations that drag the average down. Always look at position data in the context of specific queries, not just as a page-level aggregate.
Not Connecting GSC with GA4
GSC and GA4 provide complementary data. GSC tells you about search performance before the click (impressions, CTR, position). GA4 tells you what happens after the click (engagement, conversions, revenue). Connecting the two platforms in GA4 (Admin > Search Console Links) gives you a unified view.
Make the Most of Your Data
Google Search Console is free, authoritative, and indispensable. But data is only valuable if you act on it. The insights in GSC should directly inform your SEO strategy — which pages to optimise, which content to create, which technical issues to fix, and where to invest your resources for maximum impact.
If you want help turning GSC data into actionable strategy, get in touch with our team. We will review your Search Console data, identify the highest-impact opportunities, and build a plan to capitalise on them.



