JavaScript frameworks dominate modern web development. React, Next.js, Vue, and their derivatives power millions of commercial websites — from early-stage startups to enterprise platforms. They offer developer productivity, rich interactive experiences, and component-based architecture that makes complex applications manageable.
They also introduce SEO challenges that, if unaddressed, can silently suppress your organic performance for months before anyone notices the cause.
The good news: JavaScript SEO is a well-understood discipline in 2026. The problems it creates are real but solvable, and Next.js in particular has matured to the point where — with the right configuration — it produces sites that are excellent for SEO. The bad news: you have to know what to configure, and many development teams don't.
This guide covers how Google crawls and indexes JavaScript sites, the most common SEO mistakes in React and Next.js implementations, and the specific fixes you need to apply.
How Googlebot Processes JavaScript
To understand JavaScript SEO, you need to understand how Googlebot works — because it does not work like a regular browser.
Googlebot operates in two stages when it encounters a JavaScript-heavy page:
Stage 1 — Initial fetch: Googlebot fetches the HTML document and indexes any content available in that initial response. For a traditional server-rendered page, this is the full content. For a client-rendered React application, this might be nothing more than <div id="root"></div> — an empty shell that the JavaScript will populate once it runs.
Stage 2 — Rendering: Googlebot adds the page to a rendering queue, where headless Chrome renders the JavaScript and produces a fully rendered version of the page. Google then re-crawls and indexes the rendered content.
The critical point: rendering is delayed and queued. Googlebot has limited rendering capacity, and rendering happens at Google's own schedule — often hours, days, or sometimes weeks after the initial fetch. Content that only exists after JavaScript executes may take significantly longer to be indexed than content delivered in the initial HTML response.
For new pages, this delay can mean ranking lag. For heavily JavaScript-dependent sites without proper server-side rendering, some content may effectively never be properly indexed if the rendering queue deprioritises it.
Client-Side Rendering: The Fundamental Problem
A pure client-side rendering (CSR) architecture — where the server delivers a minimal HTML shell and React populates everything on the client — is problematic for SEO for several reasons:
- Indexing delay: Content requires the two-stage crawl process described above
- Render budget: Large, complex JavaScript applications may not fully render within Googlebot's resource limits
- JavaScript errors: Any uncaught error in your JavaScript can cause the page to fail rendering, resulting in an empty index entry
- Core Web Vitals: CSR applications often have poor Largest Contentful Paint (LCP) and First Contentful Paint (FCP) scores, which affect both user experience and ranking signals
- Discovery issues: If your links are rendered only in JavaScript, they may be discovered later and crawled less frequently than server-rendered links
Pure CSR is appropriate for web applications that require authentication and are not intended to be indexed (dashboards, admin panels, SaaS products). For any public-facing content that needs to rank in search, it should be avoided.
Server-Side Rendering (SSR) and Static Generation (SSG)
The two primary solutions to JavaScript SEO challenges in Next.js are server-side rendering and static site generation.
Server-Side Rendering (SSR)
With SSR, the HTML is generated on the server at request time and delivered to the browser (and Googlebot) as a complete document. Googlebot receives fully populated HTML in the initial fetch — no rendering queue required.
In Next.js App Router (Next.js 13+), Server Components render on the server by default. This is the correct default for SEO-sensitive content.
When to use SSR:
- Pages with user-specific content (requires authentication)
- Pages where data changes frequently and must be fresh on every request
- APIs or pages that generate content in real time
The trade-off: SSR adds server processing time and cost. For high-traffic sites, this can be significant.
Static Site Generation (SSG)
SSG generates HTML at build time. The pre-rendered pages are served as static files — no server processing at request time. Googlebot receives complete HTML immediately, and the pages load extremely quickly for users.
In Next.js, SSG is achieved with generateStaticParams() in the App Router, or getStaticProps and getStaticPaths in the Pages Router.
When to use SSG:
- Content that doesn't change frequently (blog posts, product pages, service pages)
- High-volume pages where server costs matter
- Any page where maximum performance and fastest indexing are priorities
SSG is typically the preferred approach for marketing sites, blogs, and ecommerce product catalogues.
Incremental Static Regeneration (ISR)
ISR is Next.js's hybrid approach: pages are statically generated at build time but automatically regenerated in the background at a defined interval (or on-demand). This allows you to get SSG performance while serving content that stays reasonably up-to-date.
Configure ISR with the revalidate export:
export const revalidate = 3600; // regenerate at most every hour
ISR is excellent for content like blog indexes, product listings, or any page where static generation is preferred but content updates more than once per build cycle.
Key Technical SEO Issues in React/Next.js Sites
1. Missing or Incorrect Metadata
In Next.js App Router, metadata is defined with the metadata export or generateMetadata function. The most common mistakes:
- Forgetting metadata on dynamic routes (
[slug]/page.tsx) - Using the same title and description across multiple pages (duplicate metadata)
- Not setting canonical URLs — especially important for paginated content or pages accessible via multiple URL patterns
- Missing Open Graph and Twitter Card metadata
// app/insights/[slug]/page.tsx
export async function generateMetadata({ params }: Props): Promise<Metadata> {
const post = await getPost(params.slug);
return {
title: post.title,
description: post.description,
alternates: { canonical: `https://yoursite.com/insights/${params.slug}` },
openGraph: {
title: post.title,
description: post.description,
url: `https://yoursite.com/insights/${params.slug}`,
type: 'article',
},
};
}
2. Client Components Hiding Content from Googlebot
In Next.js App Router, 'use client' components render on the client. If your primary page content is inside a Client Component, it may not be present in the initial server response.
The fix: Keep content-bearing sections as Server Components. Use Client Components only for interactive elements (forms, carousels, accordions). The content architecture should be: Server Component shell → Client Component interactive islands.
3. Soft 404s
A soft 404 is a page that returns a 200 status code but contains content indicating that nothing was found (e.g., "No results found" or an empty product grid). Googlebot interprets these as valid pages and may index them, diluting your crawl budget and creating duplicate/thin content issues.
In Next.js, use notFound() from next/navigation to return a proper 404 response:
import { notFound } from 'next/navigation';
export default async function ProductPage({ params }: Props) {
const product = await getProduct(params.slug);
if (!product) notFound();
// ...
}
4. Internal Links Rendered Only in Client Components
Links that are only rendered after client-side JavaScript executes may be discovered later and followed less frequently by Googlebot. Ensure your primary navigation links are rendered server-side in your layout.
Next.js <Link> components in Server Components are fully rendered in the initial HTML response and work perfectly for SEO.
5. Paginated Content Without Proper URL Structure
Pagination in JavaScript applications often uses client-side state rather than distinct URLs (e.g., clicking "Next" loads more results without changing the URL). From an SEO perspective, this means pages 2, 3, and beyond don't exist as indexable URLs.
The correct approach: paginate via URL parameters or path segments:
/insights(page 1)/insights/page/2(page 2)/insights/page/3(page 3)
Each page should have its own metadata, canonical URL, and be linked from the previous/next page for proper crawl chain continuity.
6. Lazy Loading Below-the-Fold Content
Lazy loading images and components is good for performance. However, if critical content (text, headings, links) is lazy loaded, it may not be included in Googlebot's rendering. Use lazy loading only for media and non-critical UI elements — never for primary textual content or navigation.
7. Unoptimised Core Web Vitals
Core Web Vitals — Largest Contentful Paint (LCP), Cumulative Layout Shift (CLS), and Interaction to Next Paint (INP) — are confirmed ranking signals. React and Next.js sites frequently struggle with:
- LCP: Slow server response time, unoptimised hero images, or render-blocking resources
- CLS: Layout shifts caused by dynamically loaded content, ads, or web fonts loading
- INP: Long main thread tasks caused by large JavaScript bundles or unoptimised event handlers
Next.js provides built-in optimisation tools — <Image> for automatic image optimisation, <Script> for controlled script loading, and <Font> for web font optimisation. Use them.
Auditing a JavaScript Site for SEO Issues
When auditing a React or Next.js site, add these checks to your standard technical SEO audit:
-
Render comparison: Fetch pages as Googlebot (using
curl -A "Googlebot") and compare the raw HTML to the browser-rendered version. Any content present in the browser but absent in the raw HTML is client-rendered and carries indexing risk. -
Google Search Console coverage report: Look for "Crawled — currently not indexed" pages, which can indicate rendering issues.
-
URL Inspection Tool: Use the Google Search Console URL Inspection tool to trigger a live render and see exactly what Googlebot sees when it renders your pages.
-
JavaScript error monitoring: Uncaught JavaScript errors that prevent page rendering will result in empty index entries. Monitor your JavaScript error rate in your analytics or error tracking tool.
-
Core Web Vitals: Check your CrUX data in Search Console and run Lighthouse audits on your critical landing pages.
Our technical SEO audit includes a comprehensive JavaScript crawl analysis, Core Web Vitals assessment, and structured remediation roadmap. Get in touch to book an audit of your Next.js or React site.



