Back to Blog

How Your JavaScript Framework Affects SEO Rankings

·Xyle Team
javascript seoframework seossrcsrnextjs seoreact seonuxt seovue seo

Your JavaScript framework choice has a direct impact on whether search engines can index your content. Google can render JavaScript — but with delays, resource limits, and no guarantee that every page gets rendered. Other search engines and AI answer engines often cannot render JavaScript at all.

If your site ships an empty <div id="root"></div> and relies on client-side JavaScript to fill it in, you are making a bet that every crawler will execute your bundle correctly. That is a bet you will lose.

The Rendering Problem

Search engine crawlers work in two phases. First, they fetch the HTML response from your server. Second, they optionally render JavaScript to see the final DOM. The problem is that phase two is expensive, delayed, and inconsistent.

Google's renderer (WRS — Web Rendering Service) uses a recent version of Chromium, but it queues pages for rendering and processes them later. The gap between crawl and render can be hours or days. During that gap, Google sees only your initial HTML.

Bing's rendering is less capable and less consistent. AI engines like ChatGPT and Perplexity typically do not render JavaScript at all — they work with whatever HTML your server returns.

This means your initial HTML response is your SEO baseline. If it is empty, your baseline is zero content.

CSR vs SSR vs SSG vs Hybrid: What Each Means for SEO

There are four rendering strategies, and each has different SEO implications.

| Strategy | How It Works | Initial HTML | SEO Impact | Best For | |----------|-------------|-------------|------------|----------| | CSR (Client-Side Rendering) | Browser downloads JS bundle, renders in browser | Empty shell | Poor — crawlers see no content | Authenticated dashboards, internal tools | | SSR (Server-Side Rendering) | Server renders HTML on each request | Full content | Good — crawlers see complete page | Dynamic content, e-commerce, news | | SSG (Static Site Generation) | HTML pre-built at build time | Full content | Excellent — fastest TTFB, full content | Blogs, docs, landing pages, marketing | | Hybrid (SSR + Client Hydration) | Server renders, browser hydrates for interactivity | Full content | Good — content visible, interactive after hydration | Most modern web apps |

CSR: The SEO Problem

A standard React app (Create React App or Vite) ships this initial HTML:

<!DOCTYPE html>
<html>
  <head>
    <title>My App</title>
  </head>
  <body>
    <div id="root"></div>
    <script src="/assets/bundle.js"></script>
  </body>
</html>

A search crawler that does not execute JavaScript sees a page with no content, no headings, no text, and no links. Even Google, which does render JS, may take days to get to your page in the rendering queue.

SSR: Content on First Response

With SSR, the server executes your components and returns complete HTML:

<!DOCTYPE html>
<html>
  <head>
    <title>Product Page — My Store</title>
    <meta name="description" content="High-quality widget with free shipping..." />
  </head>
  <body>
    <div id="root">
      <h1>Premium Widget</h1>
      <p>High-quality widget with free shipping on orders over $50...</p>
      <!-- Full rendered content -->
    </div>
    <script src="/assets/bundle.js"></script>
  </body>
</html>

Every crawler sees the full page on the first request. No rendering queue. No JavaScript dependency.

SSG: Pre-Built and Fast

Static generation takes this further by building all HTML at deploy time. There is no server computation per request — pages are served from a CDN. This gives you the fastest possible time-to-first-byte (TTFB) and ensures every crawler gets complete content instantly.

Framework-by-Framework SEO Guide

Next.js

Next.js is the strongest choice for SEO among React frameworks. It supports SSR, SSG, and hybrid rendering out of the box. Pages are server-rendered by default in the App Router.

Detection markers: __NEXT_DATA__ script tag, /_next/ asset paths. Xyle automatically detects Next.js and its rendering mode when you crawl a page.

SEO strengths: Built-in metadata API, automatic sitemap generation, image optimization with next/image, SSR/SSG by default.

React (CRA / Vite)

Plain React with Create React App or Vite is CSR by default. Your pages ship as an empty shell until JavaScript executes in the browser.

SEO impact: Poor without additional setup. If you need SEO, you have two options: migrate to Next.js, or add a pre-rendering service.

# Check if your React app is CSR-only
# View source should show actual content, not just <div id="root"></div>
curl -s https://yoursite.com | grep -c "<h1>"
# If this returns 0, crawlers see no headings

Vue / Nuxt

Plain Vue is CSR. Nuxt adds SSR and SSG support for Vue, similar to what Next.js does for React. If you are building a Vue app that needs SEO, use Nuxt.

Detection markers: __NUXT__ or __NUXT_DATA__ in the HTML, /_nuxt/ asset paths.

SEO recommendation: Use Nuxt with ssr: true (the default) for all public-facing pages. For a detailed Nuxt-specific guide, see our Nuxt.js SEO guide.

Is Nuxt.js Good for SEO?

Yes — Nuxt is one of the best choices for SEO among JavaScript frameworks. It ships with SSR enabled by default, which means every crawler sees your full HTML content on the first request. No rendering queue, no JavaScript dependency.

Nuxt 3 also supports hybrid rendering through routeRules, letting you mix SSR, SSG, and CSR per route:

// nuxt.config.ts — hybrid rendering for optimal SEO
export default defineNuxtConfig({
  routeRules: {
    "/": { prerender: true },              // SSG for homepage
    "/blog/**": { prerender: true },       // SSG for blog posts
    "/products/**": { swr: 3600 },         // SSR with caching
    "/dashboard/**": { ssr: false },       // CSR for private pages
  },
})

Nuxt's built-in useSeoMeta composable makes per-page meta tags straightforward:

<script setup>
useSeoMeta({
  title: 'Your Page Title',
  description: 'Your meta description for this specific page.',
  ogImage: 'https://yoursite.com/og-image.png',
})
</script>

The Nuxt module ecosystem covers the remaining SEO essentials: @nuxtjs/sitemap for automatic sitemap generation, nuxt/image for optimized images, and @nuxtjs/robots for robots.txt management.

Key Nuxt SEO pitfalls to avoid:

  • Do not set ssr: false globally — this kills SEO for all pages
  • Do not fetch content in onMounted() — use useFetch or useAsyncData so data is included in the SSR output
  • Do not rely on client-only plugins for meta tags — they will not appear in the initial HTML

Angular / Angular Universal

Plain Angular is CSR. Angular Universal adds server-side rendering. The setup is more involved than Next.js or Nuxt, but it works.

Detection markers: ng-version attribute on the root element, ngsw service worker references.

SEO recommendation: Use Angular Universal or the newer Angular SSR (@angular/ssr) for any page that needs indexing.

Svelte / SvelteKit

SvelteKit renders on the server by default and produces minimal client-side JavaScript. The compiled output is significantly smaller than React or Angular bundles, which benefits Core Web Vitals.

Detection markers: SvelteKit-specific data attributes, /_app/ asset paths.

SEO recommendation: SvelteKit is excellent for SEO out of the box. Keep the default SSR behavior.

WordPress

WordPress is server-rendered by default — PHP generates complete HTML. SEO issues with WordPress are rarely about rendering. They are about plugin bloat slowing page speed, poor heading hierarchy in themes, and missing structured data.

Detection markers: wp-content paths, wp-json API, meta generator tag.

Astro

Astro takes a different approach: it ships zero JavaScript to the browser by default. Your components render to static HTML at build time, and interactive elements are hydrated on demand using "islands" architecture.

Detection markers: astro- prefixed attributes, data-astro- attributes in the HTML.

SEO strengths: Best-in-class Core Web Vitals (near-zero JavaScript), SSG by default, built-in image optimization, and native sitemap integration. Astro is the strongest choice for content-heavy sites where performance is a priority.

# Astro SEO essentials
npx astro add sitemap    # Auto-generate sitemap
npx astro add image      # Optimized images

SEO recommendation: Astro is excellent for blogs, docs, and marketing sites. If your site is primarily content, Astro delivers the best performance and SEO baseline of any framework.

Gatsby

Gatsby generates static HTML at build time (SSG). Excellent for blogs, documentation, and marketing sites. Limited for dynamic content since every change requires a rebuild.

Detection markers: gatsby- prefixed elements, ___gatsby root element.

How to Detect Your Rendering Type

You can check your rendering type manually by viewing the page source (not the DevTools DOM — that shows the rendered result). If the source HTML contains your content, you are using SSR or SSG. If it contains only a shell, you are using CSR.

The faster way is to use Xyle:

$ xyle crawl --url https://yoursite.com --json

The output includes a rendering section:

{
  "rendering": {
    "framework": "Next.js",
    "rendering_type": "SSR",
    "is_spa": false,
    "has_client_hydration": true,
    "seo_impact": "Positive — full content available on initial HTML response"
  }
}

This tells you exactly what framework is in use, how the page renders, and whether it is an SPA — without manually digging through source code.

Fixing CSR SEO Problems

If your site is CSR and you need SEO, here are your options in order of preference.

Option 1: Migrate to an SSR Framework

The most robust solution. If you are using React, migrate to Next.js. If Vue, migrate to Nuxt. This gives you SSR/SSG with minimal configuration changes to your components.

# Next.js migration — your existing React components mostly work as-is
npx create-next-app@latest my-app --typescript
# Move your components to app/ or pages/ directory
# Add metadata exports for SEO

Option 2: Pre-Rendering Service

If migration is not feasible, a pre-rendering service runs a headless browser to generate static HTML snapshots of your pages. You serve these snapshots to crawlers while real users get the SPA experience.

// middleware.ts — detect crawler user agents and serve pre-rendered HTML
import { NextRequest, NextResponse } from "next/server";

const CRAWLER_AGENTS = [
  "googlebot", "bingbot", "slurp", "duckduckbot",
  "baiduspider", "yandexbot", "facebot", "ia_archiver",
];

export function middleware(request: NextRequest) {
  const userAgent = request.headers.get("user-agent")?.toLowerCase() || "";
  const isCrawler = CRAWLER_AGENTS.some((bot) => userAgent.includes(bot));

  if (isCrawler) {
    // Redirect to pre-rendered version
    const prerenderUrl = `https://prerender.yourservice.com/${request.url}`;
    return NextResponse.rewrite(prerenderUrl);
  }

  return NextResponse.next();
}

This is a workaround, not a solution. It adds complexity, latency, and a maintenance burden. Prefer migrating to SSR.

Option 3: Add Noscript Fallback Content

The simplest stopgap — add critical content in <noscript> tags so crawlers that do not execute JavaScript still see something:

<noscript>
  <h1>Your Page Title</h1>
  <p>Key content that crawlers should index...</p>
</noscript>

This is the weakest option. It only helps crawlers that completely skip JavaScript, and the content can easily drift from your actual rendered page.

Technical SEO Checks for JavaScript Sites

Beyond rendering, JavaScript sites have specific technical SEO pitfalls.

Canonical Tags in SPAs

Single-page applications change the URL via client-side routing without a page reload. Make sure your canonical tag updates with each route change:

// Next.js App Router — canonical is handled via metadata
export const metadata = {
  alternates: {
    canonical: "https://yoursite.com/current-page",
  },
};

For client-rendered SPAs, you need to dynamically update the canonical tag in the <head> when the route changes, or use a library like react-helmet:

import { Helmet } from "react-helmet";

function ProductPage({ slug }: { slug: string }) {
  return (
    <>
      <Helmet>
        <link rel="canonical" href={`https://yoursite.com/products/${slug}`} />
        <meta property="og:url" content={`https://yoursite.com/products/${slug}`} />
      </Helmet>
      <h1>Product Details</h1>
      {/* Page content */}
    </>
  );
}

Open Graph Tags with SSR

OG tags must be in the initial HTML response — social media crawlers never execute JavaScript. If you are using SSR, this works naturally through your framework's metadata API. If you are using CSR, OG tags will not work without pre-rendering.

Verifying Your Technical SEO

After fixing rendering issues, verify that your technical SEO checks pass:

$ xyle crawl --url https://yoursite.com --json

The technical_seo section shows whether your canonical, robots, viewport, charset, lang, HTTPS, OG tags, and heading structure are correct:

{
  "technical_seo": {
    "has_canonical": true,
    "has_robots_meta": true,
    "has_viewport": true,
    "has_charset": true,
    "has_lang": true,
    "is_https": true,
    "has_og_tags": true,
    "h1_count": 1,
    "title_length": 54,
    "meta_description_length": 148
  }
}

If any check fails, you know exactly what to fix.

Framework SEO Comparison

Here is a side-by-side comparison of the major frameworks for SEO-critical features:

| Feature | Next.js | Nuxt 3 | Astro | SvelteKit | Gatsby | |---------|---------|--------|-------|-----------|--------| | Default Rendering | SSR | SSR | SSG | SSR | SSG | | SSG Support | Yes | Yes | Yes (default) | Yes | Yes (default) | | Hybrid Rendering | Yes (per-page) | Yes (routeRules) | Yes (islands) | Yes (per-page) | Limited | | JS Shipped | React runtime | Vue runtime | Zero by default | Minimal | React runtime | | Meta Tag API | metadata export | useSeoMeta | <head> in layout | svelte:head | react-helmet | | Image Optimization | Built-in | nuxt/image | Built-in | Manual | gatsby-image | | Sitemap | next-sitemap | @nuxtjs/sitemap | @astrojs/sitemap | Manual | gatsby-plugin-sitemap | | Best For | React apps | Vue apps | Content sites | Performance-first apps | Legacy static sites |

Key takeaway: For pure content sites, Astro delivers the best SEO performance due to zero client-side JavaScript. For full-stack applications, Next.js and Nuxt are the strongest choices in their respective ecosystems. SvelteKit offers a middle ground with minimal JS overhead.

Prerendering Vue.js Apps for Better Crawlability

If you are using plain Vue.js (without Nuxt) and cannot migrate immediately, prerendering is the most practical path to SEO-friendly output.

Option A: vite-ssg for Static Generation

vite-ssg adds static site generation to a standard Vite + Vue app:

npm install vite-ssg
// src/main.ts
import { ViteSSG } from 'vite-ssg'
import App from './App.vue'
import routes from './routes'

export const createApp = ViteSSG(App, { routes })

At build time, this generates static HTML for every route. Crawlers see full content, and you keep the SPA experience for users.

Option B: Prerendering Service

If static generation is not feasible (highly dynamic content, too many routes), use a prerendering service that serves cached HTML snapshots to crawlers while real users get the SPA:

// Detect crawlers and serve pre-rendered HTML
const AI_AND_SEARCH_CRAWLERS = [
  'googlebot', 'bingbot', 'chatgpt-user', 'claude-web',
  'perplexitybot', 'gptbot', 'duckduckbot',
]

function isCrawler(userAgent: string): boolean {
  const ua = userAgent.toLowerCase()
  return AI_AND_SEARCH_CRAWLERS.some(bot => ua.includes(bot))
}

Note that AI crawlers (GPTBot, Claude-Web, PerplexityBot) typically do not render JavaScript at all. If your Vue app is CSR-only, these crawlers see nothing. Prerendering is essential if you want AI visibility.

Option C: Migrate to Nuxt

The best long-term solution. Nuxt is built on Vue, so your existing components work with minimal changes. You get SSR by default, a mature meta tag API, and an ecosystem of SEO modules. See our Nuxt.js SEO guide for the full walkthrough.

Frequently Asked Questions

Is JavaScript bad for SEO?

No — JavaScript itself is not bad for SEO. The problem is client-side rendering (CSR), where the initial HTML is empty and content only appears after JavaScript executes. Frameworks that use SSR or SSG (Next.js, Nuxt, Astro, SvelteKit) deliver full HTML to crawlers and have excellent SEO.

Which JavaScript framework is best for SEO?

For pure content sites, Astro is the strongest choice — it ships zero JavaScript and produces the fastest pages. For full-stack applications, Next.js (React) and Nuxt (Vue) are both excellent with SSR/SSG support. Choose based on your team's framework preference rather than SEO capability alone.

Do AI crawlers render JavaScript?

Most do not. GPTBot, Claude-Web, and PerplexityBot typically work with whatever HTML your server returns without executing JavaScript. Google's AI Overviews use the rendered version from Google's indexer, but other AI answer engines rely on the initial HTML. If your site is CSR-only, AI crawlers likely see no content.

Is Nuxt good for SEO?

Yes. Nuxt is one of the best JavaScript frameworks for SEO because it ships with SSR enabled by default. Every crawler sees your full content on the first request. Nuxt 3 also supports hybrid rendering, letting you use SSG for static pages and SSR for dynamic pages. See our Nuxt.js SEO guide for details.

Getting Started

Here is a three-step audit for your JavaScript site:

  1. Check your rendering. Run xyle crawl --url <url> --json and look at the rendering section. If it shows CSR, you have work to do.
  2. Verify content in source HTML. View your page source (not DevTools) and confirm your headings, text, and meta tags are present without JavaScript execution.
  3. Fix what is broken. Migrate to SSR if possible, add pre-rendering if not, and verify technical SEO checks pass.

Your framework choice is a foundation decision. Getting rendering right means every other SEO optimization you make — from structured data to content quality — actually reaches the crawlers. Getting it wrong means none of it matters.

Run your first crawl with Xyle and see exactly how search engines experience your JavaScript site.

Ready to optimize your search rankings?

Xyle connects to Google Search Console, analyzes content gaps with AI, and gives you actionable fixes — from the terminal or dashboard.

Read the Docs