How do SPAs affect SEO?

JavaScript

The short answer

SPAs can hurt SEO because search engine crawlers may not execute JavaScript to see the content. The page starts as an empty HTML shell and JavaScript builds the content on the client. If the crawler does not run JavaScript, it sees an empty page. The solutions are server-side rendering (SSR), static generation (SSG), or pre-rendering.

The problem

A typical SPA sends this to the browser:

<!DOCTYPE html>
<html>
<body>
<div id="root"></div>
<script src="app.js"></script>
</body>
</html>

The actual content is generated by JavaScript after the page loads. Google's crawler can execute JavaScript, but there are issues:

  • Crawling JavaScript pages uses more resources, so Google may not crawl them as frequently
  • Other search engines (Bing, DuckDuckGo) have limited JavaScript execution
  • Social media crawlers (for link previews) often do not execute JavaScript at all
  • There can be a delay between when the page is crawled and when JavaScript-rendered content is indexed

Solutions

Server-side rendering (SSR):

The server runs React and sends fully rendered HTML:

// Next.js handles this automatically
async function ProductPage({ params }) {
const product = await getProduct(params.id);
return <h1>{product.name}</h1>;
// Crawler receives: <h1>Widget Pro</h1>
}

Static generation (SSG):

Pages are pre-built as HTML files at build time. Crawlers get complete HTML instantly.

Pre-rendering:

A service like Prerender.io detects crawler requests and serves a pre-rendered HTML snapshot instead of the SPA.

Meta tags for social sharing:

For link previews, set meta tags on the server:

<head>
<meta property="og:title" content="Product Name" />
<meta
property="og:description"
content="Product description"
/>
<meta
property="og:image"
content="https://example.com/image.jpg"
/>
</head>

These must be in the initial HTML — social crawlers do not run JavaScript.

Other SPA SEO concerns

  • URL structure — use real URLs with React Router (not hash-based /#/page), so each page has a unique, crawlable URL
  • Sitemaps — generate a sitemap so crawlers know which pages exist
  • Page titles and meta descriptions — update them dynamically for each route using document.title or a library like react-helmet

Interview Tip

Explain the core problem (crawlers may not run JavaScript), then cover SSR and SSG as solutions. Mentioning meta tags for social sharing shows you think about practical SEO beyond just search engines. If you know Next.js handles this, say so.

Why interviewers ask this

SEO is critical for businesses that depend on organic traffic. Interviewers want to see if you understand the tradeoffs of SPAs, know how to make them SEO-friendly, and can recommend the right rendering strategy.