If your website runs on React, Vue, or Angular, there’s a good chance Google is only seeing a fraction of your content. I’ve audited dozens of JavaScript-heavy sites for Singapore businesses, and the pattern is consistent: beautiful front-end, hollow HTML shell, and rankings that flatline. These six JavaScript SEO best practices will help you fix that gap between what your users see and what Googlebot actually indexes.
This isn’t a surface-level overview. I’m going to walk you through the technical decisions, the specific tools, and the exact steps we use at bestseo.sg when a client comes to us with a JS-rendered site that’s underperforming in organic search.
How Googlebot Actually Processes Your JavaScript Site
Before we get into the fixes, you need to understand the problem at a mechanical level. Google doesn’t process your JavaScript site the way your browser does. It uses a two-phase system, and the gap between those phases is where most rankings get lost.
Phase 1: Crawling the Raw HTML
Googlebot sends a request to your URL and receives back whatever your server delivers. For a traditional HTML site, that’s a complete page with all your content, headings, links, and structured data baked in. For a client-side rendered JavaScript app, that response is often a near-empty HTML document with a single <div id="app"></div> and a bundle of JS files referenced in script tags.
At this stage, Googlebot extracts whatever it can. If your content isn’t in that initial HTML response, it doesn’t exist yet as far as Phase 1 is concerned.
Phase 2: Rendering (The Queue)
Google places your page into a rendering queue. When resources become available, Googlebot’s Web Rendering Service (WRS) executes your JavaScript, fetches additional API calls, builds the DOM, and then reads the fully rendered page. Only after this step can Google see your actual content.
Here’s the problem. That rendering queue isn’t instant. Google’s own documentation acknowledges this delay can range from seconds to days. During a 2023 audit for a Singapore e-commerce client running a React SPA, we found that 34% of their product pages hadn’t been rendered and indexed after two weeks of being live. That’s two weeks of zero organic visibility for new products.
The rendering queue is also resource-constrained. Google allocates a finite crawl budget to your site. Every JavaScript file that needs to be fetched and executed eats into that budget. If your bundle is 2MB of unminified code, you’re burning through resources that could be spent discovering and indexing more of your pages.
The 6 JavaScript SEO Best Practices You Should Implement
Now that you understand the mechanics, let’s get into the specific practices that close the gap between what users see and what Google indexes.
1. Pick a Rendering Strategy That Serves Google Complete HTML
This is the single most impactful decision you’ll make for JavaScript SEO. Your rendering strategy determines whether Google gets a full page on the first request or has to queue your site for later processing.
Server-Side Rendering (SSR) is the gold standard. Your server executes the JavaScript, builds the complete HTML, and sends it to both users and crawlers. Googlebot receives a fully formed page on the first request. No rendering queue. No delay. Frameworks like Next.js (for React), Nuxt.js (for Vue), and Angular Universal make this achievable without rewriting your entire application.
Static Site Generation (SSG) is even better for pages that don’t change frequently. Your build process pre-generates HTML files at deploy time. These are served as static files, which means they’re fast and completely crawlable. For a Singapore property listing site we worked on, switching from pure client-side rendering to SSG for their neighbourhood guide pages resulted in a 62% increase in indexed pages within three weeks.
Dynamic rendering is a pragmatic middle ground. You detect whether the request comes from a bot or a user, then serve pre-rendered HTML to bots and the normal JS experience to users. Google has explicitly stated this is acceptable and not cloaking, as long as the content is equivalent. Tools like Rendertron and Prerender.io handle this automatically.
Here’s how to decide:
- Content-heavy pages that need SEO (blog posts, product pages, service pages): use SSR or SSG.
- Highly interactive dashboards or logged-in areas that don’t need indexing: client-side rendering is fine.
- Large existing CSR apps where a full SSR migration isn’t feasible yet: implement dynamic rendering as a bridge solution.
2. Verify That Google Actually Sees Your Content
Choosing the right rendering method is step one. Confirming it works is step two. I’ve seen teams implement SSR and assume the job is done, only to discover that certain API calls were failing during server-side execution, leaving entire sections blank.
Google Search Console’s URL Inspection Tool is your primary diagnostic instrument. Paste any URL from your site, click “Test Live URL,” and then view the rendered page. You’ll see exactly what Googlebot sees after rendering. Compare this against what you see in your browser. Any missing text, images, or navigation elements indicate a rendering failure.
For a more systematic check, use the Rich Results Test at search.google.com/test/rich-results. It renders your page and shows both the HTML output and any structured data detected. This is particularly useful for checking whether your schema markup, which is often injected via JavaScript, is actually visible to Google.
Here’s a manual technique I recommend to every client. Open Chrome, go to Settings, then Site Settings, then JavaScript, and disable it. Now browse your site. Every piece of content that disappears is content that depends entirely on JavaScript rendering. If your H1, your product descriptions, or your pricing information vanishes, you have a problem that needs fixing.
For Singapore businesses running bilingual sites (English and Chinese), pay special attention to language-switched content. We’ve found that JavaScript-based language toggles frequently fail during server-side rendering, meaning Google only indexes one language version. Check both language variants separately in Search Console.
3. Fix JavaScript Pagination So Google Can Crawl Every Page
Infinite scroll and JavaScript-powered pagination are ranking killers when implemented incorrectly. The core issue is simple: if clicking “Load More” or scrolling down doesn’t change the URL, Google has no way to discover or index the additional content.
Think of it like a hawker centre menu that only shows the first five items. If the rest of the menu requires you to tap a screen but there’s no separate page for each section, Google will only ever know about those first five dishes.
Use the History API to update URLs. When a user navigates to page 2 of your product listing, call history.pushState() to update the browser URL to something like /products?page=2 or /products/page/2. This creates a unique, crawlable URL for each paginated state. Each URL must return the correct content when accessed directly.
Include proper <a> tags with href attributes for your pagination controls. Don’t use <button onclick="loadPage(2)"> for pagination. Instead, use <a href="/products?page=2">Page 2</a>. You can still attach JavaScript event handlers for a smooth user experience, but the underlying HTML link gives Googlebot a path to follow.
Add all paginated URLs to your XML sitemap. Don’t just include page 1. If you have 50 pages of products, all 50 URLs should be in your sitemap. Submit this through Google Search Console. For a Singapore fashion retailer we audited, adding paginated URLs to their sitemap led to 340 additional product pages being indexed within 10 days.
Also implement rel="next" and rel="prev" link elements in your page head. While Google has said these aren’t a strong signal, they still help other search engines and provide a clear structural hint about the relationship between paginated pages.
4. Build Internal Links That Search Engines Can Actually Follow
Internal linking is how Google discovers new pages and understands your site’s hierarchy. JavaScript frameworks love to handle navigation with custom click events, router links, and programmatic redirects. Many of these are invisible to crawlers.
The rule is absolute: every navigational link must be a standard <a> tag with a valid href attribute. Not a <div> with an onClick. Not a <span> styled to look like a link. Not a JavaScript router.push() without a corresponding anchor element.
Here’s a quick test. Open your browser’s developer tools, go to the Elements panel, and search for all your navigation links. If you find elements like <div class="nav-item" @click="navigate('/about')">, those links are invisible to Googlebot during the crawl phase. Replace them with <a href="/about"> elements.
For React Router, use the <Link> component, which renders as a proper <a> tag. For Vue Router, use <router-link>, which does the same. Angular’s routerLink directive also produces correct anchor elements. The framework tools exist. You just need to make sure your developers are using them consistently.
Your XML sitemap becomes even more critical for JavaScript sites. It serves as a safety net, ensuring Google knows about pages that might be difficult to discover through link-following alone. Generate your sitemap automatically during your build process, and set up a scheduled submission through Google Search Console. Audit it monthly to catch pages that have been removed or URLs that have changed.
5. Optimise JavaScript Performance for Faster Rendering
Page speed isn’t just a user experience metric. It directly affects how efficiently Google can crawl and render your site. Every kilobyte of JavaScript that Googlebot has to download and execute consumes crawl budget. Slow pages mean fewer pages crawled per session, which means slower indexing across your entire site.
Minify and compress your JavaScript bundles. Minification strips whitespace, shortens variable names, and removes comments. Compression (gzip or Brotli) reduces the transfer size further. A typical unminified React bundle might be 1.8MB. After minification and Brotli compression, that same bundle drops to around 280KB. That’s an 84% reduction in what Googlebot needs to download. Tools like Webpack, Vite, and esbuild handle this automatically in production builds.
Implement code splitting aggressively. Your homepage doesn’t need the JavaScript for your checkout flow. Code splitting breaks your application into chunks that load on demand. The browser (and Googlebot) only downloads the code needed for the current page. In Next.js, this happens automatically at the page level. For custom setups, use dynamic import() statements to split at logical boundaries.
Defer or async non-critical scripts. Third-party scripts like analytics, chat widgets, and ad tracking don’t need to block rendering. Add the defer or async attribute to these script tags. Better yet, load them after the main content has rendered using requestIdleCallback or an intersection observer.
Monitor your Core Web Vitals closely. For JavaScript-heavy sites, the metrics that suffer most are:
- Largest Contentful Paint (LCP): Often delayed because the main content waits for JS execution. Target under 2.5 seconds.
- Interaction to Next Paint (INP): Heavy JavaScript can block the main thread, making interactions feel sluggish. Target under 200 milliseconds.
- Cumulative Layout Shift (CLS): Dynamically injected content pushes elements around. Target under 0.1.
Use Chrome’s Lighthouse tool or PageSpeed Insights to measure these. For Singapore-hosted sites, test from a Singapore-based server to get accurate latency measurements. If your hosting is overseas, consider a CDN with Singapore edge nodes to reduce time-to-first-byte for local users and for Googlebot’s Singapore-based crawlers.
6. Set Up Ongoing Monitoring and Troubleshooting
JavaScript SEO isn’t a one-time fix. Every code deployment can introduce new rendering issues. A developer adds a new API call that fails server-side. A third-party script update breaks your structured data injection. A new route gets created without a corresponding sitemap entry. You need systems that catch these problems before they erode your rankings.
Google Search Console is your early warning system. Check the Pages report (formerly Coverage report) weekly. Look for increases in “Discovered, currently not indexed” or “Crawled, currently not indexed” pages. A sudden spike in either category often signals a rendering problem. The URL Inspection Tool lets you drill into specific pages to see exactly what went wrong.
Use a JavaScript-capable crawler for regular audits. Screaming Frog (with JavaScript rendering enabled) and Sitebulb can simulate Googlebot’s rendering process across your entire site. Run a full crawl monthly, or after any significant code deployment. Compare the rendered HTML against the raw HTML for each page. Any page where the rendered version has significantly more content than the raw version is a page that depends on client-side rendering, and is therefore at risk of indexing delays.
Set up automated alerts for Core Web Vitals regressions. Google Search Console provides this data, but tools like web-vitals.js can send real-user metrics to your analytics platform in real time. When a deployment causes LCP to spike from 2.1 seconds to 4.8 seconds, you want to know within hours, not weeks.
Keep a rendering changelog. Document every change to your rendering strategy, framework version, or build configuration. When indexing issues appear, you can cross-reference against recent changes to identify the cause quickly. This sounds basic, but it’s saved us dozens of hours in debugging for clients.
Framework-Specific Guidance for Singapore Development Teams
Most Singapore web agencies build on one of three major frameworks. Here’s what you need to know about each from a JavaScript SEO perspective.
React Sites
Default Create React App (CRA) setups use pure client-side rendering. This is the worst option for SEO. If your React site was scaffolded with CRA and you need organic traffic, you have two practical paths forward.
The first is migrating to Next.js, which gives you SSR, SSG, and incremental static regeneration out of the box. Next.js 14’s App Router makes it straightforward to choose rendering strategies on a per-page basis. Your marketing pages can be statically generated while your dynamic product pages use SSR.
The second path is implementing a dynamic rendering layer using Prerender.io or a self-hosted Rendertron instance. This is less invasive than a full framework migration and can be deployed at the CDN or reverse proxy level. For a Singapore fintech client regulated by MAS, we chose this approach because their compliance team couldn’t approve a full framework change within the required timeline. The result was a 41% increase in organic impressions within six weeks.
Vue Sites
Vue’s ecosystem mirrors React’s SEO challenges. Default Vue CLI projects render client-side. Nuxt.js is the established solution, offering SSR, SSG, and a hybrid mode called ISR (Incremental Static Regeneration) in Nuxt 3.
One Vue-specific issue I see frequently: developers using Vue’s v-if directive to conditionally render content based on API responses. If the API call hasn’t resolved during server-side rendering, that content won’t appear in the initial HTML. Use asyncData or useFetch in Nuxt to ensure data is fetched before the page renders on the server.
Angular Sites
Angular applications tend to be larger and more complex, often used for enterprise-grade projects. Angular Universal provides server-side rendering capabilities. The setup is more involved than Next.js or Nuxt.js, but the Angular CLI includes schematics that automate much of the configuration.
A common Angular-specific pitfall: using setTimeout or setInterval in components that run during SSR. These cause the server-side render to hang or timeout, resulting in incomplete HTML being sent to Googlebot. Use Angular’s isPlatformBrowser check to ensure browser-only code doesn’t execute on the server.
JavaScript SEO and Mobile-First Indexing in Singapore
Google uses mobile-first indexing, meaning it primarily crawls and indexes the mobile version of your site. For JavaScript sites, this has specific implications you need to address.
Singapore has one of the highest smartphone penetration rates globally, at over 97%. Your mobile experience isn’t secondary. It’s the primary version Google evaluates. If your JavaScript renders differently on mobile versus desktop, or if certain scripts fail on mobile viewport sizes, Google will index the broken mobile version.
Test your rendered output at mobile dimensions. Use Search Console’s URL Inspection Tool, which crawls as a mobile user agent by default. Compare this against a desktop render. Any content present on desktop but missing on mobile is content Google won’t index.
Watch your JavaScript bundle sizes carefully for mobile. A 500KB bundle that loads in 1.2 seconds on a desktop fibre connection might take 4+ seconds on a mobile 4G connection. In Singapore, 5G coverage is extensive, but not universal. Test on throttled connections to get realistic performance numbers.
Lazy load below-the-fold content intelligently. Use intersection observers to load images and non-critical content as users scroll. But make sure your primary content, the text and elements above the fold, renders immediately without waiting for scroll events. Googlebot doesn’t scroll your page. It renders the initial viewport state and reads the full DOM. Content gated behind scroll-triggered JavaScript will still be visible in the rendered DOM, but content gated behind user interaction events (like click-to-expand) may not be.
Common JavaScript SEO Mistakes I See on Singapore Websites
After auditing hundreds of Singapore-based sites, these are the recurring issues that cost businesses the most organic traffic.
Hash-based routing without fallback. URLs like example.com/#/products are problematic because Googlebot ignores the fragment identifier (everything after the #). Your entire SPA appears as a single page to Google. Switch to HTML5 history mode routing, which produces clean URLs like example.com/products.
Blocking Googlebot from JavaScript resources. Check your robots.txt file. If you’re disallowing access to your /static/, /assets/, or /js/ directories, Googlebot can’t download the scripts it needs to render your pages. The rendered output will be an empty shell. Allow access to all resources required for rendering.
Relying on client-side redirects for canonical URLs. JavaScript-based redirects (using window.location) are slower for Googlebot to process than server-side 301 redirects. In some cases, Googlebot may not follow them at all. Always implement redirects at the server level.
Missing meta tags in the initial HTML. Your title tag, meta description, and canonical tag must be present in the server-rendered HTML, not injected by JavaScript after page load. If these tags are only added client-side, Google may use the empty or default values from your HTML template instead. For React, use next/head in Next.js. For Vue, use Nuxt’s useHead composable. For Angular, use the Meta and Title services within Angular Universal.
Let’s Fix Your JavaScript Site’s SEO
If you’ve read this far, you probably already suspect your JavaScript site has indexing issues. The good news is that every problem described here is fixable, and the results tend to be dramatic. We’ve seen sites go from 30% indexed pages to 95% within a month of implementing proper rendering strategies.
Start with the URL Inspection Tool in Google Search Console. Check your five most important pages. If the rendered output doesn’t match what you see in your browser, you know where to begin. Work through the six practices above in order, starting with your rendering strategy, because everything else depends on getting that right.
If you’d rather have someone who’s done this hundreds of times handle the technical audit, reach out to us at bestseo.sg. We’ll run a full JavaScript rendering audit on your site and show you exactly what Google is missing. No obligation, just clarity on what needs fixing and how to fix it.
