I build React-based business/marketing websites - content driven websites where SEO comes into play.
Now, common wisdom dictates that such a website should be static or at least SSR. So you want to use NextJS, Gatsby or another similar framework. But I’ve been reading about how Crawlers have the ability to execute javascript now so theoretically if your client-side js is fast enough - it should still work for SEO purposes.
Does anyone have experience optimizing client-side React Apps for SEO? Is this doable or am I deluding myself?
It depends on what your site is for. Ecom? Unless you’re using the new merchant API - you’re shooting yourself in the foot when it comes to timing if you’re not SSRing. News sites - you just need to make sure you have a proper news sitemap. Everyone else should be ok - but you’re still gonna fall behind competition in a few smaller ranking races.
To break it down a little more - SEs have multiple versions of bots that assess your site. Discovery vs Rendering for example - discovery bots are rudimentary and do not render (in fact - the simplest bot is basically wget). If you have a bunch of links that render in - they will not be discovered nearly as quickly as they won’t be seen until a render bot visits. In addition to that - the dissonance between the different records that Google now has for 1 page does indeed cause a further delay.
Here’s a case study on JS effects on indexing. Also be sure to watch your CrUX dataset as CSR tends to be a LOT harder on your mobile audience than you realize.
If your concern is SEO, simplify your site and put the content front and center via SSR. Seriously. The faster the search engine can get to your content and index it, the better.
If it has to load up a browser, wait for things to render, then poke around… that is time and money they are losing because of your choices.
The way I optimize React based client apps… I remove React and move to SSR. Site loads faster and is more responsive with 1/10th the bandwidth need.
If you have few public pages, you can use plain html/css/js for them and React for the actual app. If you are building a website with a lot of public pages but less reactivity consider Astro or something similar. If you are building a website with a lot of public pages and reactivity use Next or another SSR framework.
I had to switch to SSR for some pages. Google (and ones that use google data) indexes your javascript page content (with conditions and restrictions) but doesn’t for example support dynamic meta tag changes (for example, fetch data → change page title and description based on data).
Tavi said:
Most crawlers don’t execute JavaScript. If they do, it takes resources to load it, => your website has lower rank.
It’s not that it’s Most… it’s that it takes a split second for wget vs something like a networkIdle2 puppeteer wait. On a 5 page site - no big deal. On a 50k page ecom site? Big deal. G may be huge - but there are still costs involved here and it directly correlates to crawl budgeting. So yes - your site gets hit more often by rudimentary bots, but rendering bots are just as busy - just with a bigger workload.
@Ellis
Makes sense, rendering is way heavier, so even though Google has resources, they’re not infinite. Crawl budget definitely isn’t something to ignore on large sites.
@Ari
They do, they just do it slower and penalize you.
There is no beating SSR for things like SEO, period. For some apps what you get with CSR is good enough or close enough, but SSR will be the king of SEO for a long time still.