All websites are built with SEO in mind, however, some websites do better than others.
It is a well-known issue that JavaScript-based websites often run into problems when it comes to SEO. With more and more websites using modern JS frameworks and libraries such as React, Angular, or Vue.js, these SEO issues become even more prevalent and they have to be dealt with.
The fact of the matter is that developers are still in early stages when it comes to optimizing modern JS frameworks with search engines, and despite the popularity of JavaScript websites, some of them still fail with Google.
Back in 2014, Google claimed that they’re doing pretty well with rendering JavaScript websites, however, they were still upfront about the fact that certain issues still exist. As mentioned in the article:
“Sometimes the JavaScript may be too complex or arcane for us to execute, in which case we can’t render the page fully and accurately . . .”
Despite all of this, it is still entirely possible to set up proper SEO for JavaScript websites.
Understanding the basics of JavaScript SEO is a must-know skill for SEO professionals, as well as developers who are dealing with websites that run on JS frameworks such as Angular.
In this article, we’re going to explore this topic in depth, so let’s get straight to the point.
Known Issues
Most issues with JavaScript websites come down to these three things:
- Crawling (Google’s ability to crawl a JavaScript website)
- Rendering (Google’s ability to render a JavaScript website)
- Crawl Budget (The time it takes to accomplish these tasks)
Google uses bots (crawlers aka Googlebots) to collect data from the web and display it to their users. With traditional, HTML-based websites, this process is usually simple and there are rarely any problems. However, Google appears to struggle with websites based on modern JavaScript frameworks due to the reasons listed above.
If you’re a Google Chrome user then you’re probably using Chrome v70 or above to read this article, but did you know that Googlebot actually uses Chrome 41 for rendering websites?
That browser version was released in 2015, so no wonder it runs into problems while trying to crawl and render websites based on modern JavaScript frameworks. Just keep in mind that we are talking about a 4-year old browser here.
Also, since JavaScript often contains a heavy-load, it severely impacts Google’s crawl budget, meaning it will take more time to crawl and index your page, which can lead to fatal results in some cases.
Now you know why Google runs into problems when it comes to JavaScript, but it’s not all doom and gloom.
A Single JavaScript Error Can Be Fatal for SEO
In 2017, Bartosz Góralewicz conducted a JavaScript SEO experiment, to see how Google handles websites built using modern JS frameworks. He found out that Googlebot couldn’t render Angular 2, which was built by Google’s own team. It turned out that Google Angular team made a small mistake, which was fatal for SEO:
If you want to learn more about this fascinating case, check out Bartosz’s article covering this case.
Diagnosing Javascript SEO Issues
The first step to take in this battle is to find out whether your website has an issue with it.
As Google mentioned, Googlebot is able to successfully render JavaScript websites in most cases. Nonetheless, you should make sure your JS website is rendered properly, and you can that by diagnosing your website.
To diagnose your website, you should try to load the page using Googlebot’s default browser: Chrome 41.
Take your time to download Chrome 41, and see how your website behaves on this browser. In most cases, if your website is working properly with Chrome 41, it should also work just fine with Googlebots. If it doesn’t, go check Chrome 41 console logs to see what may be the cause.
If you want to learn more about rendering on Google Search, check out this official article from Google covering this topic.
Additionally, Google just recently updated their Mobile-Friendly test to support JavaScript websites, so developers can use it to further diagnose their websites. This test now shows the rendered HTML versions of your website, as well as console logs which you can use to find out if there are any problems with your website.
What happens if I want to use modern JavaScript features for my website?
Introducing: Graceful Degradation, Transpilers, and Polyfills
JavaScript is now used more than ever, however, some features are simply incompatible with older browsers ( and Chrome 41 is definitely one of them), thus impossible to render. To deal with these problems, Google suggests using graceful degradation.
If some of your JS features work only on modern browsers, you should make sure your website at least degrades gracefully in older versions. That’s what transpilers and polyfills are used for. This way, you can translate modern JavaScript statements to the ones that older browsers (and Googlebots!) can understand.
As an example, Google 41 doesn’t support ES6, so you should make sure to transpile your JavaScript to ES5.
To quote Tomas Rudzki from Elephate:
“For example, when a transpiler encounters “let x=5” (a statement that many older browsers can’t understand), it translates it to “var x=5” (an expression which is totally understandable by older browsers, including Chrome 41 which is used by Google for rendering)”.
If you want to learn more about transpilers and polyfills, this article should explain just about everything.
Google’s Solution to JavaScript SEO
Until Google finds a way to make sure their Googlebots support the latest technological advancements, they suggest developers to use Dynamic Rendering solution to successfully process JavaScript.
In short, Dynamic Rendering means that you can switch between client-side rendered and server-side, pre-rendered content for specific users (in this case — Google’s crawlers).
Dynamic rendering is good for websites that use modern JavaScript, which isn’t supported by crawlers. For your normal website users, nothing will change, and they will still load the same page using their browser. However, crawlers will be served a pre-rendered page, which removes the 3 known issues that we talked about earlier.
The solution is simple — where needed, the dynamic renderer will serve content that’s suitable for crawlers. For example, in the case of modern JavaScript statements, it will serve a static HTML version instead.
But which renderer should I use for dynamic rendering?
Although Google lists 3 options for this, their latest update (Feb. 2019) shows us how to use Rendertron for this.
Rendertron is an open source project that uses Headless Chrome to render the content. It takes care of the server-side rendering and allows you to show a pre-rendered JavaScript page to crawlers. It uses the user-agent HTTP header to determine if a request comes from your user’s browsers or a bot, and it then displays the relevant content to each of them.
By default, Rendertron does not render Googlebot’s requests, but you can fix that by adding Googlebot to a list of user agents:
const BOTS = rendertron.botUserAgents.concat(‘googlebot’);
const BOT_UA_PATTERN = new RegExp(BOTS.join(‘|’), ‘i’);
To properly configure Rendertron dynamic rendering for your website, you should refer to Google’s official guide on Dynamic Rendering with Rendertron.
It’s still uncertain whether Google ranks JavaScript-based website as equally as HTML websites, but there’s nothing we can do about this. The best thing that we can do for our JS website SEO is to make sure Googlebots get enough content to work with, and that’s exactly what this article was made to help you with — enjoy!