SEO goes beyond simply ranking on Google—it also extends into positioning your website to get ranked on google.
Starting with the basics, SEO comes in on-page (what you do on the web page to rank) and off-page (what you do outside the web page to rank). For both, you get to use different SEO tactics like topic clusters, creating pillar pages, keyword research, backlinking, alt-texts, meta-descriptions, and other things.
One thing we fail to discuss is how search engines like Google see our content. Google and the other big guys like Bing, and DuckDuckGo, can easily show your website on their SERPs because they crawl your website.
Here’s the kicker: not all websites can be crawled.
So, what do you do when you have excellent content, all your SEO checkboxes in place, but you’re not showing up for any organic search?
That’s where Dynamic Rendering comes in.
Dynamic rendering is creating a version of your content specifically for search engine bots and creating another for users.
This is the technical part of SEO few people address. In today’s post, we’ll cover all you need to know about dynamic rendering and how it directly impacts your SEO.
What is Dynamic Rendering?
Dynamic rendering is the process of sending the server-side rendered version of your website to search engine bots and the client-side version to users. This means two versions are created: one client-side rendered version of your site for human users and a separate, server-side version for search engines.
This technique approaches rendering content based on the agent calling it—users or a search engine bot.
On the server side, the Javascript content of your website is converted into the static HTML version search engine bots prefer. This new version allows them to fully access, crawl, and index all content.
This method is one of the advanced SEO tips endorsed by Google.
“Currently, it's difficult to process JavaScript, and not all search engine crawlers are able to process it successfully or immediately. In the future, we hope that this problem can be fixed, but in the meantime, we recommend dynamic rendering as a workaround solution to this problem. Dynamic rendering means switching between client-side rendered and pre-rendered content for specific user agents.”
Why Does Google Recommend Dynamic Rendering?
Javascript rendering is user-friendly but not bot-friendly.
On the implementation of Dynamic rendering, Martin Split said:
“...serving and rendering the page will take some time, and we want to serve the page as quickly as possible to avoid timeouts (2:03)…. Even though Googlebot can execute JavaScript, we don’t want to rely on that now (2:37).”
The main issue with bots and users is rendering—having a common interest is the solution here.
What is Rendering?
Rendering, in this context, is the process where Googlebot retrieves your pages, runs your code, and assesses your content to understand the layout or structure of your website.
Here is a visual representation of the rendering process.
The rendering process for web pages occurs in two phases:
- Initial HTML
- DOM
Initial HTML
This process occurs as a response from the server as Googlebot crawls the entire page to identify JS and CSS files present to build the page.
DOM
The Document Object Model, abbreviated as DOM or sometimes called the Rendered HTML, is an interface that represents how the browser reads your HTML (and XML). With the DOM, Javascript manipulates the structure and style of your website.
During this rendering process, Google collects all vital information to allow them to determine if your content correlates with searchers’ intent, and your position compared with other websites in the same niche (this is the SEO part—we’ll come back to this later).
There’s a rule the Googlebot follows:
“Google will not index a content it can’t render.”
This raises the next question.
How Does Google Index My Website?
Before the Googlebot reports back to Google and indexes your website, it has processed your plain HTML page in two steps, namely, crawl and index.
Crawl
The crawl process works by first crawling the HTML on a web page. Then it reads all text and links (internal and external), and identifies content keywords to give meaning to what the page is about.
Index
After the crawl process, the Googlebot then indexes the website, appearing on SERPs.
For Google, Bing, and every other search engine, rendering content in static HTML is best.
When your content is in Javascript, it becomes difficult for search engines to index it due to the number of times they have to process it until it’s successful.
Also, in trying to process your Javascript, the execution becomes resource-intensive. In a way, it consumes the time allocated for the bot to spend on the page.
As we discussed earlier, Google can index Javascript content, it takes considerable resources to successfully scale through the process of crawling, rendering, and indexing. Conversely, other search engines such as Bing and DuckDuckGo can’t parse Javascript content at all.
Dynamic rendering is the go-to solution for most, if not all, websites.
Dynamic Rendering for Bot and Users
Using Dynamic Rendering is preferable—serving plain HTML status content to bots and Javascript for users. This hybrid solution provides search-engine bots with a machine-readable, text-only web page that’s simple to crawl and index.
The web page is a render-optimized and fully functional version for user agents. Serving the web page in JavaScript allows users to interact with the website and get that web experience.
What Type of Websites Can Best Use Dynamic Rendering?
Websites with large or JavaScript-heavy content—causing them to rapidly change—are the best fit for Dynamic Rendering.
Another use case is companies struggling with crawl-budget issues—primarily large websites. An excellent example is e-commerce companies with an ever-changing inventory, catalog, user preference, user location, and more.
An advantage of large websites using dynamic rendering is the simplicity of deployment compared to server-side rendering.
If you’re thinking of opting for dynamic rendering but aren’t convinced, below are a few questions to help.
- Is the content on your website dynamic in its original form, i.e., does it change rapidly?
- Is your engineering team limited, and will implementing server-side rendering be a hassle or out of budget?
- Is the website you’re planning on implementing dynamic rendering indexable?
- If yes, do you need Javascript to render all or part of the content?
- Are you facing crawl-budget issues due to how large or dynamic your website content is?
- Is the load on your server exceeding the recommended bandwidth, or are you facing related issues affecting your content rendering dynamically?
If you answered yes to one or more of the questions below, then Dynamic Rendering may be your best solution.
What Problems Can Dynamic Rendering Solve?
Implementing dynamic rendering allows search engines to crawl your website faster and more easily. Another big one is page speed. Page speed is one of the significant factors both users and bots agree is key in estimating a website’s performance.
If your page speed is slow, according to Google, anything over 3-5 seconds is slow, then users will likely become frustrated and leave the website—hence making them “bounce.”
For bots, this means they won’t have enough time to crawl your web pages and index them all successfully. Without indexing all your pages, appearing on SERPs becomes next to impossible.
Since dynamic rendering is a solution for search-engine bots, the SEO benefits and impacts are based on whether your web page shows up on SERPs.
Dynamic Rendering In SEO: How Does It Help?
The fundamental basis of dynamic rendering is to fix web pages not showing up on SERPs.
Users complained that Google isn’t indexing all or some of their pages—affecting their SEO, as they can’t rank.
Google noted the main issue with not being able to index some or all content is Javascript’s presence on specific web pages. On one hand, this affects the overall SEO of websites. On the other, Google can’t alter its crawl budget. This brings us to Dynamic Rendering to solve both problems.
Starting with crawl budgets, major search engines like Bing and Google have limited time their bots can spend on an entire website. They set up a time limit called the crawl budget and it varies per website.
According to Google,
“The amount of time and resources that Google devotes to crawling a site is commonly called the site’s crawl budget.
Note that not everything crawled on your website will necessarily be indexed; each page must be evaluated, consolidated, and assessed to determine whether it will be indexed after it has been crawled.
Crawl budget is dependent on two main elements: Crawl Capacity and Crawl Demand.”
So far, we know only Google’s formula for crawl budget:
Crawl Budget = Crawl Capacity + Crawl Demand
Crawl capacity depends on:
- The speed of your page(s)
- The number of errors the bot runs into
- The limit set in Google’s Search Console
Crawl demand depends on:
- The popularity of your pages
- The time of publishing or when it was last updated
Improving your SEO is easy once you figure out all these key factors influencing your crawl budget.
Why’s it so important?
Googlebot crawling your website is an essential part of your SEO. SEO is simply your process of ranking on the SERPs to get traffic to your website. Before you can show up on SERPs, your site needs to be crawled and indexed. If the Googlebot isn’t crawling and indexing your website, your SEO is useless.
How Do Dynamic Rendering Content Help JavaScript-enabled Websites?
The Problem
Not all websites can switch to plain static HTML; they need JavaScript as much as possible to function. That’s a problem for search engines as they’ll have to render it for the crawl bot to see content Javascript loads. As discussed earlier, the problem with loading Javascript content is \the time and resources it takes to successfully execute. So search engine bots defer Javascript content until they have enough resources to execute them.
In instances where the search engine bots render JavaScript content, it’s different from the main JavaScript content.
The Solution
JavaScript is the problem for bots, so eliminating it is the solution. With dynamic rendering, search engine bots get a static HTML version of your web page—taking Javascript out of the equation.
The goal of dynamic rendering is serving a version to the bot and another to humans. So you may wonder:
Is Dynamic Rendering The Same As Cloaking?
Dynamic Rendering is not the same as cloaking.
If you intend to use dynamic rendering is to serve a different version of your page to bots and another to humans—then it’s cloaking.
Google considers dynamic rendering not to be cloaking.
Here’s a statement from Google.
“Googlebot generally doesn't consider dynamic rendering as cloaking. So as long as your dynamic rendering produces similar content, Googlebot won't view dynamic rendering as cloaking.
When you're setting up dynamic rendering, your site may produce error pages. Googlebot doesn't consider these error pages as cloaking and treats the error as any other error page.
Using dynamic rendering to serve completely different content to users and crawlers can be considered cloaking. For example, a website that serves a page about cats to users and a page about dogs to crawlers can be considered cloaking.”
The admin’s intentions is the difference between dynamic rendering and cloaking.
How To Implement Dynamic Rendering
If you try implementing dynamic rendering on your own without technical knowledge, there’s a chance you’ll waste time and resources.
It’s best to hire an experienced developer team to set up a system that identifies search engine bots versus humans. This system will help determine what content to serve each agent’s request.
Here’s an overview of what to expect from Google.
- Install and configure a dynamic renderer to transform your web content from JavaScript into static HTML. Depending on your team's system of choice, some standard dynamic renders are Puppeteer, Rendertron, and Prerender.io.
- Choose the agent you want receiving your static HTML—search engine bots, in this case.
Here’s a sample code
export const botUserAgents = [
'googlebot',
'bingbot',
'linkedinbot',
'mediapartners-google',
];
- If your server slows down due to pre-rendering problems, or you see a high volume of pre-rendering requests, try implementing a cache of pre-rendered content.
- Next, try to determine the source of the user agent—mobile or desktop. Then, use dynamic serving to help provide the appropriate content to either user agent.
Dynamic Serving
Dynamic Serving is a setup where the server responds with a different HTML (and CSS) on the same URL depending on which user agent requests the page (mobile, tablet, or desktop).
- Configure your server to deliver static HTML to the crawlers that you selected. Depending on your technology, there are several ways you can do this; below are a few examples.
- Proxy requests coming from crawlers to the dynamic renderer.
- Pre-render as part of your deployment process and make your server serve the static HTML to crawlers.
- Build dynamic rendering into your custom server code.
- Benefit static content from a pre-rendering service to crawlers.
- Use a middleware for your server—a good example is Rendertron because it is an open-source solution based on headless Chromium.
Verifying Your Dynamic Rendering Configuration
After implementing dynamic rendering for your content, verify that it works. Do the following:
Mobile-Friendly Test
This is to see if your page is mobile-friendly based on your new implementation.
Visit Google's mobile test page and insert your address.
If you get a success message, your website works for users on mobile. Here’s a visual representation of our website testing out the tool.
URL Inspection Tool
Use the URL Inspection Tool to ensure that your desktop content is also visible on the rendered page. Once crawled and indexed, the rendered page is how the Googlebot sees your page.
Rich Results Test
The Rich Results Test is mainly for websites with structured data. This is to ensure your structured data renders appropriately.
Using Edgemesh Client As A Dynamic Rendering Solution
Edgemesh’s cloud software solution approaches converting complex enterprise websites into static HTML for bots and creates pre-rendered cached structured data of each page on the client side. This way, we can give your users the optimum browsing experience and bots a seamless crawling and indexing process.
Wrapping Up
We introduce dynamic rendering to enable the easy crawling and indexing of web pages per the searcher’s intent. Understanding this gives you the insight on how best to pinpoint issues when search engine bots aren't indexing your pages.
JavaScript-heavy websites are typically difficult to crawl. If your business is dependent on JavaScript, you should opt for dynamic rendering today.
That brings an end to today’s topic, but not the end of the conversation. If you have any questions relating to dynamic rendering, feel free to talk to us about it.
If you enjoyed this article and would like to see more of our content, please feel free to take a look at some of the ones below.
- Complete Guide On Time To First Byte (TTFB)?
- Complete Guide On First Contentful Paint (FCP)?
- All You Need To Know About Total Blocking Time (TTB)
- What Is Start Render Time, and How Do You Improve It?
- The Complete Guide To Google’s Core Web Vitals: Largest Contentful Paint (LCP)?
Do customer experience, good conversions, low bounce rates and overall, speed matter to you? Then you’ll love Edgemesh’s Enterprise-Grade Web Acceleration.
Our intelligent, automated, and next-generation client-side caching readies your website to move at full speed—with just a single line of code. Plus, it takes under 5 minutes to set up.
What do you say?
Start your 14-day trial to get a feel for what speed means to your business.