The Complete Guide to Dynamic Rendering & How It Impacts SEO
SEO goes beyond simply ranking on Google—it also extends into positioning your website to get ranked on google.
Starting with the basics, SEO comes in on-page (what you do on the web page to rank) and off-page (what you do outside the web page to rank). For both, you get to use different SEO tactics like topic clusters, creating pillar pages, keyword research, backlinking, alt-texts, meta-descriptions, and other things.
One thing we fail to discuss is how search engines like Google see our content. Google and the other big guys like Bing, and DuckDuckGo, can easily show your website on their SERPs because they crawl your website.
Here’s the kicker: not all websites can be crawled.
So, what do you do when you have excellent content, all your SEO checkboxes in place, but you’re not showing up for any organic search?
That’s where Dynamic Rendering comes in.
Dynamic rendering is creating a version of your content specifically for search engine bots and creating another for users.
This is the technical part of SEO few people address. In today’s post, we’ll cover all you need to know about dynamic rendering and how it directly impacts your SEO.
What is Dynamic Rendering?
Dynamic rendering is the process of sending the server-side rendered version of your website to search engine bots and the client-side version to users.
In other words, it’s a pre-rendering of content on a web page for search engines crawl bots such as Google bots, Duck Duck Go bots, Bing Bot, and more.
This technique approaches rendering content based on the agent calling it—users or a search engine bot.
This method is one of the advanced SEO tips endorsed by Google.
Why Does Google Recommend Dynamic Rendering?
On the implementation of Dynamic rendering, Martin Split said:
The main issue with bots and users is rendering—having a common interest is the solution here.
What is Rendering?
Rendering, in this context, is the process where Googlebot retrieves your pages, runs your code, and assesses your content to understand the layout or structure of your website.
Here is a visual representation of the rendering process.
The rendering process for web pages occurs in two phases:
- Initial HTML
This process occurs as a response from the server as Googlebot crawls the entire page to identify JS and CSS files present to build the page.
During this rendering process, Google collects all vital information to allow them to determine if your content correlates with searchers’ intent, and your position compared with other websites in the same niche (this is the SEO part—we’ll come back to this later).
There’s a rule the Googlebot follows:
“Google will not index a content it can’t render.”
This raises the next question.
How Does Google Index My Website?
Before the Googlebot reports back to Google and indexes your website, it has processed your plain HTML page in two steps, namely, crawl and index.
The crawl process works by first crawling the HTML on a web page. Then it reads all text and links (internal and external), and identifies content keywords to give meaning to what the page is about.
After the crawl process, the Googlebot then indexes the website, appearing on SERPs.
For Google, Bing, and every other search engine, rendering content in static HTML is best.
Dynamic rendering is the go-to solution for most, if not all, websites.
Dynamic Rendering for Bot and Users
What Type of Websites Can Best Use Dynamic Rendering?
Another use case is companies struggling with crawl-budget issues—primarily large websites. An excellent example is e-commerce companies with an ever-changing inventory, catalog, user preference, user location, and more.
An advantage of large websites using dynamic rendering is the simplicity of deployment compared to server-side rendering.
If you’re thinking of opting for dynamic rendering but aren’t convinced, below are a few questions to help.
- Is the content on your website dynamic in its original form, i.e., does it change rapidly?
- Is your engineering team limited, and will implementing server-side rendering be a hassle or out of budget?
- Is the website you’re planning on implementing dynamic rendering indexable?
- Are you facing crawl-budget issues due to how large or dynamic your website content is?
- Is the load on your server exceeding the recommended bandwidth, or are you facing related issues affecting your content rendering dynamically?
If you answered yes to one or more of the questions below, then Dynamic Rendering may be your best solution.
What Problems Can Dynamic Rendering Solve?
Implementing dynamic rendering allows search engines to crawl your website faster and more easily. Another big one is page speed. Page speed is one of the significant factors both users and bots agree is key in estimating a website’s performance.
If your page speed is slow, according to Google, anything over 3-5 seconds is slow, then users will likely become frustrated and leave the website—hence making them “bounce.”
For bots, this means they won’t have enough time to crawl your web pages and index them all successfully. Without indexing all your pages, appearing on SERPs becomes next to impossible.
Since dynamic rendering is a solution for search-engine bots, the SEO benefits and impacts are based on whether your web page shows up on SERPs.
Dynamic Rendering In SEO: How Does It Help?
The fundamental basis of dynamic rendering is to fix web pages not showing up on SERPs.
Users complained that Google isn’t indexing all or some of their pages—affecting their SEO, as they can’t rank.
Starting with crawl budgets, major search engines like Bing and Google have limited time their bots can spend on an entire website. They set up a time limit called the crawl budget and it varies per website.
According to Google,
“The amount of time and resources that Google devotes to crawling a site is commonly called the site’s crawl budget.
Note that not everything crawled on your website will necessarily be indexed; each page must be evaluated, consolidated, and assessed to determine whether it will be indexed after it has been crawled.
Crawl budget is dependent on two main elements: Crawl Capacity and Crawl Demand.”
So far, we know only Google’s formula for crawl budget:
Crawl Budget = Crawl Capacity + Crawl Demand
Crawl capacity depends on:
- The speed of your page(s)
- The number of errors the bot runs into
- The limit set in Google’s Search Console
Crawl demand depends on:
- The popularity of your pages
- The time of publishing or when it was last updated
Improving your SEO is easy once you figure out all these key factors influencing your crawl budget.
Why’s it so important?
Googlebot crawling your website is an essential part of your SEO. SEO is simply your process of ranking on the SERPs to get traffic to your website. Before you can show up on SERPs, your site needs to be crawled and indexed. If the Googlebot isn’t crawling and indexing your website, your SEO is useless.
The goal of dynamic rendering is serving a version to the bot and another to humans. So you may wonder:
Is Dynamic Rendering The Same As Cloaking?
Dynamic Rendering is not the same as cloaking.
If you intend to use dynamic rendering is to serve a different version of your page to bots and another to humans—then it’s cloaking.
Google considers dynamic rendering not to be cloaking.
Here’s a statement from Google.
“Googlebot generally doesn't consider dynamic rendering as cloaking. So as long as your dynamic rendering produces similar content, Googlebot won't view dynamic rendering as cloaking.
When you're setting up dynamic rendering, your site may produce error pages. Googlebot doesn't consider these error pages as cloaking and treats the error as any other error page.
Using dynamic rendering to serve completely different content to users and crawlers can be considered cloaking. For example, a website that serves a page about cats to users and a page about dogs to crawlers can be considered cloaking.”
The admin’s intentions is the difference between dynamic rendering and cloaking.
How To Implement Dynamic Rendering
If you try implementing dynamic rendering on your own without technical knowledge, there’s a chance you’ll waste time and resources.
It’s best to hire an experienced developer team to set up a system that identifies search engine bots versus humans. This system will help determine what content to serve each agent’s request.
Here’s an overview of what to expect from Google.
- Choose the agent you want receiving your static HTML—search engine bots, in this case.
Here’s a sample code
export const botUserAgents = [
- If your server slows down due to pre-rendering problems, or you see a high volume of pre-rendering requests, try implementing a cache of pre-rendered content.
- Next, try to determine the source of the user agent—mobile or desktop. Then, use dynamic serving to help provide the appropriate content to either user agent.
Dynamic Serving is a setup where the server responds with a different HTML (and CSS) on the same URL depending on which user agent requests the page (mobile, tablet, or desktop).
- Configure your server to deliver static HTML to the crawlers that you selected. Depending on your technology, there are several ways you can do this; below are a few examples.
- Proxy requests coming from crawlers to the dynamic renderer.
- Pre-render as part of your deployment process and make your server serve the static HTML to crawlers.
- Build dynamic rendering into your custom server code.
- Benefit static content from a pre-rendering service to crawlers.
- Use a middleware for your server—a good example is Rendertron because it is an open-source solution based on headless Chromium.
Verifying Your Dynamic Rendering Configuration
After implementing dynamic rendering for your content, verify that it works. Do the following:
This is to see if your page is mobile-friendly based on your new implementation.
Visit Google's mobile test page and insert your address.
If you get a success message, your website works for users on mobile. Here’s a visual representation of our website testing out the tool.
URL Inspection Tool
Use the URL Inspection Tool to ensure that your desktop content is also visible on the rendered page. Once crawled and indexed, the rendered page is how the Googlebot sees your page.
Rich Results Test
The Rich Results Test is mainly for websites with structured data. This is to ensure your structured data renders appropriately.
Using Edgemesh Client As A Dynamic Rendering Solution
Edgemesh’s cloud software solution approaches converting complex enterprise websites into static HTML for bots and creates pre-rendered cached structured data of each page on the client side. This way, we can give your users the optimum browsing experience and bots a seamless crawling and indexing process.
We introduce dynamic rendering to enable the easy crawling and indexing of web pages per the searcher’s intent. Understanding this gives you the insight on how best to pinpoint issues when search engine bots aren't indexing your pages.
That brings an end to today’s topic, but not the end of the conversation. If you have any questions relating to dynamic rendering, feel free to talk to us about it.
If you enjoyed this article and would like to see more of our content, please feel free to take a look at some of the ones below.
- Complete Guide On Time To First Byte (TTFB)?
- Complete Guide On First Contentful Paint (FCP)?
- All You Need To Know About Total Blocking Time (TTB)
- What Is Start Render Time, and How Do You Improve It?
- The Complete Guide To Google’s Core Web Vitals: Largest Contentful Paint (LCP)?
Do customer experience, good conversions, low bounce rates and overall, speed matter to you? Then you’ll love Edgemesh’s Enterprise-Grade Web Acceleration.
Our intelligent, automated, and next-generation client-side caching readies your website to move at full speed—with just a single line of code. Plus, it takes under 5 minutes to set up.
What do you say?
Start your 14-day trial to get a feel for what speed means to your business.