Can Google really see everything on your page? How Develocraft ended their SEO nightmare

quick summary
Develocraft – a digital product agency that recently launched their brand new website. They discovered that despite a solid user experience, Google’s crawlers were unable to render much of their on-page content, potentially devastating their SEO. Here’s their story of how Onely’s guidance helped them get back on track.

Why SEO matters

Fact; a study found that on average, moving up one spot on Google’s SERP (Search Engine Results Page) increased the click-through rate by 30.8%.

The same 2019 study by, analyzing 5,000,000 search results, also found that the number one organic result gets an average of 31.7% of click-throughs.

the number one organic click on Google gets over 30% of all clicks

If your business is like ours, you want to attract as many leads through your web presence as possible. That can make statistics like these equal parts informative and terrifying.

Why? As of February 2020, Google had a search engine market share of 92.07% according to Statcounter, which monitors aggregate data for over 10 Billion page views per month.

This means that a slight advantage on Google for your preferred keywords – which absolutely should be tied to the core products and services you offer – can mean hammering your competitors (assuming you can convert your traffic).

Background – Develocraft’s new site

At Develocraft, we recently undertook a total brand refresh. As part of this, we were excited to build a brand new website.

We chose to build it from the ground up ourselves with React, rather than rely too heavily on premade solutions like WordPress, for the following reasons:

  • We aren’t limited by the functionality and design limits imposed by certain platforms
  • It allows us to train our developers in a real environment on skills they may need for our clients
  • Our headless CMS allows us to alter any text on the website in a range of ways
  • It gives us the option to build our site into a customer data platform
  • We now have a testbed for experiments with web development practices that we can use on behalf of our clients in the future

Why Google doesn’t always index JavaScript-powered websites like ours

You can see why it’s so important for Google and other search engines to be able to see (and ‘index’) our content. But why couldn’t they? Here’s a blow-by-blow of some of the underlying factors.

Factors that put us at risk

  • Because we wanted a dynamic website, we made JavaScript (JS) the main programming language for our site.
  • As a JavaScript-focused site using React.JS, the majority of our case study pages weren’t built using static HTML. 
  • Although we had some HTML, this was restricted to a very small part of our content, such as section headings and footers.
  • Google’s current indexing process (how it finds and makes content available to searchers) is focused on HTML because it defines the actual content of the page.
  • Google’s bots – creatively named Googlebot – have a lot more to do when crawling a JS page. The process is more complex, and if rendering takes too long, they may simply skip the page. This can be due to exceeding your ‘crawl budget.’


empty Google cache for develocraft's page

What Google saw when looking at one of the case studies on our page, according to the public cache. Although the public cache may not offer a perfect view of what Google can index, it can be a helpful indication.

Common reasons Google can’t render a page

Those are the general causes. Think of the factors that put us at a higher risk of having this issue. However, the specific reasons Google might not render your pages properly can be harder to determine. 

Common causes include:

  • Timeouts while rendering
  • Errors during the rendering process
  • Googlebot blocked (usually unintentionally) from accessing important JS files by the website admin
  • The site structure didn’t allow Googlebot to discover the page

Why Google couldn’t see our content

There are various reasons why your JavaScript-dependent content might not be properly served to Googlebot. The way your server does this is a big determining factor in the type of issues you may face. There are two main ways of serving content to both robots and users:

  • Server-side rendering (SSR)
  • Client-side rendering (CSR)

You can find more detail about each of these in Onely’s guide to JavaScript SEO, but server-side rendering – which we use at Develocraft – is our focus here.

SSR involves supplying the bot with an HTML file describing all on-page content as soon as it arrives. Usually, this is the approach that leads to the fewest problems.

At Develocraft, our server-side rendering was blocked by a bug in the part of our JS script responsible for fetching data from the content management system (in our case – Strapi).

We’d recently introduced a new feature to help us manage the elements of our on-page content, allowing us to easily and flexibly integrate different elements into our case studies, articles, and blog posts.

Develocraft's tool for managing on-page content

Inside Apollo, our library solution, queries are created to fetch data flexibly via a heuristic fragment. Unfortunately, our first attempt at introducing this didn’t contain the required code to deliver the fetched information (such as paragraphs of content, images, and so forth) into the HTML received by Googlebot.

While trying to understand what had gone wrong, Onely’s Ultimate Guide to JavaScript SEO along with their WWJD tool were extremely valuable in helping to uncover our issues.

In particular, Chapter 3 helped us:

  • See if Google could technically render our website.
  • Identify possible reasons why it couldn’t.
  • Showed us how to use a range of tools to troubleshoot the issue (such as Google Search Console)

If you’d like more detail on how to discover similar problems on your own pages you can follow that link or read on to learn how to use Google Search Console to make sure you aren’t having the same problems.

Google Search Console showing Develocraft's URL available to Google

After making the adjustments needed, you can check back in with each tab in Google Search Console. Here you can see that Googlebot was able to render our content.

How to check if Google can index your pages

Google Search Console is invaluable for spotting problems with your page. Here’s how to check if Google can render your JS content, step by step.

Checking if Google can render your page

  1. Navigate to your Google Search Console (GSC) page.
  2. Go to the URL Inspection tool.
  3. Use the Live Test function.
  4. Click on the screenshot tab.
  5. Inspect the screenshot to see if Google renders your main content and any other important elements, like links, articles, and products.
  6. Check the HTML tab for the content, too.
Google Search Console showing a URL that's available to Google but has issues

What we saw when testing one of our case study pages following the above steps. Google could render the ‘fake content’, but that was it.

If you’re not seeing the content you want people to find when viewing a SERP, it’s time to take immediate action. 

Remember to check all pages on your site. In our case, the Home and Services pages were displaying correctly, but our case studies pages were not.

To be sure, perform these checks several times to make sure it’s not simply a hiccup in your server.

How to see what resources Google couldn’t load

There can be multiple issues with your pages, meaning it’s best to use all GSC features to get a comprehensive view of what’s holding your SEO back.

Clicking the More Info tab in the URL Inspection Tool will display a detailed list of the resources Google couldn’t load, giving you:

  • The number of the resources that couldn’t be loaded versus all page resources
  • The type of resource, e.g. script, font, or image
  • An identifying URL for each
Google Search Console showing the More Info tab for a URL with issues

Checking your HTML with Google Search Console

JavaScript is a valuable tool for web developers because it allows them to change the HTML on their pages dynamically. 

But what if your HTML isn’t being affected properly?

You can find out by taking the following steps:

  1. Navigate to your Google Search Console (GSC) page
  2. Perform a URL inspection – found in the bar on the left
  3. Click ‘View Tested Page’ – this will open a side window on the right displaying the on-page HTML Google could find
  4. Select some text from the page that you want Google searchers to be able to find, then copy it to your clipboard with CTRL + C or CMD + C 
  5. Go back to the GSC and check to see if the text is making it into the HTML as seen by the bots by performing opening up a search box with CTRL + F or CMD + F 
  6. Paste your copied text into the search box with CTRL + V or CMD + V. Your browser will highlight any matching results. You can also copy the HTML from GSC to work with it in your preferred text editor.
Develocraft's homepage with a highlighted crucial keyword

When checking to see if Google is picking up the most important content on your page it’s a good idea to pick keywords you want to rank for on SERPs.
Google Search Console showing a URL that's available to Google and the highlighted keywords visible for robots

Instead of scanning the HTML by eyeball, use a simple CTRL + F or CMD + F search to find out if your JavaScript really is properly manipulating your HTML.

Your next steps

If you discover that you have problems such as these or other technical SEO issues, you can schedule a fifteen-minute call via the Onely homepage. It only takes a moment but can be remarkably valuable to your commercial success.

If you’re interested in having a digital product such as a mobile or web app developed, at Develocraft we have a strong track record of helping founders get their ideas off the drawing board and onto people’s devices. You can find out more about us on the Develocraft website.

Author Bio

Alex Smithers helps Develocraft create content that startup founders, even non-technical ones, will find useful as they build their business. He currently serves as Develocraft’s Content Marketing Manager.

A huge thank you to Kamil Kozicki and Filip Szyler for helping keep this content accurate. 

Alex Smithers’ LinkedIn:

Develocraft LinkedIn:

Develocraft Facebook: