From Zero to Hero – Why is my Site Invisible in Google?

The Best Trick of All

Let me present you my greatest trick, the best of them all, my grande finale! I called it THE VANISHING WEBSITE!

Okay, let’s be serious for a moment. Have you ever wondered what might make your site be totally invisible in search results? Of course, you have! You obviously want valuable traffic on your website and are worried about all the ways that you can lose it.

I’ll try to show you a few things you should look at that might be the reason why your site doesn’t appear in search results or is performing really, really poorly in them.

Let’s get down to business!

Rendering Problems

It’s obvious that you want your site to be as modern as it can be. To do that, you’re willing to use the latest technologies, crazy JavaScript frameworks, and other cool stuff. It sounds great, but it might also be a trap.

Someone once said, “With using great frameworks comes great responsibility.” Or something like that.

You can not forget about Googlebot when you use frameworks like Angular, Vue.js, React, and many more. Googlebot is not as trendy with technology as you and your site might be.

The main reason is that Googlebot uses a 3 year old browser – Chrome 41 – to render all websites, yours included.

If your site is Javascript heavy, Googlebot might have some problems with rendering it properly.

So how can I check if my site is properly rendered by Googlebot? There are a few things you can try:

  • Chrome 41 – as I said earlier, Chrome 41 is a version of the Chrome browser that Google uses to render websites. Because it’s 3 years old (as of this writing), it obviously is not 100% compatible with all of the cutting-edge JS technologies (i.e., it doesn’t support ES6). You can download Chrome 41 and check for yourself if your site is loading properly in this browser. For more information on Chrome 41 you can check our R&D Specialist Tomek Rudzki’s great article on this topic.
  • Fetch and Render/Fetch as Google – thanks to this tool you can also check how your site looks when it’s rendered by Googlebot.

Fetch as Google offers a comparison between the version of your site that users get and the version that Googlebot receives.

More information on Fetch as Google can also be found in Tomek’s previously mentioned article. This tool and Chrome 41 should be the first things you grab when you want to check your site’s rendering.

  • Mobile Friendly Test and Rich Resultsboth of these tools provide you the DOM (Document Object Model) that Googlebot receives when it gets to your site. DOM is like the next evolution of your web page’s source code.

There is a big difference between source code and DOM. In DOM, HTML code is parsed, and JavaScript is executed, which results in creating the DOM. You can say that the source code is the list of ingredients, and the DOM is the baked cake.

With Mobile Friendly Test and Rich Results you can, for example, see if Googlebot can see all of your internal links. You want your products to be crawled by Googlebot so you’ll be able to see if Googlebot can see all of the links.

Thanks to these tools, you can see if Googlebot is able to properly execute the JavaScript that is implemented on your website.

If your site is JavaScript-heavy and you actually find some rendering problems, you can always try to fix it using server-side rendering or prerender your site using external tools like

Thanks to prerendering, all of the complex JS files are executed before your site is viewed by Googlebot. Googlebot will receive only static HTML and resources that are completely understandable for it and will be able to see the beauty of your website.

JavaScript might make your site invisible, but there is always a way to fix it.

Page Loading Animation

This point is related to rendering problems. Page Loading Animation is a small thing that can cause massive problems with your site’s visibility. If you’re using JavaScript page loading animation that covers the entire page before it’s 100% loaded, Googlebot might not be able to see any of your pages because of that.

As a perfect example, I can show you the Searchmetrics visibility chart of one of our clients. Their site was optimized properly and contained valuable content, but none of their pages were ranking high in search results. All of this was caused by the JavaScript page loading animation that was covering the whole page until it was 100% loaded.

This loading cover was present only for a fraction of a second and was barely visible to users, but it was too much for Googlebot. Googlebot saw only the blank page with the loading animation on the whole site.

So we deleted the loading cover, and look what happened!

Making sure that Googlebot is able to render your site properly is what you should check first when you’re looking for a reason why your site is invisible in search results.

Black Hat and Penalties

Black hat SEO is bad for your site! Who would have thought?!

There are many different things that are interpreted as black hat SEO like cloaking, keyword stuffing, backlinks from spammy sites, and so on. All of these things might be the reason why your site is invisible in search results or might cause deindexing some part of your website.

Some black hat activities worked pretty well a couple of years ago, but Google is getting smarter and smarter with distinguishing good SEO from bad SEO. And when Google finds that your site contains some black hat, that might hurt you a lot.

Welcome to Google Penalty World!

As you can see on a chart above, right after the Penguin 4.0 update, the visibility of this site dropped significantly due to the spammy link building that was provided. Google penalties are real, and they hurt!

A great example of the results of “not so white hat” SEO can be found in Tomek’s analysis of the Giphy visibility drop.

Overall, learn what black hat SEO is, and don’t try to trick Google by using it because you’ll probably regret it.

Massive Technical Problems

I think this point is pretty self-explanatory. If your site contains a massive amount of technical issues, there is no way it will perform well in search results.

If you index a lot of garbage, auto-generated, or duplicate content, or your valuable content is dug deep due to poor Information Architecture, or you’ve got a load of 404s on your site, and most of the internal linking on your site give us something like this:

and on top of that, your website needs like 40 seconds to load every page.

I’ve got no questions why your site is performing poorly… Google is not a fan of these kinds of websites. Do something about that. Now!

Proper technical on page optimization might save your butt and let your website grow to its full potential.

Stupid Mistakes

This paragraph wouldn’t be here if I hadn’t seen all of these for myself.

Your site might be invisible in SERPs because, for example, you blocked your whole site in robots.txt by accident. Or maybe you made a mistake with the canonical tags on your website. Maybe they’re pointing to a single URL, for example, your homepage. Or maybe you deployed your new site, but for some reason, there are canonical tags to the password protected staging version of your site.

I know it sounds ridiculous, but it happens. No more words needed.

We’re all people. We all make mistakes.

What Should I Do?

I’ll try to wrap up couple of things you can do to get your site out of the Invisible Zone. Let’s divide it into two categories Technical Optimization and Content Optimization.

Technical Optimization

I would divide it into two parts:

  • Fixing – that’s pretty self-explanatory – just fix your website, right? I’d like you to focus on things that are very important for SEO, which were listed in the Massive technical problems section.

    Make sure your site is rendered properly by Googlebot, optimize your site’s loading time, perform a proper link audit for your website, fix all smaller technical issues that have a negative influence on your visibility and you’re good to move on to the second point.


  • Developing – in this point I want to highlight two major activities – Keyword Research and Information Architecture Development. Both of these things are dependent on each other, and they are crucial to develop the structure of your website.

    Thanks to proper Keyword Research, you can get to know a load of valuable key phrases that your site can rank for. And when you get knowledge about new key phrases, you may also get inspiration and ideas with which you can develop your site’s Information Architecture. You can create new categories for your products, new content for your blog, and much more.

    If you want to know more about both KR and IA, you can check our Senior SEO Specialist Kamia Spodymek’s article on Information Architecture and my article on Keyword Research.

Content Optimization

  • Make good unique content – simply saying: create content for your niche, for your audience, and with content that is unique. Avoid duplicating content from other sites like the plague. Content is king, that’s what they are saying, right?


  • Attack those long tails – this is one thing that I observed particularly on one site I was working on. It was a brand new site, and most of the clicks they gained were from very specific long tail phrases for which they were creating articles which rank in the top 3 in search results. But how was it possible?

    It’s easier to achieve higher rankings in search results for long tail phrases because they’re not as competitive. Not that many strong sites try to rank for them. It’s also worth noting that long tail phrases are responsible for 80% of searches on the web, so they can bring you REAL TRAFFIC. That’s the way to get your first clicks!


  • Link Building – it’s really important to elaborate a good link building strategy. There are many unethical and short term search engine backlinking strategies that might result in ranking penalties.

    That’s why it’s important to have a high quality and thoughtful link building strategy. To do that it’s important to create connections with trusted influencers in your niche that produce thematic content that resonates with your readers. As a result of a white hat link building strategy, your website will gain more traffic, raising your search engine rankings.


First clicks, first users, AND… POOF! There is your traffic!

By implementing the above steps where appropriate, your site shouldn’t be invisible in search results anymore. So you, your visitors, and of course the one and only one Googlebot can appreciate it in its whole glory! Enjoy your traffic!