Top Insights from “The Limits of Search Engines & How We Can Overcome Them”

Top Insights from “The Limits of Search Engines & How We Can Overcome Them” - Hero Image

On November 30th, 2021, Bartosz Góralewicz and Tomasz Rudzki sat down with Fabrice Canel, the Principal Program Manager at Bing, to discuss the challenges search engines face right now – how to keep crawling and indexing the web that’s constantly getting heavier and bigger?

If you want to know how the whole conversation went, you can watch the webinar here. 

In this article, I want to present the most exciting insights and announcements from the webinar.

IndexNow helps optimize the crawl budget

The conversation started with a popular topic in the SEO industry right now – what problem is IndexNow trying to solve?

Search engines have very little clue what’s going on on the internet. So they need to keep crawling the pages to discover if there’s any updated or new content. As a result, they might waste the crawl budget on low-quality pages or pages that haven’t been updated.

We believe it’s good for the industry to change this model. It will help you to get the latest content indexed in search engines, and you can help us to minimize crawling, which is good for the whole industry.
source: Fabrice Canel

Search engines don’t know when you’re going to post your next blog article – but you do! Notifying them about new content on your site brings mutual benefits – your content can appear quicker in SERP, and search engines can reduce the amount of resources they need to crawl and index a website.

The solution for pinging the search engines is not a new idea. A few years ago, webmasters were able to submit URLs to search engines. The problem was that it was easy to just spam search engines with unnecessary submissions, so it needed improvement. 

Compared to the old solution with a simple ping sent to the search engines, IndexNow adds a layer of trust. You need to generate an API key and host it on your website. This way, search engines can verify that the submitted URL belongs to you, minimizing the noise in the resulting data. 

Sitemaps are not enough

If you’re wondering about why well-optimized sitemaps aren’t enough to support crawling and why we still need IndexNow, then the answer is: sitemaps are not enough!

In general, Bing will try to download sitemaps more or less once a day. […] But the internet is more than once a day. You don’t want to wait one day when you have a press release coming. Maybe you need to notify the search engines immediately.
source: Fabrice Canel

So IndexNow helps you push your content to search engines when it’s time-sensitive. This is especially important for news websites, but it’s particularly important for any website to get new content indexed as quickly as possible.

Additionally, sitemaps are great for listing the URLs but not so much for updating the content. The lastmod tag is often being abused and set up incorrectly.

IndexNow is less noisy than sitemaps. Fabrice stated that the data coming from IndexNow is very promising, and it proved to guide the bots to high-quality content and minimize crawling.

One of the things to remember is that if you’re going to submit your URL even though it hasn’t changed, Bing might lose the trust and stop coming to crawl the page after you use IndexNow.

All this doesn’t mean that you should stop using XML sitemaps. Fabrice pointed out that sitemaps work particularly well when paired with using IndexNow – it’s not like you should use one or the other.

Cloudflare is now supporting IndexNow

During the webinar, Fabrice announced that Cloudflare is now supporting IndexNow.

Cloudflare has a feature called Crawler Hints, which provides data for search engines about changes in your content. And now, Crawler Hints support IndexNow and can directly notify search engines about any new, updated, or deleted content via this new protocol.

More than 60,000 websites have already tuned on the ability in Cloudflare to support IndexNow. It means billions of URLs per day that are coming to tell us what changed on the internet. […] It brings benefits for Bing and other search engines, as well as website owners.
source: Fabrice Canel

All you need to do is enable Crawler Hints on your website with one simple click, and Cloudflare will take care of the rest! You can find more information on how to turn on Crawler Hints here.

IndexNow doesn’t give a ranking boost

Bartosz asked Fabrice about the fact that many domains that started using IndexNow seem to get boosted traffic from Bing now. But no, IndexNow doesn’t give you any ranking boost. 

We are not prioritizing based on IndexNow. It’s not a ranking signal. Enabling IndexNow won’t help you rank higher.
source: Fabrice Canel

However, with the amount of Internet content that’s being continuously created, the sooner you get indexed, the more chances you have to appear higher in SERPs. 

It’s especially important for sites with news sections. It’s crucial to be indexed as soon as possible. If search engines are late with crawling and indexing your content, you might fall behind your competition.

That’s why websites that use IndexNow might do better in SERPs. Webmasters know best when their content has changed and can notify search engines to crawl and index the page quickly. 

IndexNow frees up resources needed for rendering

As Fabrice mentioned in the webinar, rendering is vital for search engines to understand the content. That’s why Bing strives to render as much content as possible. 

By adopting IndexNow, you can help search engines free up the resources for crawling additional content, like JavaScript, necessary for rendering the page. 

Additionally, the rendering of the website depends on its size. Search engines might do well with rendering if you have only a few pages. Still, if you’re a big eCommerce website with millions of pages full of JavaScript-rich content, their resources might not be enough to render everything. In this case, it’s crucial to notify the search engines bots about what they should crawl and help them save the crawl budget.

In a few years, we want to be able to render every content on the internet. However, be reasonable with creating your sites – if customers need to click on something on a page to access some content, search engines will need to do the same, but they are not smarter than customers yet. Ensure all content on a page is displayed on it immediately and don’t hide it.
source: Fabrice Canel

Future of search engines collaboration

In the past, search engines collaborated on essential initiatives like or the robots.txt protocol. 

During the webinar, Fabrice was asked about potential areas where such cooperation may occur in the near future. He pointed to the issue of hreflang tags and how they are responsible for generating trillions of redundant URLs that generate waste for search engines and the internet overall.

Fabrice said it’s his desire to work with other search engines on finding new ways to target international markets that would also allow search engines to consolidate signals to a single page for one language. This would have to involve a solution for website owners to direct users of the same language to content aimed at specific markets (e.g., Spanish speakers in Argentina and Uruguay would land on the same page and then be directed to content specific to Argentina and Uruguay).

Fabrice summed up that there should be no need to optimize your website for specific search engines in the future. Instead, search engines should follow common protocols and create solutions to improve the current limitations and benefit the industry.