I recently created a trivial demo SPA (single page app) with Ember.js. I wanted to see how Google might naturally crawl my site and index it for Google Search results.
My website is using a very unique and nonsensical name that overlaps with no other search results, so it was easy to search on and see what was and was not indexed. Initially, Google had no search results for my site’s name.
I created a Google Webmaster entry for my trivial site and used the
Fetch as Google tool to
Fetch and Render my site.
I also clicked
Request Indexing so that Google would
Crawl this URL and its direct links.
I waited a few days to see how Google had indexed my content.
The home page was indexed, as was the
/api page, but some of my content was not searchable and did not seem to be picked up by Google at all. Specifically, I noticed the
/about page was not indexed, which seemed odd since it was directly linked from my home page.
robots.txt allows all, so nothing should be blocked for Googlebot. And although I have no Sitemap, which would “…inform search engines about URLs on a website that are available for crawling.”, my content had a number of links that should have been crawable by Googlebot.
I looked at the result of Google’s Webmaster Crawl Tool to
Fetch as Google my
/ home (index) page URL.
It was at this point I realized that Google’s crawl process was not seeing my nav bar. The navigation was completely missing from Google’s crawl and render of my site.
The result of that crawl request does not show my nav bar even though I can see the nav bar in my modern browser. Google states that “This is how Googlebot saw the page:” and “This is how a visitor to your website would have seen the page:“. See here a screenshot of the result.
That same navigation bar in my single page app renders without issue in production in Safari, Firefox, and Chrome. See here a screenshot of what the home page (
/) looks like in Chrome
62.0.3202.94 (Official Build) (64-bit). Note the nav bar at the top.
At this point I was perplexed. Clearly Googlebot was having an issue with an aspect of my site, but I had a hard time understanding why Google would vary so significantly from what I could see in my browser. If I can see the nav bar in my modern browser, why can’t Googlebot? Why is the navigation bar not rendered by Google’s crawl process?
Why is the navigation bar not rendered by Google’s crawl process?
The nav bar had a
linear-gradient css rule. If that rule is removed, the nav bar renders fine when using the
Fetch and Render tool in the Google Webmaster Search Console. It is interesting to note that this one CSS rule seems to prevent the entire navigation DOM element from rendering properly by Googlebot.
So, this does not seem to an SPA/AJAX-driven site problem, but rather a CSS problem for Googlebot’s rendering process.
Once I removed the
linear-gradient and changed the
background to a simple color, Googlebot could see my nav bar. Though, I must admit I do not understand why there is a discrepency between the section stating “This is how Googlebot saw the page:” and “This is how a visitor to your website would have seen the page:”
If I can see the nav bar in my modern browser, why can’t Googlebot?
I initially assumed my nav bar issue was somehow because I was using an SPA despite Google saying the following regarding an AJAX-powered SPA like the one I am using.
Based on Google’s statement about crawling, I had assumed that if I can see the content with my human eye in my modern browser for my website, then Google’s crawl process via Googlebot should be able to render and analyze that content too, but clearly that is not the case. It has nothing to do with my site being an SPA. Rather, it is because my site was using a CSS rule that did not play well with Googlebot.
Googlebot is based on an older build of Chrome. Specifically, a build of version 41. It is possible to install Chrome 41 on a compatible OS. I did so, and I do see the nav bar renders fine on a desktop with Chrome 41 and works as expected. See here the nav bar rendering fine in Chrome
Although my site works fine on Chrome 41, the issue with my site and Googlebot seems to be a case where it must be noted that Googlebot’s web rendering service (WRS) is not the same as Chrome.
Other articles highlight this discrepency.
From what we noticed, Google Search Console renders CSS a little bit different than Chrome 41. This doesn’t happen often, but as with most tools, we need to double check whenever possible.
Although the immediate fix for this was updating my CSS, my fundamental confusion and interest is why Googlebot does not behave like a modern browser for this CSS rule. The CSS works fine in Chrome 41, so why not in Googlebot?
I do not have a specific answer to that question aside from the fact that Googlebot is not the same as Chrome. It does not render content the same, and it is clear to me now that modern CSS can cause issues for Googlebot and prevent content from being crawled.
Test your site against Google’s Webmaster Tools early and often.