What does that mean?
For optimal rendering and indexing, you should make sure that Googlebot has access to the JavaScript, CSS, and image files that your pages use. Disallowing crawling of Javascript or CSS files in your site’s robots.txt directly will harm how well your algorithms render and index your content. This will most likely result in suboptimal rankings.
Advice for optimal indexing
Google indexing systems have been strongly attached to the text cached version (Lynx style). Now that indexing is based on page rendering, that’s no longer accurate. The new system more closely resembles a modern web browser. Accordingly, Google recommends the following tips:
1. Ensure that your web design follows the principles of progressive enhancement. This will help Google’s systems see usable content and basic functionality even when certain web design features are not yet supported.
2. Work hard to make your pages render quickly. This is beneficial for both user experience and efficient indexing. You should aim to:
– Get rid of unnecessary downloads
– Merge your separate CSS and JavaScript files and configure your server to serve them compressed
You’ll also have to check that your server can handle the additional load Google has also updated the “Fetch and Render as Google” feature allowing you to see how their systems will render the page. As a result of this, you’ll be able to identify a number of indexing issues such as improper robots.txt restrictions and redirects that Googlebot cannot follow.
For more information, visit the Google Webmaster Guidelines directly.