Googlebot remains one of the most powerful tools for search engine optimisation (SEO) and web development in search engines and web crawlers. To ensure smooth functioning and accurate indexing of websites, Googlebot has imposed a 15MB limit on subresources. What are the implications of this limit on SEO and web development, and the possible solutions you can adopt to avoid issues related to indexing and crawling?
The 15MB Limit on Subresources
Google's John Mueller recently confirmed that Googlebot stops processing additional subresources when encountering more than 15MB of data. This revelation came after webmasters noticed that their websites were not indexed properly, despite having any apparent issues with their pages. Subresources include images, JavaScript, CSS, and other files required to render a webpage fully. While the 15MB limit may seem generous, some websites, particularly those with large media, may exceed this threshold, causing Googlebot to skip the remaining subresources.
Impact on SEO
The implications for websites exceeding the 15MB limit can significantly harm their SEO performance. If Googlebot is unable to crawl all the subresources on a page, it may not index the content accurately, leading to a decline in search engine rankings. This creates a ripple effect that influences the website's overall visibility, impacting organic traffic and conversions. Additionally, Google's algorithm may need to assess the content's quality and relevance effectively, resulting in decreased visibility in search results. This can make it even more challenging for a website to maintain or improve its search engine ranking position in highly competitive niches.
Impact on Web Development
Web developers must now pay close attention to the size of their subresources to ensure optimal indexing and crawling. Websites that exceed the 15MB limit may need to be redesigned or optimised to fit within Googlebot's constraints, which requires a thorough understanding of the latest web development and optimisation techniques. This may involve compressing or resizing images, minifying JavaScript and CSS files, or even removing unnecessary content that does not contribute to the user experience. Sometimes, web developers might need to reconsider their design approach or prioritise certain features while balancing aesthetics, functionality, and search engine optimisation. Adhering to the 15MB limit ensures better compatibility with Googlebot, improving search rankings and a more seamless user experience.
Solutions for Web Developers
To avoid the negative impact of Googlebot's 15MB limit, web developers should consider the following strategies:
- Image optimisation: Compressing images and using appropriate file formats can drastically reduce the size of image files without sacrificing quality.
- Minification: Minifying JavaScript and CSS files removes unnecessary whitespace, comments, and characters, thus reducing their size.
- Content Delivery Network (CDN): Utilising a CDN can help distribute a load of serving subresources, leading to faster page loading times and potentially reducing the total subresource size.
- Lazy loading: Implementing lazy loading techniques can defer the loading of subresources until they are needed, allowing Googlebot to crawl more efficiently.
- Code splitting: Breaking up large JavaScript bundles into smaller chunks can help reduce the size of subresources and improve page loading times.
The 15MB limit on subresources set by Googlebot reminds web developers to create efficient and optimised web pages. By implementing best practices such as image optimisation, minification, and lazy loading, developers can ensure that their websites are properly indexed and ranked by search engines. In turn, this will lead to improved SEO performance and greater visibility in search results.