The Google Webmaster Guidelines just got a big update.
Following last week’s revamp of the Webmasters page, the new Google Webmaster Guidelines page has been updated for the changing user base. The Guidelines are divided into two sections—General and Quality Guidelines—that offer a simple list of best practices to follow.
Although the Quality Guidelines have been pretty much untouched, there are many updates to the General Guidelines section.
Here’s what’s new and what’s changed.
The New General Google Webmaster Guidelines
This section now features three drop-down menus that go into how Google finds your pages, how Google understands your pages, and how visitors use your pages. Changes are in bold, our emphasis.
Help Google find your pages.
“Ensure that a link from another findable page can reach all pages on the site. The referring link should include either text or, for images, an alt attribute relevant to the target page.”
This part used to talk about static links, but Google has new ways of linking to sites, such as from plain text. Also, note the part about image alt attributes. This is part of Google’s big push for website accessibility. And relevancy is always essential.
“Provide a sitemap file with links pointing to your site’s important pages. Also, provide a page with a human-readable list of links to these pages (sometimes called a site index or site map page).”
A sitemap helps Google’s spiders understand how your site is organized. However, the new Google Guidelines require you to have a sitemap for users to read.
“Limit the number of links on a page to a reasonable number (a few thousand at most).”
A couple of years ago, Google removed the 100 links-per-page guideline in favor of the vaguer “reasonable number.” Now that has been clarified to “a few thousand at most.”
“Use the robots.txt file on your web server to manage your crawling budget by preventing the crawling of infinite spaces such as search result pages. Keep your robots.txt file up to date. Learn how to manage crawling with the robots.txt file. Test the coverage and syntax of your robots.txt file using the robots.txt testing tool.”
Google’s been getting a lot more specific about how you should submit your robots.txt file and what you should put in it. This update offers you a little more direction.
Help Google understand your pages.
“Ensure that your <title> elements and alt attributes are descriptive, specific, and accurate.”
Google added “specific” to provide extra guidance on what you should be doing with your alt attributes.
“Design your site to have a clear conceptual page hierarchy.”
A small change in wording here—”design your site” from “make your site.” This does emphasize designing your website to be as organized as possible.
“Follow our recommended best practices for images, video, and structured data.”
You’re already used to following Google’s guidelines, so here are more for you to “follow.” That’s a much more direct suggestion than “read” them, as it used to say.
“When using a content management system (for example, Wix or WordPress), make sure that it creates pages and links that search engines can crawl.”
This change references Wix and WordPress since they are two popular CMS platforms.
“To help Google fully understand your site’s contents, allow all site assets that would significantly affect page rendering to be crawled: for example, CSS and JavaScript files that affect the understanding of the pages. The Google indexing system renders a web page as the user would see it, including images, CSS, and JavaScript files. To see which page assets Googlebot cannot crawl or to debug directives in your robots.txt file, use the blocked resources report in Search Console and the Fetch as Google and robots.txt Tester tools.”
The only change here is that this guideline now specifies CSS and JavaScript as files that could affect your page being crawled.
“Allow search bots to crawl your site without session IDs or URL parameters that track their path through the site. These techniques are useful for tracking individual user behavior, but the access pattern of bots is entirely different. Using these techniques may result in incomplete indexing of your site, as bots may be unable to eliminate URLs that look different but point to the same page.”
This section was expanded to explain the reason behind not using session IDs and URL parameters.
“Make your site’s important content visible by default. Google can crawl HTML content hidden inside navigational elements such as tabs or expanding sections. However, we consider this content less accessible to users and believe you should make your most important information visible in the default page view.”
We bolded the first part for irony. The new Webmaster Guidelines page has content hidden by drop-down menus. Although Google can crawl content hidden by tabs or expanding sections, the best stuff should be in the open. We’re guessing that “Follow these guidelines” is what Google means.
“Make a reasonable effort to ensure that advertisement links on your pages do not affect search engine rankings. For example, use robots.txt or rel= “nofollow” to prevent a crawler from following advertisement links.”
Google changed this to a specific solution to avoid “affecting search engine rankings.”
Help visitors use your pages.
“Try to use text instead of images to display important names, content, or links. If you must use images for textual content, use the alt attribute to include a few words of descriptive text.”
Here’s the mention of alt attributes again. You already know not to expect Google to pick up text within images, so alt attributes serve a double purpose here—getting text read by Google and adding accessibility.
“Ensure that all links go to live web pages. Use valid HTML.”
The wording changed from “check for broken links and correct HTML.” Google Search Console will help find broken links, so it’s on you to make sure your links are going to pages that can be accessed. There is no hidden content here!
“Optimize your page loading times. Fast sites make users happy and improve the overall quality of the web (especially for those users with slow Internet connections). Google recommends that you use tools like PageSpeed Insights and Webpagetest.org to test the performance of your page.”
Again, Google offers a specific thing you can do to make your site faster for the user’s benefit.
“Design your site for all device types and sizes, including desktops, tablets, and smartphones. Use the mobile-friendly testing tool to test how well your pages work on mobile devices and get feedback on what must be fixed.”
This change reflects the recent move away from desktops and toward mobile devices.
“If possible, secure your site’s connections with HTTPS. Encrypting interactions between the user and your website is a good practice for communication on the web.”
This is a brand new one, and it’s in line with Google’s desire to keep its users safe while they search. That it’s in the official Guidelines says quite a bit.
“Ensure that your pages are useful for readers with visual impairments, for example, by testing usability with a screen reader.”
Here’s another new one that stresses making your site accessible.
How Should You Follow the New Guidelines?
Let’s be honest here; the Guidelines aren’t merely suggestions. Google even removed the language that used to say, “Even if you choose not to implement any of these suggestions,” leaving only the “We strongly encourage you to pay very close attention to the Google Webmaster Quality Guidelines.” This indicates that you should follow the Webmaster Guidelines in the letter.
Some guidelines were removed altogether. The one that warned that not all search engines could crawl dynamic pages is no longer (since dynamic pages are pretty standard now). But the Quality Guidelines were untouched, save for added emphasis on “Avoid the following techniques.”
All these updates show the best methods you should use for both Google and users.
Read an excellent breakdown of the changes at The SEM Post, or check out the Webmaster Guidelines here.
- About the Author
- Latest Posts
Dalton Grant, a seasoned veteran with over 15 years in the Internet Marketing industry, brings extensive experience and dynamic leadership to SEO Inc. He initially served in various roles within the company, displaying his multifaceted talent and versatile expertise. Today, he holds several critical roles at SEO Inc., including Senior SEO Analyst, Linking Director, Brand Specialist, and Google Link Penalty Guru.
Mr. Grant boasts comprehensive knowledge of our AI SEO and content platform, having been instrumental in its integration since its client roll-out in 2022. His constant drive for growth and innovation propels the company forward, and he consistently brings fresh, groundbreaking ideas to the table.