What is the Fastest Way to Deindex Pages on Google?

The fastest way to remove a page from Google’s index is with the URL removal tool through Google Search Console. But the device isn’t an efficient solution if you deal with a more significant site with many URLs. In these cases, John Mueller recommends keeping the URLs you’d like to deindex in your XML sitemap with the lastmod date reflecting the change in your meta robots tag values. 

  • Change to meta robots noindex
  • Use the lastmod date to reflect the change.
  • Update your XML sitemap.

None other than John Mueller, Webmaster Trends Analyst at Google, showed up to answer the question in the thread below. Here’s what he had to say:

“Having them in a sitemap with a lastmod date reflecting the change in robots meta tag values would be better than removing them from the sitemap file.”

If you want to deindex your pages from Google, you’ve got the answer straight from the mouth of Google.

We recently came across an interesting conversation on the BigSEO subreddit. A user was trying to deindex many product pages on their site, but they still show three weeks later in Google’s search results. Even their Search Console shows no changes after the deindex. Is your site having a similar issue, or are you not getting your current Internet Marketing Initiatives results? Check out our award-winning SEO Services.

Here’s the catch: the user had deindexed pages for the same products on another of their sites, and those were deindexed immediately.

“Any ideas why Google has ignored my noindex tags?” the user asked.

Another user suggested OP remove the pages from the sitemap and resubmit it. In response to this particular suggestion, a familiar face intervened.

Here’s a screenshot of the thread and Mueller’s response:

 

Clearing Up Robots.txt Confusion

John Mueller also weighed in on using your robots.txt file and meta robots tags to block pages from being indexed. He indicated that Google wouldn’t see the noindex tag in such cases if it’s secured through the robots.txt file. He also added that robots.txt won’t remove pages from Google’s index. 

 

The fastest way to remove a page from Google’s index is with the URL removal tool through Google Search Console. If you are dealing with a larger site with a large amount of URLs, the tool isn’t an efficient solution. In these cases, John Mueller recommends keeping the URLs you like to deindex in your XML sitemap with the lastmod date reflecting the change in your meta robots tag values.

Ready to Collaborate? Contact Us!

Please enable JavaScript in your browser to complete this form.
Blog Sidebar

Categories.

NEWSLETTER

Please enable JavaScript in your browser to complete this form.
Newsletter Signup