What is the Fastest Way to Deindex Pages on Google?

The fastest way to remove a page from Google’s index is with the URL removal tool through Google Search Console. But if you are dealing with a more significant site with lots of URLs, the tool isn’t an efficient solution. In these cases, John Mueller recommends keeping the URLs you’d like to deindex in your XML sitemap with the lastmod date reflecting the change in your meta robots tag values. 

  • Change to meta robots noindex
  • Use the lastmod date to reflect the change.
  • Update your XML sitemap

None other than John Mueller, Webmaster Trends Analyst at Google, showed up to answer the question in the thread below. Here’s what he had to say:

“Having them in a sitemap with a lastmod date reflecting the change in robots meta tag values would be better than removing them from the sitemap file.”

If you want to deindex your pages from Google, you’ve got the answer straight from the mouth of Google.

We recently came across an interesting conversation on the BigSEO subreddit. A user was trying to deindex many product pages on their site, but they’re still showing in Google’s search results three weeks later. Even their Search Console shows no changes after the deindex. Is your site having a similar issue, or are you not getting your current Internet Marketing Initiatives results? Check out our award-winning SEO Services.

Here’s the catch: the user had deindexed pages for the same products on another of their sites, and those were deindexed immediately.

“Any ideas why Google has ignored my noindex tags?” the user asked.

Another user suggested OP remove the pages from the sitemap and resubmit it. In response to this particular suggestion, a familiar face intervened.

Here’s a screenshot of the thread and Mueller’s response:

Google's John Mueller weighed in on a Reddit thread on how best to deindex pages.

Clearing Up Robots.txt Confusion

John Mueller also weighed in on using your robots.txt file and meta robots tags to block pages from being indexed. He indicated that Google wouldn’t see the noindex tag in such cases if it’s blocked through the robots.txt file. He also added that using a robots.txt won’t remove pages from Google’s index. 

John Mueller from Google talks robots.txt file and deindexing pages.

Ready to Collaborate? Contact Us!



Please enable JavaScript in your browser to complete this form.