Remove Website Outdated Pages From Google Search

Leave a Comment

Learn How to block URLs in Google Search?


Ever worried for those pages of your website which are outdated coming in Google search. 


Want to remove outdated pages of your website from Google search, Don't worry there are few ways by you can remove your website outdated pages from Google search engine results pages.


1. Using Google tool for Removal Outdated url:

Simply put your website outdated URL in this google tool and click "REQUEST REMOVAL"
https://www.google.com/webmasters/tools/removals



2. Block Search Indexing with Meta Tags

You can also de-indexed your website outdated pages by adding a meta tag just before the closing head tag.



3. Using Robots.txt

This is the second method for removing outdated URL from Google search. You have to update your robots.txt by adding the below  queries. For example i want to remove my website's "cgi-bin", "temp" and "junk" pages from Google search. Now i use the following code in my robots.txt file.


User-agent: *

Disallow: /cgi-bin/
Disallow: /tmp/
Disallow: /admin-login/

If you want to remove whole website from Google search just add

User-agent: *
Disallow: /

If you still not uploaded Robots.txt file on your website, don't worry just follow the below steps


You may find more information from Google here: https://support.google.com/webmasters/answer/6062602?hl=en&ref_topic=4598466


How to create Robots.txt file?

open a notepad
Write this code
User-agent: *
Disallow:





Now save your notepad as named "robots.txt". The file must be save in .txt format. Now upload this file on root of your website.

0 comments: