The task of removing individual pages or the entire site from the search engines is faced by the webmaster when changing information, changing the domain, combining several resources and a number of other reasons. There are several ways to solve the problem, depending on the desired result.
Instructions
Step 1
Delete the page to be de-indexed so that the server returns an error message when it tries to navigate to the selected web page HTTP / 1.1 404 Not Found. To apply the selected changes, you must wait for the robot to re-access the required page.
Step 2
Use the robot's root file robots.txt to exclude selected sections or pages from search engine indexing. To prevent the display of the admin panel in the search engine, use the command: User-Agent: * Diaallow: / admin / Or, to exclude the selected page from indexing, enter the value: User-Agent: * Disallow: / selected_page.html # To apply the changes, the robot must visit again selected page.
Step 3
Use the meta-tags method to add a rule specified in HTML code to all the required pages: This is necessary in order to exclude unwanted pages from the search engine.
Step 4
Select a method for creating X-Robots-Tag for introducing commands into the http-header that are not displayed in the page code: X-Robots-Tag: noindex, nofollow This method is most useful for excluding selected pages or sections from indexing of foreign search engines.
Step 5
Use the special webmaster control page in Yandex: https://webmaster.yandex.ru/deluri.xml or in Google: https://www.google.com/webmasters/tools. This is done to prohibit the display of the desired page, section or entire site in the selected search engine.