Bas_1981 Posted November 16, 2016 Share Posted November 16, 2016 Hello, when I check my domain on google with site:[mydomain].com I see some pages that I don't use (like "stores") Should I simply delete them from the server or is it better to change my robots.txt and ad "disallow" for this page? Link to comment Share on other sites More sharing options...
Johann Posted November 18, 2016 Share Posted November 18, 2016 Changing the robots.txt file won't remove those pages from Google index. It will just ask Google not to crawl them anymore, but not to forget them. Deleting the file from the server won't also change the situation You have to insert this tag : <meta name="robots" content="noindex"> to let Google remove a page from their index the next time it crawl the page Link to comment Share on other sites More sharing options...
Bas_1981 Posted December 5, 2016 Author Share Posted December 5, 2016 Oké, thank you for your answer. I'll ad that tag! Sorry for my late reaction, the notification-email was deliverd in the spambox Link to comment Share on other sites More sharing options...
Recommended Posts
Create an account or sign in to comment
You need to be a member in order to leave a comment
Create an account
Sign up for a new account in our community. It's easy!
Register a new accountSign in
Already have an account? Sign in here.
Sign In Now