Jump to content

Tips on how to add pages to the robots.txt file? I think I made a big mistake


Recommended Posts

So recently I downloaded the Cleanurls module that lets you get rid of the id number in your pages url, however I did not think about the effect that this would have on the google search engine. As now when I want to find specific pages on my website, google displays the old url that lead to a page that says ' This page is not available'.

 

Because this is the case, I want to add all of my old urls that have the id numbers in them to my robots.txt file so they will no longer show up in google search.

 

Is this possible?

 

And if so, what would be the code that I would add to block each specific page from google.

 

Thanks heaps!

 
Link to comment
Share on other sites

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now
×
×
  • Create New...