Mister Denial Posted September 10, 2012 Share Posted September 10, 2012 (edited) Hey everyone, I just noticed that Google now ignores the Disallow: /*p= from my robots.txt file and now lists a whole bunch of duplicate meta tag errors in Webmaster tools. I wonder how comes, and if anyone else is having the same issue. And what I could do to prevent this issue? I recently upgraded to 1.4.9 and wonder if that is the issue?I am also using Tomerg's Presto-Changeo module "Duplicate URL Redirect". Help would be much appreciated! Regards, Dan EDIT: must have been a Google glitch, the robots.txt. file works again as it should Edited September 21, 2012 by Mister Denial (see edit history) Link to comment Share on other sites More sharing options...
Mister Denial Posted September 11, 2012 Author Share Posted September 11, 2012 I tried adding this line: Disallow: /*?p= With the question mark instead of just using /*p= in robots.txt - anyone got an idea if that will work to get rid of the duplicate meta tags? Link to comment Share on other sites More sharing options...
Mister Denial Posted September 17, 2012 Author Share Posted September 17, 2012 Okay, neither Disallow: /*?p= or Disallow: /*p= are working, GWT still reports duplicate meta titles and descriptions for all shop pages with more than one page. Anyone got an idea how to fix this? Link to comment Share on other sites More sharing options...
Recommended Posts
Create an account or sign in to comment
You need to be a member in order to leave a comment
Create an account
Sign up for a new account in our community. It's easy!
Register a new accountSign in
Already have an account? Sign in here.
Sign In Now