Vitazzz Posted October 10, 2015 Share Posted October 10, 2015 (edited) I've added this rule to my website: Disallow: /*p= Since then Google crawled my pages several times, but still shows these category pages and marks them as duplicated content. The Robots.txt tester though shows that block is set properly. Also there are pages where instead of entered meta description it uses HTML description text and marks it as duplicate meta descriptions. I'm so sick of this... Edited October 10, 2015 by Vitazzz (see edit history) Link to comment Share on other sites More sharing options...
innovacy Posted October 15, 2015 Share Posted October 15, 2015 It would help to have the URL of the site, to be able to check the exact details, otherwise it's just suggestions in the blind. Link to comment Share on other sites More sharing options...
Dh42 Posted October 16, 2015 Share Posted October 16, 2015 You really need to do a custom modification and point the canonical of the paginated pages back to the category landing page. Link to comment Share on other sites More sharing options...
Recommended Posts
Create an account or sign in to comment
You need to be a member in order to leave a comment
Create an account
Sign up for a new account in our community. It's easy!
Register a new accountSign in
Already have an account? Sign in here.
Sign In Now