cocothecat Posted May 14, 2013 Share Posted May 14, 2013 I bought a module Canonical URL PRO I've already contacted the developer. However I'm looking for ideas / suggestions here The module works perfectly, however its been pointed out to me that once JS is disabled (this is how google crawls a site) then the canonical url structure fails. Is there a way around this problem? So example, If I'm viewing page 2 my canonical url points to: <link rel="[url=""]canonical[/url]" href="http://URL/5-category" /> Which is perfect, but when I visit the same page with JS turned off <link rel="[url=""]canonical[/url]" href="http://URL/5-category?=p2" /> Which is useless to me and anyone else who has bought this module, I have seen people hard code suggestiosn on this forum, but my issue is DOES this issue effect them or not? Link to comment Share on other sites More sharing options...
Trip Posted May 14, 2013 Share Posted May 14, 2013 (edited) Correct me if I am wrong but I have no clue why a category needs a canonical link. There is no chance that you have 2 categories with the same name and content. Second, within the product you can set the preferred category where it should appear and therefore imo you don't need a canoical tag. When you for example open this link www.jing-shop.com/10-thai-fisherman-pants-black-cotton.html on my page you are automatically redirected to http://www.jing-shop.com/de/aladinhosen-fischerhosen-wickelhosen/10-fischerhose-schwarz.html ergo you have only one page per product. Every duplicate is redirected with a 301 http://en.wikipedia.org/wiki/HTTP_301 when you enable Automatically redirect to the canonical URL under the SEO settings in PS. You can see this in Detail here http://www.webpagete..._HS5/1/details/ .... the first thing that happens is a 301 to the main product page. All the best, trip Edited May 14, 2013 by Trip (see edit history) Link to comment Share on other sites More sharing options...
cocothecat Posted May 14, 2013 Author Share Posted May 14, 2013 If you have page 1, page 2, or any filters for example this can lead to duplicated content being indexed Link to comment Share on other sites More sharing options...
Trip Posted May 14, 2013 Share Posted May 14, 2013 In my case each filter generates a special url with a # tag ... so they are no duplicates. The robot.txt prevents google from indexing suboptimal content User-agent: Googlebot Disallow: /*orderby= Disallow: /*orderway= Disallow: /*tag= Disallow: /*id_currency= Disallow: /*search_query= Disallow: /*id_lang= Disallow: /*back= Disallow: /*utm_source= Disallow: /*utm_medium= Disallow: /*utm_campaign= Disallow: /*n= Even if there is duplicate content google is pretty good. The main purpose is, that you can show search engines which webpage you want to display in the search results. If you don't do it google takes more or less a random page. The penalty thing is more or less a fairy tale. Please read this article from google. Quote Let's put this to bed once and for all, folks: There's no such thing as a "duplicate content penalty." At least, not in the way most people mean when they say that. and Quote When we detect duplicate content, such as through variations caused by URL parameters, we group the duplicate URLs into one cluster. We select what we think is the "best" URL to represent the cluster in search results. We then consolidate properties of the URLs in the cluster, such as link popularity, to the representative URL. Before I play around with canonical tags I would look in webmaster tools if you have problems. I for example have submitted about 3000 ulr's and 2900 are indexed... looks pretty good to me ... if you have about 1000 pages but 100000 are indexed than I would start to worry but than check if my configs are correct e.g. robots.txt. I see in webmaster tools that 33000 url's are blocked by robots.txt so no danger from that. Last but not least Quote However, rel=canonical can be a bit tricky because it’s not very obvious when there’s a misconfiguration. http://googlewebmastercentral.blogspot.de/2013/04/5-common-mistakes-with-relcanonical.html so I would proceed with caution. Link to comment Share on other sites More sharing options...
Trip Posted May 14, 2013 Share Posted May 14, 2013 Sorry, here is the link Demystifying the "duplicate content penalty" http://googlewebmastercentral.blogspot.de/2008/09/demystifying-duplicate-content-penalty.html Link to comment Share on other sites More sharing options...
cocothecat Posted May 14, 2013 Author Share Posted May 14, 2013 I get what your saying, but when the client hires a 3rd party company to deal with the seo and this is what they request then advising only goes so far: When it comes to SEO I think everyone has their own opionion on stuff and each company has a set of best practises they wish to keep to. It does help reduce the same page being indexed twice though and robot.txt only goes so far if you ask me. Link to comment Share on other sites More sharing options...
Trip Posted May 14, 2013 Share Posted May 14, 2013 Yeah, no worries about that. What I wanted to point out is that the 1.5.X version is imo pretty good in avoiding DC. In earlier versions there were significantly more problems like indexing of orderby and so on. I just wanted to clear out that there is no need for panic about duplicate content in most cases. I strongly advice to check what webmaster tools tells you. You can analyse very good if you have issues or not. I tell it once again, I see no need for canonical tag in categories. If this is done wrong than maybe only one category is indexed and the following are not. The result is not what you want as the products on the following pages are not indexed or they get no link juice from the category which is very important. And this is not my opinion this is what google tells us and it is the only source I would trust when it comes to SEO. I did not analyse the page but when the ratio of submitted pages is more or less equal to the indexed pages than I would request a second opinion from another SEO company because IMO the advice is more or less hot air. It is imo better to concentrate on on page SEO, descpiptions etc.pp., linkbuilding whatever but this is my personal opinion Another easy way to find out what is indexed in google is write Quote site:www.url_of_shop.com in the search form. Than you can see if there is unwanted content indexed.All the best, trip 1 Link to comment Share on other sites More sharing options...
cocothecat Posted May 14, 2013 Author Share Posted May 14, 2013 No I agree with what your saying, just with this project my hands are tied, there is an exteranl 3rd party company involved after issues with a previous "seo person" and there are 1000000s of bad links, the site is basiclly being built from the ground up and to give this agency a chance I'm just doing what they request, I don't want to step on toes or make things harder for the client than they need to be. I've pointed out some things on both sides but like anything it only goes so far, I can make my point and record I made it. A few things have been raised which I 100% dissagree with, to be honest the whole seo thing is annoying as hell I been doing this for 14 -15 years now and I don't think anything has changed since the 90s, old school ways are still the best when it comes to this stuff meta, unique content and onsite techniques is what counts none of this fancy url trickery and aniaml proof updates BS ** On a side note, I think the problem has been resolved the module author was promt to reply and it seems all I needed to do was dissalow the "p" tag I'm waiting on confirmation if this is now fixed from said seo co! Link to comment Share on other sites More sharing options...
Recommended Posts