annette Posted March 27, 2009 Share Posted March 27, 2009 I love prestashop. Before switching, I used cs-cart. My page rank was terrible and still is. I converted to PS yesterday after much testing on another domain.I resubmitted my sitemap to google but I am getting warnings:Duplicate meta descriptionsPages with duplicate meta descriptions Pages/index.php/index.php?mode=catalog/index.php?target=auth&mode=recover_password/index.php?target=categories&category_id=165/index.php?target=categories&category_id=166/index.php?target=checkout&mode=cart/index.php?target=checkout/index.php?target=events&mode=access_key/index.php?target=forms&name=contact_us/index.php?target=gift_certificates/index.php?target=orders/index.php?target=pages&page_id=about/index.php?target=pages&page_id=privacy_policy/index.php?target=products&mode=update&product;_id={$product_id}/index.php?target=products&product_id=29786/index.php?target=products&product_id=29787/index.php?target=profiles&mode=add/index.php?target=search/index.php?target=sitemap/index.php?target=topics&topic_id=9These are all from the old script which has been removed. How do I fix this with google? I see no place to resubmit?Forgive me if I sound ignorant. SEO has always been difficult for me.Thanks in advance for any assistance.Annette Link to comment Share on other sites More sharing options...
Trip Posted March 27, 2009 Share Posted March 27, 2009 Hi Annette,I think it needs more time until google updates it's index and until changes appear in webmaster tools.I have still bugs from the 14.03.2009 in webmaster tools which I resolved 1 week ago.Old Pages can sometimes be found for a couple of months although they already closed.Greetz, trip Link to comment Share on other sites More sharing options...
annette Posted March 27, 2009 Author Share Posted March 27, 2009 Thanks for your quick reply!I was just reading over at Google webmaster and noticed that there is a tool to request removal of cached pages. I wonder if that would help?"Tell us what cached content to removeYou must apply a noarchive meta tag to your pages to successfully remove cached pages from Google search results. These pages will then be removed the next time we crawl the site. If you need to expedite your content removal, make sure you have applied the noarchive meta tag first, before submitting your removal request below.If you have changed the content of your page, we will update the cached version to reflect this change the next time we crawl the page. However, if you want to remove the older, cached version of the content, you can submit your removal request below. This will remove the description and cached copy of your page from Google search results for a minimum of 6 months. If the page does not use a noarchive meta tag, the content of the page must have changed from the cached version in order for this request to be successful."Any thoughts? I think I'll go ahead and give it a try Link to comment Share on other sites More sharing options...
Trip Posted March 27, 2009 Share Posted March 27, 2009 Puuh, I think when you do not know exactly what you are doing it is better to wait a couple of weeks. Google normally does this job automatically and if you do wrong you get your whole page banned from the index. Just my thoughts. Maybe someone else knows it better.Greets, trip Link to comment Share on other sites More sharing options...
annette Posted March 27, 2009 Author Share Posted March 27, 2009 I don't know if this is going to fix the problem or not but I added the following meta tag to my header.tpl page:<meta name="googlebot" content="noarchive">That will tell google to remove the cached index page. Hopefully the duplicate description errors will disappear at that point. Once the errors are gone, I'm going to remove the tag.Otherwise I have serious problems in page rank Link to comment Share on other sites More sharing options...
Spark Posted April 1, 2009 Share Posted April 1, 2009 Hi there,Same here! i am having trouble getting my URLs to be index..was quite worry when i see big fat ZERO indexed yet i have already submit more than 400 URLs last week...anyone here able to resolve it?I remember that's a module [Think google sitemap V1.1] will have to look through the forum again Link to comment Share on other sites More sharing options...
arowana Posted July 8, 2009 Share Posted July 8, 2009 i tried all the versions from google sitemap 1.0 to 1.4, all give me a big ZERO indexed.come'on .. there must be some [spam-filter] out there who can help ! Link to comment Share on other sites More sharing options...
eqilibrium Posted July 8, 2009 Share Posted July 8, 2009 I had a lot of problems with google sitemap module too. Google told me I had duplicate content, etc. Finally, I decided to uninstall this module and make the sitemap elsewere uploading it after.I'm using the http://www.xml-sitemaps.com/ and my pages are now indexed and without further problems. Link to comment Share on other sites More sharing options...
arowana Posted July 8, 2009 Share Posted July 8, 2009 eqilibrium, thanks for sharing ! this is great ! Link to comment Share on other sites More sharing options...
Recommended Posts