Jump to content

How do I even begin fixing Google Crawl Errors?


noesac

Recommended Posts

If the pages are in fact 404s, you can ask Google to delete the URLs yourself or wait for them to eventually catch up. If it is a page you don't want Google viewing, edit your robots.txt file (I restrict all pages with a *?* in them) and then ask google to delete the page. You cannot control outside links coming to your site or pages that may have been moved. The best thing you can do is make a 301 redirect or just let Google catch up.

Link to comment
Share on other sites


Hi,

how to ask Google to delete the URLs? I have problem with crawl errors. All my website is like one big crawl error. I dont unserstand why :(

My website is http://www.forever.lt/aloe-vera/

Thank you for any help.


If you have other language packs other than the one you are using, the simplest thing to do is delete them from the backend. This will stop the 404 errors for products that aren't there.

Make sure you add this to your robots.txt file:

Disallow: /*?*

This will get rid of the nasty urls (make sure you have enabled friendly URL in PrestaShop).

You can request Google delete URLs if you are getting 404 errors or if they violate the rules in robots.txt with Google Webmaster Tools. Select your Site then click on Site Configuration > Crawler Access > Remove URL. You will have to type them in or copy/paste and submit one by one.
Link to comment
Share on other sites

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now
×
×
  • Create New...