Search the Community
Showing results for tags 'robot.txt'.
-
Are you struggling to get your PrestaShop website ranked higher in search engine results pages (SERPs)? Are you spending a lot of time and effort on SEO, but not seeing the results you want? If so, you're not alone. SEO can be complex and time-consuming, especially for busy business owners. But it's essential to get your website ranked well in SERPs if you want to attract more visitors and customers. That's where SEO Audit comes in. SEO Audit is a comprehensive PrestaShop SEO module that can help you improve your website's SEO and traffic. With SEO Audit, you can: Analyze your website's SEO performance and identify any areas that need improvement Generate engaging and optimized content for your website Optimize your website's URLs Create a sitemap and submit it to search engines Customize your robots.txt file Optimize your website for social media sharing And more! SEO Audit is easy to use and doesn't require any coding knowledge. So if you're looking for a way to improve your website's SEO and traffic, SEO Audit is the perfect solution for you. The SEO Audit module is now available on the PrestaShop Addons Marketplace. Download now: https://addons.prestashop.com/en/seo-natural-search-engine-optimization/28279-seo-audit-best-seo-practices-2023-advanced-seo.html Compatibility: PrestaShop 8 and PrestaShop 1.7 Demo: Front office | Back office HOW THE "SEO AUDIT" MODULE CAN IMPROVE YOUR PRESTASHOP WEBSITE'S SEO RANKING? SEO Audit is a powerful tool that can help you improve your PrestaShop website's SEO ranking in a number of ways, including: SEO analysis: SEO Audit provides a comprehensive SEO analysis of your website, including a review of your content, URLs, sitemap, robots.txt file, and social media integration. This analysis can help you identify any areas that need improvement. Content generation: SEO Audit integrates with ChatGPT to help you generate engaging and optimized content for your website. ChatGPT is a powerful AI-powered tool that can help you create content that is both informative and relevant to your target audience. URL optimization: SEO Audit can help you optimize your website's URLs for SEO. This includes removing IDs from URLs, creating 301 redirects for duplicate URLs, and shortening URLs to make them easier to remember and type. Sitemap generation: SEO Audit can automatically generate a sitemap for your website. This sitemap will be submitted to search engines, helping them to crawl and index your website more effectively. Social media optimization: SEO Audit can help you optimize your website for social media sharing. This includes adding Open Graph tags to your pages, which can help your website's content to appear more prominently in social media feeds. Introduction Video: Join the success story of our 'SEO Audit' module, boasting over 3,000+ downloads and a thriving community of satisfied PrestaShop users. But, why take our word for it when you can hear straight from our delighted customers? Fortune F: “Fortune found our module to work seamlessly and was especially impressed with the support provided by PrestaHero. When they needed a specific feature, our Support and Development team stepped in to deliver exactly what was needed.” Carl-Johan Hallum-Olesen: “Carl-Johan is a big fan of our module, citing the time it saved and the valuable insights it provided for improving his website. While expressing satisfaction, he also shared some suggestions for future updates, highlighting the potential for even greater functionality.” NextGen Music GmbH Dejan Popadic: “Dejan Popadic and the NextGen Music GmbH team have found our module to be a comprehensive solution. They praised its ease of use, extensive functions, and the reliable support provided by PrestaHero. We're proud to have them as a satisfied customer and partner in our journey.” With such positive feedback and a growing user base, the module continues to evolve and meet the needs of PrestaShop website owners. Download SEO Audit and improve your website’s SEO ranking today! Have questions or need assistance? Feel free to leave a comment or contact directly to our technical support on Addons Marketplace.
-
Salve a tutti, ho un problema con l'indicizzazione del mio sito. Ho generato il file robots.txt in automatico da impostazioni->seo&urls. Ho scoperto che non vengo indicizzato e facendo una verifica con Visual Seo Studio mi ha restituito questo Pagine scaricate: 0 Pagine al secondo: 0 Secondi per pagina: 0 Tempo totale esplorazione: 00:00:00 Tempo totale download: 00:00:00 Causa conclusione: Codice di risposta HTTP inatteso in robots.txt Un codice di risposta HTTP inatteso ('306 Unused') nel robots.txt impedisce l'esplorazione Non riesco a capire quale sia il problema e come risolverlo. Ho sbagliato a generare il robots.txt? Grazie dell'aiuto!
-
- indicizzazione
- robot
-
(and 2 more)
Tagged with:
-
Bonjour à tous, Je viens de recevoir aujourd'hui une notification de Google Search Console dont les termes sont les suivants ; " Nos systèmes ont récemment détecté un problème avec votre page d'accueil qui affecte l'affichage et l'indexation de votre contenu par nos algorithmes. Plus particulièrement, Googlebot ne peut accéder à vos fichiers JavaScript ou CSS à cause de restrictions dans votre fichier robots.txt. Ces fichiers nous permettent de comprendre que votre site Web fonctionne correctement. En bloquant l'accès à ces éléments, il est donc possible que le classement du site ne soit pas optimal." Après recherche sur le sujet, il est effectivement déconseillé de restreindre l'accès à ces fichiers afin d'améliorer l'indexation par Google. En éditant le fichier robot.txt généré automatiquement par prestashop, je ne vois aucune mention particulière qui interdit à Googlebot d'indexer ces fichiers (voir la capture écran). Est-ce que l'un de vous aurais une piste ? Merci par avance
-
Hola a todo: Haber si alguien me ayuda a solucionar algo con lo que llevo días y días y no encuentro solución Al enviar mi sitemap de prestashop 1.5 a webmaster tools google y bing me da el siguiente error - Bloqueado por robots.txt ¿qué puedo hacer para poder solucionar esto? no me sirve de nada la ayuda de google ni de bing. ¿esto hace que mi página no se indexe en los buscadores? Gracias, agradecería cualquier tipo de ayuda
-
Bonjour, Je vous contact car j'utilise lengow pour la gestion de mes flux, et je suis confronté à un soucis. J’ai un soucis avec Google Shopping, qui semble ne pas explorer un grand nombre de pages en raison du fichier robot.txt En fait je pense que j’ai trouvé la raison, mais je ne sais pas comment la contourner. Dans mon flux j’ai des urls avec utm_source/utm_medium/utm_campaign, logique puisque c’est ce que j’ai paramétré dans mon interface lengow pour tracker mes campagne dans GA par ex : http://www.outy-store.fr/visseuse/68-boulonneuse-a-chocs-18v-440nm-btw450z-makita-0088381084949.html?LGWCODE=68;8621;599&utm_source=Googleshopping&utm_term=BTW450Z&utm_medium=comparateur&utm_campaign=Googleshopping Mais … dans mon robot.txt j’ai Disallow: /*utm_source= Disallow: /*utm_medium= Disallow: /*utm_campaign= Donc logique qu’il bloque. Que me conseillez vous de faire ? Supprimer ces disallow dans mon robot ? C’est ce qui me vient à l’esprit toutefois si prestashop l’intègre dans son robot.txt par défaut il doit y avoir une raison ? Merci pour votre aide, A
-
Hello Everyone. I have a multisite, with some extra stores that sells the same thing. The domain names are different but google has found it as duplicates. Now im trying to index only main site, and disallow google from the other multistore urls. But how can i do this? Can i input disallow /example.com in robot.txt? and still have my main store indexed. Prestashop only has 1 robot.txt to all stores. I have put this in the theme/multistorethemeiwantexcluded/header.tpl: <meta name="robots" content="index,nofollow"> Thanks
-
Ciao, premettendo che sono nuovo del mondo prestashop, vi spiego subito il mio problema. Ho usato prestashop per creare il mio negozio online; fin qui tutto okay. Siccome il mio sito non è ancora finito (partirò tra un mesetto) ma è online volevo bloccare l'indicizzazione nei motori di ricerca. Ho provato ad aprire il file robot.txt che ho trovato nella cartella del tema, ma sinceramente non ho ben capito cosa dovrei toccare all'interno. # GoogleBot specific User-agent: Googlebot Disallow: /*orderby= Disallow: /*orderway= Disallow: /*tag= Disallow: /*id_currency= Disallow: /*search_query= Disallow: /*id_lang= Disallow: /*back= Disallow: /*utm_source= Disallow: /*utm_medium= Disallow: /*utm_campaign= Disallow: /*n= # All bots User-agent: * # Directories Disallow: /classes/ Disallow: /config/ Disallow: /download/ Disallow: /mails/ Disallow: /modules/ Disallow: /translations/ Disallow: /tools/ Disallow: /it/ # Files Disallow: /addresses.php Disallow: /address.php Disallow: /authentication.php Disallow: /cart.php Disallow: /discount.php Disallow: /footer.php Disallow: /get-file.php Disallow: /header.php Disallow: /history.php Disallow: /identity.php Disallow: /images.inc.php Disallow: /init.php Disallow: /my-account.php Disallow: /order.php Disallow: /order-opc.php Disallow: /order-slip.php Disallow: /order-detail.php Disallow: /order-follow.php Disallow: /order-return.php Disallow: /order-confirmation.php Disallow: /pagination.php Disallow: /password.php Disallow: /pdf-invoice.php Disallow: /pdf-order-return.php Disallow: /pdf-order-slip.php Disallow: /product-sort.php Disallow: /search.php Disallow: /statistics.php Disallow: /attachment.php Disallow: /guest-tracking # Sitemap Ho provato anche ad aprire il file header.tpl, dove volevo inserire il mio codice. <head> <title>...</title> <META NAME="ROBOTS" CONTENT="NOINDEX, NOFOLLOW"> </head> Prtroppo non ho trovato dove metterlo, ho trovato questa stringa: <meta name="robots" content="{if isset($nobots)}no{/if}index,follow" /> Cosa devo fare per evitare che tutto il sito (ogni pagina,modulo ecc) non venga indicizzata nei motori di ricerca? il mio sito: http://www.lunar-tech.com Spero possiate aiutarmi. Grazie Mille
-
Bonjour à tous, Savez-vous si il existe un module pour générer un fichier robot.txt adapté pour prestashop afin d'éviter de se le taper à la main ? D'avance merci !
-
I have thousands of crawls errors due to robots.txt restrictions; part of them are understandable, one cannot allow acces to some files or folders but a lot of these WMT errrors are coming from my site search, for ex. http://www.caprice-shop.ro/search.php?tag=verighete%205.5%20mm I have just commented out the line Disallow: /search.php in robots.txt. Is this OK ? Or should I let it in effect ? Have a nice weekend everybody!
-
I don't know what to do with this issue. Please anyone can help me. My webhosting company suspended my site because of this quote: "System administration has identified your account as using higher resources on the server housing your account. This is impacting other users, and we may be forced to suspend or have already suspended your site in order to stabilize the server. We noticed that your site (Prestahsop based site), VARIABLE_1, is being heavily 'crawled' by search engines. Search engines tend to mimic the effect of hundreds of visitors going through every portion of your site, often all at once. You may wish to implement a robots.txt file in order to reduce this effect. This file contains instructions for well behaving 'robots' on how to crawl your site. You can find more information about this here: http://www.robotstxt.org/. The basic format would be as follows to block robots from the following (example) directories: User-agent: * Disallow: /cgi-bin/ Disallow: /images/ Disallow: /tmp/ Disallow: /private/ To use this effectively, you will need to review your site and see what parts might be the most intensive. An alternative to blocking a search engine is to request their robots to not crawl through your site as quickly as they normally would. It is an unofficial extension to the robots.txt standard but one that most popular search engines use. This is an example of how you request robots request pages only every ten seconds: User-agent: * Crawl-delay: 10 This is especially useful for parts of your sites like forums or 'tag clouds' that, while useful to human visitors, are troublesome in terms of how robots aggressively pass through them repeatedly. You can also use your access logs to see how search engines are hitting your site. Let us know if you need help finding your logs in our control panel and we'll be glad to help. If your site is currently suspended, please contact us to lift the suspension in order to implement the above recommendation. As always, feel free to contact us with any further questions." Would these recomendations really help? I have many problems with Prestashop SEO lately.