A robots. txt file tells search engine crawlers which pages or files the crawler can or can’t request from your site. This is used mainly to avoid overloading your site with requests; it is not a mechanism for keeping a web page out of Google.
Tobi Lutke, Shopify CEO, broke the news this evening on Twitter while adding that it might take until Monday for this feature to roll out to everyone.
All Shopify stores start with the same robots.txt, which the company says works for most sites, but now the file can be edited through the robots.txt.liquid theme template.
Site owners can make the following edits to the robots.txt file:
- Allow or disallow certain URLs from being crawled
- Add crawl-delay rules for certain crawlers
- Add extra sitemap URLs
- Block certain crawlers
Shopify recommends using Liquid to add or remove directives from the robots.txt.liquid template, as it preserves the ability to keep the file automatically updated.