How to Edit robots.txt in Shopify

How to customize your Shopify store's robots.txt using the robots.txt.liquid template. Default rules, common customizations, and gotchas.

Shopify generates your store's robots.txt automatically, and for years there was no way to customize it. That changed when Shopify introduced the robots.txt.liquid template. You can now add custom rules, but the process is different from uploading a file to a traditional web server.

This guide covers Shopify's default robots.txt, how to customize it with the liquid template, common modifications you will want to make, and the gotchas that trip people up.

Shopify's Default robots.txt

Every Shopify store comes with a pre-configured robots.txt. You can see yours at https://yourstore.com/robots.txt. The default file is extensive -- Shopify blocks quite a few paths out of the box.

Here is what Shopify's default robots.txt typically includes:

User-agent: *
Disallow: /admin
Disallow: /cart
Disallow: /orders
Disallow: /checkouts/
Disallow: /checkout
Disallow: /carts
Disallow: /account
Disallow: /collections/*sort_by*
Disallow: /*/collections/*sort_by*
Disallow: /collections/*+*
Disallow: /collections/*%2B*
Disallow: /collections/*%2b*
Disallow: /*/collections/*+*
Disallow: /*/collections/*%2B*
Disallow: /*/collections/*%2b*
Disallow: /blogs/*+*
Disallow: /blogs/*%2B*
Disallow: /blogs/*%2b*
Disallow: /*/blogs/*+*
Disallow: /*/blogs/*%2B*
Disallow: /*/blogs/*%2b*
Disallow: /*design_theme_id*
Disallow: /*preview_theme_id*
Disallow: /*preview_scaffold_builder*
Disallow: /search
Disallow: /apple-app-site-association
Disallow: /.well-known

Sitemap: https://yourstore.com/sitemap.xml

Shopify blocks these paths for good reasons:

  • /admin, /cart, /checkout, /account -- These are session-specific pages that should never be indexed.
  • /*sort_by* and /*+* -- These are collection filter and sort parameters that create thousands of duplicate URLs.
  • /search -- Internal search results are thin content.
  • /*design_theme_id* and /*preview_theme_id* -- Theme preview URLs that should not be public.

You cannot remove Shopify's default rules

The robots.txt.liquid template lets you add rules, but you cannot remove Shopify's default rules. They are hardcoded into the platform. Your custom rules are appended to the defaults.

Creating the robots.txt.liquid Template

To customize your Shopify robots.txt, you need to create a robots.txt.liquid template in your theme.

1

Open the theme editor

In your Shopify admin, go to Online Store > Themes. Click "Actions" (or the three-dot menu) on your active theme, then select "Edit code."

2

Add a new template

In the Templates folder, click "Add a new template." Select robots.txt from the template type dropdown. Shopify will create a robots.txt.liquid file.

3

Edit the template

The template starts with a default Liquid tag that outputs Shopify's standard rules. Add your custom rules below it.

4

Save the template

Click "Save." Your changes will be live immediately.

The default robots.txt.liquid template contains:

{% content_for_header %}
{{ content_for_layout }}

Wait -- that is the standard Shopify template structure. For the robots.txt.liquid file specifically, it uses:

{% comment %}
  Shopify's default robots.txt rules are automatically included.
  Add your custom rules below.
{% endcomment %}

To add custom rules, you write them directly in the template file as plain text, mixed with Liquid logic if needed.

Adding Custom Rules

Here is how to add custom rules to your Shopify robots.txt.

Block a specific path:

{{ content_for_header }}

User-agent: *
Disallow: /pages/internal-use-only
Disallow: /pages/old-landing-page

Block AI crawlers:

{{ content_for_header }}

User-agent: GPTBot
Disallow: /

User-agent: ClaudeBot
Disallow: /

User-agent: CCBot
Disallow: /

Add an additional sitemap:

{{ content_for_header }}

Sitemap: https://yourstore.com/sitemap-custom.xml

Test your Shopify robots.txt

After customizing your Shopify robots.txt, validate that your rules work correctly alongside Shopify's defaults.

Common Shopify Customizations

Here are the customizations Shopify store owners most frequently need.

Blocking Specific Pages

If you have pages that should not be indexed -- internal landing pages, thank-you pages, or draft content:

{{ content_for_header }}

User-agent: *
Disallow: /pages/thank-you
Disallow: /pages/coming-soon
Disallow: /pages/staff-only

Blocking Tag Filter Pages

Shopify creates URLs for product tags within collections. These can generate massive amounts of near-duplicate content:

{{ content_for_header }}

User-agent: *
Disallow: /collections/*/tag/
Disallow: /collections/*/*

Be careful with the second rule -- it can block legitimate nested URLs depending on your store structure.

Blocking Vendor and Type Pages

Shopify auto-generates pages for product vendors and types:

{{ content_for_header }}

User-agent: *
Disallow: /collections/vendors?
Disallow: /collections/types?

Adding Crawl-delay for Aggressive Bots

If a specific bot is hitting your store too hard:

{{ content_for_header }}

User-agent: AhrefsBot
Crawl-delay: 10

User-agent: SemrushBot
Crawl-delay: 10

Using Liquid Logic

Since the template supports Liquid, you can use conditional logic:

{{ content_for_header }}

{% if shop.domain contains 'staging' %}
User-agent: *
Disallow: /
{% endif %}

User-agent: GPTBot
Disallow: /

This blocks all crawlers if the store is on a staging domain, and always blocks GPTBot regardless.

You can also use Liquid to dynamically include shop information:

{{ content_for_header }}

Sitemap: https://{{ shop.domain }}/sitemap.xml

Validate your Shopify customizations

Paste your complete Shopify robots.txt output and test URLs against all your rules -- both default and custom.

Testing After Changes

After saving your robots.txt.liquid template, verify the changes are live.

1

Check the live file

Open https://yourstore.com/robots.txt in your browser. Your custom rules should appear alongside Shopify's defaults.

2

Verify rule order

Shopify's default rules appear first, followed by your custom rules. Make sure the order does not create conflicts.

3

Test specific URLs

Use a robots.txt testing tool to verify that the URLs you want blocked are actually blocked, and the URLs you want indexed are still accessible.

4

Check Google Search Console

After a few days, check Google Search Console's Pages report to ensure no important pages are newly blocked.

Gotchas and Limitations

You cannot override Shopify defaults

Your custom rules are added alongside Shopify's defaults, not instead of them. If Shopify blocks /checkout, you cannot unblock it.

Theme updates may reset your template

If you switch themes or a theme update overwrites your templates, you may lose your robots.txt.liquid customizations. Keep a backup.

Liquid errors break the file

A syntax error in your Liquid code can cause the robots.txt to output incorrectly. Always check the live output after saving.

Headless Shopify stores need separate handling

If you use Shopify as a headless backend with a custom storefront, the robots.txt from your storefront domain is what matters, not Shopify's. You will need to manage robots.txt on your storefront hosting.

Multiple storefronts have separate files

If you use Shopify Markets with separate domains, each domain has its own robots.txt. Customizations apply per-theme, so test each domain.

Keep it simple

Shopify's defaults cover the most important rules. Most stores only need to add AI crawler blocks and perhaps block a few specific pages. Do not over-engineer your robots.txt -- each new rule is a potential source of errors.


Shopify handles the basics. You handle the rest.

Test your robots.txt for free

Validate your robots.txt file instantly. Check directives, find crawling issues, and ensure search engines can access your site.