How to Edit robots.txt in WordPress

Edit your WordPress robots.txt file using Yoast SEO, Rank Math, or direct file editing. Step-by-step instructions for each method.

WordPress handles robots.txt differently from most platforms. By default, it generates a virtual robots.txt file dynamically -- there is no physical file on disk. This catches many developers off guard when they try to edit it directly.

This guide covers three methods to edit your WordPress robots.txt: through Yoast SEO, through Rank Math, and by creating a physical file. Each approach has trade-offs.

How WordPress Handles robots.txt

WordPress generates a virtual robots.txt file automatically. When a crawler requests https://yoursite.com/robots.txt, WordPress intercepts the request and outputs a default set of rules.

The default WordPress virtual robots.txt looks like this:

User-agent: *
Disallow: /wp-admin/
Allow: /wp-admin/admin-ajax.php

Sitemap: https://yoursite.com/wp-sitemap.xml

This blocks crawlers from the WordPress admin area (except the AJAX endpoint, which some themes and plugins need) and points to the built-in WordPress sitemap.

Key things to understand:

  • There is no physical robots.txt file in your WordPress installation by default. The file is generated on-the-fly.
  • If you create a physical file, WordPress will serve that instead of the virtual one.
  • Plugins can modify the virtual file by hooking into the robots_txt filter.
  • If your site is set to "Discourage search engines" (Settings > Reading), WordPress adds Disallow: / to the virtual file, blocking all crawlers from everything.

Check your Reading settings

Go to Settings > Reading and make sure "Discourage search engines from indexing this site" is unchecked. This is a common cause of accidental de-indexing, especially on sites that were set up as staging environments and then went live.

Method 1: Editing with Yoast SEO

Yoast SEO provides a built-in robots.txt editor. This is the most popular method and does not require FTP access.

1

Open Yoast SEO settings

In your WordPress admin, go to Yoast SEO > Tools. Click on "File editor."

2

Edit the robots.txt content

You will see a text area with your current robots.txt content. If the file does not exist yet, Yoast will offer to create it. Edit the content as needed.

3

Save your changes

Click "Save changes to robots.txt." Yoast creates a physical robots.txt file in your WordPress root directory.

4

Verify the changes

Open https://yoursite.com/robots.txt in your browser to confirm the updated content is being served.

Yoast creates a physical file

When you use Yoast's file editor, it creates an actual robots.txt file in your WordPress root directory. This file overrides the virtual robots.txt entirely. Any changes made via the robots_txt filter or other plugins will no longer take effect.

If you do not see the "File editor" option in Yoast, your hosting environment may have file editing disabled. Check if DISALLOW_FILE_EDIT is set to true in your wp-config.php.

Method 2: Editing with Rank Math

Rank Math also includes a robots.txt editor, and it works similarly to Yoast's.

1

Open Rank Math settings

Go to Rank Math > General Settings > Edit robots.txt.

2

Edit the content

Rank Math shows your current robots.txt content in an editable text area. Modify the directives as needed.

3

Save your changes

Click "Save Changes." Rank Math will create or update the physical robots.txt file.

4

Test the result

Visit https://yoursite.com/robots.txt in your browser to verify.

Rank Math also provides a robots.txt validation feature that checks for basic syntax errors before saving.

Validate your WordPress robots.txt

After editing, paste your robots.txt to check for syntax errors, conflicting rules, and common WordPress-specific mistakes.

Method 3: Direct File Editing

If you do not use Yoast or Rank Math, or if you prefer full control, you can create and edit the robots.txt file directly.

Via FTP or SFTP:

  1. Connect to your server using an FTP client (FileZilla, Cyberduck, etc.)
  2. Navigate to your WordPress root directory (the same directory that contains wp-config.php)
  3. Create or edit the robots.txt file
  4. Upload the file

Via your hosting file manager:

Most hosting control panels (cPanel, Plesk, etc.) include a file manager. Navigate to your WordPress root directory and create or edit the robots.txt file.

Via SSH:

# Connect to your server
ssh user@yourserver.com

# Navigate to WordPress root
cd /var/www/html

# Create or edit robots.txt
nano robots.txt

When you create a physical file, it completely replaces the virtual robots.txt. WordPress will not add its default rules or sitemap reference automatically. You are responsible for the entire file content.

WordPress-Specific robots.txt Rules

Here is a well-configured robots.txt for a typical WordPress site:

User-agent: *
Allow: /wp-admin/admin-ajax.php
Disallow: /wp-admin/
Disallow: /wp-includes/
Disallow: /wp-content/plugins/
Disallow: /wp-json/
Disallow: /wp-login.php
Disallow: /xmlrpc.php
Disallow: /readme.html
Disallow: /license.txt
Disallow: /?s=
Disallow: /search/
Disallow: /author/
Disallow: /*?replytocom=
Disallow: /tag/*/page/
Disallow: /category/*/page/

# Allow CSS, JS, and images for rendering
Allow: /wp-content/uploads/
Allow: /wp-content/themes/

Sitemap: https://yoursite.com/wp-sitemap.xml
Sitemap: https://yoursite.com/sitemap_index.xml

Let me break down the reasoning:

  • /wp-admin/ and /wp-includes/ -- Backend files that should not be indexed
  • /wp-content/plugins/ -- Plugin files that should not appear in search results
  • /wp-json/ -- REST API endpoints that produce raw JSON, not user-facing pages
  • /?s= and /search/ -- Internal search results, which are thin content
  • /author/ -- Author archive pages, often low-value duplicate content on single-author sites
  • /*?replytocom= -- Comment reply URLs that create duplicate page versions
  • /tag/*/page/ and /category/*/page/ -- Paginated tag and category archives

Test your WordPress robots.txt rules

Enter your WordPress URL and test which pages are blocked and which are allowed for each crawler.

What You Should Never Block in WordPress

Do not block these paths

  • /wp-content/uploads/ -- This is where your media files live. Blocking it hides images from Google Image Search.
  • /wp-content/themes/ -- This contains your CSS and JavaScript. Blocking it prevents Google from rendering your pages.
  • /wp-admin/admin-ajax.php -- Many themes and plugins use this endpoint for front-end functionality. Always add an explicit Allow rule for it.

Handling WooCommerce

If you run WooCommerce, add these rules to prevent crawling of cart, checkout, and account pages:

Disallow: /cart/
Disallow: /checkout/
Disallow: /my-account/
Disallow: /*?add-to-cart=*
Disallow: /*?orderby=*
Disallow: /*?filter_*

These pages are session-specific and should not be indexed. The filter and orderby parameters create thousands of duplicate URLs that waste crawl budget.

Virtual vs. Physical File: Which to Use

ApproachProsCons
Virtual (default)Auto-generates, includes sitemap, managed by WordPressCannot customize without code or plugins
Plugin editor (Yoast/Rank Math)Easy GUI editing, no FTP neededCreates physical file, overrides virtual
Physical file (manual)Full control, version controllableMust manage everything yourself, including sitemap references

For most WordPress sites, using a plugin editor is the best balance of convenience and control. If you manage your site through version control and deployment pipelines, a physical file that you maintain in your repository is the cleanest approach.


WordPress makes robots.txt easy to edit, but easy to break. Test after every change.

Test your robots.txt for free

Validate your robots.txt file instantly. Check directives, find crawling issues, and ensure search engines can access your site.