When managing a website, controlling what search engines can crawl and index is a core part of SEO and data privacy strategy. The robots.txt
file is a simple but powerful configuration file that sits at the root of your website and gives instructions to web crawlers (like Googlebot, Bingbot, etc.) about which pages or folders they are allowed or disallowed to visit.
Whether you're preventing duplicate content from being crawled, hiding internal admin or staging URLs, or optimizing your crawl budget by excluding low-value pages, a well-configured robots.txt
can make or break your site's performance in search engines.
However, updating this file often requires direct server access or developer involvement — which may not be possible in agile or no-code environments. That's where redirection.io becomes a powerful ally.
What does this redirection.io recipe do?
This recipe allows you to serve a completely customized robots.txt
file directly through redirection.io — without modifying your hosting setup or application code.
You can define the entire content of the file within the redirection.io dashboard and ensure it's instantly available to search engines at https://example.com/robots.txt
.
With this solution, marketing and SEO teams can autonomously manage crawler access instructions, make immediate changes, and respond to evolving SEO priorities without developer bottlenecks.
Sample robots.txt
file you can use
User-agent: *
Disallow: /admin/
Disallow: /staging/
Disallow: /checkout/confirmation
Allow: /blog/
Sitemap: https://example.com/sitemap.xml
This sample file tells all bots (User-agent: *
) not to crawl the /admin/
, /staging/
, or checkout confirmation pages, while explicitly allowing the blog and listing the location of the sitemap.
It is always a good idea to check the validity of your robots.txt file! Use our robots.txt validator before deploying the new file on your website.
Best practices for robots.txt files
- the
robots.txt
only gives directives to crawlers, but those my not respect it! If a page is private, rather use thenoindex
header to prevent it from being indexed, or even use our HTTP Authentication recipe to request a password for accessing to this resource. - Always include the
Sitemap
directive for better crawl efficiency. - Validate your
robots.txt
before deploying it.
How to install this recipe on my website with redirection.io?
Installing this recipe on your website requires the following steps:
- click on "Install on my website"
- review the newly created rule and edit if necessary
- publish the ruleset
A few seconds later, the new robots.txt
file will be served on your website.