Robots.txt

Living in the root of millions of websites is a little file called robots.txt that tells search engines and other "robots" what they can and can't index on your website. This file doesn't actually restrict robots from crawling your website — it works like an honor system — but Google and other major search engines do respect it.

The default template is called robots.dust and is pretty small by default. It currently exposes your sitemap to search engines and asks them not to index any of the admin panel's pages.

You don't need to include robots.dust in your theme unless you want to override the default. Postleaf will use the default template if this template is absent from your theme.

Learn more about how search engines and other "robots" use robots.txt.