$317 out of $898.88 have been raised. Last updated May 18 at 16:20pm.
I hate asking for money — I prefer to earn it. But today I'm going to do something I hate because it will benefit the project and make Postleaf available to millions of users who don't have desktops or laptops.
Living in the root of millions of websites is a little file called
robots.txt that tells search engines and other "robots" what they can and can't index on your website. This file doesn't actually restrict robots from crawling your website — it works like an honor system — but Google and other major search engines do respect it.
The default template is called
robots.dust and is pretty small by default. It currently exposes your sitemap to search engines and asks them not to index any of the admin panel's pages.
Embeds are a great feature of the Web that lets you include third-party content into your posts. Think YouTube videos, Google Maps, and more! All you have to do is copy and paste the embed code from the provider's website.
Searching for embed code can be tiresome, though. Each provider has their own way of exposing the code, so users are left to hunt for it. Wouldn't it be better if you could just copy the URL from the top of your browser and paste it into your post?
A discussion on GitHub this morning was the inspiration for this post. It addresses something I've been wanting to do in my software for years, but due to user demand, I haven't.
Why would I want to remove something that so many users seem to want? I'm glad you asked.