May 30, 2017 – The funding blitz has concluded! Many, many thanks to my awesome users who contributed a total of $500 towards the original goal. An additional $743.93 was contributed by Surreal CMS, a content management service I created in 2008.
The overflow funds were used to purchase the larger 12.9-inch iPad Pro, which is even better since I can cover just about any test case with this device. The total cost was $1,243.93 after iPad, Smart Keyboard, and sales tax. The next step is to follow this issue on GitHub as I work to improve iOS support!
Living in the root of millions of websites is a little file called
robots.txt that tells search engines and other "robots" what they can and can't index on your website. This file doesn't actually restrict robots from crawling your website — it works like an honor system — but Google and other major search engines do respect it.
The default template is called
robots.dust and is pretty small by default. It currently exposes your sitemap to search engines and asks them not to index any of the admin panel's pages.
Embeds are a great feature of the Web that lets you include third-party content into your posts. Think YouTube videos, Google Maps, and more! All you have to do is copy and paste the embed code from the provider's website.
Searching for embed code can be tiresome, though. Each provider has their own way of exposing the code, so users are left to hunt for it. Wouldn't it be better if you could just copy the URL from the top of your browser and paste it into your post?
A discussion on GitHub this morning was the inspiration for this post. It addresses something I've been wanting to do in my software for years, but due to user demand, I haven't.
Why would I want to remove something that so many users seem to want? I'm glad you asked.