Optimizes your site's robots.txt to reduce server load and CO2 footprint by blocking unnecessary crawlers while allowing major search engines and specific tools.
Test this plugin on the WordPress playground
- Download the plugin right here and install it.
Warning
The plugin will delete your existing robots.txt
file, if one exists on your server, although it'll try to back it up. It'll restore it, when it can, when you uninstall the plugin.
The default output of this plugin can be seen here on joost.blog or here on emilia.capital.
The plugin exposes the following filters:
emilia/ecofriendly_robots/allowed_spiders
- filter anarray
of user agents.emilia/ecofriendly_robots/blocked_paths
- filters anarray
of blocked paths.emilia/ecofriendly_robots/allowed_paths
- filters anarray
of allowed paths, should be subsets ofblocked_paths
.emilia/ecofriendly_robots/output
- filters the entire output as astring
.
If you're developing on this plugin, you will probably want to run tests and lint. You can do that by running the following commands:
- PHP Code style:
composer check-cs
- PHP Autofixer for code style:
composer fix-cs
- PHP Lint:
composer lint
- PHP Unit tests:
composer test