bnomei/kirby3-robots-txt
Manage a virtual robots.txt from the Kirby config file
时间:2018-10-05 15:56
innoweb/silverstripe-robots
Adds a Robots.txt file that is configurable from /admin/settings/.
时间:2018-09-21 07:14
verschuur/laravel-robotstxt
Set the robots.txt content dynamically based on the Laravel app environment.
时间:2017-01-16 15:14
nickmoline/robots-checker
Class to check a URL for robots exclusion using all possible methods of robots exclusion
时间:2016-12-26 23:21
devgeniem/wp-noindex-testing-staging-robots
Uses robots.txt to not index development/testing environments of this site with search-engines
时间:2016-10-30 18:57
execut/yii2-robots-txt
Module for generate robots.txt file by url rules
时间:2016-07-13 13:23
vipnytt/useragentparser
User-Agent parser for robot rule sets
时间:2016-04-08 16:48
vipnytt/robotstxtparser
Robots.txt parsing library, with full support for every directive and specification.
时间:2016-04-08 10:58
vipnytt/sitemapparser
XML Sitemap parser class compliant with the Sitemaps.org protocol.
时间:2016-04-03 19:25
anomaly/robots-extension
A simple robots.txt generator.
时间:2016-03-12 04:39
hofff/contao-robots-txt-editor
Editor for auto creation and modification of robots.txt.
时间:2015-10-26 21:42
mgargano/non-production-robots-ignore
Non Production Robots Ignore is a simple WordPress plugin that is used with vlucas/phpdotenv (or similar) that sets environment variables to set robots.txt to ignore all on all environments except production.
时间:2015-09-16 12:21
beeg99/phpcrawl
PHPCrawl is a webcrawler/webspider-library written in PHP. It supports filters, limiters, cookie-handling, robots.txt-handling, multiprocessing and much more.
时间:2015-09-14 14:23
dawid-z/phpcrawl
PHPCrawl is a webcrawler/webspider-library written in PHP. It supports filters, limiters, cookie-handling, robots.txt-handling, multiprocessing and much more.
时间:2015-09-12 13:48