mauricerenck/darkvisitors
最新稳定版本:1.1.3
Composer 安装命令:
composer require mauricerenck/darkvisitors
包简介
Kirby robots.txt plugin for blocking AI Crawlers and Bots
README 文档
README
Dark Visitors is a plugin for Kirby 3 and 4 that blocks unwanted AI Crawlers from your website using robots.txt. It uses the Dark Visitors API to identify and block unwanted visitors.
It also allows you to add custom rules and your sitemaps to your robots.txt file.
Installation
composer require mauricerenck/darkvisitors
Or download the latest release unzip it, copy it to site/plugins/dark-visitors
Get the access token
You need a Dark Visitors access token to use this plugin. Go to https://darkvisitors.com/ create an account and your own Project. Open your project and get your access token under settings.
Usage
Edit your config.php and add the following line:
'mauricerenck.dark-visitors.token' => 'YOUR TOKEN'
AI crawlers
Set which types of AI crawlers you want to block:
'mauricerenck.dark-visitors.aiTypes' => ['AI Assistant', 'AI Data Scraper', 'AI Search Crawler'],
Your custom rules
Add your custom rules to the robots.txt file:
'mauricerenck.dark-visitors.agents' => [ [ 'userAgents' => ['Googlebot', 'Bingbot'], 'disallow' => ['/admin'], ], [ 'userAgents' => ['Bingbot'], 'allow' => ['/microsoft'], ], ],
Setting your custom rules will overwrite the default rules, which are:
[
'userAgents' => ['*'],
'disallow' => ['/kirby', '/site'],
];
Sitemaps
Add your sitemaps to the robots.txt file:
'mauricerenck.dark-visitors.sitemaps' => [ 'Sitemap: https://your-site.tld/sitemap.xml', 'Sitemap: https://your-site.tld/sitemap2.xml', ],
Tracking/Analytics
Darkvisitors offers a tracking feature. If you want to use it, you can enable it in the config:
'mauricerenck.dark-visitors.analytics' => true,
Learn more about robots.txt and AI crawlers
统计信息
- 总下载量: 304
- 月度下载量: 0
- 日度下载量: 0
- 收藏数: 18
- 点击次数: 0
- 依赖项目数: 0
- 推荐数: 0
其他信息
- 授权协议: MIT
- 更新时间: 2024-04-18