mediawiki/crawlable-all-pages 问题修复 & 功能扩展

解决BUG、新增功能、兼容多环境部署,快速响应你的开发需求

邮箱:yvsm@zunyunkeji.com | QQ:316430983 | 微信:yvsm316

mediawiki/crawlable-all-pages

最新稳定版本:1.0.0

Composer 安装命令:

composer require mediawiki/crawlable-all-pages

包简介

Extension to remove robot restrictions from Special:AllPages in MediaWiki

README 文档

README

This extension overrides Special:AllPages by changing the HTML header of the page. This is a relatively easy way to allow a search engine crawler to index all the pages in your wiki.

The HTML removed is simply:

<meta name="robots" content="noindex,nofollow"/>

Installation without composer

  • Download and place the files in a directory called CrawlableAllPages in your extensions/ folder.
  • Add the following code at the bottom of your LocalSettings.php:
wfLoadExtension( 'CrawlableAllPages' );
  • ✓ Done – Navigate to Special:Version on your wiki to verify that the extension is successfully installed.

Installation with composer

  • If you do not have a composer.local.json file in your MediaWiki installation, create one:
echo '{require: { "mediawiki/crawlable-all-pages": "dev-master" }' > composer.local.json
  • If you have jq and moreutilssponge installed and an existing composer.local.json, you can use the following command to add this extension to your composer.local.json file:
jq '.require += { "mediawiki/crawlable-all-pages": "dev-master" }' \
   composer.local.json | sponge composer.local.json
  • Run composer update
composer update
  • ✓ Done – Navigate to Special:Version on your wiki to verify that the extension is successfully installed.

统计信息

  • 总下载量: 22
  • 月度下载量: 0
  • 日度下载量: 0
  • 收藏数: 0
  • 点击次数: 0
  • 依赖项目数: 0
  • 推荐数: 0

GitHub 信息

  • Stars: 0
  • Watchers: 0
  • Forks: 0
  • 开发语言: PHP

其他信息

  • 授权协议: GPL-3.0-or-later
  • 更新时间: 2019-03-20