定制 fuelviews/laravel-robots-txt 二次开发

按需修改功能、优化性能、对接业务系统,提供一站式技术支持

邮箱:yvsm@zunyunkeji.com | QQ:316430983 | 微信:yvsm316

fuelviews/laravel-robots-txt

最新稳定版本:v1.0.0

Composer 安装命令:

composer require fuelviews/laravel-robots-txt

包简介

Laravel robots txt package

README 文档

README

Latest Version on Packagist GitHub Tests Action Status GitHub Code Style Action Status Total Downloads PHP Version Laravel Version

Laravel Robots.txt is a robust and easy-to-use solution designed to automatically generate and serve dynamic robots.txt files for your Laravel application. The package provides intelligent caching, environment-based rules, and seamless integration with your application's routing system.

Requirements

  • PHP ^8.3
  • Laravel ^10.0 || ^11.0 || ^12.0

Installation

Install the package via Composer:

composer require fuelviews/laravel-robots-txt

Publish the configuration file:

php artisan vendor:publish --tag="robots-txt-config"

Basic Usage

Automatic Route Registration

The package automatically registers a route at /robots.txt that serves your dynamic robots.txt file:

https://yoursite.com/robots.txt

Configuration

Configure your robots.txt rules in config/robots-txt.php:

<?php

return [
    /**
     * The disk where the robots.txt file will be saved
     */
    'disk' => 'public',

    /**
     * User agent rules for different paths
     */
    'user_agents' => [
        '*' => [
            'Allow' => [
                '/',
            ],
            'Disallow' => [
                '/admin',
                '/dashboard',
            ],
        ],
        'Googlebot' => [
            'Allow' => [
                '/',
            ],
            'Disallow' => [
                '/admin',
            ],
        ],
    ],

    /**
     * Sitemaps to include in robots.txt
     */
    'sitemap' => [
        'sitemap.xml',
    ],
];

Environment Behavior

Development/Staging Environments

In non-production environments (app.env !== 'production'), the package automatically generates a restrictive robots.txt:

User-agent: *
Disallow: /

This prevents search engines from indexing your development or staging sites.

Production Environment

In production, the package uses your configured rules to generate the robots.txt file.

Advanced Usage

Using the Facade

use Fuelviews\RobotsTxt\Facades\RobotsTxt;

// Get robots.txt content
$content = RobotsTxt::getContent();

// Generate fresh content (bypasses cache)
$content = RobotsTxt::generate();

// Save to a specific disk and path
RobotsTxt::saveToFile('s3', 'seo/robots.txt');

Direct Class Usage

use Fuelviews\RobotsTxt\RobotsTxt;

$robotsTxt = app(RobotsTxt::class);

// Check if regeneration is needed
$content = $robotsTxt->getContent();

// Generate and save to custom location
$robotsTxt->saveToFile('public', 'custom-robots.txt');

Named Routes

The package registers a named route that you can reference:

// In your views
<link rel="robots" href="{{ route('robots') }}">

// Generate URL
$robotsUrl = route('robots');

Configuration Options

Disk Configuration

Specify which Laravel filesystem disk to use for storing the robots.txt file:

'disk' => 'public', // or 's3', 'local', etc.

User Agent Rules

Define rules for different user agents:

'user_agents' => [
    '*' => [
        'Allow' => ['/'],
        'Disallow' => ['/admin', '/dashboard'],
    ],
    'Googlebot' => [
        'Allow' => ['/api/public/*'],
        'Disallow' => ['/api/private/*'],
    ],
    'Bingbot' => [
        'Crawl-delay' => ['1'],
        'Disallow' => ['/admin'],
    ],
],

Sitemap Integration

Include sitemap URLs in your robots.txt:

'sitemap' => [
    'sitemap.xml',
    'posts-sitemap.xml',
    'categories-sitemap.xml',
],

This generates:

Sitemap: https://yoursite.com/sitemap.xml
Sitemap: https://yoursite.com/posts-sitemap.xml
Sitemap: https://yoursite.com/categories-sitemap.xml

Caching System

The package uses an intelligent caching system that regenerates the robots.txt file only when:

  • The configuration changes
  • The application environment changes
  • The application URL changes
  • The cached file doesn't exist

Cache Management

Cache is automatically managed, but you can clear it manually:

use Illuminate\Support\Facades\Cache;

// Clear the robots.txt cache
Cache::forget('robots-txt.checksum');

File Storage

Automatic Storage

The package automatically stores the generated robots.txt file to your configured disk at robots-txt/robots.txt.

Custom Storage

use Fuelviews\RobotsTxt\Facades\RobotsTxt;

// Save to specific location
RobotsTxt::saveToFile('s3', 'seo/robots.txt');

// Save to multiple locations
RobotsTxt::saveToFile('public', 'robots.txt');
RobotsTxt::saveToFile('backup', 'robots-backup.txt');

Example Generated Output

Production Environment

User-agent: *
Allow: /
Disallow: /admin
Disallow: /dashboard

User-agent: Googlebot
Allow: /
Disallow: /admin

Sitemap: https://yoursite.com/sitemap.xml
Sitemap: https://yoursite.com/posts-sitemap.xml

Non-Production Environment

User-agent: *
Disallow: /

Testing

Run the package tests:

composer test

Troubleshooting

Robots.txt Not Updating

If your robots.txt isn't reflecting configuration changes:

  1. Clear the application cache: php artisan cache:clear
  2. Ensure your configuration is valid
  3. Check file permissions for the storage disk

Route Conflicts

If you have an existing /robots.txt route or static file:

  1. Remove any static public/robots.txt file (the package automatically removes it)
  2. Ensure no other routes conflict with /robots.txt

Changelog

Please see CHANGELOG for more information on what has changed recently.

Contributing

Please see CONTRIBUTING for details.

Security Vulnerabilities

Please review our security policy on how to report security vulnerabilities.

Credits

📜 License

The MIT License (MIT). Please see License File for more information.

Built with ❤️ by the Fuelviews team

⭐ Star us on GitHub📦 View on Packagist

统计信息

  • 总下载量: 2.16k
  • 月度下载量: 0
  • 日度下载量: 0
  • 收藏数: 1
  • 点击次数: 1
  • 依赖项目数: 0
  • 推荐数: 0

GitHub 信息

  • Stars: 1
  • Watchers: 1
  • Forks: 0
  • 开发语言: PHP

其他信息

  • 授权协议: MIT
  • 更新时间: 2024-03-05