Parse robots.txt
, robots
meta and headers
Determine if a page may be crawled from robots.txt, robots meta tags and robot headers.
Support us
We invest a lot of resources into creating best in class open source packages. You can support us by buying one of our paid products.
We highly appreciate you sending us a postcard from your hometown, mentioning which of our package(s) you are using. You'll find our address on our contact page. We publish all received postcards on our virtual postcard wall.
Installation
You can install the package via composer:
composer require spatie/robots-txt
Usage
$robots = SpatieRobotsRobots::create(); $robots->mayIndex('https://www.spatie.be/nl/admin'); $robots->mayFollowOn('https://www.spatie.be/nl/admin');
You can also specify a user agent:
$robots = SpatieRobotsRobots::create('UserAgent007');
By default, Robots
will look for a robots.txt
file on https://host.com/robots.txt
.
Another location can be specified like so:
$robots = SpatieRobotsRobots::create() ->withTxt('https://www.spatie.be/robots-custom.txt'); $robots = SpatieRobotsRobots::create() ->withTxt(__DIR__ . '/public/robots.txt');
Testing
composer test
Changelog
Please see CHANGELOG for more information what has changed recently.
Contributing
Please see CONTRIBUTING for details.
Security Vulnerabilities
Please review our security policy on how to report security vulnerabilities.
Postcardware
You're free to use this package, but if it makes it to your production environment we highly appreciate you sending us a postcard from your hometown, mentioning which of our package(s) you are using.
Our address is: Spatie, Kruikstraat 22, 2018 Antwerp, Belgium.
We publish all received postcards on our company website.
Credits
Brent Roose All ContributorsLicense
The MIT License (MIT). Please see License File for more information.
版权声明:
1、该文章(资料)来源于互联网公开信息,我方只是对该内容做点评,所分享的下载地址为原作者公开地址。2、网站不提供资料下载,如需下载请到原作者页面进行下载。
3、本站所有内容均由合作方或网友上传,本站不对文档的完整性、权威性及其观点立场正确性做任何保证或承诺!文档内容仅供研究参考学习用!
4、如文档内容存在违规,或者侵犯商业秘密、侵犯著作权等,请点击“违规举报”。