[WordPress] 外掛分享: Better Robots.txt – AI-Ready Crawl Control & Bot Governance

首頁外掛目錄 › Better Robots.txt – AI-Ready Crawl Control & Bot Governance
WordPress 外掛 Better Robots.txt – AI-Ready Crawl Control & Bot Governance 的封面圖片
6,000+
安裝啟用
★★★★
4.5/5 分(102 則評價)
8 天前
最後更新
100%
問題解決
WordPress 5.0+ PHP 7.4+ v3.0.0 上架:2018-05-15

內容簡介

Better Robots.txt 是一款智慧型的 WordPress 外掛,取代了預設的 robots.txt 工作流程,提供可配置和預覽的功能,幫助用戶有效管理搜尋引擎、AI 爬蟲及其他自動化代理的互動,適合初學者及進階用戶。

【主要功能】
• 提供引導式設定精靈
• 可預覽最終的 robots.txt 結果
• 支援多種爬蟲行為管理
• 包含基本、進階及高級版本
• 清理低價值的爬蟲路徑
• 編輯核心 WordPress 保護規則

外掛標籤

開發者團隊

⬇ 下載最新版 (v3.0.0) 或搜尋安裝

① 下載 ZIP → 後台「外掛 › 安裝外掛 › 上傳外掛」
② 後台搜尋「Better Robots.txt – AI-Ready Crawl Control & Bot Governance」→ 直接安裝(推薦)
📦 歷史版本下載

原文外掛簡介

Better Robots.txt replaces the default WordPress robots.txt workflow with a smarter, structured version you can configure and preview before publishing.
Instead of a blank textarea, you get a guided wizard with presets, plain-language explanations, and a final Review & Save step so you can inspect the generated robots.txt before it goes live.
Built for beginners and advanced users alike, Better Robots.txt helps you control how search engines, AI crawlers, SEO tools, archive bots, bad bots, social preview bots, and other automated agents interact with your site.
Trusted by thousands of WordPress sites, Better Robots.txt is designed for the AI era without resorting to hype, vague promises, or hidden rules.
Better Robots.txt is available in Free, Pro, and Premium editions. The free plugin covers the guided workflow and essential crawl control features, while Pro and Premium unlock additional governance, protection, and AI-ready modules. Some screenshots on the plugin page show features from all three editions.
A quick overview

Why Better Robots.txt is different
Most robots.txt plugins fall into one of three categories:

Simple text editor
Virtual robots.txt manager
Single-purpose AI or policy add-on

Better Robots.txt goes further.
It gives you a complete, guided crawl control workflow so you can:

Choose a preset that matches your goals
Control major crawler categories without writing everything by hand
Keep core WordPress protection rules visible and editable
Clean up low-value crawl paths that waste crawl budget
Generate a cleaner robots.txt output
Preview the final result before saving

What you can control
Better Robots.txt helps you manage:

Search engine visibility
AI and LLM crawler behavior
AI usage signals such as search, ai-input, and ai-train preferences
SEO tool crawlers
Bad bots and abusive crawlers
Archive and Wayback access
Feed crawlers and crawl traps
WooCommerce crawl cleanup
CSS, JavaScript, and image crawling rules
Social media preview crawlers
ads.txt and app-ads.txt allowance
llms.txt generation
Advanced directives such as crawl-delay and custom rules
Final review before publishing

Editions
Better Robots.txt is available in three editions:

Free – Includes the guided setup, the Essential preset, core crawl control features, and the final Review & Save workflow.
Pro – Adds more advanced governance and protection modules, including additional AI, crawler, and cleanup controls.
Premium – Unlocks the most restrictive and advanced protection options, including the Fortress preset and additional high-control modules.

Some options shown in the interface are marked Free, Pro, or Premium so users can immediately understand which modules belong to each edition.
Presets
Setup starts with four modes:

Essential – A clean, practical configuration for most websites that want a better robots.txt without complexity.
AI-First – For publishers and content sites that want AI-ready governance without shutting down discovery.
Fortress – For websites that want stronger protection against scraping, archive capture, and unnecessary crawl activity.
Custom – For users who prefer to configure each module manually.

For many sites, one preset plus a quick review is enough.
Built for beginners and experts
Beginners get:

A guided setup instead of a raw robots.txt box
Preset-based configuration
Plain-language explanations for important choices
A safer workflow with a final preview step

Advanced users get:

Editable core WordPress protection rules
Fine-grained crawler controls by category
WooCommerce-oriented cleanup options
Consolidated output options
Advanced directives and custom rules
A final output they can inspect before publishing

AI-ready, without hype
Better Robots.txt includes features for modern AI-related crawl governance, including:

AI crawler handling
Optional llms.txt support
AI usage signals for compliant systems
Optional machine-readable governance signals for advanced use cases

These features help you express how you want automated systems to use your content.
However, Better Robots.txt does not claim to control AI by force. Like robots.txt itself, these signals are most useful with compliant systems and good-faith crawlers.
What Better Robots.txt is
Better Robots.txt is:

A robots.txt governance plugin for WordPress
A guided configuration workflow instead of a raw text editor
A crawl control layer to reduce wasteful crawling
A practical bridge between SEO, crawl hygiene, and AI-era policy signaling
A way to keep your crawl policy clearer for humans and machines

Technical reference for advanced users: Better Robots.txt also maintains a public GitHub repository with product definition, governance notes, and machine-readable artefacts.
What Better Robots.txt is not
Better Robots.txt is not:

A firewall or Web Application Firewall (WAF)
An anti-scraping enforcement engine
A legal compliance engine
A guarantee that every bot will obey your rules
A replacement for server-level security or access control

It helps you publish a clearer crawl policy.
It does not replace infrastructure-level protection.
Typical use cases
Use Better Robots.txt if you want to:

Clean up a weak or noisy default robots.txt
Reduce crawl waste on WordPress or WooCommerce
Keep major search engines allowed while restricting other bots
Control whether archive bots can snapshot your site
Publish AI usage preferences more clearly
Keep social preview bots allowed while limiting scrapers
Review the final file before making it live

Key Features

Guided step-by-step wizard
Preset-based setup: Essential, AI-First, Fortress, Custom
Search engine visibility controls
AI and LLM crawler governance
AI usage signals support
SEO tool crawler controls
Bad bot and abusive crawler options
Archive and Wayback access controls
Spam, feed, and crawl trap cleanup
WooCommerce crawl cleanup options
CSS, JavaScript, and image crawling rules
Social media preview crawler controls
ads.txt and app-ads.txt allowance
Optional llms.txt generation
Consolidated output option
Core WordPress protection rules remain visible and editable
Final Review & Save preview screen

延伸相關外掛

文章
Filter
Apply Filters
Mastodon