[WordPress] 外掛分享: Odyssey LLMS

首頁外掛目錄 › Odyssey LLMS
WordPress 外掛 Odyssey LLMS 的封面圖片
100+
安裝啟用
★★★★★
5/5 分(2 則評價)
46 天前
最後更新
100%
問題解決
WordPress 5.8+ v6.1.14 上架:2025-12-03

內容簡介

總結:Odyssey LLMS是您網站AI存在的必備控制面板。它生成供大型語言模型(如ChatGPT、Claude和Gemini)使用的重要文件,以了解並引用您的內容。

問題與答案:
1. llms.txt 是什麼?
- llms.txt 可以想像成AI的「網站地圖」。人類使用HTML頁面,搜尋引擎使用XML地圖,AI代理尋找您根目錄中的llms.txt文件。這個文件為它們提供一個清晰、優先順序的連結列表,確保它們訓練在您最佳的內容上,並忽略廢料。
2. llms-full.txt (Markdown) 是什麼?
- 這是一個可選的高級功能(準備好用於RAG)。llms-full.txt 不僅提供連結,還將您的實際網站內容轉換為乾淨、輕量的Markdown格式。
- 為什麼有用:它讓AI代理可以立即消化您網站的知識,而無需訪問和爬取每個HTML頁面。這減輕了伺服器負擔,並確保AI獲得正確的資料進行「檢索增強生成」(RAG)。
3. llms.jsonl(Fine-Tuning)是什麼?
- 這個文件將您的內容格式化為提示完成對(JSON Lines)。這是用於調整GPT-4或Llama 3之類模型的標準格式。
新增功能6.0:
- JSONL生成器:創建準備用於調整自定義AI模型的數據集。
- 內容清理:使用CSS選擇器(例如.sidebar、.comments)從您的Markdown和JSONL文件中去除不需要的元素。
- 視覺分析:使用內建的儀表板小工具和圖表準確顯示哪些AI機器人(如ChatGPT、Claude、Google AI等)正在訪問您的文件。
- WooCommerce整合:自動生成包括價格、庫存狀態和SKU在內的結構化「產品」部分。
主要功能:
- 乾淨的分頁界面:分為一般規則、內容來源、分析、Robots.txt、安全性和工具。
- 細節的機器人特定規則:為個別AI爬蟲(如GPTBot、Google-Extended等)設定詳細的允許或拒絕規則。
- 「默認全部阻擋」模式:創建一個安全的「白名單」,默認阻擋所有爬蟲,只允許您明確啟用的機器人。
- 設置的匯入和匯出:非常適用於機構。使用JSON匯入/匯出輕鬆備份、恢復和遷移您在網站之間的設置。
- 高級排程:在保存、手動或定期排程(每小時、每天、每周、每月)時重新生成您的文件。
- 安全驗證器:防止意外在robots.txt中阻擋所有流量。

外掛標籤

開發者團隊

⬇ 下載最新版 (v6.1.14) 或搜尋安裝

① 下載 ZIP → 後台「外掛 › 安裝外掛 › 上傳外掛」
② 後台搜尋「Odyssey LLMS」→ 直接安裝(推薦)
📦 歷史版本下載

原文外掛簡介

Odyssey LLMS is the definitive control panel for your website’s AI presence. It generates the critical files used by Large Language Models (like ChatGPT, Claude, and Gemini) to understand and cite your content.
For Beginners: Just activate it. A comprehensive and optimised llms.txt file is instantly generated. No configuration is needed.
For Power Users: Manage every aspect of your AI strategy. Track bot traffic with built-in analytics, generate JSONL datasets for fine-tuning, and clean up your content with CSS selectors.
Concepts Explained: Why do you need this?
1. What is llms.txt?
Think of llms.txt as a “Sitemap for AI”. While humans use HTML pages and Search Engines use XML sitemaps, AI agents look for an llms.txt file in your root directory. This file gives them a clean, prioritised list of links to crawl, ensuring they train on your best content and ignore the junk.
2. What is llms-full.txt (Markdown)?
This is an optional advanced feature (RAG-Ready). Instead of just providing links, llms-full.txt provides your actual website content converted into clean, lightweight Markdown format.
* Why it’s useful: It allows AI agents to ingest your website’s knowledge immediately without needing to visit and scrape every single HTML page. This reduces server load and ensures the AI gets accurate data for “Retrieval Augmented Generation” (RAG).
* ⚠️ WARNING regarding Virtual Mode Limits: When using Virtual Mode to generate this file, the item limit for the llms-full.txt file is securely capped at 50 by default. Manually increasing this limit beyond 50 in the ‘Tools’ settings will drastically increase server load and risks causing immediate 500/503 server crashes. Use this feature at your own risk. If you require more than 50 items in your llms-full.txt file, we recommend using Physical Mode instead.
3. What is llms.jsonl (Fine-Tuning)?
This file formats your content into prompt-completion pairs (JSON Lines). This is the standard format used to fine-tune models like GPT-4 or Llama 3 on your specific data.
New Features in 6.0:

JSONL Generator: Create a dataset ready for fine-tuning custom AI models.
Content Cleaning: Use CSS selectors (e.g., .sidebar, .comments) to strip unwanted elements from your Markdown and JSONL files.
Visual Analytics: Visualise exactly which AI bots (ChatGPT, Claude, Google AI, etc.) are accessing your files with a built-in dashboard widget and charts.
WooCommerce Integration: Automatically generates a structured “Products” section including Price, Stock, and SKU.

Key Features:

Clean Tabbed Interface: Organised into General Rules, Content Sourcing, Analytics, Robots.txt, Security, and Tools.
Granular Bot-Specific Rules: Set detailed Allow or Disallow rules for individual AI crawlers (GPTBot, Google-Extended, etc.).
“Block All by Default” Mode: Create a secure “whitelist” by blocking all crawlers by default and only allowing the bots you explicitly enable.
Settings Import & Export: Perfect for agencies. Easily back up, restore, and migrate your settings between sites with JSON import/export.
Advanced Scheduling: Regenerate your file on save, manually, or on a recurring schedule (Hourly, Daily, Weekly, Monthly).
Safety Validator: Prevents accidental blocking of all traffic in robots.txt.

延伸相關外掛

文章
Filter
Apply Filters
Mastodon