[WordPress] 外掛分享: AI Provider for exo

首頁外掛目錄 › AI Provider for exo
全新外掛
安裝啟用
尚無評分
10 天前
最後更新
問題解決
WordPress 7.0+ PHP 8.3+ v1.0.1 上架:2026-04-21

外掛標籤

開發者團隊

⬇ 下載最新版 (v1.0.1) 或搜尋安裝

① 下載 ZIP → 後台「外掛 › 安裝外掛 › 上傳外掛」
② 後台搜尋「AI Provider for exo」→ 直接安裝(推薦)
📦 歷史版本下載

原文外掛簡介

This plugin registers exo as an AI provider in WordPress 7’s AI Client SDK and Connectors page.
exo connects all your devices into an AI cluster, enabling you to run frontier models locally. It exposes an OpenAI-compatible API that this plugin connects to.
Features:

Registers exo as a WordPress AI provider
OpenAI-compatible text generation via exo’s chat completions API
Auto-detect active models from your running exo cluster
Capability detection — displays model capabilities (Text, Code, Thinking, Vision) as badges
“Connect & Detect” / “Save & Re-detect” connector flow
Optional API key authentication with secure storage
Configurable endpoint URL (default: http://localhost:52415)
Settings integrated into WordPress 7’s Connectors page

Choosing a Model:
exo exposes every model in its catalog, but only models actively loaded on your cluster will respond. Use “Connect & Detect” on the Connectors page to discover which models are running.
Recommended — Instruct models produce clean, usable output for WordPress AI features (title generation, content suggestions, etc.):

Llama-3.2-3B-Instruct-8bit — ~3 GB, fast, great for short tasks
Meta-Llama-3.1-8B-Instruct-4bit — ~4 GB, good balance of speed and quality
Llama-3.3-70B-Instruct-4bit — ~35 GB, best quality, needs a larger cluster

Avoid — Reasoning/thinking models (Qwen3.5, DeepSeek, GLM, Nemotron-Nano) spend most tokens on internal chain-of-thought, producing slow responses with minimal visible output.
To load a model: exo run mlx-community/Llama-3.2-3B-Instruct-8bit
Requirements:

WordPress 7.0 or later
PHP 8.3 or later
A running exo cluster (see exo documentation)

Configuration
The plugin can be configured via the Connectors page or environment variables:

AIPRFOEX_ENDPOINT — exo API endpoint (default: http://localhost:52415)
AIPRFOEX_API_KEY — Optional API key for authentication
AIPRFOEX_MODEL — Model name to use (auto-detected if empty)

You can also define these as constants in wp-config.php.

延伸相關外掛

文章
Filter
Apply Filters
Mastodon