plugin-icon

Ultimate AI Connector for Compatible Endpoints

作者:David Stone·
Connects the WordPress AI Client to Ollama, LM Studio, or any AI endpoint that uses the standard chat completions API format.
版本
2.0.0
最后更新
Apr 24, 2026
Ultimate AI Connector for Compatible Endpoints

This plugin extends the WordPress AI Client to support any AI service or server that uses the standard chat completions API format (/v1/chat/completions and /v1/models endpoints).

Supported services include:

  • Ollama – Run open-source models (Llama, Mistral, Gemma, etc.) locally on your own hardware.
  • LM Studio – Desktop application for local LLM inference with a one-click server.
  • OpenRouter – Unified API providing access to 100+ models from multiple providers.
  • vLLM – High-throughput inference server for production deployments.
  • LocalAI – Drop-in replacement for running models locally.
  • text-generation-webui – Popular web UI with API server mode.
  • Any compatible endpoint – Works with any server implementing the standard format.

Requirements:

  • WordPress 7.0+ – The AI Client SDK is included in core. This plugin works on its own without any additional dependencies.

Why it matters:

Other AI-powered plugins that use the WordPress AI Client (such as AI Experiments) can automatically discover and use any model you connect through this plugin. Configure your endpoint once and every AI feature on your site can use it.

How it works:

  1. Install and activate the plugin.
  2. Go to Settings > Connectors and configure the connector with your endpoint URL (e.g. http://localhost:11434/v1 for Ollama).
  3. Optionally provide an API key for services that require authentication.
  4. The plugin registers a provider with the WordPress AI Client and dynamically discovers all available models from your endpoint.

The plugin also handles practical concerns like extended HTTP timeouts for slow local inference and non-standard port support.

免费基于付费套餐
通过安装,您同意 WordPress.com 服务条款第三方插件条款
目前已测试版本
WordPress 7.0
这个插件是可用的下载,适用于您的站点。