The Drupal AI Module: What It Is and How to Use It in Production
The Drupal AI Module ecosystem turns Drupal into an AI-native CMS. Here's what's in the module suite, what actually works in production, and what still needs engineering work.
The Drupal AI Module (or more accurately, the Drupal AI module ecosystem) is a suite of contributed modules that bring LLM-powered features directly into Drupal. It was started in 2023, has grown fast, and as of early 2026 is the de facto standard way to integrate AI into Drupal sites.
The ecosystem is larger than a single module. What most people call “the AI module” is actually a family of modules:
ai— the core framework. Defines the provider abstraction, the operation types (chat, embeddings, classification, etc.), and the plugin system.ai_provider_openai,ai_provider_anthropic,ai_provider_gemini,ai_provider_ollama,ai_provider_azure— individual provider implementations. Swap them without rewriting your code.ai_agents— lets you define AI agents that can call Drupal tools and functions. This is where the interesting work is happening right now.ai_search— embeddings-based search over your Drupal content. Plugs into Search API.ai_assistant_api— backend infrastructure for chatbot-style assistants.ai_content_suggestions— editorial assistance: generate meta descriptions, alt text, summaries, and tags directly from the node edit form.ai_interpolator— automatic field population during content creation. Fill an alt text field by calling a vision model on the uploaded image.ai_automators— successor naming toai_interpolator; same idea, newer API.
What actually works in production
Not everything in the ecosystem is equally ready. Here’s what we’ve actually shipped to clients, and what we’d still approach carefully.
Works well ✅
Content suggestion workflows. AI-generated meta descriptions, alt text, summaries, and image descriptions are the low-risk, high-value features. They save editors time, failures are obvious (the editor can just reject a bad suggestion), and the provider abstraction means you can swap OpenAI for Claude or Gemini without touching the integration code.
Embedding-based search. ai_search plugged into Search API gives you semantic search across your content. The indexing step is slow, and you need a vector store (we usually use Postgres with pgvector), but the search quality jump over Solr keyword matching is dramatic.
Translation. Multi-provider translation via ai_tmgmt works well for first-pass translations that a human reviewer then polishes. Don’t ship AI-translated copy without review.
Mixed results ⚠️
AI Agents. The agent framework is ambitious and the pieces are there, but the developer experience is still rough. We’ve built production agents, but they required a lot of custom tool definitions and prompt engineering. Approach agents as “build a custom thing on top of the framework,” not “install a module and you’re done.”
Chatbots via ai_assistant_api. The backend plumbing works, but the frontend UX is up to you. If you want a chatbot that actually grounds in your site content and captures leads to a CRM, plan on writing the frontend from scratch — the module gives you the API, not the chat UI.
Not ready for production ❌
Layout Builder + AI canvas. The idea is compelling (describe a page, AI builds a Layout Builder layout), but we’ve hit consistency issues. Use for demos, not real editorial workflows. Reassess in 6-12 months.
Code generation modules. Several modules promise to generate Drupal code from natural language. They’re neat toys but we don’t ship them to production.
How we actually use the AI Module
Most of our production deployments follow this pattern:
-
Start with the provider abstraction. Always route AI calls through
ai_provider_*modules, never hit OpenAI’s SDK directly from your custom code. This gives you swap-ability and one place to handle rate limits, logging, and fallbacks. -
Put observability in first. We log every AI request with token count, latency, model, and outcome. You will need this data when debugging quality issues and when optimizing costs. Add it before you build the feature, not after.
-
Use
ai_content_suggestionsfor editorial AI. Custom node form integrations almost always reinvent this module badly. Use it, extend it, don’t replace it. -
Build agents from scratch, but use
ai_agentsas the harness. The agent framework gives you a sensible structure for tool definitions and conversation state. Everything else (system prompt, tool set, safety guardrails) you’re going to build custom anyway. -
Don’t skip evals. Set up at least a handful of test inputs with known-good outputs, and run them against every provider/model swap. A provider change that looks identical on a spot-check can degrade subtle outputs in ways you won’t notice for weeks.
Costs to plan for
The module itself is free. The LLM calls are not.
For editorial features (alt text, summaries, descriptions) on a site producing 50-100 pieces of content a week, expect $5-50/month in API costs depending on model. For AI search with embeddings, expect an upfront indexing cost (one-time, a few dollars for most sites) and cheap ongoing queries. For a production chatbot with non-trivial traffic, plan $50-500/month minimum.
The real cost is engineering time, not API calls. We’ve seen teams under-budget integration effort by 3-5x because “it’s just an API call.” It isn’t — production AI means observability, evals, fallbacks, prompt versioning, safety filters, and ongoing tuning.
Should you add the AI Module to your Drupal site?
Yes, if:
- You have specific editorial workflows that will benefit from AI assistance (translation, alt text, summarization, classification).
- You can assign one engineer to own the AI integration — not just set it up, but keep it tuned.
- You’re willing to budget for observability and evals up front.
No, if:
- You want to “add AI to Drupal” as a strategic initiative without a specific use case. You’ll end up with five half-working features that nobody uses.
- You don’t have engineering capacity to maintain it. AI features drift — providers update their models, prompt behavior changes, costs move. Without someone owning the integration, it’ll rot.
- You’re hoping it’ll solve a content quality problem. AI makes good editorial teams faster; it doesn’t turn bad editorial teams into good ones.
TL;DR
The Drupal AI Module ecosystem is real, mature enough for production in specific places (content suggestions, search, translation), and ambitious in places that are still rough (agents, chatbots, Layout Builder AI). The right approach is to pick one or two concrete editorial problems, use the existing modules as a framework, and invest in observability from day one.
We build production AI features on Drupal. If you’re planning an AI integration and want an honest read on what’s ready and what isn’t — tell us about your project.