If you’re looking at chatbots for your website, you’ve probably noticed there are two very different species. Traditional chatbots — the ones with button menus and decision trees — have been around for years. Docs-based chatbots, powered by AI and your actual content, are the newer approach.
They solve the same problem (answering customer questions) in fundamentally different ways. Here’s how they compare.
Traditional Chatbots: The Decision Tree Approach
Traditional chatbots work like a phone menu. The customer picks an option, the bot follows a scripted path, and hopefully lands on the right answer. Tools like Intercom, Drift, and Zendesk have offered this for years.
How they work:
- You define a set of intents (“billing question”, “shipping status”)
- You write specific responses for each intent
- The bot matches the customer’s message to an intent and serves the scripted reply
- If no match, it falls back to “I didn’t understand, here are some options”
The good:
- Predictable — you know exactly what it will say
- Good for transactional tasks (track my order, reset password)
- No AI hallucination risk
The bad:
- Painful to set up — you have to anticipate every possible question
- Brittle — any phrasing you didn’t predict gets a dead end
- Maintenance nightmare — every product change means updating flows
- Customers hate them — 80% of users report frustration with traditional chatbots
Docs Chatbots: The AI-Grounded Approach
Docs chatbots work differently. Instead of predefined scripts, they read your actual documentation and answer questions in natural language, grounded in your content.
How they work:
- You upload your docs (help articles, FAQs, product guides, markdown files)
- The system chunks and indexes them using embeddings
- When a customer asks a question, the most relevant chunks are retrieved
- An LLM generates a natural answer using only those chunks as context
- If the answer isn’t in your docs, it says so
The good:
- Handles any question about your product, not just ones you predicted
- Natural conversation — understands paraphrasing, follow-ups, context
- Zero maintenance — update your docs and the bot automatically knows
- Sets up in minutes, not weeks
The bad:
- Can’t take actions (yet) — can’t process refunds or change accounts
- Quality depends on your documentation quality
- Slightly less predictable than scripted responses
Head-to-Head Comparison
| Traditional | Docs Chatbot | |
|---|---|---|
| Setup time | Weeks to months | Minutes |
| Maintenance | Constant updates needed | Update docs only |
| Question coverage | Only predicted questions | Anything in your docs |
| Natural language | Basic intent matching | Full conversational |
| Actions | Yes (with integrations) | Limited (improving) |
| Cost | £50-500/month | £0-29/month |
| Accuracy | High (but limited scope) | High (within docs scope) |
When to Choose Which
Choose a traditional chatbot if:
- You need the bot to take actions (process returns, modify accounts)
- Your support is mostly transactional with few question types
- You have a team to build and maintain conversation flows
Choose a docs chatbot if:
- Most support questions are “how do I...” or “what is...”
- You already have documentation (help centre, FAQ, guides)
- You want something working today, not next quarter
- You don’t have a dedicated support engineering team
The Best of Both Worlds
In practice, many teams use both. A docs chatbot handles the knowledge questions (which make up the majority of support volume), while a traditional bot or human agent handles the transactional stuff.
With DeskPilot, you can deploy a docs chatbot in minutes — upload your help articles, get a shareable chat link, and start deflecting the knowledge questions immediately. Then layer on your existing tools for actions and escalation.
It’s not either/or. But if you can only pick one starting point, a docs chatbot gives you the most coverage for the least effort.