Building conversational chatbots with knowledge using CrewAI and Mem0
A few months ago, I wrote a blog post on building conversational chatbots with CrewAI, but because the AI world moves so fast, I believe it’s now due for an update. In the previous version, I built a simple chatbot that used Mem0 for long-term memory. Now that Mem0 support is built into crews, it’s a matter of setting it up. You can view the final chatbot code on GitHub.
Setting Up Our Environment
Prerequisites:
- Basic knowledge of Python
- Python 3.10 or later installed
- pip package manager
- uv package manager
First, let's install CrewAI using the pip package manager. You can find the installation instructions here. Open your terminal and run:
pip install crewai crewai-tools
This will install crewAI and allow us to use the crewAI CLI tool to create our conversational chatbot. Next step is to create our crewai_knowledge_chatbot
using the CLI tool
crewai create crew crewai_knowledge_chatbot
During the creation process, you'll select a model provider (like OpenAI or Anthropic) and specify which model to use. The CLI will guide you through these choices.
Once your project is created, install the dependencies:
$ cd crewai_knowledge_chatbot
$ crewai install
Building our chatbot
Our chatbot architecture consists of 6 key components:
- Task - Defines what the chatbot needs to do
- Agent - The AI entity that handles conversations
- Crew - Orchestrates the agent and task
- Memory - Stores conversation history
- Knowledge - Provides reference information
- Chat loop - Manages the conversation flow
Chat task
First, we define the chat task in YAML. This task description tells our agent how to handle conversations:
// src/crewai_knowledge_chatbot/config/tasks.yaml
chat_task:
description: >
Use the conversation history to build your response to the user:
{history}
Respond to the user's message: {user_message}",
expected_output: >
Your output should be a relevant, accurate, and engaging response that directly addresses the user's query or continues the conversation logically.
agent: assistant
Assistant agent
The agent configuration defines the AI assistant's role, goals, and personality:
// src/crewai_knowledge_chatbot/config/agents.yaml
assistant:
role: >
Virtual Assistant
goal: >
To provide accurate, helpful, and engaging responses to a wide range of user queries, enhancing user experience and knowledge across various topics.
backstory: >
You are an advanced AI assistant with vast knowledge spanning multiple disciplines, designed to engage in diverse conversations and provide helpful information.
Your primary function is to assist users by answering questions, offering explanations, and providing insights, adapting your communication style to suit different users and contexts.
While you lack personal experiences or emotions, you're programmed to respond with empathy, use appropriate humor, and always strive to provide accurate, helpful, and tailored information, admitting when a topic is outside your knowledge base.
This configuration shapes how the agent interacts with users, setting expectations for its behavior and capabilities.
Configuring memory
Memory enables our chatbot to maintain context across conversations. We use Mem0 for this purpose. You can find instructions on how to set up Mem0 and CrewAI here. You need to sign up, get a Mem0 API KEY, and place it in the .env file with the key MEM0_API_KEY.
// src/crewai_knowledge_chatbot/crew.py
memory_config = {
"provider": "mem0",
"config": {"user_id": "User"},
}
This configuration tells CrewAI to use Mem0 as the memory provider and associates memories with a specific user ID. The memory system helps the chatbot remember previous interactions and maintain contextual awareness.
Setting up knowledge
Knowledge integration allows our chatbot to reference specific information sources. To use knowledge in CrewAI you need to have a knowledge
folder in the root folder of the project. You can find the documentation for setting up knowledge here. For the demo chatbot we are going to use the Cognitive Architectures for Language Agents paper. Here's how we set up a PDF knowledge source:
from crewai.knowledge.source import PDFKnowledgeSource
# Create a PDF knowledge source
pdf_source = PDFKnowledgeSource(
file_paths=["CoALA.pdf"]
)
This code loads a PDF document as a knowledge source, allowing the chatbot to reference and cite information from it during conversations. The knowledge system helps ground the chatbot's responses in specific source material.
Chat crew
The crew configuration brings together all components:
// src/crewai_knowledge_chatbot/crew.py
from crewai import Agent, Crew, Process, Task
from crewai.project import CrewBase, agent, crew, task
from crewai.knowledge.source.pdf_knowledge_source import PDFKnowledgeSource
pdf_source = PDFKnowledgeSource(file_paths=["CoALA.pdf"])
memory_config = {
"provider": "mem0",
"config": {"user_id": "User"},
}
@CrewBase
class CrewaiKnowledgeChatbot:
"""CrewaiKnowledgeChatbot crew"""
agents_config = "config/agents.yaml"
tasks_config = "config/tasks.yaml"
@agent
def assistant(self) -> Agent:
return Agent(
config=self.agents_config["assistant"],
memory=True,
memory_config=memory_config,
verbose=False,
)
@task
def chat_task(self) -> Task:
return Task(
config=self.tasks_config["chat_task"],
)
@crew
def crew(self) -> Crew:
return Crew(
agents=self.agents,
tasks=self.tasks,
process=Process.sequential,
knowledge_sources=[pdf_source],
verbose=False,
)
Chat loop
The chat loop manages the ongoing conversation:
from mem0 import MemoryClient
client = MemoryClient()
def run():
history = []
while True:
user_input = input("User: ")
if user_input.lower() in ["exit", "quit", "bye"]:
print("Chatbot: Goodbye! It was nice talking to you.")
break
chat_history = "\\n".join(history)
inputs = {
"user_message": f"{user_input}",
"history": f"{chat_history}",
}
response = CrewaiKnowledgeChatbot().crew().kickoff(inputs=inputs)
history.append(f"User: {user_input}")
history.append(f"Assistant: {response}")
client.add(user_input, user_id="User")
print(f"Assistant: {response}")
The chat loop:
- Maintains a conversation history
- Handles user input and exit commands
- Formats the conversation context for the crew
- Triggers the chatbot's response
- Stores interactions in memory
- Manages the display of responses
Running the chatbot
To start your chatbot, navigate to the project root directory and run:
crewai run
Conclusion
CrewAI provides a feature rich framework for creating AI agent workflows quickly and efficiently. This simple chatbot demonstration showcases several powerful features:
- Project scaffolding through the CrewAI CLI
- Built-in memory support for maintaining conversation context
- Integrated knowledge capabilities for grounding responses in specific sources
- Flexible agent and task configuration
- Simple but powerful conversation management
These features come together to create a chatbot that can maintain context, access specific knowledge, and provide informed responses without requiring complex custom code. You can view the final chatbot code on GitHub.
Visit crewai.com to start your AI automation journey today.
For enterprise inquiries or consulting services, reach out to me on X (formerly Twitter) or LinkedIn so that we can see how CrewAI can be integrated into your workflows.
AI should drive results, not complexity. AgentemAI helps businesses build scalable, efficient, and secure AI solutions. See how we can help.