Building a travel chatbot with AutoGen and Groq
Generative AI has taken the world by storm and people’s minds are blowing up with the possibilities. A lot of research and experimentation is going on and one area that has received much attention is AI agents. This blog post will introduce you to the world of AI agents using Python, AutoGen, and Groq.
AutoGen, developed by Microsoft, provides us with a flexible framework for building collaborative AI agents. Meanwhile, Groq offers state-of-the-art language models to leverage for natural language processing tasks. By combining these tools, we'll create an AI chatbot that's greater than the sum of its parts. Before building our chatbot, let's clarify a fundamental concept.
If you want to skip to the code then you can go view it on GitHub.
What is an AI Agent?
Firstly we need to answer the question, what is an AI agent? An AI agent is a computer program that can make decisions and act in an environment to achieve specific goals. This means agents take in inputs from their environment, process the data using an LLM, take actions to achieve specific goals and they can adapt their behavior based on past experiences.
AI agents vary in complexity. Their use cases range from simple chatbots to autonomous vehicles. The key feature of an AI agent is its ability to operate autonomously to some degree, making decisions and taking actions without constant human intervention.
In this post, we'll use AutoGen to create the structure and decision-making process of our agent, while Groq will provide the language model that powers its understanding and generation capabilities.
For our practical example, we are going to be creating a travel chatbot. Users will be able to ask the assistant questions about any cities they might be planning to visit, which activities they can do there, and the best time to visit those cities. This example will demonstrate the fundamental concepts of AI agents and how they are connected to LLMs.
Setting up the environment
Before we start building our AI agent, let's set up our development environment.
Python Installation
If you don't have Python installed, visit the official Python website and follow the installation instructions for your operating system.
Installing AutoGen
We'll use PyAutoGen, which is the Python package for AutoGen:
pip install autogen groq
Setting up Groq
Groq will provide our language model. Follow these steps:
- Go to https://console.groq.com/ and create an account
- Navigate to the API keys section
- Create a new API key for the project
With these steps completed, your environment should be ready for building our AI travel assistant!
Building the agent
Let's create our AI travel assistant step by step. Create a project directory named travel-agent
. Inside this directory, create a .env
file to store our API keys and base URL:
GROQ_API_KEY=gsk_secret_key_xxxxxxxxxxxxx
AUTOGEN_USE_DOCKER=False
AUTOGEN_USE_DOCKER=False
disables the use of docker for code execution.
Create a file named travel-assistant.py
in your project directory. We'll build our agent in this file.
Import Required Libraries
import os
from dotenv import load_dotenv
from autogen import ConversableAgent, UserProxyAgent
load_dotenv()
The load_dotenv()
function loads environment variables from our .env
file, which we can access using the os
package.
Configure the Language Model
We'll create an llm_config
dictionary to connect our agents to Groq:
llm_config = {
"config_list": [
{
"model": "llama-3.1-8b-instant",
"api_key": os.getenv("GROQ_API_KEY"),
"api_type": "groq",
}
]
}
Groq provides various LLM models. We're using llama-3.1-8b-instant
from Meta, but you can choose a different model from their website. You can view all the models available on the Groq website https://console.groq.com/docs/models
Create the Agents
We'll create two agents:
- A
UserProxyAgent
that acts on behalf of the user. - A
ConversableAgent
that serves as our travel assistant.
user_proxy_agent = UserProxyAgent(name="User Agent", system_message="User proxy agent")
assistant_agent = ConversableAgent(
name="Assistant",
system_message="You are a helpful travel assistant. Provide detailed and accurate information about travel destinations, activities, and best times to visit. Offer personalized recommendations based on the user's preferences.",
llm_config=llm_config,
)
The UserProxyAgent
initiates the chat and executes commands. By default, the UserProxyAgent
has human_input_mode="ALWAYS"
which allows the user to ask follow-up questions.
def main():
while True:
user_input = input("User: ")
if user_input.lower() in ["exit", "quit", "bye"]:
print("Assistant: Goodbye! Have a great day!!")
break
user_proxy_agent.initiate_chat(assistant_agent, message=user_input)
if __name__ == "__main__":
main()
Running the Agent
To start the agent, navigate to your project directory in the terminal and run:
python travel-assistant.py
Conclusion
AutoGen and Groq are both powerful tools that allow you to create any LLM powered application that your imagination can think of. For the next step I suggest that you play around with different models offered by Groq and compare their output.
In this post, we've just scratched the surface of what's possible. Our travel assistant chatbot demonstrates the fundamental concepts, but the potential applications are vast and varied. From personal productivity assistants to complex decision-making systems in business environments, the possibilities are truly exciting.
Remember the code is available on Github. I hope this tutorial has inspired you to dive deeper into the world of AI applications. If you have any questions you can reach out to me on X (formerly twitter) or LinkedIn. Happy coding!
AI should drive results, not complexity. AgentemAI helps businesses build scalable, efficient, and secure AI solutions. See how we can help.