Lesson 4: AI Chatbot with Groq API
Build an AI-powered chatbot using Groq's fast inference API to answer user questions with intelligent, conversational responses.
Lesson Objectives
- Understand what Groq is and why it's fast for AI inference
- Set up a Groq API key and integrate it into Flask
- Create the chatbot blueprint with form handling
- Implement the Groq API to generate AI responses
- Build a user interface for asking questions and displaying responses
- Test the chatbot with questions like 'When is Christmas this year?'
AI Chatbot with Groq API - Notes
What is Groq?
Groq is an AI inference platform that provides extremely fast responses using optimized hardware (LPU - Language Processing Unit).
Key Benefits:
- Very fast inference speeds (responses in milliseconds)
- Free tier available for developers
- Access to models like Llama, Mixtral, and more
- Simple API similar to OpenAI
Getting Started with Groq
Step 1: Sign up for a Groq account at console.groq.com
Step 2: Generate an API key from your dashboard
Step 3: Install the Groq Python library:
pip install groq Chatbot Blueprint Structure
Your chatbot blueprint should have these routes:
from flask import Blueprint, render_template, request
chatbot_bp = Blueprint('chatbot', __name__)
@chatbot_bp.route('/chatbot')
def index():
# Display chatbot interface
return render_template('chatbot/index.html')
@chatbot_bp.route('/chatbot/ask', methods=['POST'])
def ask_question():
question = request.form.get('question')
response = get_groq_response(question)
return render_template('chatbot/index.html',
question=question,
response=response) Integrating Groq API
Here's how to use Groq to generate AI responses:
from groq import Groq
import os
def get_groq_response(question):
# Initialize Groq client
client = Groq(
api_key=os.environ.get('GROQ_API_KEY')
)
# Create chat completion
chat_completion = client.chat.completions.create(
messages=[
{
"role": "user",
"content": question,
}
],
model="llama3-8b-8192", # or mixtral-8x7b-32768
)
return chat_completion.choices[0].message.contentAvailable Models:
llama3-8b-8192- Fast, good for general questionsllama3-70b-8192- More powerful, better reasoningmixtral-8x7b-32768- Excellent balance of speed and quality
Chatbot Form Implementation
Create a simple form for users to ask questions:
<form action="/chatbot/ask" method="POST">
<div class="mb-3">
<label for="question" class="form-label">Ask a Question</label>
<textarea class="form-control" name="question"
id="question" rows="3"
placeholder="e.g., When is Christmas this year?"
required></textarea>
</div>
<button type="submit" class="btn btn-primary">
Send Question
</button>
</form>
{% if response %}
<div class="alert alert-success mt-4">
<h5>Response:</h5>
<p>{{ response }}</p>
</div>
{% endif %} Example Conversation
User asks: "When is Christmas this year?"
Groq responds: "Christmas is on December 25th. So it's on a Monday."
Tips for better responses:
- Be specific with your questions
- Use llama3-70b for complex questions
- Handle API errors gracefully (show user-friendly messages)
- Consider adding a system message for consistent personality
Environment Variables
Store your Groq API key securely using environment variables:
Create a .env file:
GROQ_API_KEY=gsk_your_api_key_hereLoad it in your Flask app:
from dotenv import load_dotenv
import os
load_dotenv()
# Access the key
api_key = os.environ.get('GROQ_API_KEY')Important: Never commit your .env file to GitHub! Add it to .gitignore.
Module 6 Complete!
Congratulations! You've now learned how to build all four blueprints:
- Tickers: Stock price tracking with financial API
- Weather: City weather tracking with weather API
- Movies: Movie database with OMDB API and full CRUD
- Chatbot: AI-powered Q&A with Groq API
Next, you'll learn theory concepts and then see the complete demo of all four working together!