Within the period of AI, chatbots have revolutionized how we work together with know-how. Maybe one of the vital impactful makes use of is within the healthcare business. Chatbots are capable of ship quick, correct data, and assist people extra successfully handle their well being. On this article, we’ll learn to develop a medical chatbot utilizing Gemini 2.0, Flask, HTML, and Bootstrap. The venture is about creating a personalised, user-friendly customized platform to reply health-related queries with accuracy and pace.
Studying Goals
- Perceive the important thing elements and structure of a Medical Chatbot with Gemini 2.0, and the way it enhances healthcare interactions.
- Learn to combine Gemini 2.0 right into a medical chatbot to offer correct, context-aware responses tailor-made to affected person wants.
- Discover using HTML and Bootstrap to design a responsive and user-friendly chatbot interface.
- Acquire hands-on expertise in establishing and deploying a personalised medical chatbot.
- Uncover the position of FAISS in enhancing search effectivity for chatbot functionalities.
What’s Gemini 2.0?
Introduced in December 2024, Gemini 2.0 is the most recent iteration of Google’s giant language mannequin (LLM) sequence, developed by Google DeepMind. It introduces a number of key enhancements, together with multimodal output, native software use, and agentic skills, positioning it as a flexible AI mannequin for various functions.
Constructing on its predecessor, Gemini 1.5, Gemini 2.0 extends the potential to course of and generate textual content, photographs, video, and audio. It provides native picture creation and multilingual text-to-speech outputs for extra pure, interactive consumer experiences.
One of the crucial excellent options of Gemini 2.0 is its agentic AI, which permits the system to plan and execute duties independently. Experimental tasks like Venture Astra display this functionality by integrating with Google providers comparable to Search and Maps to offer real-time, contextual help. One other instance is Venture Mariner, a Chrome extension that navigates the net autonomously to carry out duties comparable to on-line procuring.
Key Options of Gemini 2.0
- Multimodal Output: Gemini 2.0 processes and generates a number of information sorts, together with textual content, photographs, audio, and video, enabling extra pure and context-rich interactions.
- Native Software Use: The mannequin seamlessly integrates with numerous instruments and platforms, enhancing its utility throughout completely different functions.
- Agentic Skills: Gemini 2.0 introduces AI brokers able to executing complicated duties with minimal human intervention, marking a step in direction of extra autonomous AI programs.
Variations of Gemini 2.0
Gemini 2.0 is out there in a number of variations, every tailor-made for particular use instances:
- Gemini 2.0 Flash Experimental: An experimental mannequin specializing in pace and effectivity, appropriate for speedy process execution.
- Gemini 2.0 Professional: Designed for a variety of duties, providing a stability between efficiency and price.
- Gemini 2.0 Extremely: Optimized for extremely complicated duties, offering superior efficiency for demanding functions.
Flask
- Flask is a light-weight net framework in Python, ultimate for constructing scalable and environment friendly net functions.
- Within the chatbot, Flask handles backend operations, together with API integration with Gemini 2.0, routing, and managing consumer interactions.
- Its simplicity and adaptability make it excellent for fast improvement and integration duties.
HTML and Bootstrap
- HTML kinds the structural basis of the chatbot’s interface, guaranteeing semantic and accessible net design.
- Bootstrap, a CSS framework, enhances the interface by offering responsive and aesthetically pleasing design elements. It ensures the chatbot works seamlessly throughout gadgets, from desktops to smartphones.
Key Options of the Medical Chatbot
- Conversational Interface: The chatbot engages customers with pure, human-like interactions powered by Gemini 2.0.
- Consumer-friendly Design: Bootstrap-enabled responsive design ensures ease of use on any system.
- Well being Question Help: Able to addressing quite a lot of medical questions, from signs to common well being recommendation.
- Accessibility: Designed to be straightforward to navigate for customers of all technical proficiencies.
Fb AI Similarity Search
Meta (previously Fb) developed FAISS as an open-source library for environment friendly similarity search and clustering of dense vectors. Machine studying generally makes use of FAISS, particularly for duties involving large-scale vector search and nearest neighbor retrieval. FAISS optimizes dealing with high-dimensional information, making it ultimate for functions comparable to advice programs, pure language processing, and picture retrieval.
In a nutshell, FAISS allows indexing dense vectors and helps quick approximate or actual search over them. It makes use of product quantization, HNSW (Hierarchical Navigable Small World graphs), and IVF (Inverted File Index) methods to stability the trade-off between pace and accuracy. These methods dramatically cut back the computational complexity and reminiscence utilization with excessive precision within the search outcomes. Nonetheless, FAISS additional helps each CPU and GPU acceleration, making it appropriate for hundreds of thousands and even billions of vectors for dealing with datasets.
One among FAISS’s key strengths is its versatility. It gives a number of indexing methods, enabling customers to decide on probably the most acceptable method for his or her particular use instances. For instance, flat indexes supply actual search capabilities, whereas quantization-based indexes prioritize effectivity. Its Python and C++ APIs make it accessible to a variety of builders, and its modular design permits for simple integration into current machine studying pipelines.
Be taught extra about Vector Database right here.
Flowchart Description: Medical Chatbot Workflow
Beneath is the movement diagram:

- Begin: The consumer begins on the House Web page (index.html) of the medical chatbot.
- Add PDFs to Construct Data Base:
- Customers add PDF recordsdata by way of the File Add performance.
- Backend processes the PDFs and builds the Data Base, storing it in a Vector Retailer (vector_store.pkl) for future queries.
- If invalid recordsdata are uploaded, the system handles the error and prompts the consumer for legitimate recordsdata.
- Ask Medical Questions:
- Customers submit medical questions by way of the Ask Medical Questions (/ask) function.
- If a Data Base exists, the system retrieves related paperwork.
- If no Data Base exists, an error message is displayed or a redirect is initiated.
- Generate Response:
- Related paperwork are retrieved and handed to the Gemini Mannequin for producing responses.
- The mannequin processes the enter and gives an correct medical response.
- Show or Redirect:
- The generated response is exhibited to the consumer or redirected to an acceptable web page for additional interplay.
- Finish: The consumer receives the response and will select to work together additional or finish the session.
This workflow ensures clean consumer interplay, environment friendly error dealing with, and correct response technology utilizing the Gemini Mannequin for a seamless medical chatbot expertise.
Setting Up the Atmosphere
Start by putting in the required dependencies, configuring the API key, and establishing the frontend to organize your setting for the medical chatbot.
Set up the necessities.txt
pip set up -r https://uncooked.githubusercontent.com/Gouravlohar/Medical-Chatbot/refs/heads/grasp/necessities.txt
API Key
Get your Gemini 2.0 API key from right here.

HTML Entrance-Finish Code for a Medical Chatbot
This HTML code kinds the front-end consumer interface of a medical chatbot utility. It creates an interactive net web page the place customers can:
- Add PDF recordsdata to offer extra context for the chatbot.
- Ship chat messages to work together with the AI-based medical chatbot.
The interface makes use of Bootstrap for styling and jQuery for dealing with consumer interactions dynamically. It consists of options like a typing indicator for the chatbot and seamless message show. The code integrates with a Flask back-end to course of consumer inputs and return AI-generated responses.
Medical Chatbot
[email protected]/dist/css/bootstrap.min.css" rel="stylesheet">
Welcome to Medical Chatbot
Word: That is an AI chatbot and will make errors. Please confirm the data offered.
{% with messages = get_flashed_messages() %}
{% if messages %}
{{ messages[0] }}
{% endif %}
{% endwith %}
`;
$("#messages-container").append(userMessage);
$("#chat-input").val("");
let typingIndicator = `
`;
$("#messages-container").append(typingIndicator);
$.ajax({
sort: "POST",
url: "/ask",
information: {
'immediate': query
},
success: perform (information) $)/g, "
");
let botMessage = `
`;
$("#messages-container").append(botMessage);
});
});
});
UI Output

Constructing a Doc-Powered AI Query Answering System
Think about having the ability to add just a few PDFs and immediately ask questions on their content material, receiving exact, AI-generated solutions in seconds. That is the promise of a document-powered AI question-answering system. By combining the facility of AI fashions like Gemini, doc embedding methods, and a Flask-based net interface, you possibly can create an clever software able to understanding, processing, and responding to consumer queries based mostly on uploaded paperwork. Beneath we’ll stroll you thru the steps to construct such a system, from establishing the setting to implementing superior options like similarity search and real-time responses.
Step 1: Imports and Setup
Start by importing mandatory libraries and modules, comparable to Flask for the net utility, Google Generative AI for mannequin integration, and LangChain for doc dealing with and vector retailer administration.
from flask import Flask, render_template, request, redirect, url_for, flash
import google.generativeai as genai
from langchain_core.prompts import PromptTemplate
from langchain.chains import LLMChain
from langchain_community.document_loaders import PyPDFLoader
from langchain_community.vectorstores import FAISS
from langchain_huggingface import HuggingFaceEmbeddings
import os
import logging
import pickle
Step 2: Flask App Initialization and Configuration
Arrange the Flask app, configure key settings just like the add folder for PDFs, and outline a secret key for session administration.
app = Flask(__name__)
app.config['UPLOAD_FOLDER'] = 'uploads'
app.secret_key = 'supersecretkey'
os.makedirs(app.config['UPLOAD_FOLDER'], exist_ok=T
Step 3: Logging Setup
Configure logging to seize vital data and errors, guaranteeing clean debugging and monitoring in the course of the app’s operation.
logging.basicConfig(stage=logging.INFO)
logger = logging.getLogger(__name__)
Step 4: Gemini Mannequin
Initialize the Gemini AI mannequin and configure it along with your API key to allow interplay with the mannequin for producing content material.
mannequin = genai.GenerativeModel('gemini-2.0-flash-exp')
my_api_key_gemini = os.getenv('GOOGLE_API_KEY')
genai.configure(api_key=my_api_key_gemini)
Step 5: Vector Retailer Setup
Arrange a vector retailer to retailer doc embeddings, loading it if already exists, to carry out environment friendly doc similarity searches later.
vector_store = None
# Load current vector retailer if obtainable
if os.path.exists('vector_store.pkl'):
with open('vector_store.pkl', 'rb') as f:
vector_store = pickle.load(f)
- vector_store = None: Initializes a variable to retailer the vector retailer.
- if os.path.exists(‘vector_store.pkl’): Checks if a beforehand saved vector retailer file (vector_store.pkl) exists.
If it exists, the system hundreds the vector retailer from the file utilizing pickle.load(f).
This shops doc embeddings (numerical representations of the paperwork) for environment friendly similarity search.
Step 6: Error Dealing with (404)
Outline a handler for 404 errors to redirect customers to the house web page once they entry a non-existent route.
@app.errorhandler(404)
def page_not_found(e):
return redirect(url_for('index'))
If a consumer accesses a non-existent web page (i.e., a 404 error happens), the system redirects them to the house web page (index).
Step 7: Essential Web page Route (/)
Create the route for the principle web page of the net utility, rendering the preliminary HTML template to the consumer.
@app.route('/')
def index():
return render_template('index.html')
Step 8: File Add Route (/add)
Implement the file add route, permitting customers to add PDF recordsdata, course of them, and convert them into embeddings for the vector retailer.
@app.route('/add', strategies=['POST'])
def add():
international vector_store
attempt:
if 'pdf_files' not in request.recordsdata:
flash("No file half")
return redirect(url_for('index'))
recordsdata = request.recordsdata.getlist('pdf_files')
paperwork = []
for file in recordsdata:
if file.filename == '':
flash("No chosen file")
return redirect(url_for('index'))
file_path = os.path.be a part of(app.config['UPLOAD_FOLDER'], file.filename)
file.save(file_path)
pdf_loader = PyPDFLoader(file_path)
paperwork.lengthen(pdf_loader.load())
# Create embeddings utilizing HuggingFaceEmbeddings
embeddings = HuggingFaceEmbeddings()
if vector_store is None:
# Create a brand new vector retailer if it would not exist
vector_store = FAISS.from_documents(paperwork, embeddings)
else:
# Add new paperwork to the present vector retailer
vector_store.add_documents(paperwork)
# Save the up to date vector retailer
with open('vector_store.pkl', 'wb') as f:
pickle.dump(vector_store, f)
flash("PDFs uploaded and processed efficiently. The data base is prepared.")
return redirect(url_for('index'))
besides Exception as e:
logger.error("An error occurred whereas processing the PDFs: %s", e)
flash("An error occurred whereas processing the PDFs.")
return redirect(url_for('index'))
- This route permits customers to add PDF recordsdata.
- The request.recordsdata object is checked for the presence of the pdf_files key.
- Information are saved to the uploads listing.
- PyPDFLoader is used to extract textual content from every uploaded PDF.
- The extracted textual content is then transformed into embeddings utilizing HuggingFaceEmbeddings.
- The embeddings are saved within the vector_store (created utilizing FAISS).
- The up to date vector_store is saved to vector_store.pkl for future use.
- If the add is profitable, successful message is proven. If there may be an error, an error message is proven.
Step 9: Query Answering Route (/ask)
Implement the question-answering route, the place customers can enter questions, and the app retrieves related paperwork and generates AI-powered responses based mostly on the content material.
@app.route('/ask', strategies=['POST'])
def ask():
international vector_store
if vector_store is None:
return "Data base is just not prepared. Please add PDFs first."
query = request.kind['prompt']
# Retrieve related paperwork based mostly on the query
relevant_docs = vector_store.similarity_search(query)
context = " ".be a part of([doc.page_content for doc in relevant_docs])
custom_prompt = f"You're the greatest physician. Solely present medical-related solutions. Context: {context} Query: {query}"
response = mannequin.generate_content(custom_prompt)
if response.textual content:
return response.textual content
else:
return "Sorry, however I believe Gemini did not need to reply that!"
- This route permits customers to ask questions.
- If the vector retailer is just not prepared (i.e., no PDFs uploaded but), the app prompts the consumer to add PDFs first.
- The consumer’s query is extracted from the shape.
- The app performs a similarity search (vector_store.similarity_search) to seek out related paperwork based mostly on the query.
- The app creates a customized immediate utilizing the context from the retrieved paperwork, instructing the AI mannequin (Gemini) to offer medical-related solutions.
- The mannequin generates a response utilizing mannequin.generate_content(custom_prompt).
- If the mannequin gives a response, the app returns it to the consumer. In any other case, the app returns a default message.
Step 10: Operating the Flask Utility
Lastly, run the Flask app in debug mode to begin the net utility and make it accessible for customers to work together with.
if __name__ == '__main__':
app.run(debug=True)
Get Code on GitHub right here
Output
PDF I used for testing hyperlink


Immediate
What number of kinds of Headache?

After importing the PDF, the system gives response instantly from its content material.
Conclusion
On this weblog, now we have mentioned the right way to create a Flask-based net utility that makes use of AI instruments and methods to construct a data base from uploaded PDFs. The applying permits customers to ask medical-related questions and get contextually related solutions based mostly on the content material of the uploaded paperwork by integrating generative fashions like Google Gemini and vector search mechanisms with LangChain. In such a system, AI, allied with trendy instruments for net improvement, can utterly automate data retrieval in an clever interactive expertise.
By understanding the essential construction of this code, from file importing to query answering, we see how a primary Flask app could be prolonged with highly effective AI capabilities. Whether or not growing a data administration system or just designing a chatbot, the identical mentioned applied sciences can come in useful and get you underway.
Key Takeaways
- By utilizing instruments like LangChain and FAISS, we will convert unstructured PDF paperwork into a robust, searchable data base that responds intelligently to consumer queries.
- Medical Chatbot with Gemini 2.0 allows personalised, real-time medical help by leveraging superior AI fashions for correct responses.
- You need to use Google’s Gemini mannequin to generate contextually related solutions by interacting with the vectorized paperwork, providing a dynamic and interactive consumer expertise.
- Flask serves because the light-weight framework that brings all the pieces collectively, dealing with file uploads, processing, and consumer interactions seamlessly.
- FAISS, a robust vector search engine, helps discover probably the most related paperwork for consumer queries, enhancing the standard and accuracy of responses.
Incessantly Requested Questions
A. The /ask route permits customers to submit questions. The app then makes use of the uploaded PDFs to seek out related data and generates a response utilizing Google’s Gemini AI mannequin.
A. The applying makes use of PyPDFLoader to extract textual content from uploaded PDFs. This textual content is then embedded into vectors utilizing HuggingFaceEmbeddings, and saved in a FAISS vector retailer for quick similarity searches.
A. Sure, you possibly can adapt the app to numerous domains. By altering the immediate, you possibly can customise the question-answering performance to match completely different fields, comparable to authorized, academic, or technical.
A. The vector retailer is saved as a .pkl file utilizing Python’s pickle module. The app checks for the file’s existence on startup and hundreds it if obtainable, guaranteeing that beforehand uploaded paperwork persist throughout periods.
A. You want Python and Flask put in, together with dependencies like google.generativeai, langchain, FAISS, and HuggingFaceEmbeddings. You additionally want an API key for Google’s Gemini mannequin. Be certain to arrange a digital setting to handle the dependencies.