HeadlinesBriefing favicon HeadlinesBriefing.com

Building a Local RAG AI Agent for Airline Reviews

DEV Community •
×

A developer built a fully local AI agent using Retrieval-Augmented Generation to answer questions about airline reviews without internet access. The project used Ollama as the LLM runtime, llama3.2 for answering, and mxbai-embed-large for embeddings, all running offline on standard hardware.

The system processes a Kaggle CSV dataset, generating vectors stored in Chroma. A strict prompt ensures answers derive solely from retrieved reviews, preventing hallucination. When no relevant data exists, like for an unrelated car question, the agent correctly responds with "No reviews were found."

This proof-of-concept demonstrates that meaningful AI projects can start without massive cloud infrastructure. By using Ollama on a local machine, the experiment avoids API costs and maintains data privacy. The approach is scalable, suggesting production-ready systems could emerge with larger datasets and better hardware.