HeadlinesBriefing favicon HeadlinesBriefing.com

Local AI Stack Generates Terraform Code

DEV Community •
×

Frustrated by endless copy‑paste of Terraform snippets, a developer turned to local AI tools to generate fresh infrastructure code. He gathered reference material—slides, old GitHub repos, a Manning book, and Terraform Registry modules—and built a self‑contained stack using Ollama for LLM inference, Langflow for no‑code agent orchestration, OpenSearch as a vector store, and Docling to ingest PDF chapters. A custom Dockerfile pre‑pulls the Granite4:tiny‑h model from Ollama, cutting startup time.

Because Langflow lacked the Docling dependency, the author extended the official image with required libraries. Retrieval‑augmented generation proved essential, so the vector index was manually created in OpenSearch after the built‑in auto‑creation failed; an AI assistant named Project Bob supplied a working script and Dockerfile in minutes. All services—Ollama, Langflow, OpenSearch, and the HashiCorp Terraform MCP server—are wired together via Docker Compose, exposing ports for local testing.

Running everything locally avoids cloud credits and keeps costs low while offering a sandbox for experimenting with AI‑driven IaC generation. Future work will automate ingestion of the author’s books, slides, and code into the RAG pipeline.