HeadlinesBriefing favicon HeadlinesBriefing.com

QNAP Unveils Edge AI Storage Server for Private LLM Deployment

TechPowerUp News •
×

QNAP has unveiled the QAI-h1290FX Edge AI storage server, targeting enterprises seeking private on-premises AI infrastructure. The device combines server-grade AMD EPYC processing with NVIDIA RTX GPU acceleration and twelve U.2 NVMe/SATA SSD slots, creating a high-performance platform for deploying large language models, RAG search engines, and generative AI applications without cloud dependency.

Powered by QNAP's ZFS-based QuTS hero operating system, the QAI-h1290FX delivers enterprise-grade data integrity with near-limitless snapshots and inline deduplication. It supports GPU access in containers through Container Station and GPU passthrough for virtual machines via Virtualization Station. Users can run inference models and AI applications with full control over performance and resource allocation.

The QAI-h1290FX comes preloaded with AI tools like AnythingLLM, OpenWebUI, and Ollama for fast deployment of private LLM workflows. Its 16-core AMD EPYC 7302P processor provides server-class compute power, while optional NVIDIA RTX PRO 6000 GPU support offers up to 96GB of GPU memory. The system addresses growing demand for on-prem AI infrastructure that maintains data privacy and operational control.