HeadlinesBriefing favicon HeadlinesBriefing.com

Figma’s Real‑Time Data Pipeline Upgrade

ByteByteGo •
×

In 2020, Figma’s data sync ran a single cron job that dumped entire tables into S3 and Snowflake, a design that held up while the platform was small. As teams added FigJam, Dev Mode and global localization, the database ballooned. By 2023, a full‑table copy stretched six hours, and replica costs ran into millions.

Figma opted for incremental change capture rather than another full copy. By reading PostgreSQL's write‑ahead log, the team streamed inserts, updates and deletes into Kafka topics—one per table—then let Snowflake pull the stream at its own pace. This decoupling kept production traffic smooth while still delivering near‑real‑time analytics.

The upgrade required careful snapshot timing to avoid data gaps. Figma starts the CDC stream slightly before a nightly snapshot finishes, so that events occurring during the export survive. Duplicates are merged; missing data cannot. The result: a three‑hour merge schedule that keeps analytics tables up to date without draining production resources.

Beyond performance, the move frees Figma engineers to innovate on new features. With near‑real‑time data, product teams can iterate faster on user experience metrics, track feature adoption across regions, and feed insights into AI‑driven design assistants. The architecture—RDS snapshots, Kafka streams, Snowflake merges—remains lightweight and cost‑effective, proving a scalable blueprint for other SaaS products.