HeadlinesBriefing favicon HeadlinesBriefing.com

JavaScript Math Explained: 0.1 + 0.2 != 0.3

DEV Community •
×

Developers often panic when `0.1 + 0.2` returns `0.30000000000000004` in JavaScript. It feels like a bug, but it’s actually a fundamental quirk of how computers handle math. Unlike languages that hide the mess, JavaScript exposes the raw truth.

The issue stems from the IEEE 754 floating-point standard. Computers store numbers in binary, and decimal fractions like `0.1` become infinite repeating fractions—much like `1/3` becomes `0.333…` in base 10. With limited bits for storage, the computer rounds the value, creating tiny precision errors that accumulate during addition.

This isn't unique to the browser environment. Python, Java, and C++ all face the same underlying limitation. They just typically round the output for readability, masking the error. JavaScript’s single `Number` type simply makes the imprecision visible.

To fix this in code, developers rely on an epsilon comparison or libraries like Decimal.js to handle precision correctly.