It will be 0.30000000000000004
Why does 0.1 + 0.2 equal 0.30000000000000004?
In everyday life, we mostly use decimal, and to accurately express decimal, 10's factors are 5 and 2, so only 1/2, 1/4, 1/5, 1/8, 1/10 can be clearly expressed by decimal; but 1/3, 1/6, 1/7, 1/9 cannot. For example, we know that 1/3 will be 0.33333333 endlessly.
For binary, only 1/2, 1/4, 1/8 can be clearly expressed, while others will continue endlessly. However, because the computer's memory is limited and the program language assigns a limited amount of memory to a number, there are limitations in the expression of accuracy, which is the reason for the strange number 0.30000000000000004.
How to avoid related problems?
console.log((0.1 + 0.2).toFixed(1)); // 0.3 console.log((0.1 + 0.2).toPrecision(1)); // 0.3