I have migrated from (classic) .NET 4.8 to .NET 6 (Core) lately. I am seeing differences in the precision of datatypes (float, double). It is really hard to change the behavior manually (by using Math.Round() or something else) as the project is significantly big. Is there any way I can overcome this issue?
Ex: a multiplication operation in .NET 4.8 is returning 3748.9 where as the same operation is returning 3748.899999999996 in .NET 6.
Here is the line of code
Math.Abs(quantity * Price).ToString(Culture.CurrentCulture);
where quantity and price are of type double
BitConverter.GetBytes()for instance) to see if there's a true change in values?decimal, which works more like humans expect (but which is slower and larger)a multiplication operation in .NET 4.8 is returning 3748.9what multiplication? How did you inspect the result? Are you sure that the result wasn't a result of text formatting changes? The real change in .NET Core was in formatting, not multiplication