I’m working on a project dealing in finance naturally requiring accuracy.
I’m seeing in Javascript subtracting a decimal doesn’t always return the exact amount.
Obviously I can round the result to 2 decimal places.
My question is, will Round(x,2) tale care of this issue in every circumstance, or is there ever going to be a situation where Round(0.9 - 0.89,2) would = 0.00?
Are you dealing with money? (i.e. dollars and cent) If so, you’re better off storing the values as whole numbers. (so, 114.57 would be 11457)
Internally, the Round() function is done by this JavaScript code. Note that the JavaScript Math.round() function does not have a decimal places argument, unlike the VB round() function.
//vbScript Round
function Round(n, d) {
if (!d || d === null || d === "") {
d = 0;
}
d = Math.floor(d);
d = d < 1 ? 0 : d;
d = Math.pow(10, d);
var result = Math.round(n * d) / d;
return result;
}
I save them as intergers in sql. My php scripts divide it by 100 when it returns any info to the front-end. Would you recommend i retain everything as integers, do calculations as integers then divide by 100 to display on screen?