c# - .Net division rounding/decimal differences between web servers -


i've got problem that's baffling me bit.

here's bit of background: in process of upgrading asp.net 2.0 web app .net 4 onto 64 bit server. have test app deployed both new , old servers make sure things work expected before go live; both of point same database on server.

here's problem:

double totalgross; double totalnet = 9999999.00; float taxrate = 15.00f;  totalgross = totalnet * (1 + (taxrate / 100)); 

on old server, .tostring() on totalgross produces: 11499998.85

on new server, .tostring() on totalgross produces: 11499998.6115814

currently @ loss why might be? latter value doesn't represent first number un-rounded?

by way - i'm not after ways correct/improve code... after possible reasons why happen!

update

i created console app , built in x86 , x64 , ran both versions on server , outputted 2 different figures. indeed seem sort of loss of precision between 32bit , 64bit when using double. surprise me 'loss of precision in question 0.2 doesn't seem precise me , quite difference?! suggested it's better use decimal type (in defence didn't write code :p)

i suppose old server 32bit machine? check http://msdn.microsoft.com/en-us/library/system.double.aspx, paragrahp floating-point values , loss of precision. can force new server run process 32bit, change anything?


Comments

Popular posts from this blog

c# - SharpSVN - How to get the previous revision? -

c++ - Is it possible to compile a VST on linux? -

url - Querystring manipulation of email Address in PHP -