• C++ Programming for Financial Engineering
    Highly recommended by thousands of MFE students. Covers essential C++ topics with applications to financial engineering. Learn more Join!
    Python for Finance with Intro to Data Science
    Gain practical understanding of Python to read, understand, and write professional Python code for your first day on the job. Learn more Join!
    An Intuition-Based Options Primer for FE
    Ideal for entry level positions interviews and graduate studies, specializing in options trading arbitrage and options valuation models. Learn more Join!

Mac (PowerPC) vs. Intel double precision

Joined
8/10/05
Messages
144
Points
28
Don't know how many of you got the Bermuda Option question correctly on 9821 final, but I know I got it wrong. I finally figured out what I did wrong and matched Andy's & Alain's numbers. I'm using a Mac (a PowerBook), with the PowerPC chip that is, and it seems that the precision of the 'double' primitive is a bit different between PowerPC & Intel chips. However, the 'long double' primitive (which has higher precision than a 'double') seems to have the same precision. Here is some code to illustrate this:
C++:
    double num = 5.0;
    long double long_num = 5.0;
    
    double denom = 100.0;
    long double long_denom = 100.0;
    
    double fraction = num/denom;
    long double long_fraction = long_num/long_denom;
    
    double total = 0.0;
    long double long_total = 0.0;
    for (int i=0; i < denom; i++)
    {
        total += fraction;
        long_total += long_fraction;
    }
    
    cout.setf(ios::fixed, ios::floatfield);
    cout.precision(18);
    
    cout << "       num=" << num << endl;
    cout << "     total=" << total << endl;
    cout << "long_total=" << long_total << endl;
Output from my PowerBook:
C++:
[Session started at 2007-01-03 23:42:45 -0500.]
       num=5.000000000000000000
     total=4.999999999999990230
long_total=5.000000000000000000

Final has exited with status 0.
Output from my PC (Pentium 4, Windows XP):
C++:
$ ./MTH9821.exe
       num=5.000000000000000000
     total=4.999999999999990200
long_total=5.000000000000000000
Press any key to continue . . .
The code above sums the fraction 5/100 in a loop for 100 times using both primitive types: 'double' and 'long double'. The 'long double' gives much better result, while the 'double' gives slightly different numbers between PowerPC & Intel chips, a subtle difference but it can drive one nut!

I'm going to start using 'long double' more often now. I know a few of us are using a Mac so hopefully you don't have to spend hours debugging your code.
 
I'm glad you finally found out the bugs, Hien. This would drive one absolutely nut.:wall
I encountered cases where double numbers are out of precision range and went haywire. But for all purpose and intention, double type should be what everyone should use. Except those Mac people ;)
The latest generation of Macs with Intel CPU should fix this problem, I assume
 
I'm glad you finally found out the bugs, Hien. This would drive one absolutely nut.:wall
I encountered cases where double numbers are out of precision range and went haywire. But for all purpose and intention, double type should be what everyone should use. Except those Mac people ;)
The latest generation of Macs with Intel CPU should fix this problem, I assume

Yeah..for all intentions & purposes, double should be enough, but my intention & purpose here is to get pass the 6-digit accuracy to get full points on the hw & exam questions :)
 
Woody, aren't you owner of the latest version Macs ? The black Mac with Bootcamp ? I thought that came with Intel CPU.

If it has a bootcamp, it must be an Intel processor ;)

This whole story with numbers sounds strange... overall, it's good to know your own computer ;)
 
If it has a bootcamp, it must be an Intel processor ;)

This whole story with numbers sounds strange... overall, it's good to know your own computer ;)

Yup...knowing your computer is definitely a good thing!
 
Woody, aren't you owner of the latest version Macs ? The black Mac with Bootcamp ? I thought that came with Intel CPU.

I do have an Intel MacBook, but I haven't run this test. I can compare some of my code between my Mac and my ThinkPad at work.
 
My first guess is that this might be down to the compiler. I know and ( mostly) trust VC++.
Maybe your compiler flags were off ?
 
Back
Top