• C++ Programming for Financial Engineering
    Highly recommended by thousands of MFE students. Covers essential C++ topics with applications to financial engineering. Learn more Join!
    Python for Finance with Intro to Data Science
    Gain practical understanding of Python to read, understand, and write professional Python code for your first day on the job. Learn more Join!
    An Intuition-Based Options Primer for FE
    Ideal for entry level positions interviews and graduate studies, specializing in options trading arbitrage and options valuation models. Learn more Join!

What exactly is AAD (Adjoint Algorithmic differentiation)?

Indeed, there is a lot of confusion around AD, AAD, automatic/algorithmic/adjoint forward/reverse diff, backprop, and so forth. and all the names don't help. You may find Leif Andersen's preface to my book (Modern Computational Finance: AAD and Parallel Simulations (Table of Contents and Preface) by Antoine Savine, Leif Andersen :: SSRN) entertaining and informative on that point.

It doesn't change the fact that it is only adjoint differentiation (also called reverse mode, backprop and probably many other names) that makes a tremendous difference, not only in finance, but also deep learning, meteorology, and probably many other fields with a magical, impossibly fast computation of thousands of differentials.

Yes, the entire computation graph must be in memory, but this is not as bad as it sounds, and it is easily mitigated by checkpointing. With Monte-Carlo risks, the memory load is typically under 100MB, the differentials being computed pathwise and the tape being wiped between paths.

On the contrary, we have experimented with many kinds of AD for years and found forward-mode to be absolutely useless. Maybe it is an elegant construction, but it is even slower than bumping, harder to implement and debug and more prone to error.

Maybe my 15 min movie above can help navigate and address the main ideas and programming considerations? Note that I don't even cover forward-mode, considering it a waste of time and attention and an unnecessary potential additional confusion.

My experience with Boost is very different from yours. I engaged discussions to develop Boost.AAD with them (with proper reverse-mode diff, in a general and efficient implementation, platform independent and header only in good Boost order) and I never had a reply.

I suppose I contacted the wrong people. Perhaps you would be so kind as introduce me to the right people to discuss these things?

Many thanks in advance,

Kind regards,

Antoine
 
Thanks for the feedback, Anttoine Found a good definition from risk.net


Adjoint algorithmic differentiation is a mathematical technique used to significantly speed up the calculation of sensitivities of derivatives prices to underlying factors, called Greeks. It is widely used in the risk management of complex derivatives and valuation adjustments.

Greeks have traditionally been calculated by making small adjustments to the values of the inputs in the pricing of a derivative and calculating the output value each time – a process known as bumping. This can be time-consuming for a portfolio of thousands of trades because the valuation of each trade will involve many steps, each requiring the output from the previous step in order to proceed.

AAD breaks this valuation process into a number of steps that can be carried out simultaneously instead of sequentially. This is made possible by exploiting a key mathematical property that applies to sensitivities called the chain rule of differentiation, which links the derivatives of parts of a function to the derivative of the whole. This allows the backward propagation of sensitivities of the output with respect to the variables in the intermediate steps, until the sensitivities with respect to the inputs are achieved.

The technique has been shown to compute Greeks up to 1,000 times faster compared with the bumping method. Disadvantages of AAD include lengthy development time and the need for highly skilled quantitative programmers.
 
Here are the maths Boost guys. In the past I found John Maddock was friendly and I know Thijs vd Bergh who does quant and ML.
Maybe have another go? Disclaimier: I don't know the inner workings of how Boost is organized..
Having a header-only public AAD library would be good.

 
Thank you. The small paragraph explaining AAD is very good indeed.
How do I reach out to these guys please? And could you maybe introduce us by email so that I don’t once again knock unannounced...
 
Back
Top