# Proof of fixed-point theorem

#### Quasar Chunawala

Hey guys,

I tried to write a proof for Banach's Contraction Mapping theorem, which is extremely important for fixed-point iteration to numerically solve for the zeroes of an equation, but I think it even extends to PDEs, where a function that solves the PDE is a fixed point in infinite dimensional function spaces. Do you guys think, my proof is rigorous and technically correct?

Cheers,
Quasar.

#### Attachments

• banach_contraction_mapping_theorem.pdf
94.3 KB · Views: 28

#### Quasar Chunawala

Thanks Daniel. Looks like, my work for making the distance |(y_(n+m) - y_m| smaller than an arbitray epsilon is similar to the one on the Wiki page.

#### Daniel Duffy

##### C++ author, trainer
Now, the icing on the cake:

solve x^2 = 2 by fixed point theorem and measure (monotonic!) convergence. Linear.

#### Quasar Chunawala

Thanks for sharing, I didn't know about Aitken's acceleration!

It would be a good exercise for me to show that, for the recursive sequence $$x_{n+1} = \frac{1}{2}\left(x_n + \frac{2}{x_n}\right)$$, Aitken acceleration has quadratic (if I understood correctly?)rate of convergence, i.e. $$\lim \frac{(Ax)_n - l}{|x_n - l|^2}= \mu$$, where $$\mu > 0$$.

#### Quasar Chunawala

I recently purchased a copy of this book : An Introduction to Numerical Analysis by Suli and Mayers. It assumes a background of only a first course in analysis, and the presentation of the material is intuitive, rigorous & clear - no hand-waving. I find it to be such a nicely written undergraduate level intro.

Last edited:

#### Daniel Duffy

##### C++ author, trainer
Thanks for sharing, I didn't know about Aitken's acceleration!

It would be a good exercise for me to show that, for the recursive sequence $$x_{n+1} = \frac{1}{2}\left(x_n + \frac{2}{x_n}\right)$$, Aitken acceleration has quadratic (if I understood correctly?)rate of convergence, i.e. $$\lim \frac{(Ax)_n - l}{|x_n - l|^2}= \mu$$, where $$\mu > 0$$.
and experimentally.

#### Daniel Duffy

##### C++ author, trainer
A nice feature of fixed point/contraction theorems is they converge for any initial guess in contrast to say Newton's method.

#### Quasar Chunawala

If I remember correctly, precisely why, one has to run a bracketing routine prior to Newton-Raphson. Geometrically speaking, if $$x_n$$ is such that $$f'(x_n)$$ is close to zero, then $$x_{n+1}$$ can be far away from the root.

#### Quasar Chunawala

By the way, do you have a suggestion, for a good intro book to C++ (using C++ 17). I definitely have to re-learn modern C++ & I want to implement some algorithms, I learn in my numerics book.

Quantlib implements many of the solvers in C++, but I'd like to make an attempt at writing my own code.

SO has a well-written post on this here. But, (1) the answer was first written a while back and (2) not a lot of books that combine both numerical methods and modern C++.

#### Daniel Duffy

##### C++ author, trainer
By the way, do you have a suggestion, for a good intro book to C++ (using C++ 17). I definitely have to re-learn modern C++ & I want to implement some algorithms, I learn in my numerics book.

Quantlib implements many of the solvers in C++, but I'd like to make an attempt at writing my own code.

SO has a well-written post on this here. But, (1) the answer was first written a while back and (2) not a lot of books that combine both numerical methods and modern C++.
Do you already know C++11? or even C++03?(?)
BTW C++17 is only a stepping stone to C++20. TBH I don't think you are ready.

It's like judo

white>yellow->orange->green->blue->brown->black

and you in C++ terms?

You can do all these algos in C. Or Python.

Last edited:

#### Quasar Chunawala

If you ask me, I think, I'm still white-belt in C++.
I think I know basic object oriented programming, but I need to come up to speed, for example when you write C++ 11.

#### Daniel Duffy

##### C++ author, trainer
If you ask me, I think, I'm still white-belt in C++.
I think I know basic object oriented programming, but I need to come up to speed, for example when you write C++ 11.
QN C++

#### Daniel Duffy

##### C++ author, trainer
Another way to solve fixed point

x = g(x)

is to solve it as a least-squares minimisation problem

min (x - g(x))^2.

And use favourite optimiser.

#### Daniel Duffy

##### C++ author, trainer
If I remember correctly, precisely why, one has to run a bracketing routine prior to Newton-Raphson. Geometrically speaking, if $$x_n$$ is such that $$f'(x_n)$$ is close to zero, then $$x_{n+1}$$ can be far away from the root.
For example.
and 1) several roots, 2) no roots are possible.

#### Quasar Chunawala

For example.
and 1) several roots, 2) no roots are possible.
$$f(x) = \frac{1}{2}-\frac{1}{1+M|x - 1.05|}$$ has roots $$x_1=1.05-\frac{1}{M}$$ and $$x_2=1.05+\frac{1}{M}$$. It'll be fun to see the behavior of a 1D solver (or a bracketing routine) when $$1/M$$ is near or smaller than the machine epsilon.

#### Daniel Duffy

##### C++ author, trainer
Even for M = 1.0e+9 etc. it will be a challenge for methods that use derivatives. It is like a boundary/internal layer problem

SHGO is a global optimiser that finds all roots.

It's very easy to use.
Python:
from scipy.optimize import minimize_scalar, minimize, shgo, brute, fmin
import math
import numpy as np

#DJD
print("boundary layer problem")
M = 1000
def func(x):
z = 1.0/(1.0 + M*math.fabs(x - 1.05))
return math.pow(0.5 - z,2)

bnds = [(0,None)]
result = shgo(func,  bounds = bnds, sampling_method ='sobol')
print("global minimum  ", result.x)
print("success?  ", result.success)
print(result.fun)
print("local minima  ", result.xl)
print("values at local minima  ", result.funl)

Last edited:

#### Daniel Duffy

##### C++ author, trainer
relevant example

Compute implied volatility using

. Newton
. Banach
. Aitken

Compare ..

These in my 2018 C++ book.

#### Daniel Duffy

##### C++ author, trainer
“The first rule of discovery is to have brains and good luck. The second rule of discovery is to sit tight and wait till you get a bright idea.”

George Pólya

#### Cecian

“The first rule of discovery is to have brains and good luck. The second rule of discovery is to sit tight and wait till you get a bright idea.”

George Pólya

A very interesting and correct statement. There is something to think about

Replies
0
Views
846
Replies
1
Views
1K
Replies
23
Views
3K
Replies
0
Views
1K
Replies
2
Views
1K