• C++ Programming for Financial Engineering
    Highly recommended by thousands of MFE students. Covers essential C++ topics with applications to financial engineering. Learn more Join!
    Python for Finance with Intro to Data Science
    Gain practical understanding of Python to read, understand, and write professional Python code for your first day on the job. Learn more Join!
    An Intuition-Based Options Primer for FE
    Ideal for entry level positions interviews and graduate studies, specializing in options trading arbitrage and options valuation models. Learn more Join!

Polynomial regression help?

Joined
2/1/08
Messages
1
Points
11
Hi guys, I've been a lurker for some time now, but have a question. If anyone could help it would be greatly appreciated.

I'm trying to fit a polynomial regression line to a set of data and I want to do it by hand instead of through a software package. I know that to estimate β in y = a + βx I use:

β = Σ( x- xbar)(y - ybar) / Σ(x - xbar)^2

But how can I estimate the β2 in y = a + β1x + β2x^2

Any help at all would be appreciated.
 
Yes, ordinary least squares will work for this -- it's still Ax = b, and the solution is still x=(A'A)^-1 A'b

You can do it in Excel -- just type "=X^2" in a second column, and then regress with both of these columns as independent variables, against whatever Y variable you want.

For the long formula with the subtracted means (x-xbar, etc) written out long-hand, look in a regression book for "multivariate regression" and use those formulas -- one good multivariate stat book is by Alvin Rencher, called Linear Models. Matrix notation is more compact, though, with the Ax=b.
 
Least Squares Method

Hello,

you need a Least Squares Method in a multidimensional model.

If you want do it by hand, you'll need to be in luck or you'll need an one-dimensional problem.
You've to find a curve which has the best fit to a series of data points like curve fitting or interpolation.
If you get the order of the equation as a second degree polynomial, like:

\(y = \alpha + \beta_{1}x + \beta_{2}x^{2}\)
You'll exactly fit three points. This is unconditional required to calculate your parameters \(\alpha, \beta_{1}\) and \(\beta_{2}\)


I would rather recommend to compute this not by hand.
 
I think you can proceed as follows:

Denote by J(\alpha, \beta_1, \beta_2) the obj. function. Then write down the FOC (the gradient should be zero). You will get a 3x3 linear system to solve w/r to \alpha, \beta_1, \beta_2. You can find the system here Least Squares Fit of a Quadratic Curve to Data


Hth
 
Least Squares Fit of a Quadratic Curve to Data

Yes Iulian,

this computes in an one-dimensional case polynomial regression:

(\begin{align}n & & \sum^{n}_{i=1}x_{i} & & \sum^{n}_{i=1}x_{i}^{2} & & \alpha & =& \sum^{n}_{i=1}y_{i}\\\sum^{n}_{i=1}x_{i} & & \sum^{n}_{i=1}x_{i}^{2} & & \sum^{n}_{i=1}x_{i}^{3}& &\beta_{1} & =& \sum^{n}_{i=1}x_{i}y_{i}\\\sum^{n}_{i=1}x_{i}^{2} & & \sum^{n}_{i=1}x_{i}^{3} & & \sum^{n}_{i=1}x_{i}^{4}& &\beta_{2} & =& \sum^{n}_{i=1}x_{i}^{2}y_{i}\\\end{align})


With n number of observations and i the i-th observation of variables x and output y.
 
Back
Top