Short note on Linear Regression.
Linear Regression
Linear regression is the type of regression that forms a relationship between the target variable and one or more independent variables utilizing a straight line. The given equation represents the equation of linear regression
Y = a + b*X + e.
Where,
a represents the intercept
b represents the slope of the regression line
e represents the error
X and Y represent the predictor and target variables, respectively.
If X is made up of more than one variable, termed as multiple linear equations.
In linear regression, the best fit line is achieved utilizing the least squared method, and it minimizes the total sum of the squares of the deviations from each data point to the line of regression. Here, the positive and negative deviations do not get canceled as all the deviations are squared.
Linear Regression
Regression is a statistical methodology developed by Sir Frances Galton in 1822-1911. Straight-line regression analysis involves a response variable y and a single predictor variable x.
The simplest form of regression is
y = a + bx
Where y is the response variable and x is the single predictor variable y is a linear function of x. a and b are regression coefficients.
As the regression coefficients are also considered as weights, we may write the above equation as:
y = w+w1x
These coefficients are solved by the method of least squares, which estimates the best-fitting straight line as the one that minimizes the error between the actual data and the estimate of the line.
Comments
Post a Comment