Question

In: Statistics and Probability

Elaborate what are the least square lines and Regression?

Elaborate what are the least square lines and Regression?

Solutions

Expert Solution

A method of curve fitting is Least Square Method. Curves or lines fitted in this method is called least square lines.  Method of least Squares is a device for finding the equation of a specific type of curve, which best fits a given set of observations. The method depends upon the principle of least squares, which suggests that for the "best fitting" curve, the sum of the squares of differences between the observed and the corresponding estimated values should be the minimum possible.

Suppose we are given n pairs of observations (x1, y1), (x2,y2),....................,(xn,yn) and it is required to fit a straight line to these data. The general equation of a straight line y=a+bx is taken, where a and b are constants. Any values for a and b would give a straight line,and once these values are obtained, an estimate of y can be had by substituting value of x. That is to say, the estimated value of y when x=x1,x2,.........,xn would be more a+bx1,a+bx2,.................,a+bxn respectively. In order that the equation y=a+bx gives a good representation of the relationship between x and y, it is desirable that the estimated values a+bx1,a+bx2,.................,a+bxn are on the whole ,close enough to the corresponding observed values y1,y2,................................,yn. For the best fitting straight line, therefore our problem is only to choose such values of a and b for the equation y=a+bx which will provide estimates of y as close as possible to the observed values. This can be done in different ways. However according to the principle of least squares, the best fitting equation is interpreted as that which minimises the sum of squares of differences

------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------

The word regression is used to denote estimation or prediction of the average value of one variable for a specific value of the other variable. The estimations is done by means of suitable equations, derived on the basis of available bivariate data. Such an equation is known as a regression equation and its geometrical representation is called a regression curve.

In linear regression the relationship between the variables is assumed to be linear. The estimates of y (say, y') is obtained from an equation of the form  

.........................................................(1)

and the estimate of x (say,x') from another equation of the form

.........................................................(2)

equation (1) is called regression equation of y on x .

equation (2) is called regression equation of x on y.

byx is regression coeeficien of y an x.

bxy is regression coeeficien of x an y.

. geometrical representaion of (1) and (2) are called regression lines.

where r is correlation coefficient between x and y

--------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------

PLEASE UPVOTE IF YOU LIKE MY ANSWER.

THANK YOU.


Related Solutions

My professor asked us if the least square estimated regression equation is indeed the best unique...
My professor asked us if the least square estimated regression equation is indeed the best unique linear combination of predictor variables. I said yes but he asked me if there are other potential linear combination of the predictor variables that may work that could provide insight? he then asked that I give an example. what other linear combinations are there???
What are the properties of the least Square Estimators? What is an unbiased estimator of the...
What are the properties of the least Square Estimators? What is an unbiased estimator of the variance of the error term ut? .
what is the regression technique of ordinary least squares?
what is the regression technique of ordinary least squares?
Data were collected on two variable x, and y and a least-square regression line were fitted...
Data were collected on two variable x, and y and a least-square regression line were fitted to the data. The resulting regression equation is ^y= 2.29-1.70x. a. interpretation for the slope b. calculate the residual for the point (5,6) c. if the correlation between x and y is r= -0.32, what percentage of the variation in y is explained by the least-squares regression line based on x?
a) What is the difference between regression and interpolation? b) Use least squares regression to fit...
a) What is the difference between regression and interpolation? b) Use least squares regression to fit a straight line to the data given in Table 1 and calculate the y value corresponding x=3. c) Find the Lagrange interpolating polynomial using the data given in Table 1 and calculate the y value corresponding x=3. Table 1 x 0 2 4 6 y 5 6 3 8
Find the 'least square linear regression line' for the following cases: (x,y)coordinates (-1,0), (0,2), (1,4), (2,5)...
Find the 'least square linear regression line' for the following cases: (x,y)coordinates (-1,0), (0,2), (1,4), (2,5) (However, use the gradient descent method and use cost function to get it) (Explain the changing process of cost functuon, gradient, and intercept together)
What is the main advantage(s) of quantile regression? How does quantile regression differ from least-squares regression...
What is the main advantage(s) of quantile regression? How does quantile regression differ from least-squares regression (linear regression)? Illustrate your answer with an example.
Elaborate on the concept of consistency, homoskedasticity, and efficiency in an econometric model (regression).
Elaborate on the concept of consistency, homoskedasticity, and efficiency in an econometric model (regression).
Linear Regression Regression Statistics R 0.99798 R Square 0.99597 Adjusted R Square 0.99445 Standard Error 1.34247...
Linear Regression Regression Statistics R 0.99798 R Square 0.99597 Adjusted R Square 0.99445 Standard Error 1.34247 Total Number Of Cases 12 Hamb Consump = 176.2709 - 106.6901 * Hamb Price + 4.5651 * Income (1,000s) - 12.1556 * Hot Dog Price ANOVA d.f. SS MS F p-level Regression 3. 3,560.58212 1,186.86071 658.549258 0. Residual 8. 14.41788 1.80224 Total 11. 3,575. Coefficients Standard Error LCL UCL t Stat p-level H0 (5%) rejected? Intercept 176.27093 45.28994 71.83215 280.709717 3.89206 0.0046 Yes Hamb...
What are Least Squares Assumptions for simple linear regression? For each least squares assumption, provide an...
What are Least Squares Assumptions for simple linear regression? For each least squares assumption, provide an example in which the assumption is valid, then provide an example in which the assumption fails.
ADVERTISEMENT
ADVERTISEMENT
ADVERTISEMENT