answersLogoWhite

0


Best Answer

y(i) = a + b1.x1(i) + b2.x2(i) + b3.x3(i) + ... + bk.xk(i) + e(i)where i = 1, 2, ... n are n observations of

the independent variables x1, x2, ... xk,

y is the dependent variable

a and the b are regression parameters.


The e are independent, identically distributed random variables (representing the error).

User Avatar

Wiki User

8y ago
This answer is:
User Avatar

Add your answer:

Earn +20 pts
Q: Linear regression model
Write your answer...
Submit
Still have questions?
magnify glass
imp
Continue Learning about Math & Arithmetic

What is the difference between the logistic regression and regular regression?

in general regression model the dependent variable is continuous and independent variable is discrete type. in genral regression model the variables are linearly related. in logistic regression model the response varaible must be categorical type. the relation ship between the response and explonatory variables is non-linear.


What are some of the advantages and disadvantages of making forecasts using regression methods?

+ Linear regression is a simple statistical process and so is easy to carry out. + Some non-linear relationships can be converted to linear relationships using simple transformations. - The error structure may not be suitable for regression (independent, identically distributed). - The regression model used may not be appropriate or an important variable may have been omitted. - The residual error may be too large.


What can you conclude if the global test of regression does not reject the null hypothesis?

You can conclude that there is not enough evidence to reject the null hypothesis. Or that your model was incorrectly specified. Consider the exact equation y = x2. A regression of y against x (for -a < x < a) will give a regression coefficient of 0. Not because there is no relationship between y and x but because the relationship is not linear: the model is wrong! Do a regression of y against x2 and you will get a perfect regression!


What is multiple and partial correlation?

multiple correlation: Suppose you calculate the linear regression of a single dependent variable on more than one independent variable and that you include a mean in the linear model. The multiple correlation is analogous to the statistic that is obtainable from a linear model that includes just one independent variable. It measures the degree to which the linear model given by the linear regression is valuable as a predictor of the independent variable. For calculation details you might wish to see the wikipedia article for this statistic. partial correlation: Let's say you have a dependent variable Y and a collection of independent variables X1, X2, X3. You might for some reason be interested in the partial correlation of Y and X3. Then you would calculate the linear regression of Y on just X1 and X2. Knowing the coefficients of this linear model you would calculate the so-called residuals which would be the parts of Y unaccounted for by the model or, in other words, the differences between the Y's and the values given by b1X1 + b2X2 where b1 and b2 are the model coefficients from the regression. Now you would calculate the correlation between these residuals and the X3 values to obtain the partial correlation of X3 with Y given X1 and X2. Intuitively, we use the first regression and residual calculation to account for the explanatory power of X1 and X2. Having done that we calculate the correlation coefficient to learn whether any more explanatory power is left for X3 to 'mop up'.


What is the difference between classical regression analysis and spatial regression analysis?

how can regression model approach be useful in lean construction concept in the mass production of houses

Related questions

How is linear regression used?

Linear regression can be used in statistics in order to create a model out a dependable scalar value and an explanatory variable. Linear regression has applications in finance, economics and environmental science.


What is the difference between simple and multiple linear regression?

I want to develop a regression model for predicting YardsAllowed as a function of Takeaways, and I need to explain the statistical signifance of the model.


Where ridge regression is used?

Ridge regression is used in linear regression to deal with multicollinearity. It reduces the MSE of the model in exchange for introducing some bias.


What is true about the y-intercept in the linear regression model?

The value depends on the slope of the line.


What is the difference between the logistic regression and regular regression?

in general regression model the dependent variable is continuous and independent variable is discrete type. in genral regression model the variables are linearly related. in logistic regression model the response varaible must be categorical type. the relation ship between the response and explonatory variables is non-linear.


Is it true that the y-intercept in the linear regression model is always 0?

It could be any value


Why are your predictions inaccurate using a linear regression model?

There are many possible reasons. Here are some of the more common ones: The underlying relationship is not be linear. The regression has very poor predictive power (coefficient of regression close to zero). The errors are not independent, identical, normally distributed. Outliers distorting regression. Calculation error.


What is the purpose of a residual analysis in simple linear regression?

One of the main reasons for doing so is to check that the assumptions of the errors being independent and identically distributed is true. If that is not the case then the simple linear regression is not an appropriate model.


which characteristics of a data set makes a linear regression model unreasonable?

A correlation coefficient close to 0 makes a linear regression model unreasonable. Because If the correlation between the two variable is close to zero, we can not expect one variable explaining the variation in other variable.


What has the author George Portides written?

George Portides has written: 'Robust regression with application to generalized linear model'


What has the author O A Sankoh written?

O. A. Sankoh has written: 'Influential observations in the linear regression model and Trenkler's iteration estimator' -- subject(s): Regression analysis, Estimation theory


What are some of the advantages and disadvantages of making forecasts using regression methods?

+ Linear regression is a simple statistical process and so is easy to carry out. + Some non-linear relationships can be converted to linear relationships using simple transformations. - The error structure may not be suitable for regression (independent, identically distributed). - The regression model used may not be appropriate or an important variable may have been omitted. - The residual error may be too large.