answersLogoWhite

0


Best Answer

It gives a measure of the extent to which values of the dependent variable move with values of the independent variables. This will enable you to decide whether or not the model has any useful predictive properties (significance). It also gives a measure of the expected changes in the value of the dependent variable which would accompany changes in the independent variable.

A regression model cannot offer an explanation. The fact that two variables move together does not mean that changes in one cause changes in the other. Furthermore it is possible to have very closely related variables which, because of a wrongly specified model, can show no correlation. For example, a LINEAR model fitted to y=x2 over a symmetric range for x will show zero correlation!

User Avatar

Wiki User

14y ago
This answer is:
User Avatar

Add your answer:

Earn +20 pts
Q: What does a regression model predict about the dependent variable?
Write your answer...
Submit
Still have questions?
magnify glass
imp
Continue Learning about Math & Arithmetic

What is the difference between the logistic regression and regular regression?

in general regression model the dependent variable is continuous and independent variable is discrete type. in genral regression model the variables are linearly related. in logistic regression model the response varaible must be categorical type. the relation ship between the response and explonatory variables is non-linear.


What is Mincer- zarnowitz regression?

A mincer Zarrowitz is a regression of the actual variable (dependent variable, y) against its fitted counterpart. At times, it may be used to assess the forecast accuracy of a model.


In the regression model the predictor variable is useful in predicting the response variable provided that B1 equals 0?

Not useful


Can independent variables be changed?

Yes. In fact, in multiple regression, that is often part of the analysis. You can add or remove independent variables to the model so as to get the best fit between what values are observed for the dependent variable and what the model predicts for the given set of independent variables.


What is multiple and partial correlation?

multiple correlation: Suppose you calculate the linear regression of a single dependent variable on more than one independent variable and that you include a mean in the linear model. The multiple correlation is analogous to the statistic that is obtainable from a linear model that includes just one independent variable. It measures the degree to which the linear model given by the linear regression is valuable as a predictor of the independent variable. For calculation details you might wish to see the wikipedia article for this statistic. partial correlation: Let's say you have a dependent variable Y and a collection of independent variables X1, X2, X3. You might for some reason be interested in the partial correlation of Y and X3. Then you would calculate the linear regression of Y on just X1 and X2. Knowing the coefficients of this linear model you would calculate the so-called residuals which would be the parts of Y unaccounted for by the model or, in other words, the differences between the Y's and the values given by b1X1 + b2X2 where b1 and b2 are the model coefficients from the regression. Now you would calculate the correlation between these residuals and the X3 values to obtain the partial correlation of X3 with Y given X1 and X2. Intuitively, we use the first regression and residual calculation to account for the explanatory power of X1 and X2. Having done that we calculate the correlation coefficient to learn whether any more explanatory power is left for X3 to 'mop up'.

Related questions

What is the difference between the logistic regression and regular regression?

in general regression model the dependent variable is continuous and independent variable is discrete type. in genral regression model the variables are linearly related. in logistic regression model the response varaible must be categorical type. the relation ship between the response and explonatory variables is non-linear.


What is Mincer- zarnowitz regression?

A mincer Zarrowitz is a regression of the actual variable (dependent variable, y) against its fitted counterpart. At times, it may be used to assess the forecast accuracy of a model.


What is the difference between simple and multiple linear regression?

I want to develop a regression model for predicting YardsAllowed as a function of Takeaways, and I need to explain the statistical signifance of the model.


What is binary logistic regression?

Binary logistic regression is a statistical method used to model the relationship between a categorical dependent variable with two levels and one or more independent variables. It estimates the probability that an observation belongs to one of the two categories based on the values of the independent variables. The output is in the form of odds ratios, which describe the influence of the independent variables on the probability of the outcome.


In the regression model the predictor variable is useful in predicting the response variable provided that B1 equals 0?

Not useful


which characteristics of a data set makes a linear regression model unreasonable?

A correlation coefficient close to 0 makes a linear regression model unreasonable. Because If the correlation between the two variable is close to zero, we can not expect one variable explaining the variation in other variable.


How is linear regression used?

Linear regression can be used in statistics in order to create a model out a dependable scalar value and an explanatory variable. Linear regression has applications in finance, economics and environmental science.


What Is a Logistic Regression Algorithm?

Using real-world data from a data set, a statistical analysis method known as logistic regression predicts a binary outcome, such as yes or no. A logistic regression model forecasts a dependent data variable by examining the correlation between one or more existing independent variables. Please visit for more information 1stepgrow.


Can independent variables be changed?

Yes. In fact, in multiple regression, that is often part of the analysis. You can add or remove independent variables to the model so as to get the best fit between what values are observed for the dependent variable and what the model predicts for the given set of independent variables.


What is stochastic error term?

A Stochastic error term is a term that is added to a regression equation to introduce all of the variation in Y that cannot be explained by the included Xs. It is, in effect, a symbol of the econometrician's ignorance or inability to model all the movements of the dependent variable.


What does variable held constant mean?

The distinction between these two types of variables is whether the variable regress on another variable or not. Like in a linear regression the dependent variable (DV) regresses on the independent variable (IV), meaning that the DV is being predicted by the IV. Within SEM modelling this means that the exogenous variable is the variable that another variable regresses on. Exogenous variables can be recognized in a graphical version of the model, as the variables sending out arrowheads, denoting which variable it is predicting. A variable that regresses on a variable is always an endogenous variable even if this same variable is used as an variable to be regressed on.


What is multiple and partial correlation?

multiple correlation: Suppose you calculate the linear regression of a single dependent variable on more than one independent variable and that you include a mean in the linear model. The multiple correlation is analogous to the statistic that is obtainable from a linear model that includes just one independent variable. It measures the degree to which the linear model given by the linear regression is valuable as a predictor of the independent variable. For calculation details you might wish to see the wikipedia article for this statistic. partial correlation: Let's say you have a dependent variable Y and a collection of independent variables X1, X2, X3. You might for some reason be interested in the partial correlation of Y and X3. Then you would calculate the linear regression of Y on just X1 and X2. Knowing the coefficients of this linear model you would calculate the so-called residuals which would be the parts of Y unaccounted for by the model or, in other words, the differences between the Y's and the values given by b1X1 + b2X2 where b1 and b2 are the model coefficients from the regression. Now you would calculate the correlation between these residuals and the X3 values to obtain the partial correlation of X3 with Y given X1 and X2. Intuitively, we use the first regression and residual calculation to account for the explanatory power of X1 and X2. Having done that we calculate the correlation coefficient to learn whether any more explanatory power is left for X3 to 'mop up'.