answersLogoWhite

0

Still curious? Ask our experts.

Chat with our AI personalities

TaigaTaiga
Every great hero faces trials, and you—yes, YOU—are no exception!
Chat with Taiga
DevinDevin
I've poured enough drinks to know that people don't always want advice—they just want to talk.
Chat with Devin
ReneRene
Change my mind. I dare you.
Chat with Rene

Add your answer:

Earn +20 pts
Q: Does a strong correlation indicate a cause-and-effect relationship between variables?
Write your answer...
Submit
Still have questions?
magnify glass
imp
Continue Learning about Math & Arithmetic

If doubling the manipulating variable results in a doubling of the responding variable the relationship between the variables is a?

This would indicate that there is a linear relationship between manipulating and responding variables.


What does a correlation coefficient of 1.1 mean?

Well, friend, a correlation coefficient of 1.1 is not possible because correlation coefficients range from -1 to 1. If you meant 1.0, that would indicate a perfect positive linear relationship between two variables. It means as one variable increases, the other variable also increases proportionally.


When are is close to -1 does it indicate a weak negative correlation?

No, it indicates an extremely strong positive correlation.


What does dispersion indicate about the data?

correlation which can be strong or weak


What is the difference between corelation and regression?

I've included links to both these terms. Definitions from these links are given below. Correlation and regression are frequently misunderstood terms. Correlation suggests or indicates that a linear relationship may exist between two random variables, but does not indicate whether X causes Yor Y causes X. In regression, we make the assumption that X as the independent variable can be related to Y, the dependent variable and that an equation of this relationship is useful. Definitions from Wikipedia: In probability theory and statistics, correlation (often measured as a correlation coefficient) indicates the strength and direction of a linear relationship between two random variables. In statistics, regression analysis refers to techniques for the modeling and analysis of numerical data consisting of values of a dependent variable (also called a response variable) and of one or more independent variables (also known as explanatory variables or predictors). The dependent variable in the regression equation is modeled as a function of the independent variables, corresponding parameters ("constants"), and an error term. The error term is treated as a random variable. It represents unexplained variation in the dependent variable. The parameters are estimated so as to give a "best fit" of the data. Most commonly the best fit is evaluated by using the least squares method, but other criteria have also been used.