You cannot say it because it is not true.
First of all, correlation simple states that two variables change so in such a way that a change in one leads to a change in the other. Changes of the same magnitude in the first variable brings about the consistent changes in the second variable. There is no way to determine whether
A simplistic example from economics will illustrate the first three. Capital investment (spending on machinery, for example) by a company and the company's profits are positively correlated. But the direction of the causal relationship is not simple to establish. A company needs to be profitable before it can raise the money to invest. On the other hand, by investing well, it becomes more competitive and so is more profitable.
As an example of the fourth type, in the UK there is a significant correlation between the sales of ice cream and swimming accidents. This is not because ice cream causes swimming accidents nor that ice cream is caused (?) by swimming accidents. The hidden variable is hot weather. People are more likely to eat ice cream. They are also more likely to go to beaches.
The converse of the statement in the question is also untrue: the absence of correlation does not prove that there is no causation. Suppose you have one variable X which is defined on a the interval (-p, p) for some positive number a. And then let Y = X^2. There is clearly a perfect relationship between the two variables. However, if the X-values are symmetric, then the symmetry of the relationship ensures that the correlation coefficient is 0! No correlation but a perfect relationship.
Chat with our AI personalities
Correlation of data means that two different variables are linked in some way. This might be positive correlation, which means one goes up as the other goes up (for instance, people who are heavier tend to be taller) or negative correlation, which means one goes up as the other goes down (for instance, people who are older tend to play video games less often). Correlation just means a link. It means that knowing one variable (a person is really tall) is enough to make a guess at the other one (that person is probably also pretty heavy). Note that there is a very common mistake people make about correlation, and this needs to be addressed. In short, the mistake is "correlation implies causation". It doesn't. If I have data which shows people who volunteer more often tend to be happier, I cannot then say "volunteer. It makes you happy!" because correlation doesn't imply causation - it might be that if you're happy you're more likely to volunteer, and the causation is the other way around. Or it might be that if you're rich, you're both more likely to be happy, and more likely to volunteer, so the data is affected by a different variable entirely.
It's not only economists that offer this warning. It's true anywhere that correlation coefficients are to be interpreted. Let me offer an example from psychology. In many populations there's a significant correlation between the shoe sizes of people and their intelligence quotients. But no-one would say that increasing a person's shoe size would increase their intelligence!
A relationship between variables
If by positive you mean that an increase in the independent variable is accompanied by an increase in the dependent variable then this will be indicated by a correlation close to one. What is considered 'close to one' depends on the field of study. In some fields where it can be quite difficult to establish relationships between variables a correlation of, say, 0.35 might be considered important, provided of course that it has been shown to be statistically significant.
There is not enough information to say much. To start with, the correlation may not be significant. Furthermore, a linear relationship may not be an appropriate model. If you assume that a linear model is appropriate and if you assume that there is evidence to indicate that the correlation is significant (by this time you might as well assume anything you want!) then you could say that the dependent variable decreases by 0.13 units for every unit change in the independent variable - within the range of the independent variable.