-2
Chat with our AI personalities
I assume you mean.....
Y = - X - 3
Y = X - 1
==============substitute
Y = - (X - 1) - 3
Y = - X + 1 - 3
Y = - X - 2
=============other way
Y = (X - 3) - 1
Y = X - 4
==============looks inconsistent to me.
What do you mean by "compute"? Do you want to graph it? Factor it? Calculate it's function given a set of points that lie on it? If you're looking to compute the function given three points that fall on the parabola, then I have just the code for you. If you're given three points, (x1, y1), (x2, y2) and (x3, y3), then you can compute the coefficients of your quadratic equation like this: a = (y1 * (x2 - x3) + y2 * (x3 - x1) + y3 * (x1 - x2)) / (x1 * x1 * (x2 - x3) + x2 * x2 * (x3 - x1) + x3 * x3 * (x1 - x2)) b = (y1 - y2) / (x1 - x2) - a * (x1 + x2); c = y1 - (x1 * x1) * a - x1 * b; You now can calculate the y co-ordinate of any point given it's x co-ordinate by saying: y = a * x * x + b * x + c;
y(i) = a + b1.x1(i) + b2.x2(i) + b3.x3(i) + ... + bk.xk(i) + e(i)where i = 1, 2, ... n are n observations ofthe independent variables x1, x2, ... xk,y is the dependent variablea and the b are regression parameters.The e are independent, identically distributed random variables (representing the error).
You need two variables. Make them x and y make y your dependent variable and x your independent variable. y=x1 this is a line y=4x1 this is a line y=4x1+3 this is a line y=x2 this is a curve y=x3 this is a curve make sure your x value has no power to it except 1 x=x1
This is actual question SUPPOSE X1 X2 X3, Xn form a random sample from a population with density function f(x,y) = 1/y where 0<x<y,y>0 where y is unknown parameter .let T=max(X1,X2,....Xn) show that Y (estimate) ... Y=(1+1/n) is unbiased estimator of Y?
multiple correlation: Suppose you calculate the linear regression of a single dependent variable on more than one independent variable and that you include a mean in the linear model. The multiple correlation is analogous to the statistic that is obtainable from a linear model that includes just one independent variable. It measures the degree to which the linear model given by the linear regression is valuable as a predictor of the independent variable. For calculation details you might wish to see the wikipedia article for this statistic. partial correlation: Let's say you have a dependent variable Y and a collection of independent variables X1, X2, X3. You might for some reason be interested in the partial correlation of Y and X3. Then you would calculate the linear regression of Y on just X1 and X2. Knowing the coefficients of this linear model you would calculate the so-called residuals which would be the parts of Y unaccounted for by the model or, in other words, the differences between the Y's and the values given by b1X1 + b2X2 where b1 and b2 are the model coefficients from the regression. Now you would calculate the correlation between these residuals and the X3 values to obtain the partial correlation of X3 with Y given X1 and X2. Intuitively, we use the first regression and residual calculation to account for the explanatory power of X1 and X2. Having done that we calculate the correlation coefficient to learn whether any more explanatory power is left for X3 to 'mop up'.