The joint probability function for two variables is a probability function whose domain is a subset of two dimensional space. The joint probability function for discrete random variables X and Y is given as
pr(x, y) = pr(X = x and Y = y). If X and Y are independent random variables then this will equal pr(X =x)*pr(Y = y).
For continuous variables, the joint funtion is defined analogously:
f(x, y) = pr(X < x and Y < y).
Chat with our AI personalities
If f(x, y) is the joint probability distribution function of two random variables, X and Y, then the sum (or integral) of f(x, y) over all possible values of y is the marginal probability function of x. The definition can be extended analogously to joint and marginal distribution functions of more than 2 variables.
Let X and Y be two random variables.Case (1) - Discrete CaseIf P(X = x) denotes the probability that the random variable X takes the value x, then the joint probability of X and Y is P(X = x and Y = y).Case (2) - Continuous CaseIf P(a < X < b) is the probability of the random variable X taking a value in the real interval (a, b), then the joint probability of X and Y is P(a < X< b and c < Y < d).Basically joint probability is the probability of two events happening (or not).
The marginal probability distribution function.
you can just ask the question on ask .com
The probability mass function is used to characterize the distribution of discrete random variables, while the probability density function is used to characterize the distribution of absolutely continuous random variables. You might want to read more about this at www.statlect.com/prbdst1.htm (see the link below or on the right)