Not sure about only two requirements. I would say all of the following:
The Poisson distribution, for example, is countably infinite.
A simple continuous distribution can take any value between two other values whereas a discrete distribution cannot.
No, it resembles a normal distribution, but discrete. sum --- Probability 2-----------1/36 3-----------2/36 4-----------3/36 5-----------4/26 6-----------5/36 7-----------6/36 8-----------5/36 9-----------4/36 10----------3/36 11----------2/36 12----------1/36
A number of independent trials such that there are only two outcomes and the probability of "success" remains constant.
Two independent outcomes with constant probabilities.
A binomial experiment is a probability experiment that satisfies the following four requirements:1. Each trial can have only two outcomes or outcomes that can be reduced to two outcomes. These outcomes can be considered as either success or failure.2. There must be a fixed number of trials.3. The outcomes of each trial must be independent of each other.4. The probability of a success must remain the same for each trial.
A discrete uniform distribution assigns the same probability to two or more possible events. For example, there is a discrete uniform distribution associated with flipping a coin: 'heads' is assigned a probability of 1/2 as is the event 'tails'. (Note that the probabilities are equal or 'uniform'.) There is also a discrete uniform distribution associated with tossing a die in that there is a 1/6 probability for seeing each possible side of the die.
I will assume that you are asking about probability distribution functions. There are two types: discrete and continuous. Some might argue that a third type exists, which is a mix of discrete and continuous distributions. When representing discrete random variables, the probability distribution is probability mass function or "pmf." For continuous distributions, the theoretical distribution is the probability density function or "pdf." Some textbooks will call pmf's as discrete probability distributions. Common pmf's are binomial, multinomial, uniform discrete and Poisson. Common pdf's are the uniform, normal, log-normal, and exponential. Two common pdf's used in sample size, hypothesis testing and confidence intervals are the "t distribution" and the chi-square. Finally, the F distribution is used in more advanced hypothesis testing and regression.
discrete distribution is the distribution that can use the value of a whole number only while continuous distribution is the distribution that can assume any value between two numbers.
A simple continuous distribution can take any value between two other values whereas a discrete distribution cannot.
If the distribution is discrete you need to add together the probabilities of all the values between the two given ones, whereas if the distribution is continuous you will need to integrate the probability distribution function (pdf) between those limits. The above process may require you to use numerical methods if the distribution is not readily integrable. For example, the Gaussian (Normal) distribution is one of the most common continuous pdfs, but it is not analytically integrable. You will need to work with tables that have been computed using numerical methods.
(From Wolfram alpha)
No, it resembles a normal distribution, but discrete. sum --- Probability 2-----------1/36 3-----------2/36 4-----------3/36 5-----------4/26 6-----------5/36 7-----------6/36 8-----------5/36 9-----------4/36 10----------3/36 11----------2/36 12----------1/36
A number of independent trials such that there are only two outcomes and the probability of "success" remains constant.
Two independent outcomes with constant probabilities.
In some situationsX is continuous but Y is discrete. For example, in a logistic regression, one may wish to predict the probability of a binary outcome Y conditional on the value of a continuously-distributed X. In this case, (X, Y) has neither a probability density function nor a probability mass function in the sense of the terms given above. On the other hand, a "mixed joint density" can be defined in either of two ways:Formally, fX,Y(x, y) is the probability density function of (X, Y) with respect to the product measure on the respective supports of X and Y. Either of these two decompositions can then be used to recover the joint cumulative distribution function:The definition generalizes to a mixture of arbitrary numbers of discrete and continuous random variables.
I have included two links. A normal random variable is a random variable whose associated probability distribution is the normal probability distribution. By definition, a random variable has to have an associated distribution. The normal distribution (probability density function) is defined by a mathematical formula with a mean and standard deviation as parameters. The normal distribution is ofter called a bell-shaped curve, because of its symmetrical shape. It is not the only symmetrical distribution. The two links should provide more information beyond this simple definition.
we compute it by using their differences