The discrete probability distributions they are a function that assigns to each element of X (S) = x1, x2,…, xi,…, where X is a given discrete random variable and S is its sample space, the probability that said event occurs. This function f of X (S) defined as f (xi) = P (X = xi) is sometimes called the probability mass function.
This mass of probabilities is generally represented in the form of a table. Since X is a discrete random variable, X (S) has a finite number of events or countable infinity. Among the most common discrete probability distributions we have the uniform distribution, the binomial distribution and the Poisson distribution.
Article index
The probability distribution function must meet the following conditions:
Furthermore, if X takes only a finite number of values (for example x1, x2,…, xn), then p (xi) = 0 if i> ny, therefore, the infinite series of condition b becomes a finite series.
This function also fulfills the following properties:
Let B be an event associated with the random variable X. This means that B is contained in X (S). Specifically, suppose that B = xi1, xi2, .... Therefore:
In other words: the probability of an event B is equal to the sum of the probabilities of the individual outcomes associated with B.
From this we can conclude that if a < b, los sucesos (X ≤ a) y (a < X ≤ b) son mutuamente excluyentes y, además, su unión es el suceso (X ≤ b), por lo que tenemos:
A random variable X is said to follow a distribution that is characterized by being uniform at n points if each value is assigned the same probability. Its probability mass function is:
Suppose we have an experiment that has two possible outcomes, it can be the toss of a coin whose possible outcomes are heads or tails, or the choice of an integer whose result can be an even number or an odd one; this type of experiment is known as Bernoulli tests.
Generally, the two possible outcomes are called success and failure, where p is the probability of success and 1-p is the probability of failure. We can determine the probability of x successes in n Bernoulli tests that are independent of each other with the following distribution.
It is the function that represents the probability of obtaining x successes in n independent Bernoulli tests, whose probability of success is p. Its probability mass function is:
The following graph represents the probability mass function for different values of the parameters of the binomial distribution.
The following distribution owes its name to the French mathematician Simeon Poisson (1781-1840), who obtained it as the limit of the binomial distribution.
A random variable X is said to have a Poisson distribution of parameter λ when it can take the positive integer values 0,1,2,3, ... with the following probability:
In this expression λ is the average number corresponding to the occurrences of the event for each unit of time, and x is the number of times the event occurs.
Its probability mass function is:
Next, a graph that represents the probability mass function for different values of the parameters of the Poisson distribution.
Note that, as long as the number of successes is low and the number of tests performed on a binomial distribution is high, we can always approximate these distributions, as the Poisson distribution is the limit of the binomial distribution.
The main difference between these two distributions is that, while the binomial depends on two parameters - namely, n and p-, the Poisson only depends on λ, which is sometimes called the intensity of the distribution..
So far we have only talked about probability distributions for cases in which the different experiments are independent of each other; that is, when the result of one is not affected by some other result.
When the case of having experiments that are not independent occurs, the hypergeometric distribution is very useful.
Let N be the total number of objects of a finite set, of which we can identify k of these in some way, thus forming a subset K, whose complement is formed by the remaining N-k elements.
If we randomly choose n objects, the random variable X that represents the number of objects belonging to K in said choice has a hypergeometric distribution of parameters N, n and k. Its probability mass function is:
The following graph represents the probability mass function for different values of the parameters of the hypergeometric distribution.
Suppose that the probability that a radio tube (placed in a certain type of equipment) will operate for more than 500 hours is 0.2. If 20 tubes are tested, what is the probability that exactly k of these will run for more than 500 hours, k = 0, 1,2,…, 20?
If X is the number of tubes that work more than 500 hours, we will assume that X has a binomial distribution. Then
And so:
For k≥11, the probabilities are less than 0.001
Thus we can observe how the probability that k of these work for more than 500 hours increases, until it reaches its maximum value (with k = 4) and then begins to decrease..
A coin is tossed 6 times. When the result is expensive, we will say that it is a success. What is the probability that two heads will come up exactly?
For this case we have that n = 6 and both the probability of success and failure are p = q = 1/2
Therefore, the probability that two heads are given (that is, k = 2) is
What is the probability of finding at least four heads?
For this case we have that k = 4, 5 or 6
Suppose that 2% of the items produced in a factory are defective. Find the probability P that there are three defective items in a sample of 100 items.
For this case we could apply the binomial distribution for n = 100 and p = 0.02 obtaining as a result:
However, since p is small, we use the Poisson approximation with λ = np = 2. A) Yes,
Yet No Comments