Binomial distribution
The binomial distribution is one of the most important discrete probability distributions.
It describes the number of successes in a series of similar and independent experiments, each of which has exactly two possible outcomes ("success" or "failure"). Such series of experiments are also called Bernoulli processes.
If the probability of success on an attempt and the number of attempts, then denote by (also , or ) is the probability of achieving exactly successes (see section Definition).
The binomial distribution and the Bernoulli test can be illustrated with the help of the Galton board. This is a mechanical apparatus into which balls are thrown. These then fall randomly into one of several compartments, where the distribution corresponds to the binomial distribution. Depending on the construction, different parameters and are possible.
Although the binomial distribution was known long before, the term was first used in 1911 in a book by George Udny Yule.
Binomial distributions for
with and as in Pascal's triangle.
The probability that a ball in a Galton board with eight levels ( ) falls into the middle compartment ( ) is .
Examples
The probability of rolling a number greater than 2 with a normal die is ; the probability that this is not the case is . Suppose one rolls the dice 10 times ( ) , then there is a small probability that no number greater than 2 is rolled a single time, or conversely every time. The probability of rolling -times rolling such a number , is described by the binomial distribution
The process described by the binomial distribution is often illustrated by a so-called urn model. In an urn, for example, there are 6 balls, 2 of them black, the others white. Reach into the urn 10 times, take out one ball, note its color and put the ball back. In a special interpretation of this process, drawing a white ball is understood as a "positive event" with probability , drawing a non-white ball as a "negative result". The probabilities are distributed in the same way as in the previous example of rolling the dice.
Definition
Probability function, (cumulative) distribution function, properties
The discrete probability distribution with the probability function
is called the binomial distribution for the parameters (number of trials) and (the success or hit probability).
Note: This formula uses the convention (see zero to the power of zero).
The above formula can be understood like this: We need for a total of trials exactly successes of probability and consequently have exactly failures of probability . However, each of the successes occur on each of the trials, so we still have to deal with the number of -elementary subsets of an -elementary set. This is because there are exactly as many ways to select the successful ones from all trials.
The failure probability complementary to the success probability is often abbreviated as
As necessary for a probability distribution, the probabilities for all possible values must sum to 1. This follows from the binomial theorem as follows:
A random variable distributed according to is accordingly calledbinomially distributed with parameters and and distribution function
,
where ⌊ denotes the rounding function.
Other common notations of the cumulative binomial distribution are , and .
Derivation as Laplace probability
Experiment scheme: An urn contains balls, of which are black and white. The probability of drawing a black ball is therefore . One by one, balls are taken at random, their color is determined, and they are put back.
We calculate the number of possibilities in which black balls can be found, and from this we calculate the so-called Laplace probability ("number of possibilities favorable to the event divided by the total number of (equally probable) possibilities").
For each of the draws, there are possibilities, so in total there are possibilities for the choice of balls. For exactly of these balls to be black, exactly of the draws must have a black ball. For each black ball, there are possibilities, and for each white ball possibilities. The black balls may still be distributed in possible ways over the draws, so there are
Cases where exactly black balls have been selected. The probability of finding exactly } black balls among is thus:
Properties
Symmetry
- The binomial distribution is symmetric in the special cases , and symmetric and otherwise asymmetric.
- The binomial distribution has the property
Expected value
The binomial distribution has the expected value .
Proof
The expected value μ calculated directly from the definition μ and the binomial theorem of
Alternatively, use that a -distributed random variable as a sum of independent Bernoulli distributed random variables with can be written. With the linearity of the expected value then follows
Alternatively, one can also give the following proof using the binomial theorem: If one differentiates at the equation
both sides to , results in
,
so
.
With and the desired result follows.
Variance
The binomial distribution has variance with .
Proof
be a -distributed random variable. The variance is determined directly from the shift theorem to
or, alternatively, from Bienaymé's equation applied to the variance of independent random variables, considering that the identical individual processes satisfy the Bernoulli distribution with becomes
The second equality holds because the individual experiments are independent, so the individual variables are uncorrelated.
Coefficient of variation
From the expected value and variance one obtains the coefficient of variation
Skew
The skewness results to
Camber
The curvature can also be represented closed as
Thus the excess
Mode
The mode, i.e. the value with the maximum probability, is for k = ⌊ for p = 1 {\displaystyle n {\displaystyle If a natural number, also a mode. If the expected value is a natural number, the expected value is equal to the mode.
Proof
Let be without restriction We consider the quotient
.
Now α , if k < , if Thus:
And only in the case the quotient has the value 1, i.e. .
Median
It is not possible to give a general formula for the median of the binomial distribution. Therefore, different cases have to be considered which provide a suitable median:
- If is a natural number, then the expected value, median, and mode agree and are equal to .
- A median lies in the interval ⌊ . Here, ⌊ denote the rounding function and ⌈ denote the rounding up function.
- A median cannot deviate too much from the expected value: .
- The median is unique and coincides with round if either or or (except when and is even).
- If and is odd, then every number in the interval a median of the binomial distribution with parameters and . If and is even, then the unique median.
Cumulants
Analogous to the Bernoulli distribution, the cumulant generating function is
.
Thus, the first cumulants κ and the recursion equation holds.
Characteristic function
The characteristic function has the form
Probability generating function
For the probability generating function we get
Moment generating function
The moment generating function of the binomial distribution is
Sum of binomial distributed random variables
For the sum two independent binomial distributed random variables and with parameters , and , the individual probabilities are obtained by applying Vandermonde's identity
thus again a binomially distributed random variable, but with the parameters and . Thus for the convolution
Thus, the binomial distribution is reproductive for fixed or forms a convolution semigroup.
If the sum is known, each of the random variables and follows a hypergeometric distribution under this condition. To do this, one calculates the conditional probability:
This represents a hypergeometric distribution.
In general: If the random variables are stochastically independent and satisfy the binomial distributions then the sum is also binomially distributed, but with parameters and . Adding binomially distributed random variables with , then a generalized binomial distribution is obtained.
Relationship to other distributions
Relationship to Bernoulli distribution
A special case of the binomial distribution for is the Bernoulli distribution. The sum of independent and identical Bernoulli distributed random variables therefore satisfies the binomial distribution.
Relationship to the generalized binomial distribution
The binomial distribution is a special case of the generalized binomial distribution with for all . More precisely, for fixed expected value and fixed order, it is the one generalized binomial distribution with maximum entropy.
Transition to normal distribution
According to Moivre-Laplace's theorem, the binomial distribution converges to a normal distribution in the limiting case , i.e., the normal distribution can be used as a useful approximation of the binomial distribution if the sample size is sufficiently large and the proportion of the expression sought is not too small. The Galton board can be used to experimentally recreate the approximation to the normal distribution.
It holds μ and σ Substituting in the distribution function of the standard normal distribution, it follows.
As can be seen, the result is thus nothing but the function value of the normal distribution for , μ as well as σ (which can be visualized also as the area of the -th strip of the histogram of the standardized binomial distribution with as its width and as its height). The approximation of the binomial distribution to the normal distribution is used in the normal approximation to quickly determine the probability of many levels of the binomial distribution, especially when no (more) table values are available for them.
Transition to Poisson distribution
An asymptotically asymmetric binomial distribution whose expected value for and converges to a constant λ can be approximated by the Poisson distribution. The value λ is then the expected value for all the binomial distributions considered in the limit calculation as well as for the resulting Poisson distribution. This approximation is also called Poisson approximation, Poisson limit theorem or the law of rare events.
A rule of thumb is that this approximation is useful when and .
The Poisson distribution is therefore the limiting distribution of the binomial distribution for large and small , it is convergence in distribution.
Relationship to geometric distribution
The number of failures until a success occurs for the first time is described by the geometric distribution.
Relationship to the negative binomial distribution
The negative binomial distribution, on the other hand, describes the probability distribution of the number of trials required to achieve a given number of successes in a Bernoulli process.
Relationship to hypergeometric distribution
In the binomial distribution, the selected samples are returned to the selected set, so they can be selected again at a later time. In contrast, if the samples are not returned to the population, the hypergeometric distribution is used. The two distributions merge when the size the population is large and the size the samples is small. As a rule of thumb, for the binomial distribution can be used instead of the mathematically more demanding hypergeometric distribution even if the samples are not taken, since in this case both yield only insignificantly different results.
Relationship to multinomial distribution
The binomial distribution is a special case of the multinomial distribution.
Relationship to Rademacher distribution
If binomially distributed for parameter and , then represented as a scaled sum of Rademacher distributed random variables
This is used in particular for the symmetric random walk on used.
Relationship to Panjer distribution
The binomial distribution is a special case of the Panjer distribution, which combines the distributions binomial distribution, negative binomial distribution and Poisson distribution in one distribution class.
Relationship to beta distribution
For many applications it is necessary to use the distribution function
concretely (for example, for statistical tests or for confidence intervals).
The following relationship to beta distribution helps here:
This is for integer positive parameters and :
To solve the equation
to prove, you can proceed as follows:
- The left and right sides match for (both sides are equal to 1).
- The derivatives with respect to left and right sides of the equation, namely they are both equal .
Relationship to beta binomial distribution
A binomial distribution whose parameter beta distributed is called a beta binomial distribution. It is a mixture distribution.
Relationship to the Pólya distribution
The binomial distribution is a special case of the Pólya distribution (choose ).
Examples
Symmetric binomial distribution (p = 1/2)
·
p = 0.5 and n = 4, 16, 64
·
Mean value subtracted
·
Scaling with standard deviation
This case occurs for the -fold coin toss with a fair coin (probability for heads equal to that for tails, so equal to 1/2). The first figure shows the binomial distribution for and for different values of as a function of . These binomial distributions are mirror symmetric about the value :
This is illustrated in the second figure. The width of the distribution grows in proportion to the standard deviation σ . The function value at , i.e. the maximum of the curve, decreases proportionally to σ .
Accordingly, binomial distributions with different can be scaled to each other by dividing the abscissa by σ and multiplying the ordinate by σ (third figure above).
The adjacent graph shows rescaled binomial distributions again, now for other values of and in a plot that better illustrates that all function values converge to a common curve with increasing By applying the Stirling formula to the binomial coefficients, we see that this curve (solid black in the figure) is a Gaussian bell curve:
.
This is the probability density to the standard normal distribution . In the central limit theorem, this finding is generalized so that sequences of other discrete probability distributions also converge to the normal distribution.
The second graph on the right shows the same data in a semi-logarithmic plot. This is recommended if you want to check whether rare events that deviate from the expected value by several standard deviations also follow a binomial or normal distribution.
Pulling balls
There are 80 balls in a container, 16 of which are yellow. A ball is removed 5 times and then put back again. Because of the putting back, the probability of drawing a yellow ball is the same for all removals, namely 16/80 = 1/5. The value gives the probability that exactly the removed balls are yellow. As an example, we calculate :
So in about 5% of the cases you draw exactly 3 yellow balls.
B(k | 0.2; 5) | |
k | Probability in % |
0 | 0032,768 |
1 | 0040,96 |
2 | 0020,48 |
3 | 0005,12 |
4 | 0000,64 |
5 | 0000,032 |
∑ | 0100 |
Erw.value | 0001 |
Variance | 0000.8 |
Number of people with birthday at the weekend
The probability that a person has a birthday on a weekend this year is (for simplicity) 2/7. There are 10 people in a room. The value indicates (in the simplified model) the probability that exactly of the people present have a birthday on a weekend this year.
B(k | 2/7; 10) | |
k | Probability in % (rounded) |
0 | 0003,46 |
1 | 0013,83 |
2 | 0024,89 |
3 | 0026,55 |
4 | 0018,59 |
5 | 0008,92 |
6 | 0002,97 |
7 | 0000,6797 |
8 | 0000,1020 |
9 | 0000,009063 |
10 | 0000,0003625 |
∑ | 0100 |
Erw.value | 0002,86 |
Variance | 0002,04 |
Common birthday in the year
253 people have come together. The value indicates the probability that exactly present have a birthday on a randomly chosen day (ignoring the year).
B(k | 1/365; 253) | |
k | Probability in % (rounded) |
0 | 049,95 |
1 | 034,72 |
2 | 012,02 |
3 | 002,76 |
4 | 000,47 |
Thus, the probability that "anyone" of these 253 people, i.e., one or more people, has a birthday on that day is .
For 252 persons, the probability is . That is, the threshold of the number of individuals above which the probability that at least one of these individuals has a birthday on a randomly chosen day becomes greater than 50% is 253 individuals (see also Birthday Paradox).
The direct calculation of the binomial distribution can be difficult due to the large factorials. An approximation via the Poisson distribution is permissible here ( With the parameter λ following values result:
P253/365(k) | |
k | Probability in % (rounded) |
0 | 050 |
1 | 034,66 |
2 | 012,01 |
3 | 002,78 |
4 | 000,48 |
Confidence interval for a probability
In an opinion poll among persons, individuals indicate that they will vote for party A. Determine a 95% confidence interval for the unknown proportion of voters who vote for party A in the total electorate.
A solution to the problem without recourse to the normal distribution can be found in the article Confidence Interval for the Success Probability of the Binomial Distribution.
Utilization model
The following formula can be used to calculate the probability that of people perform an activity that takes an average of minutes per hour simultaneously.
Statistical error of class frequency in histograms
The display of independent measurement results in a histogram leads to the grouping of the measured values into classes.
The probability for entries in class is given by the binomial distribution
with and .
Expected value and variance of are then
and .
Thus, the statistical error of the number of entries in class is
.
When the number of classes is large, becomes small and σ .
For example, the statistical accuracy of Monte Carlo simulations can be determined.
The same data in semi-logarithmic order
Binomial distributions with p = 0.5 (with shift by -n/2 and scaling) for n = 4, 6, 8, 12, 16, 23, 32, 46
Random numbers
Random numbers for the binomial distribution are usually generated using the inversion method.
Alternatively, one can exploit the fact that the sum of Bernoulli distributed random variables is binomially distributed. To do this, one generates Bernoulli distributed random numbers and sums them up; the result is a binomially distributed random number.