Contents
Show
Using this definition, one can write the probability that XX takes a value in a certain interval [a,b][a,b] without using an integral. Recall that previously this probability was defined in terms of a PDF: P(a≤X≤b)=∫abfX(x)dx.P(a\leq X \leq b) = \int_a^b f_X (x) \,dx. Now, the probability is rewritten as the difference in values of the CDF: P(a≤X≤b)=FX(b)−F X(a).P(a \leq X \leq b) = F_X(b) - F_X(a). So the CDF gives the amount of area underneath the PDF between two points. It increases from zero (for very low values of xx ) to one (for very high values of xx). This is because as x→−∞x \to -\infty, there is no probability that X X will be found that far out if the PDF is normalized. If x→∞x \to \infty, this corresponds to P(X≤∞)P(X \leq \infty) which will be one because it is certain that XX takes some finite value. In the case of discrete random variables, the value of FX F_X makes a discrete jump at all possible values of xx; the size of the jump corresponds to the probability P(X=x)P(X = x) of that value. In the case of a continuous random variable, the function increases continuously; it is not meaningful to speak of the probability that X=xX = x because this probability is always zero. Instead one considers the probability that the value of XX lies in a given interval: P(X∈[a,b])=P(a≤X≤ b)=FX(b)−FX(a).P(X \in [a,b]) = P(a ≤ X ≤ b) = F_X(b)-F_X(a). Note that it does not matter if the inequalities are strict (if the interval is [a,b][a,b] or (a,b)(a,b) for example): since the probability of any given value is zero, the endpoints can be included or not without changing any probabilities. Still, one frequently wants to make use of the probability density function fX(x)f_X (x) rather than the CDF. Since the CDF corresponds to the integral of the PDF, the PDF corresponds to the derivative of the CDF: fX(x)=FX′(x)=dFXdx.f_X(x) = F_X'(x) = \frac{dF_X}{dx} .
1−e−100λ1-e^{-100 \lambda} λe− 100λ\lambda e^{-100 \lambda} e100λe^{100 \lambda} −e−100λ-e^{-100 \lambda} The probability density function of a certain random variable XX is: f X(x)=λe−λx,f_X (x) = \lambda e^{-\lambda x}, where xx takes values in [0,∞)[0,\infty). Find the probability that x<100x < 100. One question that often comes up in applications of continuous probability is the following: given the PDF of a random variable, is it possible to find the PDF of an arbitrary function of that random variable? The answer is yes, and the easiest method uses the CDF of the random variable. The general case goes as follows: consider the CDF FX( x)F_X (x) of the random variable XX, and let Z=g(X)Z = g(X) be a function of XX. It's important to note the distinction between upper and lower case: XX is a random variable while x x is a real number. Recall that the PDF is given by the derivative of the CDF: fX(x)=ddXFX(x)=d dxP(X≤x).f_X (x) = \frac{d}{dX} F_X (x) = \frac{d}{dx} P(X \leq x). Now write the formula for the CDF of ZZ: fZ(z)=ddzP(Z≤z)=ddzP(g(X)≤z )=ddzP(X≤g−1(z))=ddzFX(g−1(z)).f_Z (z) = \frac{d}{dz} P(Z \leq z) = \frac{d}{dz} P(g(X) \leq z) = \frac{d}{dz} P(X \leq g^{-1} (z)) = \frac{d}{dz} F_X (g^{-1} (z)). If gg is invertible and increasing, then by the chain rule: fZ(z)=fX(g−1(z))dg−1(z)dz.f_Z (z) = f_X (g^{-1} (z)) \frac{dg^{-1} (z)}{dz}. This formula can be generalized straightforwardly to cases where gg is not invertible or increasing.
What is the relationship between probability density function and cumulative distribution function?Probability and Random Variables
(1.7), p(x) = F′(x). Thus, the probability density is the derivative of the cumulative distribution function. This in turn implies that the probability density is always nonnegative, p(x) ≥ 0, because F is monotone increasing.
Is cumulative distribution function same as probability distribution function?The cumulative distribution function is used to describe the probability distribution of random variables. It can be used to describe the probability for a discrete, continuous or mixed variable. It is obtained by summing up the probability density function and getting the cumulative probability for a random variable.
What is the difference between probability and cumulative probability?Probability is the measure of the possibility that a given event will occur. Cumulative probability is the measure of the chance that two or more events will happen. Usually, this consists of events in a sequence, such as flipping "heads" twice in a row on a coin toss, but the events may also be concurrent.
What is the major difference between CDF and PMF or PDF )?The PMF is one way to describe the distribution of a discrete random variable. As we will see later on, PMF cannot be defined for continuous random variables. The cumulative distribution function (CDF) of a random variable is another method to describe the distribution of random variables.
|