RANDOM VARIABLE-DISCRETE RANDOM VARIABLE
RANDOM VARIABLE
RANDOM VARIABLE
*________________________________________________*
PROBABILITY DISTRIBUTION FUNCTION OF X
If X is a random variable, then the function F(x),defined by $F(x)=P\lbrace X \leq x \rbrace $ is calle the distribution function of X
*________________________________________________*
DISCRETE RANDOM VARIABLE
A random variable whose set of possible values is either finite or infinite countably infinite is called discrete random variable
*________________________________________________*
PROBABILITY MASS FUNCTION
If X is a discrete random variable, then the function $P(x)=P[X=x] $ is called the probability mass function of X
*________________________________________________*
PROBABILITY DISTRIBUTION
The value assumed by the random variable X presented with corresponding probabilities is known as the probability is known as the probability distribution of X
\(X \) | \(x_1 \) | \(x_2 \) | \(x_3 \) | … | |||||||||||||||||||||||||
---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|
\(P(X=x)\) | \(P_1\) | \(P_2\) | \(P_3\) | … |
*________________________________________________*
CUMULATIVE DISTRIBUTION OR DISTRIBUTION FUNCTION OF X (FOR DISCRETE RANDOM VARIABLE)
The cumulative distribution function F(x) of a discrete random variable X with probability distribution P(x) is given by
\[ F(x)=P(X \leq x) = \sum_{t\leq x} p(t) \qquad x=-\infty,\dots,-2,-1,0,1,2,\dots,\infty \]
*________________________________________________*
PROPERTIES OF DISTRIBUTION FUNCTION
(1.) $F(-\infty)=0$
(2.) $F(\infty)=1$
(3.) $0 \leq F(x) \leq 1$
(4.) $F(x_1)\leq F(x_2) \quad if x_1< x_2$
(5.) $P(x_1 < X \leq x_2)=F(x_2)-F(x_1)$
(6.) $P(x_1 \leq X \leq x_2)=F(x_2)-F(x_1)+P[X=x_1]$
(7.) $P(x_1 < X < x_2)=F(x_2)-F(x_1)-P[X=x_2]$
(8.) $P(x_1 \leq X < x_2)=F(x_2)-F(x_1)-P(X=x_2)+P(X=x_1)$
*________________________________________________*
RESULTS
(1) $P(X \leq \infty )=1$
(2) $P(X \leq -\infty)=0$
(3) If $ x_1 \leq x_2 $ then $P(X=x_1)\leq P(X=x_2)$
(4) $P(X>x)=1-P[X \leq x]$
(5) $P(X \leq x)=1-p(X>x)$
*________________________________________________*
EXPECTED VALUE OF A DISCRETE RANDOM VARIABLE X
Let X be a discrete random variable assuming values $x_1,x_2,\dots,x_n$ with corresponding probabilities $P_1,P_2,\dots,P_n$ then \[ E(x)=\sum_{i} x_i p(x_i) \] is called the expected value of X$E(x)$ is also called commonly the mean or the exception of X.A useful identity states that for a function g \[ E(g(x))=\sum_{x_i} g(x_i) p(x_i) \]
THE VARIANCE OF A RANDOM VARIABLE X
It is defined by $Var(X)=E[x-E(X)]^2$\\The variance which is equal to the expected value of the square of the difference between X and its expected value.It is a measure of the spread of the possible values of X.\[ Var(X)=E[X^2]-[E(X)]^2 \] the quantity $\sqrt{Var(x)}$ is called the \textbf{Standard deviation of X}
FORMULA
(1) $\sum_{i}p(x_i)=1$(2) $F(x)=P[X\leq x]$ e.g...$P[X\leq 4]=F[4]=P(0)+P(1)+P(2)+P(3)+P(4)$
(3) $P(1)=F(1)-F(0)$
$P(2)=F(2)-F(1)$
$P(3)=F(3)-F(2)$
Prove that $E(aX+b)=aE(X)+b$
$ E[aX+b] =\sum_{x_i} (ax_i+b)p(x_i) $ [ $ E(x)=\sum x_i p(x_i) $]
theorem
If X is random variable then show that \[ Var(aX+b)=a^2 X+b \]
Let Y=aX+b
E(Y) =E[aX+b]
E(Y) =E[aX+b]
i.e.., $ Var(aX+b) = a^2 Var(x) $
RESULT
$\star$ $p[X=x_i]=p(x)$
$\to $ Probability function (or) Probability distribution (or) Probability mass function (p.m.f)
PROBLEMS UNDER THE DISTRIBUTION FUNCTION FOR DISCRETE RANDOM VARIABLE
Example 1For the following probability distribution (i)Find the distribution function of X.(ii) What is the smallest value of x for which \boldmath $p(X\leq x) > 0.5$