Wednesday, April 21, 2021

RANDOM VARIABLE-DISCRETE RANDOM VARIABLE

 RANDOM VARIABLE

RANDOM VARIABLE

 A real valued function defined on the outcome of a probability experiment is called a random variable

*________________________________________________*

PROBABILITY DISTRIBUTION FUNCTION OF X

 If X is a random variable, then the function F(x),defined by $F(x)=P\lbrace X \leq x \rbrace $ is calle the distribution function of X

    *________________________________________________*

DISCRETE RANDOM VARIABLE

 A random variable whose set of possible values is either finite or infinite countably infinite is called discrete random variable

    *________________________________________________*

PROBABILITY MASS FUNCTION

If X is a discrete random variable, then the function $P(x)=P[X=x] $ is called the probability mass function of X

    *________________________________________________*

PROBABILITY DISTRIBUTION

The value assumed by the random variable X presented with corresponding probabilities is known as the probability is known as the probability distribution of X


\(X \) \(x_1 \) \(x_2 \) \(x_3 \)
\(P(X=x)\) \(P_1\) \(P_2\) \(P_3\)                                          


   *________________________________________________*

CUMULATIVE DISTRIBUTION OR DISTRIBUTION FUNCTION OF X (FOR DISCRETE RANDOM VARIABLE)

The cumulative distribution function F(x) of a discrete random variable X with probability distribution P(x) is given by
\[ F(x)=P(X \leq x) = \sum_{t\leq x} p(t) \qquad x=-\infty,\dots,-2,-1,0,1,2,\dots,\infty  \]

   *________________________________________________*

PROPERTIES OF DISTRIBUTION FUNCTION


(1.) $F(-\infty)=0$

(2.) $F(\infty)=1$

(3.) $0 \leq F(x) \leq 1$

(4.) $F(x_1)\leq F(x_2) \quad if x_1< x_2$

(5.) $P(x_1 < X \leq x_2)=F(x_2)-F(x_1)$

(6.) $P(x_1 \leq X \leq x_2)=F(x_2)-F(x_1)+P[X=x_1]$

(7.) $P(x_1 < X < x_2)=F(x_2)-F(x_1)-P[X=x_2]$

(8.) $P(x_1 \leq X < x_2)=F(x_2)-F(x_1)-P(X=x_2)+P(X=x_1)$

   *________________________________________________* 

RESULTS

(1) $P(X \leq \infty )=1$

(2) $P(X \leq -\infty)=0$

(3) If $ x_1 \leq x_2 $ then $P(X=x_1)\leq P(X=x_2)$

(4) $P(X>x)=1-P[X \leq x]$

(5) $P(X \leq x)=1-p(X>x)$

   *________________________________________________*

EXPECTED VALUE OF A DISCRETE RANDOM VARIABLE X

Let X be a discrete random variable assuming values $x_1,x_2,\dots,x_n$ with corresponding probabilities $P_1,P_2,\dots,P_n$ then \[ E(x)=\sum_{i} x_i p(x_i) \] is called the expected value of X
$E(x)$ is also called commonly the mean or the exception of X.A useful identity states that for a function g \[ E(g(x))=\sum_{x_i} g(x_i) p(x_i)  \]
   *________________________________________________*

THE VARIANCE OF A RANDOM VARIABLE X

It is defined by $Var(X)=E[x-E(X)]^2$\\

The variance which is equal to the expected value of the square of the difference between X and its expected value.It is a measure of the spread of the possible values of X.\[ Var(X)=E[X^2]-[E(X)]^2 \] the quantity $\sqrt{Var(x)}$ is called the \textbf{Standard deviation of X}
   *________________________________________________*

FORMULA

(1) $\sum_{i}p(x_i)=1$

(2) $F(x)=P[X\leq x]$    e.g...$P[X\leq 4]=F[4]=P(0)+P(1)+P(2)+P(3)+P(4)$

(3) $P(1)=F(1)-F(0)$
     $P(2)=F(2)-F(1)$
     $P(3)=F(3)-F(2)$
 
(4) Mean=$E(x)=\sum x_i p(x_i)$=Expected value
 
(5) $E[X^2]=\sum x_i^2 p(x_i)$
 
(6) Variance=$Var[X]=E[X^2]-[E(X)]^2$
 
(7) $E[aX+b]=aE[X]+b$
 
(8) $Var[aX \pm b]=a^2 Var X$
 
(9) Probability mass function $p(x)=P[X=x]$
 
(10) Standard distribution =$\sqrt{Var(X)}$
  *________________________________________________* 
 
Theorem1:
    Prove that $E(aX+b)=aE(X)+b$


 
 $   E[aX+b] =\sum_{x_i} (ax_i+b)p(x_i) $                    [ $ E(x)=\sum x_i p(x_i)  $]
 
 $                =\sum_{x_i} [ ax_i p(x_i)+b p(x_i) ] $              [ Distributive  Property $ (a+b)\times c =(a\times c)+(b \times c)$ ]
    
                $ = a \sum_{x_i} x_i p(x_i)+b\sum  p(x_i)$                   [$  \sum [a+b]=\sum a+\sum b $ ]
  
                $= aE(X) +b $                        [  E(x)= $ \sum x_i p(x_i) \sum p(x_i)= 1$]
    
$ E[aX+b] =aE(X) +b $
 
         HENCE PROVED
*________________________________________________* 

theorem

    If X is random variable then show that \[ Var(aX+b)=a^2 X+b \]


    Let Y=aX+b
    E(Y) =E[aX+b]
 
We know that ,theorem1   E[aX+b]=aE(X) +b

E(Y) =E[aX+b]
 
        = aE(X) +b                                 [E[aX+b]=aE(X) +b]
 
-E(Y)  =- aE(X) +b                                               [Multiple  -1   both   side]
 
Y-E(Y)  =Y - aE(X) +b                                          [ Add   Y   on   both   side]
 
Y-E(Y)  =(aX+b) - aE(X) +b                                [Y=aX+b]
 
Y-E(Y) =aX+b - aE(X) +b
 
Y-E(Y) =aX-aE(X)                                                 [Cancelled  b]
 
Y-E(Y) =a[X-E(X)]                                                [Take common  term   a ]
 
$ [Y-E(Y)]^2  =[ a[X-E(X)] ]^2  $                       [Square  on both  side ]
 
$ Var(Y) = a^2 Var(x) $ [$ Var(Y)=[Y-E(Y)]^2 ,   Var(x) = [X-E(X)]^2,   (ab)^2=a^2 b^2 $ ]
 
$ Var(aX+b)  = a^2 Var(x)  $                              [Y=aX+b]

i.e.., $ Var(aX+b) = a^2 Var(x)  $

HENCE PROVED
*________________________________________________*  

RESULT


 $\star$ $p[X=x_i]=p(x)$
    $\to $ Probability function (or) Probability distribution (or) Probability mass function (p.m.f)
   
$\star$ $f(x)\to$ Probability density function (p.d.f) (or) density function
   
$\star$  $F(x) \to $ Cumulative distribution function (c.d.f) (or) distribution function

 
*________________________________________________*  

PROBLEMS UNDER THE DISTRIBUTION FUNCTION FOR DISCRETE RANDOM VARIABLE

Example 1
    For the following probability distribution (i)Find the distribution function of X.(ii) What is the smallest value of x for which \boldmath $p(X\leq x) > 0.5$

No comments:

Post a Comment