Jointly Distributed Random Variables: 11 Important Facts

Content

Jointly distributed random variables

     The jointly distributed random variables are the random variable more than one with probability jointly distributed for these random variables, in other words in experiments where the different outcome with their common probability is known as jointly distributed random variable or joint distribution, such type of situation occurs frequently while dealing the problems of the chances.

Joint distribution function | Joint Cumulative probability distribution function | joint probability mass function | joint probability density function

    For the random variables X and Y the distribution function or joint cumulative distribution function is

gif

where the nature of the joint probability depends on the nature of random variables X and Y either discrete or continuous, and the individual distribution functions for X and Y can be obtained using this joint cumulative distribution function as

gif

similarly for Y as

gif

these individual distribution functions of X and Y are known as Marginal distribution functions when joint distribution is under consideration. These distributions are very helpful for getting the probabilities like

and in addition the joint probability mass function for the random variables X and Y is defined as

gif

the individual probability mass or density functions for X and Y can be obtained with the help of such joint probability mass or density function like in terms of discrete random variables as

gif

and in terms of continuous random variable the joint probability density function will be

gif

where C is any two dimensional plane, and the joint distribution function for continuous random variable will be

image 60

the probability density function from this distribution function can be obtained by differentiating

gif

and the marginal probability from the joint probability density function

gif

as

gif

and

gif

with respect to the random variables X and Y respectively

Examples on Joint distribution

  1. The joint probabilities for the random variables X and Y representing the number of mathematics and statistics books from a set of books which contains 3 mathematics, 4 statistics and 5 physics books if 3 books taken randomly
%5Cbinom%7B12%7D%7B3%7D%3D%5Cfrac%7B1%7D%7B220%7D
  • Find the joint probability mass function for the sample of families having 15% no child, 20% 1 child, 35% 2 child and 30% 3 child if the family we choose randomly from this sample for child to be Boy or Girl?

The joint probability we will find by using the definition as

Jointly distributed random variables
Jointly distributed random variables : Example

and this we can illustrate in the tabular form as follows

Jointly distributed random variables
Jointly distributed random variables : Example of joint distribution
  • Calculate the probabilities
gif

if for the random variables X and Y the joint probability density function is given by

gif

with the help of definition of joint probability for continuous random variable

gif

and the given joint density function the first probability for the given range will be

gif
gif
gif
gif

in the similar way the probability

gif
gif
gif
gif

and finally

gif
gif
gif
  • Find the joint density function for the quotient X/Y of random variables X and Y if their joint probability density function is
gif

To find the probability density function for the function X/Y we first find the joint distribution function then we will differentiate the obtained result,

so by the definition of joint distribution function and given probability density function we have

%7BY%7D%28a%29%3DP%20%7B%20%5Cfrac%7BX%7D%7BY%7D%5Cleq%20a%20%7D
gif
gif
gif
gif

thus by differentiating this distribution function with respect to a we will get the density function as

gif

where a is within zero to infinity.

Independent random variables and joint distribution

     In the joint distribution the probability for two random variable X and Y is said to be independent if

gif

where A and B are the real sets. As already in terms of events we know that the independent random variables are the random variables whose events are independent.

Thus for any values of a and b

gif

and the joint distribution or cumulative distribution function for the independent random variables X and Y will be

gif

if we consider the discrete random variables X and Y then

gif

since

gif
gif
gif
gif

similarly for the continuous random variable also

gif

Example of independent joint distribution

  1. If for a specific day in a hospital the patients entered are poisson distributed with parameter λ and probability of male patient as p and probability of female patient as (1-p) then show that the number of male patients and female patients entered in the hospital are independent poisson random variables with parameters λp and λ(1-p) ?

consider the number of male and female patients by random variable X and Y then

gif
gif

as X+Y are the total number of patients entered in the hospital which is poisson distributed so

gif

as the probability of male patient is p and female patient is (1-p) so exactly from total fix number are male or female shows binomial probability as

gif

using these two values we will get the above joint probability as

gif
gif
gif

thus probability of male and female patients will be

gif
gif

and

gif

which shows both of them are poisson random variables with the parameters λp and λ(1-p).

2. find the probability that a person has to wait for more than ten minutes at the meeting for a client as if each client and that person arrives between  12 to 1 pm following uniform distribution.

consider the random variables X and Y to denote the time for that person and client between 12 to 1 so the probability jointly for X and Y will be

image 61
gif
gif
gif
gif

calculate

gif

where X,Y and Z are uniform random variable over the interval (0,1).

here the probability will be

gif

for the uniform distribution the density function

gif

for the given range so

gif
gif
gif
gif

SUMS OF INDEPENDENT RANDOM VARIABLES BY JOINT DISTRIBUTION

  The sum of independent variables X and Y with the probability density functions as continuous random variables, the cumulative distribution function will be

gif
gif
gif
gif

by differentiating this cumulative distribution function for the probability density function of these independent sums are

latex%5Dfty%7D%20F %7BX%7D%20%28a y%29%20f %7BY%7D%28y%29dy
gif
gif

by following these two results we will see some continuous random variables and their sum as independent variables

sum of independent uniform random variables

   for the random variables X and Y uniformly distributed over the interval (0,1) the probability density function for both of these independent variable is

gif

so for the sum X+Y we have

gif

for any value a lies between zero and one

gif

if we restrict a in between one and two it will be

gif

this gives the triangular shape density function

gif

if we generalize for the n independent uniform random variables 1 to n then their distribution function

by mathematical induction will be

gif

sum of independent Gamma random variables

    If we have two independent gamma random variables with their usual density function

gif

then following the density for the sum of independent gamma random variables

gif
gif
gif
gif
gif

this shows the density function for the sum of gamma random variables which are independent

sum of independent exponential random variables

    In the similar way as gamma random variable the sum of independent exponential random variables we can obtain density function and distribution function by just specifically assigning values of gamma random variables.

Sum of independent normal random variable | sum of independent Normal distribution

                If we have n number of independent normal random variables Xi , i=1,2,3,4….n with respective means μi and variances σ2i then their sum is also normal random variable with the mean as Σμi  and variances Σσ2i

    We first show the normally distributed independent sum for two normal random variable X with the parameters 0 and σ2 and Y with the parameters 0 and 1, let us find the probability density function for the sum X+Y with

gif

in the joint distribution density function

gif

with the help of definition of density function of normal distribution

gif
gif

thus the density function will be

gif
gif
gif

which is nothing but the density function of a normal distribution with mean 0 and variance (1+σ2) following the same argument we can say

em%3E%7B2%7D

with usual mean and variances. If we take the expansion and observe the sum is normally distributed with the mean as the sum of the respective means and variance as the sum of the respective variances,

thus in the same way the nth sum will be the normally distributed random variable with the mean as Σμi  and variances Σσ2i

Sums of independent Poisson random variables

If we have two independent Poisson random variables X and Y with parameters λ1 and λ2 then their sum X+Y is also Poisson random variable or Poisson distributed

since X and Y are Poisson distributed and we can write their sum as the union of disjoint events so

gif
gif
em%3E%7B2%7D%5E%7Bn k%7D%7D%7B%28n k%29%21%7D

by using the of probability of independent random variables

em%3E%7B2%7D%5E%7Bn k%7D%7D%7Bk%21%28n k%29%21%7D
em%3E%7B2%7D%5E%7Bn k%7D
em%3E%7B2%7D%29%5E%7Bn%7D

so we get the sum X+Y is also Poisson distributed with the mean λ12

Sums of independent binomial random variables

                If we have two independent binomial random variables X and Y with parameters (n,p) and (m, p) then their sum X+Y is also binomial random variable or Binomial distributed with parameter (n+m, p)

let use the probability of the sum with definition of binomial as

gif
gif
gif
gif
gif

which gives

gif

so the sum X+Y is also binomially distributed with parameter (n+m, p).

Conclusion:

The concept of jointly distributed random variables which gives the distribution comparatively for more than one variable in the situation is discussed in addition the basic concept of independent random variable with the help of joint distribution and sum of independent variables with some example of distribution is given with their parameters, if you require further reading go through mentioned books. For more post on mathematics, please click here.

https://en.wikipedia.org

A first course in probability by Sheldon Ross

Schaum’s Outlines of Probability and Statistics

An introduction to probability and statistics by ROHATGI and SALEH