In today's post we discuss discrete random variables from the textbook we have been referring. A random variable X is a function from a finite or countably infinite sample space S to the real numbers. It associates a real number with each possible outcome of an experiment, which allows us to work on probability distribution induced on the resulting set of numbers. These variables can also be defined for uncountably infinite sample spaces but we will only look at random variables that are discrete.
For a random variable X and a real number x, the event X = x to be such that {s belongs to S : X(s) = x } thus Pr[ X = x] = Sum Pr[s]
The function f(x) = Pr[X = x] is the probability density function of the random variable X
Per the definitions of probabilities we know that Pr[X = x] >= 0
and that the sum of the individual probabilities is equal to 1.
If we take the example of a pair of dice with six possible outcomes each and we define a random variable X to be the maximum of the two values showing on the dice, then we have
Pr[X = 3] = 5/ 36
because there are 36 possible outcomes when we take the values in pairs
and the value that X assigns is 3 since
it has 5 possible outcomes (1,3), (2,3), (3,3), (3,2), (3,1)
It is common for several random variables to be defined on the same sample space.
If there are two random variables defined on the same sample space, say X and Y
then their co-occurrence has a probability distribution function that is
Pr [ X = x and Y = y] which is the joint probability distribution.
If we fix one of the values, we can vary the other and this can be summed.
For a fixed value y, Pr[Y = y] = Sum of all x Pr[X=x and Y = y]
The same goes for a fixed value of x, where we can vary y.
We can extend this to conditional probabilities as well. For example,
Pr[X = x | Y = y] = Pr [ X = x and Y = y] / Pr [Y = y]
We can say that two random variables x and y are independent if for all x and y
the events X = x and Y = y are independent which we can express as
Pr[ X = x and Y = y] = Pr [X = x].Pr[Y = y]
The simplest summary of the distribution of a random variable is the average of the values it takes.
For a random variable X and a real number x, the event X = x to be such that {s belongs to S : X(s) = x } thus Pr[ X = x] = Sum Pr[s]
The function f(x) = Pr[X = x] is the probability density function of the random variable X
Per the definitions of probabilities we know that Pr[X = x] >= 0
and that the sum of the individual probabilities is equal to 1.
If we take the example of a pair of dice with six possible outcomes each and we define a random variable X to be the maximum of the two values showing on the dice, then we have
Pr[X = 3] = 5/ 36
because there are 36 possible outcomes when we take the values in pairs
and the value that X assigns is 3 since
it has 5 possible outcomes (1,3), (2,3), (3,3), (3,2), (3,1)
It is common for several random variables to be defined on the same sample space.
If there are two random variables defined on the same sample space, say X and Y
then their co-occurrence has a probability distribution function that is
Pr [ X = x and Y = y] which is the joint probability distribution.
If we fix one of the values, we can vary the other and this can be summed.
For a fixed value y, Pr[Y = y] = Sum of all x Pr[X=x and Y = y]
The same goes for a fixed value of x, where we can vary y.
We can extend this to conditional probabilities as well. For example,
Pr[X = x | Y = y] = Pr [ X = x and Y = y] / Pr [Y = y]
We can say that two random variables x and y are independent if for all x and y
the events X = x and Y = y are independent which we can express as
Pr[ X = x and Y = y] = Pr [X = x].Pr[Y = y]
The simplest summary of the distribution of a random variable is the average of the values it takes.
No comments:
Post a Comment