Question

In: Statistics and Probability

(A universal random number generator.)Let X have a continuous, strictly increasing cdf F. Let Y =...

(A universal random number generator.)Let X have a continuous, strictly increasing cdf F. Let Y = F(X). Find the density of Y. This is called the probability integral transform. Now let U ∼ Uniform(0,1) and let X = F−1(U). Show that X ∼ F. Now write a program that takes Uniform (0,1) random variables and generates random variables from an Exponential (β) distribution

Solutions

Expert Solution

Probability Integral Transform method:

If has distribution, then such that or is an observation from the probability distribution , this means that we can generate observations from the distribution by generating random variables (which most software programs can do easily) and applying the transformation.

Suppose you want to generate instances of an exponential() random variable. The cdf is

The following R code generates 10 exponential random variables taking Uniform (0,1) random variables.

n_random = 10
beta = 2
U = runif(n_random, min = 0, max = 1)
Y = −(1/beta)*log(1-U)
Y


Related Solutions

let the continuous random variables X and Y have the joint pdf: f(x,y)=6x , 0<x<y<1 i)...
let the continuous random variables X and Y have the joint pdf: f(x,y)=6x , 0<x<y<1 i) find the marginal pdf of X and Y respectively, ii) the conditional pdf of Y given x, that is fY|X(y|x), iii) E(Y|x) and Corr(X,Y).
Let X and Y be continuous random variables with joint pdf f(x, y) = kxy^2 0...
Let X and Y be continuous random variables with joint pdf f(x, y) = kxy^2 0 < x, 0 < y, x + y < 2 and 0 otherwise 1) Find  P[X ≥ 1|Y ≤ 1.5] 2) Find P[X ≥ 0.5|Y ≤ 1]
Let X and Y be two jointly continuous random variables with joint PDF f(x,y) = Mxy^2...
Let X and Y be two jointly continuous random variables with joint PDF f(x,y) = Mxy^2 0<x<y<1 a) Find M = ? b) Find the marginal probability densities. c) P( y> 1/2 | x = .25) = ? d) Corr (x,y) = ?
Let the random variable X and Y have the joint pmf f(x, y) = , c...
Let the random variable X and Y have the joint pmf f(x, y) = , c xy 2 where x = 1, 2, 3; y = 1, 2, x + y ≤ 4 , that is, (x, y) are {(1, 1),(1, 2),(2, 1),(2, 2),(3, 1)} . (a) Find c > 0 . (b) Find μ . X (c) Find μ . Y (d) Find σ . 2 X (e) Find σ . 2 Y (f) Find Cov (X, Y )...
Let the continuous random variable X have probability density function f(x) and cumulative distribution function F(x)....
Let the continuous random variable X have probability density function f(x) and cumulative distribution function F(x). Explain the following issues using diagram (Graphs) a) Relationship between f(x) and F(x) for a continuous variable, b) explaining how a uniform random variable can be used to simulate X via the cumulative distribution function of X, or c) explaining the effect of transformation on a discrete and/or continuous random variable
Let X1, X2, X3, X4, X5 be independent continuous random variables having a common cdf F...
Let X1, X2, X3, X4, X5 be independent continuous random variables having a common cdf F and pdf f, and set p=P(X1 <X2 <X3 < X4 < X5). (i) Show that p does not depend on F. Hint: Write I as a five-dimensional integral and make the change of variables ui = F(xi), i = 1,··· ,5. (ii) Evaluate p. (iii) Give an intuitive explanation for your answer to (ii).
Let X and Y be continuous random variables with E[X] = E[Y] = 4 and var(X)...
Let X and Y be continuous random variables with E[X] = E[Y] = 4 and var(X) = var(Y) = 10. A new random variable is defined as: W = X+2Y+2. a. Find E[W] and var[W] if X and Y are independent. b. Find E[W] and var[W] if E[XY] = 20. c. If we find that E[XY] = E[X]E[Y], what do we know about the relationship between the random variables X and Y?
The Probability Integral Transformation Theorem states the following: Let X have continuous cdf FX(x) and define...
The Probability Integral Transformation Theorem states the following: Let X have continuous cdf FX(x) and define the random variable U as U = FX(x). Then U is uniformly distributed on (0,1), i.e., P(U ≤ u) = u, 0 < u < 1. This theorem can be used to generate random variables with an arbitrary continuous distribution function F, if F ^−1 is explicitly known. To illustrate how the method works, you will generate 1,000 random numbers from an Exponential(λ) distribution...
Let the random variable and have the joint pmf X Y f(x,y) = {x(y)^2}/c where x...
Let the random variable and have the joint pmf X Y f(x,y) = {x(y)^2}/c where x = 1, 2, 3 ; y = 1, 2, x+y<= 4, that is (x,y) are {(1,1), (1,2), (2,1), (2,2), 3,1)} (a) Find . c > 0 (b) Find . μX (c) Find . μY (d) Find . σ2 X (e) Find . σ2 Y (f) Find Cov . (X,Y ) (g) Find p , Corr . (x,y) (h) Are and X and Y independent
let the random variables x and y have the joint p.m.f f(x,y)= (x+y)/12 where x=1,2 and...
let the random variables x and y have the joint p.m.f f(x,y)= (x+y)/12 where x=1,2 and y=1,2 calculate the covariance of x and y calculate the correlation coefficient of x and y Thank you
ADVERTISEMENT
ADVERTISEMENT
ADVERTISEMENT