Probabilisitically, the law of large numbers fails, as you can see in the following simulation exercise: In the Cauchy experiment (with the default parameter values), a light sources is 1 unit from position 0 on an infinite straight wall. But the last integral is $$\mu_n$$, so by the induction hypothesis, $$\mu_{n+1} = \frac{n + 1}{n} \frac{n}{r} = \frac{n + 1}{r}$$. better way, given by the change of variables theorem for Properties of Expected Value for Uncertain Variables 615 3 Main Results The purpose of this section is to investigate some new theorems of expected value and give the concept of sublinear expectation. Show that if X nonnegative integers. If $$0 \lt a \lt 1$$, We summarizesome elementary properties of expected value and variance in the fol-lowing Theorem 1. Suppose that $$X$$ and $$Y$$ are independent random variables taking values in general spaces $$S$$ and $$T$$ respectively, and that $$u: S \to \R$$ The arcsine distribution is studied in more generality in the chapter on Special Distributions. $$Y = X_1 + X_2$$, the sum of the scores. Let $$Y$$ denote the number of type 1 objects in the sample, so that $$Y = \sum_{i=1}^n X_i$$. By definition, the expectation value is . \E(X + Y) & = \int_{S \times T} (x + y) f(x, y) \, d(x, y) = \int_{S \times T} x f(x, y) \, d(x, y) + \int_{S \times T} y f(x, y) \, d(x, y) \\ Suppose that $$T$$ has the exponential distribution with rate parameter $$r$$. $\E(X) = \frac{1}{n} \sum_{i=0}^{n-1} (a + i h) = \frac{1}{n}\left(n a + h \frac{(n - 1) n}{2}\right) = a + \frac{(n - 1) h}{2} = \frac{a + b}{2}$. This app simulates the first arrival in a Poisson process. Let we introduce an inequality on expected value. Suppose that $$X$$ and $$Y$$ are real-valued, independent random variables, and that $$\E(X) = 5$$ and $$\E(Y) = -2$$. f(n) = e^{-a} \frac{a^n}{n! Sketch the graph of $$f$$ and show the location of the mean, median, and mode on the $$x$$-axis. As usual, we assume that the indicated expected values exist. For selected values of the parameters, run the simulation 1000 times and compare the empirical mean to the distribution mean. EXPECTED VALUE 229 X Y HHH 1 HHT 2 HTH 3 HTT 2 THH 2 THT 3 TTH 2 TTT 1 Table 6.2: Tossing a coin three times. This is the uniform distribution the interval $$[a, a + w]$$. Suppose that there are 5 duck hunters, each a perfect shot. The parameters $$m, \, r, \, n \in \N$$ with $$r \le m$$ and $$n \le m$$. 23. Let $$Y = \sum_{i=1}^n X_i$$, the sum of the variables. In particular, if $$\bs{1}_A$$ is the indicator variable of an event $$A$$, then $$\E\left(\bs{1}_A\right) = \P(A)$$, so in a sense, expected value subsumes probability. The expected value of a real-valued random variable gives the center of the distribution of the variable. Recall that a Bernoulli trials process is a sequence $$\bs{X} = (X_1, X_2, \ldots)$$ of independent, identically distributed indicator random variables. As usual, we start with a random experiment modeled by a probability space $$(\Omega, \mathscr F, \P)$$. Run the experiment 1000 times and compare the sample mean to the distribution mean. conditioned on a given event B for the experiment (with P(B) \end{align} where $$a \in (0, \infty)$$ is a parameter. \[ g(n) = p (1 - p)^{n-1}, \quad n \in \N_+. + b g(x) $f(y) = \binom{n}{y} p^y (1 - p)^{n - y}, \quad y \in \{0, 1, \ldots, n\}$, If $$Y$$ has the binomial distribution with parameters $$n$$ and $$p$$ then $$\E(Y) = n p$$, The critical tools that we need involve binomial coefficients: the identity $$y \binom{n}{y} = n \binom{n - 1}{y - 1}$$ for $$y, \, n \in \N_+$$, and the binomial theorem: Thus, expected value is a linear operation on the collection of real-valued random variables for the experiment. Suppose that X has $f(x) = \frac{1}{\pi \left(1 + x^2\right)}, \quad x \in \R$ It is very A much better proof uses the additive property and the representation of $$Y$$ as a sum of indicator variables. The proof is by induction on $$n$$, so let $$\mu_n$$ denote the mean when the shape parameter is $$n \in \N_+$$. Note again how much easier and more intuitive the second proof is than the first. Open the Brownian motion experiment and select the last zero. 12. Prove Jensen's inequality: Run the simulation 1000 times and note the behavior of the empirical mean. For the expected value above to make sense, the sum must be well defined, as in the discrete case, the integral must be well defined, as in the continuous case, and we must avoid the dreaded indeterminate form $$\infty - \infty$$. course, on the probability measure Suppose that $$X$$ has a continuous distribution on $$S \subseteq \R$$ with PDF $$f$$. The exponential distribution is studied in detail in the chapter on the Poisson Process. It follows from the last result that independent random variables are uncorrelated (a concept that we will study in a later section). variables theorem is the main tool you will need. $\E(X) = \int_1^\infty x \frac{a}{x^{a+1}} \, dx = \int_1^\infty \frac{a}{x^a} \, dx = \frac{a}{-a + 1} x^{-a + 1} \bigg|_1^\infty = \frac{a}{a - 1}$. All results that we obtain for expected value in general have analogues for these conditional expected values. For selected values of the parameters, run the simulation 1000 times and compare the empirical mean to the distribution mean. Find the following expected values: Suppose that $$N$$ has a discrete distribution with probability density function $$f$$ given by $$f(n) = \frac{1}{50} n^2 (5 - n)$$ for $$n \in \{1, 2, 3, 4\}$$.