Let $X=2e^{-2\theta}$, ask to calculate $E[X^2]$. Here's the right way to calculate it$$E[X^2]=var[X]+(E[X])^2={1\over 2^2} + ({1\over 2})^2={1\over 2}$$but if I do it in another way$$E[X^2]=E[(2e^{-2\theta})^2] = E[4e^{-4\theta}]= {1\over 4}$$Then it ends up $1\over 4$, what's wrong with the second calculation?...Read more

In a binomial experiment we know that every trial is is independent and that the probability of success, $p$ is the same in every trial. This also means that the expected value of any individual trial is $p$. So if we have a sample of size $n$, by the linearity property of the expectation, the expected value of the same is just $n \cdot p$. This is all intuitive.When the population size is finite and when we don't replace the items after every trial, we can't use the binomial distribution to get the probability of $k$ successes in a sample of s...Read more

I am trying to solve a question from a past exam paper. Suppose you have a single observation $X$ from a continuous distribution for which the probability density function (pdf) is either $f_0$ or $f_1$, where $$ f_0(x) = \begin{cases} 1, & \text{for 0} < x < 1 \\ 0, & \text{otherwise} \end{cases} $$ $$ f_1(x) = \begin{cases} 4x^3, & \text{for 0} < x < 1 \\ 0, & \text{otherwise} \end{cases} $$ On the basis of one observation, we would like to determine whether $f_0$ or $f_1$ is the ...Read more

I have an agent with a utility function of the form $ u(x) = -(x-g)^2 $, with $ g $ being a constant. $ x \sim Beta (a,b, \alpha, \beta) $, with $ [a,b] $ being the support, and $ \alpha, \beta $ being the parameters. I would like to calculate the expected utility of the agent. Now, I have tried to approach this in several ways, including Taylor series approximations, and would always get a different result. I am sure I am making a mistake somewhere, would you be able to point it out?Here are the 2 main approaches I have tried and I have no ide...Read more

I've spent so much time trying to proof to myself these are equivalent, but it's not working out. Given some $X = u + Y$,where $Y \sim N(0,1)$.I have some estimator $Z = 2x_0 + \sum_{i=0}^N x_i^2 + 2x_{N+1}$$Var[z] = E[Z^2] = E[(2x_0 + \sum_{i=0}^N x_i^2 + 2x_{N+1})^2] = 4 + N + 4 = N + 8$because the variance is distributed. $ Var[2x_0] + Var[\sum_{i=0}^N x_i^2] + Var[2x_{N+1}] $However, when I try to distribute the terms as a polynomial instead, I definitely don't get that answer. I know I'm doing something wrong somewhere.$Var[z] = E[Z^2] = E...Read more

Suppose that X and Y are mean zero, unit variance random variables.If least squares regression (without intercept) of Y against X gives a slope of $\beta$ (i.e. it minimises $\mathbb{E}\left[ (Y-\beta X)^2 \right]$), what is the slope of the regression of X against Y?Is my understanding correct that the minimisation factor isn't important here due to the fact that both random variables have identical variances?$$\beta = \frac{\mathrm{Cov}(X,Y)}{\mathrm{Var}(X)} = \frac{\mathbb{E}[X,Y] - \mathbb{E}[X]\mathbb{E}[Y]}{\mathrm{Var}(X)} = \frac{\math...Read more

Let$Z$ be a gaussian white noise with mean $0$ and variance $1$$c \in \mathbb{R}$ constantIs the time series stationary? I compute the mean and variance and they look constant, am I right? if so what is the value of the auto-covariance function, $\mathbb{C}(X_j, X_{j+l})$ ?$$\mathbb{E}(X_j) = \mathbb{E}(Z_0 \cos(cj))= \cos(cj) \mathbb{E}(Z_0) = 0$$$$\mathbb{V}(X_j) = \cos^2(cj) \mathbb{V}(Z_0) = \cos^2(cj)$$...Read more

I'm studying about Markov Processes and I came across the following exercise in my reference book (Daniel W. Stroock An Introduction to Markov Processes): Let $\{Y_n:n\geq 1\}$ be a sequence of mutually independent, identically distributed random variables satisfying $E[Y_1]<\infty$. Set $X_n=\sum_{m=1}^n Y_m$ for $n\geq 1$. The Weak Law of Large Numbers says that $$P\left(\left|\frac{X_n}{n}-E[Y_1]\right|\geq \epsilon\right)\rightarrow 0\;\;\;\text{for all } \epsilon>0.$$ In fact, $$\lim_{n\rightarrow \infty} E\left[\left|...Read more

Suppose that each entry in $n$ by $p$ matrix $X$ has standard normal distribution $\mathcal{N}(0,1)$. I am interested in finding the proof that $\mathbb{E}(X(X'X)^{-1}X') = \frac pn \cdot I_n$,which is my guess coming from simulations. I have tried to attack the problem using Wishard distributions but without results. I showed that:$X(X'X)^{-1}X'$ is symmetrictrace of $X(X'X)^{-1}X'$ is equal to $p$$X(X'X)^{-1}X'$ is idempotent matrix...Read more

How to solve it?Let $X_1,X_2,\dots,X_n$ be a random sample from a $N(0,1)$ population. Define $$ Y_1=\left|\frac{1}{n}\sum_{i=1}^nX_i \right|,\ \ Y_2=\frac{1}{n}\sum_{i=1}^n|X_i|. $$ Calculate $E[Y_1]$ and $E[Y_2]$ and establish the inequality between them.According to me $$E[Y_1]=E\left[ \left|\frac{1}{n}\sum_{i=1}^nX_i \right|\right]=\frac{1}{n}\left|E\sum_{i=1}^nX_i\right|$$ and $$E[Y_2]=\frac{1}{n}\sum_{i=1}^nE[|X_i|]$$...Read more

Let F be the distribution function of random variable X, that is $F(t)=Prob[X<t]$ and know that $E[X]=0$ and $Var[X]=1$. I am trying to understand how to show the following inequality$\int_{1/u}^{\infty} e^{ut}dF(t) \leq\sum_{k=1}^\infty e^{k+1} Prob[X\geq k/u]$...Read more

Assume $X_1, X_2, X_3$ are independent random variables from a normal distribution with unknown mean $\mu$ and variance $\sigma^2$. Let $W_1=X_1-X_2$ and $W_2=X_1+X_2-2X_3$. I need to show that $W_1$ and $W_2$ are independent. Can I show this using the fact that if $Cov(W_1,W_2)=0$ then they are independent.My wokrings so far:$$\begin{array}CCov(W_1,W_2)=Cov(X_1-X_2,X_1+X_2-2X_3)\\Cov(W_1,W_2)=\bigr[Cov(X_1,X_1)+Cov(X_1,X_2)+Cov(X_1,-2X_3)+Cov(-X_2,X_1)+Cov(-X_2,X_2)+Cov(-X_2,-2X_3)\bigr]\\Cov(W_1,W_2)= 2Cov(X_2,X_3)-2Cov(X_1,X_3)\\Cov(W_1,W_2)...Read more

Let us assume that data is generated according to a true model $$y_i = \beta_{true}x_i + \epsilon_i$$for $i = 1, ..., n$Assume that $x_i$ are fixed, and $\epsilon_i$~ N(0, $\sigma^2$) independently.Let $$\hat\beta =\frac{\sum^{n}_{i=1}y_ix_i}{\sum^{n}_{i=1}x^2_i + \lambda}$$ be the shrinkage estimator from the ridge regression.How to calculate expectation and variance of $\hat\beta$, and mean squared error E$[(\hat\beta - \beta_{true})^2]$ ?I'm stuck on this part for expectation. What to do next?$$E(\hat\beta)= E(\frac{\beta_{true}\sum^{n}_{i=1...Read more

Cn someone give me the final answers for these? I found the pdf easily, the expectation and variance however have taken my time and i'm lost....Read more

Let i.i.d. $t_i\sim N(0,1), i = 1,2,\dots,n$, and \begin{equation}X = \frac{\sum_i t_i}{\sum_i t_i^2}.\end{equation}How to calculate the first two moments of $X$, i.e., $\mathrm{E}(X)$ and $\mathrm{E}(X^2)$? I did some simulation studies and am almost sure that $\mathrm{E}(X) = 0$ and $\mathrm{E}(X^2) = \frac{1}{n-2}$. However, I failed to get the derivation in detail. Thanks in advance....Read more