﻿ oipapio

### pr.probability - On the moments of Lévy processes

For a Brownian motion $B_t$, the evolution of the moments with $t$ obeys the simple rule:$$\mathbb{E}[|B_t|^p] = \kappa_p |t|^{p/2},$$with $\kappa_p<\infty$. The proof only requires to remark that the random variables $\frac{B_t}{\sqrt{t}}$ are Gaussian with variance 1.I am interested to know if, more generally, the question was studied for Lévy processes. Of course, in the general case, the moments can be infinite. When they are not, do we have an estimation of the evolution of $\mathbb{E}[|X_t|^p]$ for a Lévy process $X_t$, or at least som...Read more

### pr.probability - Suprema of stochastic processes

Let X be a continuous stochastic process. I know that (t>s)P(|X(t) - X(s)|>δ) < |t-s|/δIs it possible to say anything (e.g. estimate the decay of the tail) aboutY=sup_{s \in [0,1]} |X(s)|?I suspect that the answer is no (it would be an easy question if we have |t-s|^{1+a}, for any a>0).I wonder what are "standard" methods of estimating the suprema of continuous stochastic processes.So far in my problem I tried to use the Garsia-Rodemich-Rumsey inequality....Read more

### pr.probability - Entropy of edit distance

The edit or Levenshtein distance between two strings is the minimum number of single character insertions, deletions and substitutions to transform one string into another. If we take random binary strings $A$ and $B$ of the same length $n$, is it known what the entropy of the edit distance between $A$ and $B$ is?Update. I would be happy with a proof that the entropy is asymptotically $c \log n$ for some (unknown) $c>0$ as suggested by Anthony Quas (or even $(\Theta(\log{n})$ which potentially saves having to prove convergence)....Read more

### pr.probability - What is this probability distribution?

Suppose we have a family $F_0,F_1,\dots$ of independent random variables which take the value $1$ with probability $p$ and $0$ otherwise; let $\delta$ be a number between $0$ and $1$. Let$X_n = \sum_{k=0}^n \delta^{n-k} F_k$.I'm interested in the distribution of $X_n$. It seems straightforward enough to be known and have a name - does anybody know what it is?...Read more

### pr.probability - Tighter Caratheodory on the moment curve?

The moment curve is the set of points of the form $$(t,t^2,t^3,...,t^n) \in R^n$$Let $M$ be the portion of the moment curve where $t\in [0,1]$, and let $\overline{M}$ be the convex hull of $M$.Caratheodory's theorem tells us that any point in $\overline{M}$ can be expressed as a convex combination of $n+1$ points in $M$. However, when $n=1$ and $n=2$, we can get by with only $n$ points instead of $n+1$ points.Question: Is this true for all $n$? I.e., can we express any point in $\overline{M}$ as a convex combination of at most $n$ points from...Read more

### pr.probability - Exchangeable normal distribution mixing measure

I have a zero mean multivariate normal probability distribution where WLOG each marginal variance is unity and all pairwise correlation coefficient are equal and positive. The number of elements in the random vector is arbitrary. I am interested in the probability distribution of the number of elements which exceed some threshold -- the same threshold for each element. The outcomes for each elements are collectively identically distributed dependent Bernoulli random variables, so by de Finetti's exchangeability theorem this probability distribu...Read more

### pr.probability - Gaussian distributions as fixed points in Some distribution space

I'm taking a course on topology and probabily. Today, the professor remarked something along the lines of: If you look at the space of probability distributions with $0$ mean and variance $1$, equipped with convolution, then the Gaussian distribution is characterized as the fixed point of each orbit."He also said this was a nice way to appreciate the importance of the gaussian distribution, and to gain insight for the central limit theorem.I asked for references on this point of view, but he said it's not standard and recalled only hearing abo...Read more

### pr.probability - Bound on the total variation distance for multiple samples $d_{tv}(P^n,Q^n)$

Given two discrete distributions $P$ and $Q$, with computable total variation distance $d_{TV}(P,Q)=||P - Q||_1$, is there a precise bound for $d_{TV}(P^n,Q^n)=||P^n - Q^n||_1$, as need to estimate the power of an optimal test for multiple samples? Moreover, is is possible to exactly compute $d_{TV}(P^n,Q^n)=||P^n - Q^n||_1$ without enumerating all combinations?The best bound that I could find is based on the Chernoff Information The complexity of distinguishing distributions...Read more

### pr.probability - Reference request for a "well-known identity" in a paper of Shepp and Lloyd

I ran into a "well-known identity" on page 345 of Shepp and Lloyd's On ordered cycle lengths in a random permutation:$$\int_x^{\infty} \frac{\exp(-y)}y dy = \int_0^x \frac{1-\exp(-y)}y dy - \log x - \gamma,$$ where $\gamma$ is the Euler constant. I am clueless as to how it is derived. Any reference to the derivation of such formulae would suffice, but an explicit solution will also be appreciated....Read more

### pr.probability - Extracting moments from a special Z-transform

Suppose I have a sequence of positive continuous random variables $\{X_k\}_{k=1}^\infty$ with (unknown) MGF's $M_{X_k}(s)$. Furthermore, it is known that$$\frac{X_n-n\mu}{\sqrt{n}\sigma}\rightarrow\mathcal{N}(0,1),$$for some known $\mu$ and unknown $\sigma$. Given the function$$F[z,s]=\sum_{n=0}^\infty z^{-n} M_{X_n}(s),$$is it possible to extract $\sigma$ without the use of inverse transforms?For example:$$F[z,s]=\frac{zs}{1-e^s+zs}.$$Answer: $\sigma^2=\frac{1}{12}$....Read more

### pr.probability - Inequality of Partial Taylor Series

Hi,For a given $\theta < 1$, and $N$ a positive integer, I am trying to find an $x > 0$ (preferably the smallest such $x$) such that the following inequality holds:$$\sum_{k=0}^{N} \frac{x^k}{k!} \leq \theta e^{x}$$In my application, even $N$ is an integer function of $x$, i.e. $N = N(x)$, but for simplicity sake, let's assume $N$ is given for now.Any ideas?Thanks for readingFred...Read more