pr.probability - Random algebraic numbers are linearly disjoint almost surely?

I already posted this question at MSE here, but since it received no answer orcomment so far I cross-post it here.It is well-known that if one considers a “random” monic polynomial of fixeddegree, say $X^n+\sum_{k=0}^{n-1}a_kX^k$ where $(a_0,a_1,\ldots, a_n)$ is drawnfrom the discrete uniform distribution on $[-N,N]^{n+1}$, then this polynomialwill be irreducible and have Galois group $S_n$ “almost surely”, i.e. the probability of this event tends to $1$ when $N\to \infty$.Now, suppose one considers two random monic polynomials $P=X^n+\sum_{k=0...Read more

pr.probability - Non-central Wishart as mixture of central Wisharts?

The non-central chi-square distribution with $\nu$ degrees of freedom has a density which can be expressed as$$f(x) = \sum_{i=0}^{\infty} c_i f_{\nu + 2i}(x),$$where $c_i$ is a function of the non-centrality parameter and $i$, and where $f_{j}(x)$ is the density of a chi-square distribution with $j$ degrees of freedom. (The non-central chi-squared distribution is a Poisson weighted mixture of central chi-squared distributions).My question is whether this property extends to the multivariate case, the non-central Wishart distribution: can we ex...Read more

pr.probability - Question on Jensen's inequality

Let $(X,Y)$ be a martingale on $\mathbb R$ and $\psi:\mathbb R\to\mathbb R$ be a convex function. Then it follows by Jensen's inequality that $$\mathbb E[\psi(X)]~~\le~~ \mathbb E[\psi(Y)]$$and if $\psi$ is strictly convex and $\mathbb E[\psi(X)]=\mathbb E[\psi(Y)]$, then $X=Y$ almost surely. Now let us consider a special convex function $\psi(x)=(x-K)^+$ (that is not strictly convex), my question is the following:If $\mathbb P[X\neq Y]>0$ and $\mathbb E[(X-K)^+]=\mathbb E[(Y-K)^+]$, could we show that $X, Y\le K$ or $X, Y\ge K$?Many thanks!...Read more

pr.probability - Random walk over a function

Let $\{X_n\}_{n\geq 0}$ be a random walk. Let us assume that $\mathbb{E}X_1 =0$ and $\mathbb{E}X_1^2=1$. Let also $\mathbb{E}\exp(c|X_1|)<+\infty$ for some $c>0$ and $X_1$ has a law with unbounded support. I conjecture that for any $A>0$ $\mathbb{P}(\forall_{i\in \{1,2,\ldots,n\} } X_i \geq A \sqrt{i} ) \sim n^{-C},$where $C>0$ is some constant depending on $A$. I can prove this claim for some special classes of RWs (e.g. with Gaussian steps). Does anyone knows general results of this kind?Further, faster functions, e.g.$\mathbb{P}(...Read more

pr.probability - On the moments of Lévy processes

For a Brownian motion $B_t$, the evolution of the moments with $t$ obeys the simple rule:$$\mathbb{E}[|B_t|^p] = \kappa_p |t|^{p/2},$$with $\kappa_p<\infty$. The proof only requires to remark that the random variables $\frac{B_t}{\sqrt{t}}$ are Gaussian with variance 1.I am interested to know if, more generally, the question was studied for Lévy processes. Of course, in the general case, the moments can be infinite. When they are not, do we have an estimation of the evolution of $\mathbb{E}[|X_t|^p]$ for a Lévy process $X_t$, or at least som...Read more

pr.probability - Suprema of stochastic processes

Let X be a continuous stochastic process. I know that (t>s)P(|X(t) - X(s)|>δ) < |t-s|/δIs it possible to say anything (e.g. estimate the decay of the tail) aboutY=sup_{s \in [0,1]} |X(s)|?I suspect that the answer is no (it would be an easy question if we have |t-s|^{1+a}, for any a>0).I wonder what are "standard" methods of estimating the suprema of continuous stochastic processes.So far in my problem I tried to use the Garsia-Rodemich-Rumsey inequality....Read more

pr.probability - Entropy of edit distance

The edit or Levenshtein distance between two strings is the minimum number of single character insertions, deletions and substitutions to transform one string into another. If we take random binary strings $A$ and $B$ of the same length $n$, is it known what the entropy of the edit distance between $A$ and $B$ is?Update. I would be happy with a proof that the entropy is asymptotically $c \log n$ for some (unknown) $c>0$ as suggested by Anthony Quas (or even $(\Theta(\log{n})$ which potentially saves having to prove convergence)....Read more

pr.probability - What is this probability distribution?

Suppose we have a family $F_0,F_1,\dots$ of independent random variables which take the value $1$ with probability $p$ and $0$ otherwise; let $\delta$ be a number between $0$ and $1$. Let$X_n = \sum_{k=0}^n \delta^{n-k} F_k$.I'm interested in the distribution of $X_n$. It seems straightforward enough to be known and have a name - does anybody know what it is?...Read more

pr.probability - Tighter Caratheodory on the moment curve?

The moment curve is the set of points of the form $$(t,t^2,t^3,...,t^n) \in R^n$$Let $M$ be the portion of the moment curve where $t\in [0,1]$, and let $\overline{M}$ be the convex hull of $M$.Caratheodory's theorem tells us that any point in $\overline{M}$ can be expressed as a convex combination of $n+1$ points in $M$. However, when $n=1$ and $n=2$, we can get by with only $n$ points instead of $n+1$ points.Question: Is this true for all $n$? I.e., can we express any point in $\overline{M}$ as a convex combination of at most $n$ points from...Read more

pr.probability - Exchangeable normal distribution mixing measure

I have a zero mean multivariate normal probability distribution where WLOG each marginal variance is unity and all pairwise correlation coefficient are equal and positive. The number of elements in the random vector is arbitrary. I am interested in the probability distribution of the number of elements which exceed some threshold -- the same threshold for each element. The outcomes for each elements are collectively identically distributed dependent Bernoulli random variables, so by de Finetti's exchangeability theorem this probability distribu...Read more

pr.probability - Gaussian distributions as fixed points in Some distribution space

I'm taking a course on topology and probabily. Today, the professor remarked something along the lines of: If you look at the space of probability distributions with $0$ mean and variance $1$, equipped with convolution, then the Gaussian distribution is characterized as the fixed point of each orbit."He also said this was a nice way to appreciate the importance of the gaussian distribution, and to gain insight for the central limit theorem.I asked for references on this point of view, but he said it's not standard and recalled only hearing abo...Read more

pr.probability - Bound on the total variation distance for multiple samples $d_{tv}(P^n,Q^n)$

Given two discrete distributions $P$ and $Q$, with computable total variation distance $d_{TV}(P,Q)=||P - Q||_1$, is there a precise bound for $d_{TV}(P^n,Q^n)=||P^n - Q^n||_1$, as need to estimate the power of an optimal test for multiple samples? Moreover, is is possible to exactly compute $d_{TV}(P^n,Q^n)=||P^n - Q^n||_1$ without enumerating all combinations?The best bound that I could find is based on the Chernoff Information The complexity of distinguishing distributions...Read more

pr.probability - Reference request for a "well-known identity" in a paper of Shepp and Lloyd

I ran into a "well-known identity" on page 345 of Shepp and Lloyd's On ordered cycle lengths in a random permutation:$$\int_x^{\infty} \frac{\exp(-y)}y dy = \int_0^x \frac{1-\exp(-y)}y dy - \log x - \gamma, $$ where $\gamma$ is the Euler constant. I am clueless as to how it is derived. Any reference to the derivation of such formulae would suffice, but an explicit solution will also be appreciated....Read more

pr.probability - Extracting moments from a special Z-transform

Suppose I have a sequence of positive continuous random variables $\{X_k\}_{k=1}^\infty$ with (unknown) MGF's $M_{X_k}(s)$. Furthermore, it is known that\begin{equation}\frac{X_n-n\mu}{\sqrt{n}\sigma}\rightarrow\mathcal{N}(0,1),\end{equation}for some known $\mu$ and unknown $\sigma$. Given the function\begin{equation}F[z,s]=\sum_{n=0}^\infty z^{-n} M_{X_n}(s),\end{equation}is it possible to extract $\sigma$ without the use of inverse transforms?For example:\begin{equation}F[z,s]=\frac{zs}{1-e^s+zs}.\end{equation}Answer: $\sigma^2=\frac{1}{12}$....Read more

pr.probability - Inequality of Partial Taylor Series

Hi,For a given $\theta < 1$, and $N$ a positive integer, I am trying to find an $x > 0$ (preferably the smallest such $x$) such that the following inequality holds:$$\sum_{k=0}^{N} \frac{x^k}{k!} \leq \theta e^{x}$$In my application, even $N$ is an integer function of $x$, i.e. $N = N(x)$, but for simplicity sake, let's assume $N$ is given for now.Any ideas?Thanks for readingFred...Read more