Log-concavity and log-convexity of moments of averages of i.i.d. random variables

We show that the sequence of moments of order less than 1 of averages of i.i.d. positive random variables is log-concave. For moments of order at least 1, we conjecture that the sequence is log-convex and show that this holds eventually for integer moments (after neglecting the first $p^2$ terms of the sequence).


Introduction and results
Suppose X 1 , X 2 , . . . are i.i.d. copies of a positive random variable and f is a nonnegative function. This article is concerned with certain combinatorial properties of the sequence a n = Ef X 1 + · · · + X n n , n = 1, 2, . . . .
For instance, f (x) = x p is a fairly natural choice leading to the sequence of moments of averages of the X i . Since we have the identity we conclude that the sequence (a n ) ∞ n=1 is nonincreasing when f is convex. What about inequalities involving more than two terms?
Recall that a nonnegative sequence (x n ) ∞ n=1 supported on a set of contiguous integers is called log-convex (resp. log-concave) if x 2 n ≤ x n−1 x n+1 (resp. x 2 n ≥ x n−1 x n+1 ) for all n ≥ 2 (for background on log-convex/concave sequences, see for instance [8,12]). One of the crucial properties of log-convex sequences is that log-convexity is preserved by taking sums (which follows from the Cauchy-Schwarz inequality, see for instance [8]). Recall that an infinitely differentiable function function f : (0, ∞) → (0, ∞) is called completely monotone if we have (−1) n f (n) (x) ≥ 0 for all positive x and n = 1, 2, . . .; equivalently, by Bernstein's theorem (see for instance [7]), the function f is the Laplace transform of a nonnegative Borel measure µ on [0, +∞), that is For example, when p < 0, the function f (x) = x p is completely monotone. Such integral representations are at the heart of our first two results.
In particular, applying these to the functions f (x) = x p with p < 0 and 0 < p < 1 respectively, we obtain the following corollary.
For p > 1, we pose the following conjecture.
We offer a partial result supporting this conjecture.
Theorem 4. Let X 1 , X 2 , . . . be i.i.d. nonnegative random variables, let p be a positive integer and let b n be defined by (4). Then for every n ≥ p 2 , we have b 2 which is clearly a log-convex sequence (as a sum of two log-convex sequences). The following argument for p = 3 was kindly communicated to us by Krzysztof Oleszkiewicz: The sequences (n −2 ) and (3n −1 −n −2 ) are log-convex. By the Cauchy-Schwarz inequality the factor at n −2 is nonnegative, so again (b n ) is log-convex as a sum of three log-convex sequences. It remains elusive how to group terms and proceed along these lines in general. Our proof of Theorem 4 relies on this idea, but uses a straightforward way of rearranging terms.
Remark 6. It would be tempting to use the aforementioned result of Boland et al. with φ(x, y) = (xy) p to resolve Conjecture 1. However, this function is neither convex nor concave on (0, +∞) 2 for p > 1 2 . For 0 < p < 1 2 , the function is concave and (2) Corollary 3 improves on this by removing the factor n 2 −1 and Corollary 3 removes the factor n 2 −1 Concluding this introduction, it is of significant interest to study the log-behaviour of various sequences, particularly those emerging from algebraic, combinatorial, or geometric structures, which has involved and prompted the development of many deep and interesting methods, often useful beyond the original problems (see, e.g., [3,4,5,6,10,11,12,13] ). We propose to consider sequences of moments of averages of i.i.d. random variables arising naturally in probabilistic limit theorems. For moments of order less than 1, we employ an analytical approach exploiting integral representations for power functions. For moments of order higher than 1, our Conjecture 1, besides refining the monotonicity property of the sequence (b n ) (resulting from convexity), would furnish new examples of log-convex sequences. For instance, neither does it seem trivial, nor handled by known techniques, to determine whether the sequence obtained by taking the Bernoulli distribution with parameter θ ∈ (0, 1), b n = n k=0 n k k n p θ k (1 − θ) n−k is log-convex. In the case of integral p, we have b n = p k=0 S(p, k) n! (n−k)!n p θ k , where S(p, k) is the Stirling number of the second kind.
The rest of this paper is occupied with the proofs of Theorems 1, 2, 4 (in their order of statement) and then we conclude with additional remarks and conjectures.
(u n (t)) is log-convex (because sums/integrals of log-convex sequences are log-convex: the Cauchy-Schwarz inequality applied to the measure µ yields which combined with u n (t) ≤ u n−1 (t)u n+1 (t), gives a 2 n ≤ a n−1 a n+1 ). The logconvexity of (u n (t)) follows from Hölder's inequality, which finishes the proof.

Proof of Theorem 2
Suppose now that f (0) = 0 and f ′ is completely monotone, say f ′ (x) = ∞ 0 e −tx dµ(t) for some nonnegative Borel measure µ on (0, ∞) (by (3)). Introducing a new measure Integrating against dx gives Let F be the Laplace transform of X 1 , that is where, to shorten the notation, we introduce the following nonnegative function To show the inequality it suffices to show that pointwise G(n, s)G(n, t) ≥ 1 2 G(n − 1, s)G(n + 1, t) for all s, t > 0. This follows from two properties of the function G: 1) for every fixed t > 0 the function α → G(α, t) is nondecreasing, 2) the function G(α, t) is concave on (0, ∞) × (0, ∞).
Indeed, by 2) we have (in fact we only use concavity in the first argument). It thus suffices to prove that is nonnegative, which follows by 1).
We have obtained where β(q) = p! α(q)·q1!···qm! and µ(q) = µ q1 · · · µ qm . By homogeneity, we can assume that µ 1 = EX 1 = 1. Note that when X 1 is constant, we get from (5) that Since Q p has only one element, namely {1, . . . , 1} and µ({1, . . . , 1}) = 1, when we subtract the two equations, the terms corresponding to m = p cancel and we get By the monotonicity of moments, µ(q) ≥ 1 for every q, so (b n ) is a sum of the constant sequence (1, 1, . . .) and the sequences (u Proof. The statement is clear for m = 1. Let 2 ≤ m ≤ p − 1 and p ≥ 3. We have To see that this is positive for every x ≥ p 2 − 1 and 2 ≤ m ≤ p − 1, it suffices to consider m = p − 1 and x = p 2 − 1 (writing x x−k = 1 + k x−k , we see that the right hand side is increasing in x). Since we have which is clearly positive.

Final remarks
Remark 8. Using majorization type arguments (see, e.g. [9]), Conjecture 1 can be verified in a rather standard but lengthy way for every p > 1 and n = 2. The idea is to establish a pointwise inequality: we conjecture that for nonnegative numbers x 1 , . . . , x 2n and a convex function φ : [0, ∞) → [0, ∞) we have where for a subset I of the set {1, . . . , 2n} we denote x I = i∈I x i . We checked that this holds for n = 2. Taking the expectation on both sides for φ(x) = x p gives the desired result that b 2 n ≤ b n−1 b n+1 . Remark 9. It is tempting to ask for generalisations of Conjecture 1 beyond the power functions, say to ask whether the sequence (a n ) defined in (1) is log-convex for every convex function f . This is false, as can be seen by taking the function f of the form f (x) = max{x − a, 0} and the X i to be i.i.d Bernoulli random variables.