To save content items to your account,
please confirm that you agree to abide by our usage policies.
If this is the first time you use this feature, you will be asked to authorise Cambridge Core to connect with your account.
Find out more about saving content to .
To save content items to your Kindle, first ensure no-reply@cambridge.org
is added to your Approved Personal Document E-mail List under your Personal Document Settings
on the Manage Your Content and Devices page of your Amazon account. Then enter the ‘name’ part
of your Kindle email address below.
Find out more about saving to your Kindle.
Note you can select to save to either the @free.kindle.com or @kindle.com variations.
‘@free.kindle.com’ emails are free but can only be saved to your device when it is connected to wi-fi.
‘@kindle.com’ emails can be delivered even when you are not connected to wi-fi, but note that service fees apply.
Let $F(\b{z})=\sum_\b{r} a_\b{r}\b{z^r}$ be a multivariate generating function that is meromorphic in some neighbourhood of the origin of $\mathbb{C}^d$, and let $\sing$ be its set of singularities. Effective asymptotic expansions for the coefficients can be obtained by complex contour integration near points of $\sing$.
In the first article in this series, we treated the case of smooth points of $\sing$. In this article we deal with multiple points of $\sing$. Our results show that the central limit (Ornstein–Zernike) behaviour typical of the smooth case does not hold in the multiple point case. For example, when $\sing$ has a multiple point singularity at $(1, \ldots, 1)$, rather than $a_\b{r}$ decaying as $|\b{r}|^{-1/2}$ as $|\b{r}| \to \infty$, $a_\b{r}$ is very nearly polynomial in a cone of directions.
This special issue is devoted to the Analysis of Algorithms (AofA). Most of the papers are from the Eighth Seminar on Analysis of Algorithms, held in Strobl, Austria, June 23–29, 2002.
Heap ordered trees are planted plane trees, labelled in such a way that the labels always increase from the root to a leaf. We study two parameters, assuming that $p$ of the $n$ nodes are selected at random: the size of the ancestor tree of these nodes and the smallest subtree generated by these nodes. We compute expectation, variance, and also the Gaussian limit distribution, the latter as an application of Hwang's quasi-power theorem.
An additive decomposition of a set $I$ of nonnegative integers is an expression of $I$ as the arithmetic sum of two other such sets. If the smaller of these has $p$ elements, we have a $p$-decomposition. If $I$ is obtained by randomly removing $n^{\alpha}$ integers from $\{0,\dots,n-1\}$, decomposability translates into a balls-and-urns problem, which we start to investigate (for large $n$) by first showing that the number of $p$-decompositions exhibits a threshold phenomenon as $\alpha$ crosses a $p$-dependent critical value. We then study in detail the distribution of the number of 2-decompositions. For this last case we show that the threshold is sharp and we establish the threshold function.
We consider Boolean functions over $n$ variables. Any such function can be represented (and computed) by a complete binary tree with and or or in the internal nodes and a literal in the external nodes, and many different trees can represent the same function, so that a fundamental question is related to the so-called complexity of a Boolean function: $L(f):=$ minimal size of a tree computing $f$.
The existence of a limiting probability distribution $P(\cdot)$ on the set of and/or trees was shown by Lefmann and Savický [8]. We give here an alternative proof, which leads to effective computation in simple cases. We also consider the relationship between the probability $P(f)$ and the complexity $L(f)$ of a Boolean function $f$. A detailed analysis of the functions enumerating some sub-families of trees, and of their radius of convergence, allows us to improve on the upper bound of $P(f)$, established by Lefmann and Savický.
We show that, for a certain class of probabilistic models, the number of internal nodes $S_n$ of a trie built from $n$ independent and identically distributed keys is concentrated around its mean, in the sense that $\Var S_n=\Oh(\EE S_n)$. Keys are sequences of symbols which may be taken from varying alphabets, and the choice of the alphabet from which the $k$th symbol is taken, as well as its distribution, may depend on all the preceding symbols. In the construction of the trie we also allow for bucket sizes greater than 1. The property that characterizes our models is the following: there is a constant $C$ such that for any word $v$ that may occur as a prefix of a key the size $S^v_n$ of a trie built from the suffixes of $n$ independent keys conditioned to have the common prefix $v$ has the property $\EE S^v_n\leq Cn$. This class of models contains memoryless and Markovian source models as well as the probabilistic dynamical source models that were recently introduced and thoroughly developed by Vallée [Algorithmica29 (2001) 262–306], in particular the continued fraction source. Furthermore we study the external path length $L_n$, which obeys $\EE L_n=\Oh(n\ln n)$ and $\Var L_n=\Oh(n\ln^2 n)$.
In this paper, we present several probabilistic transforms related to classical urn models. These transforms render the dependent random variables describing the urn occupancies into independent random variables with appropriate distributions. This simplifies the analysis of a large number of problems for which a function under investigation depends on the urn occupancies. The approach used for constructing the transforms involves generating functions of combinatorial numbers characterizing the urn distributions. We also show, by using Tauberian theorems derived in this paper, that under certain simple conditions the asymptotic expressions of target functions in the transform domain and in the inverse–transform domain are identical. Therefore, asymptotic information about certain statistics can be obtained without evaluating the inverse transform.
In this paper, we investigate the limit law of the inertial moment of Dyck paths with respect to the $x$-axis, that is, the sum of the squares of the altitudes. We find its Laplace transform using Louchard's methodology, rediscovering a result which was in fact well known by probabilists. We give recurrence relations which enable us to compute the moments of the joint limit law of the area and the inertial moments of both Dyck paths and Grand Dyck paths (bilateral Dyck paths).
We give an algorithm that, with high probability, recovers a planted $k$-partition in a random graph, where edges within vertex classes occur with probability $p$ and edges between vertex classes occur with probability $r\ge p+c\sqrt{p\log n/n}$. The algorithm can handle vertex classes of different sizes and, for fixed $k$, runs in linear time. We also give variants of the algorithm for partitioning matrices and hypergraphs.
This article proposes a surprisingly simple framework for the random generation of combinatorial configurations based on what we call Boltzmann models. The idea is to perform random generation of possibly complex structured objects by placing an appropriate measure spread over the whole of a combinatorial class – an object receives a probability essentially proportional to an exponential of its size. As demonstrated here, the resulting algorithms based on real-arithmetic operations often operate in linear time. They can be implemented easily, be analysed mathematically with great precision, and, when suitably tuned, tend to be very efficient in practice.
The Lehmer–Euclid Algorithm is an improvement of the Euclid Algorithm when applied to large integers. The original Lehmer–Euclid Algorithm replaces divisions on multi-precision integers by divisions on single-precision integers. Here we study a slightly different algorithm that replaces computations on $n$-bit integers by computations on $\mu n$-bit integers. This algorithm depends on the truncation degree $\mu\in ]0, 1[$ and is denoted as the ${\mathcal{LE}}_\mu$ algorithm. The original Lehmer–Euclid Algorithm can be viewed as the limit of the ${\mathcal{LE}}_\mu$ algorithms for $\mu \to 0$. We provide here a precise analysis of the ${\mathcal{LE}}_\mu$ algorithm. For this purpose, we are led to study what we call the Interrupted Euclid Algorithm. This algorithm depends on some parameter $\alpha \in [0, 1]$ and is denoted by ${\mathcal E}_{\alpha}$. When running with an input $(a, b)$, it performs the same steps as the usual Euclid Algorithm, but it stops as soon as the current integer is smaller than $a^\alpha$, so that ${\mathcal E}_{0}$ is the classical Euclid Algorithm. We obtain a very precise analysis of the algorithm ${\mathcal E}_{\alpha}$, and describe the behaviour of main parameters (number of iterations, bit complexity) as a function of parameter $\alpha$. Since the Lehmer–Euclid Algorithm ${\mathcal {LE}}_\mu$ when running on $n$-bit integers can be viewed as a sequence of executions of the Interrupted Euclid Algorithm ${\mathcal E}_{1/2}$ on $\mu n $-bit integers, we then come back to the analysis of the ${\mathcal {LE}}_\mu$ algorithm and obtain our results.
In this paper, we show for generalized $M$-ary search trees that the Steiner distance of $p$ randomly chosen nodes in random search trees is asymptotically normally distributed. The special case $p=2$ shows, in particular, that the distribution of the distance between two randomly chosen nodes is asymptotically Gaussian. In the presented generating functions approach, we consider first the size of the ancestor-tree of $p$ randomly chosen nodes. From the obtained Gaussian limiting distribution for this parameter, we deduce the result for the Steiner distance. Since the size of the ancestor-tree is essentially the same as the number of passes in the (generalized) Multiple Quickselect algorithm, the limiting distribution result also holds for this parameter.
This chapter covers quite a number of topics because finite translative arrangements have been rather intensively investigated. Actually, one more topic is parametric density, which is the subject of Chapter 10.
Given a convex body K in ℝd, Section 9.1 introduces the associated Minkowski space, namely, the normed space induced by K0 = (K – K)/2. The main body of the chapter starts with density-type problems. We characterize K when some translates of K tile a certain convex body (see Theorem 9.2.1). Then, appealing to methods in Chapter 7, we investigate the asymptotic behavior of optimal packings and coverings of a large number of translates of K in Sections 9.3 and 9.4. We also describe the classical economic periodic packings and coverings by translates of K (see Theorem 9.5.2).
The next topic is the so-called Hadwiger number H(K): the maximal number of nonoverlapping translates of K that touch K. Theorem 9.6.1 says that λd < H(K) ≤ 3d – 1, where λ > 1 is an absolute constant. The lower bound H(K) ≥ d2 + d is verified as well (see Theorem 9.7.1), which is optimal if d = 2, 3. For positive α, Sections 9.8 to 9.10 discuss a natural generalization of the Hadwiger number, that is, the maximal number Hα(K) of nonoverlapping translates of αK touching K. We note that H∞(K) is related to antipodal sets and equilateral sets (see Section 9.11).
Given n and a convex body K, we consider arrangements of n congruent copies, namely, packings inside convex containers and coverings of compact convex sets. Concerning density, clusters are naturally related to periodic arrangements (see Theorem 7.1.1). In addition, we verify that an asymptotic density exists as n tends to infinity (see Theorem 7.2.2), and we characterize the case when this asymptotic density is one (see Lemma 7.2.3). Example 7.2.4 shows that the asymptotic structure of the optimal arrangement depends very much on K.
Concerning the mean i-dimensional projection for i = 1, …, d − 1, we verify that the optimal convex hull of n nonoverlapping congruent copies of K is close to being a ball (see Theorem 7.2.2). In contrast, the compact convex set of maximal mean width covered by n congruent copies of K is close to being a segment (see Theorem 7.4.1).
Periodic and Finite Arrangements
Let K be a convex body in ℝd. If K is a periodic arrangement by congruent copies of K with respect to the lattice Λ (see Section A.13), and the number of equivalence classes is m, then the density of the arrangement is m · V(K)/det Λ. We define the packing density δ(K) to be supremum of the densities of periodic packings by congruent copies of K, and we let Δ(K) = V(K)/δ(K).
Let K be a convex domain. According to the classical result of L. Fejes Tóth [FTL1950], the density of a packing of congruent copies of K in a hexagon cannot be denser than the density of K inside the circumscribed hexagon with minimal area. Besides this statement, we verify that the same density estimate holds for any convex container provided the number of copies is high enough. In addition, we show that if K is a centrally symmetric domain then the inradius and circumradius of the optimal convex container cannot be too different. Following L. Fejes Tóth [FTL1950] in case of coverings, the analogous density estimate is verified under the “noncrossing” assumption, which essentially says that the boundaries of any two congruent copies intersect in two points. In case of both packings and coverings, congruent copies can be replaced by similar copies of not too different sizes. Finally, we verify the hexagon bound for coverings by congruent fat ellipses even without the noncrossing assumption, a result due to A. Heppes.
Concerning the perimeter, we show that the convex domain of minimal perimeter containing n nonoverlapping congruent copies of K gets arbitrarily close to being a circular disc for large n. However, if the perimeter of the compact convex set D covered by n congruent copies of K is maximal then D is close to being a segment for large n.