We use cookies to distinguish you from other users and to provide you with a better experience on our websites. Close this message to accept cookies or find out how to manage your cookie settings.
To save content items to your account,
please confirm that you agree to abide by our usage policies.
If this is the first time you use this feature, you will be asked to authorise Cambridge Core to connect with your account.
Find out more about saving content to .
To save content items to your Kindle, first ensure no-reply@cambridge.org
is added to your Approved Personal Document E-mail List under your Personal Document Settings
on the Manage Your Content and Devices page of your Amazon account. Then enter the ‘name’ part
of your Kindle email address below.
Find out more about saving to your Kindle.
Note you can select to save to either the @free.kindle.com or @kindle.com variations.
‘@free.kindle.com’ emails are free but can only be saved to your device when it is connected to wi-fi.
‘@kindle.com’ emails can be delivered even when you are not connected to wi-fi, but note that service fees apply.
The aim of this paper is to define the entropy of a finite semi-Markov process. We define the entropy of the finite distributions of the process, and obtain explicitly its entropy rate by extending the Shannon–McMillan–Breiman theorem to this class of nonstationary continuous-time processes. The particular cases of pure jump Markov processes and renewal processes are considered. The relative entropy rate between two semi-Markov processes is also defined.
We present a stochastic algorithm which generates optimal probabilities for the chaos game to decompress an image represented by the fixed point of an IFS operator. The algorithm can be seen as a sort of time-inhomogeneous regenerative process. We prove that optimal probabilities exist and, by martingale methods, that the algorithm converges almost surely. The method holds for IFS operators associated with any arbitrary number of possibly overlapping affine contraction maps on the pixels space.
We present an entropy conservation principle applicable to either discrete or continuous variables which provides a useful tool for aggregating observations. The associated method of modality grouping transforms a variable Z1 into a new variable Z2 such that the mutual information I(Z2,Y) between Y, a variable of interest, and Z2 is equal to I(Z1,Y).
We prove a monotonicity property for a function of general square integrable pairs of martingales which is useful in fractal-based algorithms for compression of image data.
Intersymbol and cochannel interference in a communications channel can often be modelled as the sum of an infinite series of random variables with weights which decay fairly rapidly. Frequently, this yields a random variable which is singular but non-atomic. The Hausdorff dimension of the distribution is estimated and methods for calculating expectations are studied. A connection is observed between the dimension and the complexity of the calculation of an expectation.