To save content items to your account,
please confirm that you agree to abide by our usage policies.
If this is the first time you use this feature, you will be asked to authorise Cambridge Core to connect with your account.
Find out more about saving content to .
To save content items to your Kindle, first ensure no-reply@cambridge.org
is added to your Approved Personal Document E-mail List under your Personal Document Settings
on the Manage Your Content and Devices page of your Amazon account. Then enter the ‘name’ part
of your Kindle email address below.
Find out more about saving to your Kindle.
Note you can select to save to either the @free.kindle.com or @kindle.com variations.
‘@free.kindle.com’ emails are free but can only be saved to your device when it is connected to wi-fi.
‘@kindle.com’ emails can be delivered even when you are not connected to wi-fi, but note that service fees apply.
In this chapter, we address the issue how channel knowledge - referring to both desired channels and the channels towards interferers - needed for various CoMP schemes can be made available where it is needed. We first investigate channel estimation techniques at the receiver side in Section 9.1, and then discuss how the obtained channel knowledge can be efficiently fed back to the transmitter side in Section 9.2, which is for example a crucial requirement for the downlink CoMP schemes investigated in Sections 6.3 and 6.4. The chapter shows that standard channel estimation and feedback concepts can principally be extended to enable CoMP in general. However, it also becomes apparent that large CoMP cooperation sizes may be considered questionable in practice, due to the fact that weak links cannot be estimated accurately, and the involved pilot and channel state information (CSI) feedback overhead may become prohibitive.
Channel Estimation for CoMP
One of the main challenges for CoMP schemes like joint transmission (JT) is to obtain accurate channel information in a multi-cell mobile radio environment with acceptable overhead for pilot signals.
The section is structured as follows. In Subsection 9.1.1, main characteristics of the mobile radio channel and state-of-the-art estimation and interpolation techniques like Wiener filtering will be introduced, with a special focus on channel prediction. For CoMP, the analysis then has to be extended to multiple channel components and multi-cell scenarios, which will be done in Subsections 9.1.2 and 9.1.3, respectively.
Information theory was created by Claude E. Shannon for the study of certain quantitative aspects of information, primarily as an analysis of the impact of coding on information transmission. Research in this field has resulted in several mathematical theories. Our subject is the stochastic theory, often referred to as the Shannon theory, which directly descends from Shannon's pioneering work.
This book is intended for graduate students and research workers in mathematics (probability and statistics), electrical engineering and computer science. It aims to present a well-integrated mathematical discipline, including substantial new developments of the 1970s. Although applications in engineering and science are not covered, we hope to have presented the subject so that a sound basis for applications has also been provided. A heuristic discussion of mathematical models of communication systems is given in the Introduction, which also offers a general outline of the intuitive background for the mathematical problems treated in the book.
As the title indicates, this book deals with discrete memoryless systems. In other words, our mathematical models involve independent random variables with finite range. Idealized as these models are from the point of view of most applications, their study reveals the characteristic phenomena of information theory without burdening the reader with the technicalities needed in the more complex cases. In fact, the reader needs no other prerequisites than elementary probability and a reasonable mathematical maturity. By limiting our scope to the discrete memoryless case, it was possible to use a unified, basically combinatorial approach.
The results of Chapter 15 enable us to solve a number of coding problems for various source and channel networks. Most of the resulting coding theorems are presented as problems which can be solved more or less in the same way. As an illustration of the methods, we shall discuss in detail a channel network and a (normal) source network. In addition, we shall consider a source network with a more general fidelity criterion than probability of error.
Channel networks with a single intermediate vertex are called broadcast channels. The simplest case of a network with two outputs has been studied intensively. Without loss of generality, one can suppose that this network has three inputs. This two-output broadcast channel (BC) is illustrated in Fig. 16.1.
At present, a computable characterization of the capacity region of the two-output broadcast channel is available only in special cases. A model of independent interest is obtained if “either of inputs 1 and 2 of the network is idle.” This corresponds to the new channel network in Fig. 16.2, the asymmetric two-output broadcast channel (ABC), which is treated as our next problem.
In the following, the DMCs corresponding to the outputs addressed by 10 and 0 will be denoted by {V : X → Y} resp. {W : X → Z}.
Contemporary techniques of data security are primarily based on computational complexity, typically on the infeasibility of inverting certain functions using currently available mathematical techniques and computing power. Among the various protocols based on such ideas, those approved by the cryptography community appear secure enough, but mathematical and technological progress may render them insecure in the future. Indeed, past experience suggests that this is likely to happen.
Information-theoretic secrecy offers provable security even against an adversary with unlimited computing power. This chapter provides a glimpse into the substantial progress that has been made towards clarifying the theoretical possibilities in this direction. Practical applications are reasonably expected within a much shorter time than capacity-achieving coding techniques have followed Shannon's discovery of the noisy channel coding theorem.
Two kinds of problems will be addressed: secure transmission over insecure channels and secret key generation taking advantage of public communication. After introducing necessary concepts and tools in Section 17.1, these problems will be treated in Sections 17.2 and 17.3. Let us emphasize that the mathematical models and techniques will be similar to those in previous chapters. These models, however, are now studied from a non-cooperative aspect: a major goal is to keep (at least) one party ignorant of (at least part of) the information exchanged.
Information is a fashionable concept with many facets, among which the quantitative one–our subject–is perhaps less striking than fundamental. At the intuitive level, for our purposes, it suffices to say that information is some knowledge of predetermined type contained in certain data or pattern and wanted at some destination. Actually, this concept will not explicitly enter the mathematical theory. However, throughout the book certain functionals of random variables will be conveniently interpreted as measures of the amount of information provided by the phenomena modeled by these variables. Such information measures are characteristic tools of the analysis of optimal performance of codes, and they have turned out to be useful in other branches of mathematics as well.
Intuitive background
The mathematical discipline of information theory, created by C. E. Shannon (1948) on an engineering background, still has a special relation to communication engineering, the latter being its major field of application and the source of its problems and motivation. We believe that some familiarity with the intuitive communication background is necessary for a more than formal understanding of the theory, let alone for doing further research. The heuristics, underlying most of the material in this book, can be best explained on Shannon's idealized model of a communication system (which can also be regarded as a model of an information storage system). The important question of how far the models treated are related to, and the results obtained are relevant for, real systems will not be addressed.
In Chapter 13 we formulated a fairly general model of noiseless communication networks. The absence of noise means that the coders located at the vertices of the network have direct access to the results of coding operations performed at immediately preceding vertices. By dropping this assumption, we now extend the model to cover communication in a noisy environment. We shall suppose that codewords produced at certain vertices are components of a vector input of a noisy channel, and it is the corresponding channel output that can be observed at some other vertex of the network.
The mathematical problems solved in this chapter will relate to the noisy version of the simplest multi-terminal network, the fork. In order to avoid clumsy notation, we give the formal definitions only for the case of two inputs.
Given finite sets X, Y, Z, consider channels with input set X × Y and output set Z. A multiple-access code (MA code) for such channels is a triple of mappings f : M1 → X, g : M2 → Y, φ : Z → M1 × M2, where M1 and M2 are arbitrary finite sets. The mappings f and g are called encoders, with message sets M1 resp. M2, while φ is the decoder. A MA code is also a code in the usual sense, with encoder (f, g) : M1 × M2 → X × Y and decoder φ.