Book contents
- Frontmatter
- Dedication
- Contents
- Part I Overview of Big Data Applications page
- Part II Methodology and Mathematical Background
- Part III Big Data Applications
- 8 Compressive Sensing-Based Big Data Analysis
- 9 Distributed Large-Scale Optimization
- 10 Optimization of Finite Sums
- 11 Big Data Optimization for Communication Networks
- 12 Big Data Optimization for Smart Grid Systems
- 13 Processing Large Data Sets in MapReduce
- 14 Massive Data Collection Using Wireless Sensor Networks
- Bibliography
- Index
8 - Compressive Sensing-Based Big Data Analysis
from Part III - Big Data Applications
Published online by Cambridge University Press: 18 May 2017
- Frontmatter
- Dedication
- Contents
- Part I Overview of Big Data Applications page
- Part II Methodology and Mathematical Background
- Part III Big Data Applications
- 8 Compressive Sensing-Based Big Data Analysis
- 9 Distributed Large-Scale Optimization
- 10 Optimization of Finite Sums
- 11 Big Data Optimization for Communication Networks
- 12 Big Data Optimization for Smart Grid Systems
- 13 Processing Large Data Sets in MapReduce
- 14 Massive Data Collection Using Wireless Sensor Networks
- Bibliography
- Index
Summary
Despite the relatively short history of compressive sensing (CS) theory pioneered by the work by Candès, Romberg, and Tao [393, 394, 395] and Donoho [396], the numbers of studies and publications in this area have become amazingly large. On the other hand, the applications of CS just begin to appear. The inborn nature that many signals can be represented by sparse vectors has been recognized in many areas of applications. Examples in wireless communication include the sparse channel impulse response in the time domain, the sparse unitization of the spectrum, and the time and spatial sparsity in the wireless sensor networks. For each of these sparse signals, there are innovative signal acquisition schemes that not only satisfy the requirements by the CS theory, but also are easily realizable on hardware. Efficient signal-recovery algorithms for each system are also available. They guarantee stable signal recovery with high probability.
In this chapter, we provide a concise overview of CS basics and some of its extensions. In subsequent chapters, we focus on CS algorithms and specific areas of CS research related to bid data. This chapter begins with Section 8.1, which gives the motivation of CS, illustrates the typical steps of CS by an example, summarizes the key components CS, and discusses how nearly sparse signals and measurement noise are treated in robust CS. Following these discussions, Section 8.2 compares CS with traditional sensing and examines their advantages and disadvantages. The foundation of CS, sparse representation, is studied in Section 8.3, which briefly covers sparsifying transforms, analytic dictionaries, learned dictionaries, as well as a few extensions of sparse modeling. Next, Section 8.4 discusses a variety of conditions under which one can trust CS for encoding signals during sensing and recovering them faithfully after sensing. Then, two synthetic examples are given to close this chapter. Finally, we outline some applications for CS in Section 8.5.
Background
The Nyquist/Shannon sampling theory has been accepted as the doctrine for signal acquisition and processing ever since it was implied by the work of Nyquist in 1928 [397] and proved by Shannon in 1949 [398]. The theorem says that to exactly reconstruct an arbitrary band-limited signal from its samples, the sampling rate needs to be at least twice the bandwidth.
- Type
- Chapter
- Information
- Signal Processing and Networking for Big Data Applications , pp. 171 - 195Publisher: Cambridge University PressPrint publication year: 2017