To save content items to your account,
please confirm that you agree to abide by our usage policies.
If this is the first time you use this feature, you will be asked to authorise Cambridge Core to connect with your account.
Find out more about saving content to .
To save content items to your Kindle, first ensure no-reply@cambridge.org
is added to your Approved Personal Document E-mail List under your Personal Document Settings
on the Manage Your Content and Devices page of your Amazon account. Then enter the ‘name’ part
of your Kindle email address below.
Find out more about saving to your Kindle.
Note you can select to save to either the @free.kindle.com or @kindle.com variations.
‘@free.kindle.com’ emails are free but can only be saved to your device when it is connected to wi-fi.
‘@kindle.com’ emails can be delivered even when you are not connected to wi-fi, but note that service fees apply.
This appendix is a collection of topics that were slightly peripheral to the main flow of the book, but still potentially interesting to some readers. The derivations of Fisher's information matrix in several forms as well as Stein's lemma are both important tools employed in the main parts of the book.
With both our estimation and Lie group tools from previous chapters, we now begin to bring the two together. We discuss a classic three-dimensional estimation problem in robotics: pointcloud alignment; this gives us our first example of carrying out optimization over the group of rotations by a few different means. We then present the classic problem of localizing a moving robot using point observations of known three-dimensional landmarks; this involves adapting the extended Kalman filter (EKF) to work with the group of poses. Another common problem in robotics is that of pose-graph optimization, which is easily handled using our Lie group tools. We conclude with a presentation of how to carry out trajectory estimation based on an inertial measurement unit (IMU) both recursively via the EKF and batch using IMU preintegration for efficiency.
This chapter is devoted to the classic simultaneous localization and mapping (SLAM) problem and the related problem, bundle adjustment. In these problems we must estimate not only the trajectory of a robot but also the three-dimensional positions of many point landmarks, based on noisy sensor data and a motion model (in the case of SLAM). We discuss how to adapt the tools presented earlier to include landmarks in the state; the inclusion of landmarks changes the sparsity pattern of the resulting estimation equations and we discuss strategies of continuing to solve them efficiently. Our approach is carried out entirely in three dimensions using our Lie group tools.
As the book attempts to be as stand-alone as possible, this chapter provides up front a summary of all the results in probability theory that will be needed later on. Probability is key to estimation as we not only want to estimate, for example, where something is but how confident we are in that estimate. The first half of the chapter introduces general probability density functions, Bayes' theorem, the notion of independence, and quantifying uncertainty amongst other topics. The second half of the chapter delves into Gaussian probability density functions specifically and establishes the key tools needed in common estimation algorithms to follow in later chapters. This chapter can also simply serve as a reference for readers already familiar with the content.
This appendix serves as a quick summary of the main linear algebra and matrix calculus tools used throughout the book. It was designed primarily as a reference but could be used as a primer or refresher to be read before the main chapters of the book.
This chapter takes a step back and revisits nonlinear estimation through the lens of variational inference, another concept common in the machine learning world. Estimation is posed as minimizing a data-likelihood objective, the Kullback-Leibler divergence between a Gaussian estimate and the true Bayesian posterior. We follow through the consequences of this starting point and show that we can arrive at many of the algorithms presented earlier through appropriate approximations, but can also open the door to new possibilities. For example, a derivative-free batch estimator that uses sigmapoints is discussed. Variational inference also provides a principled approach to learning parameters in our estimators from training data (i.e., parameters of our motion and observation models).
As the popularity of adhesive joints in industry increases, so does the need for tools to support the process of selecting a suitable adhesive. While some such tools already exist, they are either too limited in scope or offer too little flexibility in use. This work presents a more advanced tool, that was developed together with a team of adhesive experts. We first extract the experts’ knowledge about this domain and formalize it in a Knowledge Base (KB). The IDP-Z3 reasoning system can then be used to derive the necessary functionality from this KB. Together with a user-friendly interactive interface, this creates an easy-to-use tool capable of assisting the adhesive experts. To validate our approach, we performed user testing in the form of qualitative interviews. The experts are very positive about the tool, stating that, among others, it will help save time and find more suitable adhesives.
SLAM Benchmark plays a pivotal role in the field by providing a common ground for performance evaluation. In this paper, a novel methodology of simultaneous localization and mapping benchmark and map accuracy improvement (SLAMB&MAI) is introduced. It can objectively evaluate errors of localization and mapping, and further improve map accuracy by utilizing evaluation results as feedback. The proposed benchmark transforms all elements into a global frame and measures the errors between them. The comprehensiveness consists in the benchmark of both localization and mapping, and the objectivity consists in the consideration of the correlation between localization and mapping by the preservation of the original pose relations between all reference frames. The map accuracy improvement is realized by first obtaining the optimization that minimizes the errors between the estimated trajectory and ground truth trajectory and then applying it to the estimated map. The experimental results showed that the map accuracy can be improved by an average of 15%. The optimization that yields minimal localization errors is obtained by the proposed Centre Point Registration-Iterative Closest Point (CPR-ICP). This proposed Iterative Closest Point (ICP) variant pre-aligns two point clouds by their centroids and least square planes and then uses traditional ICP to minimize the error between them. The experimental results showed that CPR-ICP outperformed traditional ICP, especially in cases involving large-scale environments. To the extent of our knowledge, this is the first work that can not only objectively benchmark both localization and mapping but also revise the estimated map and increase its accuracy, which provides insights into the acquisition of ground truth map and robot navigation.
The Knuth–Morris–Pratt (KMP) algorithm for string search is notoriously difficult to understand. Lost in a sea of index arithmetic, most explanations of KMP obscure its essence. This paper constructs KMP incrementally, using pictures to illustrate each step. The end result is easier to comprehend. Additionally, the derivation uses only elementary functional programming techniques.
In this paper, we compare the entropy of the original distribution and its corresponding compound distribution. Several results are established based on convex order and relative log-concave order. The necessary and sufficient condition for a compound distribution to be log-concave is also discussed, including compound geometric distribution, compound negative binomial distribution and compound binomial distribution.
A pioneering book on an important novel subject, Memory Activism and Digital Practices after Conflict presents a thorough and detailed overview of memory activism in Serbia over the last 20 years, across two generations. First of all, the book traces the transition from anti-war and human rights activism to ‘first-generation memory activism’, by examining the trajectory of the NGO Women in Black created in the 1990s to protest against Serbian nationalism, Milošević's war policy and Serb war crimes in the Yugoslav wars. The turn to what could be defined as memory activism grew organically out of this initial anti-nationalist and anti-war struggle, gaining traction following the deep disappointment felt by many in the Serbian alternative with the continuities in Serbia's institutions and social structures after the fall of Milošević in 2000, the official silence and denial of war crimes, and the unfulfilled promises of transitional justice in the region.
The loading and unloading operations of smart logistic application robots depend largely on their perception system. However, there is a paucity of study on the evaluation of Lidar maps and their SLAM algorithms in complex environment navigation system. In the proposed work, the Lidar information is finetuned using binary occupancy grid approach and implemented Improved Self-Adaptive Learning Particle Swarm Optimization (ISALPSO) algorithm for path prediction. The approach makes use of 2D Lidar mapping to determine the most efficient route for a mobile robot in logistical applications. The Hector SLAM method is used in the Robot Operating System (ROS) platform to implement mobile robot real-time location and map building, which is subsequently transformed into a binary occupancy grid. To show the path navigation findings of the proposed methodologies, a navigational model has been created in the MATLAB 2D virtual environment using 2D Lidar mapping point data. The ISALPSO algorithm adapts its parameters inertia weight, acceleration coefficients, learning coefficients, mutation factor, and swarm size, based on the performance of the generated path. In comparison to the other five PSO variants, the ISALPSO algorithm has a considerably shorter path, a quick convergence rate, and requires less time to compute the distance between the locations of transporting and unloading environments, based on the simulation results that was generated and its validation using a 2D Lidar environment. The efficiency and effectiveness of path planning for mobile robots in logistic applications are validated using Quanser hardware interfaced with 2D Lidar and operated in environment 3 using proposed algorithm for production of optimal path.
Invertibility is a fundamental concept in computer science, with various manifestations in software development (serializer/deserializer, parser/printer, redo/undo, compressor/decompressor, and so on). Full invertibility necessarily requires bijectivity, but the direct approach of composing bijective functions to develop invertible programs is too restrictive to be useful. In this paper, we take a different approach by focusing on partially invertible functions—functions that become invertible if some of their arguments are fixed. The simplest example of such is addition, which becomes invertible when fixing one of the operands. More involved examples include entropy-based compression methods (e.g., Huffman coding), which carry the occurrence frequency of input symbols (in certain formats such as Huffman tree), and fixing this frequency information makes the compression methods invertible.
We develop a language Sparcl for programming such functions in a natural way, where partial invertibility is the norm and bijectivity is a special case, hence gaining significant expressiveness without compromising correctness. The challenge in designing such a language is to allow ordinary programming (the “partially” part) to interact with the invertible part freely, and yet guarantee invertibility by construction. The language Sparcl is linear-typed and has a type constructor to distinguish data that are subject to invertible computation and those that are not. We present the syntax, type system, and semantics of the language and prove that Sparcl correctly guarantees invertibility for its programs. We demonstrate the expressiveness of Sparcl with examples including tree rebuilding from preorder and inorder traversals, Huffman coding, arithmetic coding, and LZ77 compression.
Social network analysis is known to provide a wealth of insights relevant to many aspects of policymaking. Yet, the social data needed to construct social networks are not always available. Furthermore, even when they are, interpreting such networks often relies on extraneous knowledge. Here, we propose an approach to infer social networks directly from the texts produced by actors and the terminological similarities that these texts exhibit. This approach relies on fitting a topic model to the texts produced by these actors and measuring topic profile correlations between actors. This reveals what can be called “hidden communities of interest,” that is, groups of actors sharing similar semantic contents but whose social relationships with one another may be unknown or underlying. Network interpretation follows from the topic model. Diachronic perspectives can also be built by modeling the networks over different time periods and mapping genealogical relationships between communities. As a case study, the approach is deployed over a working corpus of academic articles (domain of philosophy of science; N=16,917).