To save content items to your account,
please confirm that you agree to abide by our usage policies.
If this is the first time you use this feature, you will be asked to authorise Cambridge Core to connect with your account.
Find out more about saving content to .
To save content items to your Kindle, first ensure no-reply@cambridge.org
is added to your Approved Personal Document E-mail List under your Personal Document Settings
on the Manage Your Content and Devices page of your Amazon account. Then enter the ‘name’ part
of your Kindle email address below.
Find out more about saving to your Kindle.
Note you can select to save to either the @free.kindle.com or @kindle.com variations.
‘@free.kindle.com’ emails are free but can only be saved to your device when it is connected to wi-fi.
‘@kindle.com’ emails can be delivered even when you are not connected to wi-fi, but note that service fees apply.
Due to the growing use of Atomic Force Microscope (AFM) nanorobots in the moving and manipulation of cylindrical nanoparticles (carbon nanotubes and nanowires) and the fact that these processes cannot be simultaneously observed, a computer simulation of the involved forces for the purpose of predicting the outcome of the process becomes highly important. So far, no dynamic 3D model that shows changes in these forces in the course of process implementation has been presented. An algorithm is used in this paper to show in 3D, the manner by which the dynamic forces vary in the mentioned process. The presented model can simulate the forces exerted on the probe tip during the manipulation process in three directions. Because of the nonlinearity of the presented dynamic model, the effective parameters have been also studied. To evaluate the results, the parameters of the 3D case (cylindrical model) are gradually reduced and it is transformed into a 2D model (disk model); and we can observe a good agreement between the results of the two simulations. Next, the simulation results are compared with the experimental results, indicating changes in lateral force. With the help of the offered dynamic model, the cantilever deformation and the forces interacting between probe tip and particle can be determined from the moment the probe tip contacts the nanoparticle to when the nanoparticle dislodges from the substrate surface.
Aliases play an important role in online environments by facilitating anonymity, but also can be used to hide the identity of cybercriminals. Previous studies have investigated this alias matching problem in an attempt to identify whether two aliases are shared by an author, which can assist with identifying users. Those studies create their training data by randomly splitting the documents associated with an alias into two sub-aliases. Models have been built that can regularly achieve over 90% accuracy for recovering the linkage between these ‘random sub-aliases’. In this paper, random sub-alias generation is shown to enable these high accuracies, and thus does not adequately model the real-world problem. In contrast, creating sub-aliases using topic-based splitting drastically reduces the accuracy of all authorship methods tested. We then present a methodology that can be performed on non-topic controlled datasets, to produce topic-based sub-aliases that are more difficult to match. Finally, we present an experimental comparison between many authorship methods to see which methods better match aliases under these conditions, finding that local n-gram methods perform better than others.
We give a constructive proof showing that every finitely generated polynomial ideal has a Gröbner basis, provided the ring of coefficients is Noetherian in the sense of Richman and Seidenberg. That is, we give a constructive termination proof for a variant of the well-known algorithm for computing the Gröbner basis. In combination with a purely order-theoretic result we have proved in a separate paper, this yields a unified constructive proof of the Hilbert basis theorem for all Noether classes: if a ring belongs to a Noether class, then so does the polynomial ring. Our proof can be seen as a constructive reworking of one of the classical proofs, in the spirit of the partial realisation of Hilbert's programme in algebra put forward by Coquand and Lombardi. The rings under consideration need not be commutative, but are assumed to be coherent and strongly discrete: that is, they admit a membership test for every finitely generated ideal. As a complement to the proof, we provide a prime decomposition for commutative rings possessing the finite-depth property.
Program transformation is an appealing technique which allows to improve run-time efficiency, space-consumption, and more generally to optimize a given program. Essentially, it consists of a sequence of syntactic program manipulations which preserves some kind of semantic equivalence. Unfolding is one of the basic operations used by most program transformation systems and consists of the replacement of a procedure call by its definition. While there is a large body of literature on the transformation and unfolding of sequential programs, very few papers have addressed this issue for concurrent languages. This paper defines an unfolding system for Constraint Handling Rules programs. We define an unfolding rule, show its correctness and discuss some conditions that can be used to delete an unfolded rule while preserving the program meaning. We also prove that, under some suitable conditions, confluence and termination are preserved by the above transformation.
We consider structured specifications built from flat specifications using union, translation and hiding with their standard model-class semantics in the context of an arbitrary institution. We examine the alternative of sound property-oriented semantics for such specifications, and study their relationship to model-class semantics. An exact correspondence between the two (completeness) is not achievable in general. We show through general results on property-oriented semantics that the semantics arising from the standard proof system is the strongest sound and compositional property-oriented semantics in a wide class of such semantics. We also sharpen one of the conditions that does guarantee completeness and show that it is a necessary condition.
Let G be a string graph (an intersection graph of continuous arcs in the plane) with m edges. Fox and Pach proved that G has a separator consisting of $O(m^{3/4}\sqrt{\log m})$ vertices, and they conjectured that the bound of $O(\sqrt m)$ actually holds. We obtain separators with $O(\sqrt m \,\log m)$ vertices.
Thank you for the support in getting Network Science up and running. We are deeply appreciative of the work our associate editors, authors, and reviewers have put into realizing this vision of an interdisciplinary journal for network science. And of course, the journal would not be possible without the hard work of the editors of Network Science, who act as action editors for the submitted articles.
This paper presents a novel global path planning method for mobile robots. An improved grid map, called three-dimensional-like map, is developed to represent the global workspace area. The new environment model includes not only contour information of obstacles but also artificial height information. Based on this new model, a simple but efficient obstacle avoidance algorithm is developed to solve robot path planning problems in static environment. The proposed algorithm only requires simple distance calculations and several comparison operations. In addition, unlike other algorithms, the proposed algorithm only needs to deal with some obstacles instead of all. The research results show that this method is computationally efficient and can be used to find an optimal or near optimal path.
Is Hitler bigger than Napoleon? Washington bigger than Lincoln? Picasso bigger than Einstein? Quantitative analysts are rapidly finding homes in social and cultural domains, from finance to politics. What about history? In this fascinating book, Steve Skiena and Charles Ward bring quantitative analysis to bear on ranking and comparing historical reputations. They evaluate each person by aggregating the traces of millions of opinions, just as Google ranks webpages. The book includes a technical discussion for readers interested in the details of the methods, but no mathematical or computational background is necessary to understand the rankings or conclusions. Along the way, the authors present the rankings of more than one thousand of history's most significant people in science, politics, entertainment, and all areas of human endeavor. Anyone interested in history or biography can see where their favorite figures place in the grand scheme of things.
Mobility of people and goods is essential in the global economy. The ability to track the routes and patterns associated with this mobility offers unprecedented opportunities for developing new, smarter applications in different domains. Much of the current research is devoted to developing concepts, models, and tools to comprehend mobility data and make it manageable for these applications. This book surveys the myriad facets of mobility data, from spatio-temporal data modeling, to data aggregation and warehousing, to data analysis, with a specific focus on monitoring people in motion (drivers, airplane passengers, crowds, and even animals in the wild). Written by a renowned group of worldwide experts, it presents a consistent framework that facilitates understanding of all these different facets, from basic definitions to state-of-the-art concepts and techniques, offering both researchers and professionals a thorough understanding of the applications and opportunities made possible by the development of mobility data.
Numerical algorithms, modern programming techniques, and parallel computing are often taught serially across different courses and different textbooks. The need to integrate concepts and tools usually comes only in employment or in research - after the courses are concluded - forcing the student to synthesise what is perceived to be three independent subfields into one. This book provides a seamless approach to stimulate the student simultaneously through the eyes of multiple disciplines, leading to enhanced understanding of scientific computing as a whole. The book includes both basic as well as advanced topics and places equal emphasis on the discretization of partial differential equations and on solvers. Some of the advanced topics include wavelets, high-order methods, non-symmetric systems, and parallelization of sparse systems. The material covered is suited to students from engineering, computer science, physics and mathematics.
The Seismic Analysis Code (SAC) is one of the most widely used analysis packages for regional and teleseismic seismic data. For the first time, this book provides users at introductory and advanced levels with a complete guide to SAC. It leads new users of SAC through the steps of learning basic commands, describes the SAC processing philosophy, and presents its macro language in full, supported throughout with example inputs and outputs from SAC. For more experienced practitioners, the book describes SAC's many hidden features, including advanced graphics aspects, its file structure, how to write independent programs to access and create files, and much more. Tutorial exercises engage users with newly acquired skills, providing data and code to implement the standard methods of teleseismic shear-wave splitting and receiver function analysis. Methodical and authoritative, this is a key resource for researchers and graduate students in global seismology, earthquake seismology and geophysics.
The idea of interfacing minds with machines has long captured the human imagination. Recent advances in neuroscience and engineering are making this a reality, opening the door to restoration and augmentation of human physical and mental capabilities. Medical applications such as cochlear implants for the deaf and neurally controlled prosthetic limbs for the paralyzed are becoming almost commonplace. Brain-computer interfaces (BCIs) are also increasingly being used in security, lie detection, alertness monitoring, telepresence, gaming, education, art, and human augmentation. This introduction to the field is designed as a textbook for upper-level undergraduate and first-year graduate courses in neural engineering or brain-computer interfacing for students from a wide range of disciplines. It can also be used for self-study and as a reference by neuroscientists, computer scientists, engineers, and medical practitioners. Key features include questions and exercises in each chapter and a supporting website.
The discrepancy method is the glue that binds randomness and complexity. It is the bridge between randomized computation and discrepancy theory, the area of mathematics concerned with irregularities in distributions. The discrepancy method has played a major role in complexity theory; in particular, it has caused a mini-revolution of sorts in computational geometry. This book tells the story of the discrepancy method in a few short independent vignettes. It is a varied tale which includes such topics as communication complexity, pseudo-randomness, rapidly mixing Markov chains, points on the sphere and modular forms, derandomization, convex hulls, Voronoi diagrams, linear programming and extensions, geometric sampling, VC-dimension theory, minimum spanning trees, linear circuit complexity, and multidimensional searching. The mathematical treatment is thorough and self-contained. In particular, background material in discrepancy theory is supplied as needed. Thus the book should appeal to students and researchers in computer science, operations research, pure and applied mathematics, and engineering.
For decades Robotica has had a characteristic look and page layout that has served our readership well. However, recent advances in publishing technology make now an excellent time to experiment with some new ideas. From January 2014 we will be using an exciting new cover design concept that will include unique artwork featuring one or more of the topics covered in each issue. Moreover, we will be changing from our traditional double-column format to a new single-column one that will facilitate larger figures, tables and equations. At the end of this month's issue, we will include a sneak peek at the new page format via an article typeset in this new single-column style. We hope that our readership embraces this change. And we look forward to continuing to publish papers of high quality and broad international appeal.