To save content items to your account,
please confirm that you agree to abide by our usage policies.
If this is the first time you use this feature, you will be asked to authorise Cambridge Core to connect with your account.
Find out more about saving content to .
To save content items to your Kindle, first ensure no-reply@cambridge.org
is added to your Approved Personal Document E-mail List under your Personal Document Settings
on the Manage Your Content and Devices page of your Amazon account. Then enter the ‘name’ part
of your Kindle email address below.
Find out more about saving to your Kindle.
Note you can select to save to either the @free.kindle.com or @kindle.com variations.
‘@free.kindle.com’ emails are free but can only be saved to your device when it is connected to wi-fi.
‘@kindle.com’ emails can be delivered even when you are not connected to wi-fi, but note that service fees apply.
This paper deals with the problem of navigating semi-autonomous mobile robots without global localization systems in unknown environments. We propose a planning-based obstacle avoidance strategy that relies on local maps and a series of short-time coordinate frames. With this approach, simple odometry and range information are sufficient to make the robot to safely follow the user commands. Different from reactive obstacle avoidance strategies, the proposed approach chooses a good and smooth local path for the robot. The methodology is evaluated using a mobile service robot moving in an unknown corridor environment populated with obstacles and people.
This paper presents a fleet model explained through a complex configuration of load sharing that considers overcapacity and is based on a life cycle cost (LCC) approach for cost-related decision-making. By analyzing the variables needed to optimize the fleet size, which must be evaluated in combination with the event space method (ESM), the solution to this problem would normally require high computing performance and long computing times. Considering this, the combined use of an integer genetic algorithm (GA) and the ant colony optimization (ACO) method was proposed in order to determine the optimal solution. In order to analyze and highlight the added value of this proposal, several empirical simulations were performed. The results showed the potential strengths of the proposal related to its flexibility and capacity in solving large problems with a near optimal solution for large fleet size and potential real-world applications. Even larger problems can be solved this way than by using the complete enumeration approach and a non-family fleet approach. Thus, this allows for a more real solution to fleet design that also considers overcapacity, availability, and an LCC approach. The simulations showed that the model can be solved in much less time compared with the base model and allows for the resolution of a fleet of at least 64 trucks using GA and 130 using ACO, respectively. Thus, the proposed framework can solve real-world problems, such as the fleet design of mining companies, by offering a more realistic approach.
The use of decision-making models in the early stages of the development of complex products and technologies is a well-established practice in industry. Engineers rely on well-established statistical and mathematical models to explore the feasible design space and make early decisions on future design configurations. At the same time, researchers in both value-driven design and sustainable product development areas have stressed the need to expand the design space exploration by encompassing value and sustainability-related considerations. A portfolio of methods and tools for decision support regarding value and sustainability integration has been proposed in literature, but very few have seen an integration in engineering practices. This paper proposes an approach, developed and tested in collaboration with an aerospace subsystem manufacturer, featuring the integration of value-driven design and sustainable product development models in the established practices for design space exploration. The proposed approach uses early simulation results as input for value and sustainability models, automatically computing value and sustainability criteria as an integral part of the design space exploration. Machine learning is applied to deal with the different levels of granularity and maturity of information among early simulations, value models, and sustainability models, as well as for the creation of reliable surrogate models for multidimensional design analysis. The paper describes the logic and rationale of the proposed approach and its application to the case of a turbine rear structure for commercial aircraft engines. Finally, the paper discusses the challenges of the approach implementation and highlights relevant research directions across the value-driven design, sustainable product development, and machine learning research fields.
The sample squared Sharpe ratio (SSR) is a critical statistic of the risk-return tradeoff. We show that sensitive upper-tail probabilities arise when the sample SSR is employed to test the mean-variance efficiency under different test statistics. Assuming the error's normality with a nonzero mean, we integrate the sample SSR and the arbitrage regression into a noncentral chi-square (χ2) test. We find that the distribution of the sample SSR based on the regression error is to the left of the F-distribution when assuming the returns' normality. Compared to two benchmarks that use the noncentral F-distribution and the central F-statistic, the χ2-statistic is more effective, competitive, significant, and locally robust when used to reject the upper-tailed mean-variance efficiency test using the usual parameters (sample size, portfolio size, and SSR).
In order to reduce the time spent on tolerance analysis, it is necessary to correctly identify and prioritize the key characteristics of the product. For multiple-state mechanisms, a systematic procedure for doing this is lacking. We present a new complexity metric for multiple-state mechanisms based on the product behavior, describing the impact of geometrical variation. The sequence of the structural state transitions is linked to the product composition, enabling a clear prioritization of variation-critical states and interfaces. The approach is applied on an industrial case and verified based on a comparison with the company-specified priority tolerance calculations.
Constrained motion is essential for varying robotics tasks, especially in surgical robotics, for instance, the case of minimally invasive interventions. This article proposes generic formulations of the classical bilateral constrained motion (i.e., when the incision hole has almost the same diameter as that of the tool) as well as unilaterally constrained motion (i.e., when the hole incision has a larger diameter compared to the tool diameter). One of the latter constraints is combined with another surgical task such as incision/ablation or suturing a wound (modeled here by 3D geometric paths). The developed control methods based on the hierarchical task approach are able to manage simultaneously the constrained motion (depending on the configuration case, i.e., bilateral or unilateral constraint) and a 3D path following. In addition, the proposed methods can operate with both straight or curved surgical tools. The proposed methods were successfully validated in various scenarios. Foremost, a simulation framework was proposed to access the performances of each proposed controller. Thereafter, several experimental validations were carried out. Both the simulation and experimental results have demonstrated the relevance of the proposed approach, as well as promising performances in terms of behavior as well as accuracy.
We shall begin the chapter by explaining what PageRank is and how it is computed efficiently. Yet the war between those who want to make the Web useful and those who would exploit it for their own purposes is never over. When PageRank was established as an essential technique for a search engine, spammers invented ways to manipulate the PageRank of a Web page, often called link spam. That development led to the response of TrustRank and other techniques for preventing spammers from attacking PageRank. We shall discuss TrustRank and other approaches to detecting link spam. Finally, this chapter also covers some variations on PageRank. These techniques include topic-sensitive PageRank (which can also be adapted for combating link spam) and the HITS, or “hubs and authorities” approach to evaluating pages on the Web.
Many students complete PhDs in functional programming each year. As a service to the community, twice per year the Journal of Functional Programming publishes the abstracts from PhD dissertations completed during the previous year.
We include in this chapter a discussion of generalizations of MapReduce, first to systems that support acyclic workflows and then to systems that implement recursive algorithms. Our last topic for this chapter is the design of good MapReduce algorithms, a subject that often differs significantly from the matter of designing good parallel algorithms to be run on a supercomputer. When designing MapReduce algorithms, we often find that the greatest cost is in the communication. We thus investigate communication cost and what it tells us about the most efficient MapReduce algorithms. For several common applications of MapReduce we are able to give families of algorithms that optimally trade the communication cost against the degree of parallelism.
Most agree that lawyers of the future will need a greater understanding of how technology can be used to design and deliver legal services. The issue for those involved in setting content for any route to qualification is defining the extent to which this must be regulated, as much as identifying the right level of technological capability. The issue is not merely one of content, but the acquisition of competences. Any accreditation must look beyond simply ensuring capability in relation to discrete tools, looking instead to ensure that future solicitors have the ability to adapt to new technologies. Separately, consideration has to be given to the emerging profession of legal technologists. Whilst some technologists may be legally qualified, those that are not must understand the ethical boundaries and regulatory requirements that lawyers work within. The organisation of the legal profession and the regulatory boundaries shared between various stakeholders require us to consider whether accreditation is the right way forward, where responsibility for accreditation should lie and who should take initiative in this space. This chapter explores these issues by contrasting the approach adopted by the Solicitors Regulation Authority in England and Wales with that of the Law Society of Scotland.
We begin our discussion of locality-sensitive hashing (LSH) with an examination of the problem of finding similar documents – those that share a lot of common text. We first show how to convert documents into sets in a way that lets us view textual similarity of documents as sets having a large overlap. A second key trick we need is minhashing, which is a way to convert large sets into much smaller representations, called signatures, that still enable us to estimate closely the Jaccard similarity of the represented sets. Finally, we see how to apply the bucketing idea inherent in LSH to the signatures. In Section 3.5 we begin our study of how to apply LSH to items other than sets. We consider the general notion of a distance measure that tells to what degree items are similar. Then, we consider the general idea of locality-sensitive hashing, and we see how to do LSH for some data types other than sets. We examine in detail several applications of the LSH idea. Finally, we consider some techniques for finding similar sets that can be more efficient than LSH when the degree of similarity we want is very high.