To save content items to your account,
please confirm that you agree to abide by our usage policies.
If this is the first time you use this feature, you will be asked to authorise Cambridge Core to connect with your account.
Find out more about saving content to .
To save content items to your Kindle, first ensure no-reply@cambridge.org
is added to your Approved Personal Document E-mail List under your Personal Document Settings
on the Manage Your Content and Devices page of your Amazon account. Then enter the ‘name’ part
of your Kindle email address below.
Find out more about saving to your Kindle.
Note you can select to save to either the @free.kindle.com or @kindle.com variations.
‘@free.kindle.com’ emails are free but can only be saved to your device when it is connected to wi-fi.
‘@kindle.com’ emails can be delivered even when you are not connected to wi-fi, but note that service fees apply.
The search space for new thermoelectric oxides has been limited to the alloys of a few known systems, such as ZnO, SrTiO3, and CaMnO3. Notwithstanding the high power factor, their high thermal conductivity is a roadblock in achieving higher efficiency. In this paper, we apply machine learning (ML) models for discovering novel transition metal oxides with low lattice thermal conductivity ($ {k}_L $). A two-step process is proposed to address the problem of small datasets frequently encountered in material informatics. First, a gradient-boosted tree classifier is learnt to categorize unknown compounds into three categories of $ {k}_L $: low, medium, and high. In the second step, we fit regression models on the targeted class (i.e., low $ {k}_L $) to estimate $ {k}_L $ with an $ {R}^2>0.9 $. Gradient boosted tree model was also used to identify key material properties influencing classification of $ {k}_L $, namely lattice energy per atom, atom density, band gap, mass density, and ratio of oxygen by transition metal atoms. Only fundamental materials properties describing the crystal symmetry, compound chemistry, and interatomic bonding were used in the classification process, which can be readily used in the initial phases of materials design. The proposed two-step process addresses the problem of small datasets and improves the predictive accuracy. The ML approach adopted in the present work is generic in nature and can be combined with high-throughput computing for the rapid discovery of new materials for specific applications.
Privacy is gravely endangered in the digital age, and we, the digital citizens, are its principal threat, willingly surrendering it to avail ourselves of new technology, and granting the government and corporations immense power over us. In this highly original work, Firmin DeBrabander begins with this premise and asks how we can ensure and protect our freedom in the absence of privacy. Can—and should—we rally anew to support this institution? Is privacy so important to political liberty after all? DeBrabander makes the case that privacy is a poor foundation for democracy, that it is a relatively new value that has been rarely enjoyed throughout history—but constantly persecuted—and politically and philosophically suspect. The vitality of the public realm, he argues, is far more significant to the health of our democracy, but is equally endangered—and often overlooked—in the digital age.
For the most part, our spies contend that we know we are spied upon, but accept the surveillance because of the many concrete benefits we receive in return. Marketers want us to know, for example, the more we divulge, the better they can serve us. Indeed, they can help us realize desires and aspirations before they occur to us – desires we never even knew we had. In that sense, they promise to empower us, help us get what we want, and improve our personal lives. In turn, this implies that savvy shoppers expose personal details, even if they seem arcane and unremarkable. Apparently, that is not for us to judge.
I have been arguing that privacy is on life support, and its prognosis looks dim. It is thoroughly besieged in the digital age, and the general population is perhaps its greatest enemy, happily surrendering it to indulge in all manner of conveniences and innovations. Critics and privacy advocates warn that this is a dire development; privacy is necessary for a free and fulfilled life. The digital tidal wave forces us to face a future where privacy may be nonexistent, or at least radically transformed, and diminished. I don’t believe the proper solution is to urge people to start caring about privacy again, build stronger walls around their personal lives, so to speak, and block out spying eyes. This seems utterly impossible, and it is unreasonable or implausible to request this of people who are eager to tap into all that the digital economy has to offer. We must find a way to thrive despite this state of affairs.
In Plato’s dialogue “Meno,” the eponymous speaker, after much frustration, declares that Socrates, his insistent interviewer, is like a “broad torpedo fish” – better known to us as an electric ray – which stuns and paralyzes those who approach it.1 The two men had conversed at length on the topic of virtue, something Meno professed to know much about, and he readily produced several definitions. After Socrates debunked each in turn, Meno was at a loss. Like the torpedo fish, he tells Socrates, “you now seem to have had that effect on me, for both my mind and my tongue are numb, and I have no answer to give you. Yet I have made many speeches about virtue before large audiences on a thousand occasions, very good speeches I thought but now I cannot even say what it is!”2
In the face of rapid technological changes transforming our lives in profound ways, the forecast for privacy looks dark. Its demise is hastened by “an unstoppable arms race in communication tools and data mining capabilities, which in turn are both due to the continued progression of Moore’s Law. The cost of keeping secrets increases inversely to decreases in the cost of computing.”1 Roughly, Moore’s Law holds that computing power will grow exponentially; digital devices will become faster and more powerful in rapid succession. Where technological changes may have once developed gradually, that is precisely not the case as in the digital age, where breakneck speed of change is the rule. And digital technology will become cheaper, more accessible, and broadly distributed in the process. Taken together, this means that the cost and hassle of protecting privacy grows exponentially, too. In the digital net that envelops our everyday lives, it will become increasingly more difficult and rare to perform any task without revealing ourselves, and opening our lives to spying eyes. And our spies are not content to watch us from without; they will install sentinels in our very bodies, and monitor us from within.
A novel underwater vehicle configuration with an operating principle as the Sepiida animal is presented and developed in this paper. The mathematical equations describing the movements of the vehicle are obtained using the Newton–Euler approach. An analysis of the dynamic model is done for control purposes. A prototype and its embedded system are developed for validating analytically and experimentally the proposed mathematical representation. A real-time characterization of one mass is done to relate the pitch angle with the radio of displacement of the mass. In addition, first validation of the closed-loop system is done using a linear controller.
The current crisis of privacy is, or ought to be, especially surprising in the United States, because privacy concerns, historians and legal scholars attest, were a prime driver in the creation of the nation, and the erection and expansion of our basic freedoms. Our disregard for privacy is surprising for another reason: it defies predictions and expectations of how we are supposed to act under surveillance. Why, if we know we are watched – and we admit as much – is online behavior so shameless, seemingly open and free? Why do so many of us feel compelled to blare intimate details, and share mundane and embarrassing events with the whole world? What does that say about us? Is human nature changing before our very eyes, in the digital age, such that we show no compunction about living an utterly public life, in most all respects? How can we retain any enduring or grudging respect for privacy in this brave new world? Some people muster objections; some admit there is something wrong in privacy invasions – but what? We have a vocabulary of privacy, and a deep historical relationship to it (or so we are told), but hardly know what it means anymore, why it is of value, and worthy of defense. And in the digital age, privacy requires no modest or ordinary defense, but a monumental call to arms, to beat back the tidal wave of surveillance – which we invite, and facilitate.
Before anyone despairs over the demise of privacy, it is helpful to consider the history of this institution, and how it developed into its modern incarnation. In one respect, privacy is a very young value, and humans have long lived – and communities flourished – without it. Privacy has always been embattled. That is nothing new – you might even say that is its native state. When people managed to achieve some degree or form of privacy in the past, it was in much lesser quantities, and far more selectively and rarely enjoyed than advocates and critics say we need and deserve. The amount of privacy we have come to expect or take for granted in contemporary suburban living, by contrast, is almost absurdly generous. It is hard to imagine or conceive of an architecture and landscape that prioritizes privacy better. But appearances are deceiving: on one hand, and as I have been arguing thus far, our lives have never been more transparent within our suburban bubbles. Do we care? Better yet: what does this indicate we value in or about privacy? Does it suggest we esteem privacy at all – or something else altogether?
Many are worried for the fate and future of privacy, and rightly so. It is impossible to get anything done these days without leaving telltale digital trails, which eager spies scoop up. And it turns out you don’t have to divulge much for companies to learn a lot about you. Our digital monitors are busy figuring out how to plumb our intimate depths on the basis of seemingly innocuous and mundane details – details that we hardly give a thought to. What’s more, some companies, like Facebook, aim to compile profiles of you even if you are a relative troglodyte, and engage in little or no digital commerce at all. If you do all you can and should to protect your privacy, even making the ultimate sacrifice of foregoing digital communications altogether, this may not be enough. Facebook will simply learn about you from your neighbors, friends, and family, who invoke you, or imply your existence.
In this note we study the emergence of Hamiltonian Berge cycles in random r-uniform hypergraphs. For $r\geq 3$ we prove an optimal stopping time result that if edges are sequentially added to an initially empty r-graph, then as soon as the minimum degree is at least 2, the hypergraph with high probability has such a cycle. In particular, this determines the threshold probability for Berge Hamiltonicity of the Erdős–Rényi random r-graph, and we also show that the 2-out random r-graph with high probability has such a cycle. We obtain similar results for weak Berge cycles as well, thus resolving a conjecture of Poole.
The political state is a free-willed creation of reflective individual citizens. Such is the directive issued by Modern philosophy, and which we take for granted in liberal democracy. According to Social Contract theory, articulated principally by Hobbes, Locke and Rousseau, humans leave a state of nature to create and enter the political sphere, which they willingly, intelligently inaugurate through a mutual ‘contract.’ Living independently in a state of nature, while perhaps preferable in some respects, is ultimately unsustainable. For humans to effectively pursue and achieve personal ends, and find fulfillment, whatever shape that may take, they must sacrifice absolute freedom in nature, to live together in security. The salient point for the discussion at hand is that, as Social Contract theory has it, individuals conceive of their goals prior to or independently of the political community. The polis is reduced to a mere platform, if you will, a stage that permits or enables us to pursue what we want, in relative peace and harmony.
People have long sought out the public realm because of a desire for transcendence. The ancient Greeks sought it out because they wanted more than the oikos, or the family home, had to offer. Accordingly, the private realm was long deemed ‘privative’ in some essential way – it deprived us of what it means to be uniquely or distinctly human.1 In Classical times, the oikos was the realm of function and hierarchy. It was hierarchical because of the task at hand, the business of survival. But things were otherwise in the public realm, where men were free – for those lucky enough to be citizens, that is.2 When they entered the public realm, the realm of politics where freedom was exercised, people were released somewhat, or temporarily, from the tyranny of necessity, and could entertain higher matters and higher concerns – uniquely human concerns.
The Office for National Statistics (ONS) is currently undertaking a substantial research program into using price information scraped from online retailers in the Consumer Prices Index including occupiers’ housing costs (CPIH). In order to make full use of these data, we must classify it into the product types that make up the basket of goods and services used in the current collection. It is a common problem that the amount of labeled training data is limited and it is either impossible or impractical to manually increase the size of the training data, as is the case with web-scraped price data. We make use of a semi-supervised machine learning (ML) method, Label Propagation, to develop a pipeline to increase the number of labels available for classification. In this work, we use several techniques in succession and in parallel to enable higher confidence in the final increased labeled dataset to be used in training a traditional ML classifier. We find promising results using this method on a test sample of data achieving good precision and recall values for both the propagated labels and the classifiers trained from these labels. We have shown that through combining several techniques together and averaging the results, we are able to increase the usability of a dataset with limited labeled training data, a common problem in using ML in real world situations. In future work, we will investigate how this method can be scaled up for use in future CPIH calculations and the challenges this brings.
Health data have enormous potential to transform healthcare, health service design, research, and individual health management. However, health data collected by institutions tend to remain siloed within those institutions limiting access by other services, individuals or researchers. Further, health data generated outside health services (e.g., from wearable devices) may not be easily accessible or useable by individuals or connected to other parts of the health system. There are ongoing tensions between data protection and the use of data for the public good (e.g., research). Concurrently, there are a number of data platforms that provide ways to disrupt these traditional health data siloes, giving greater control to individuals and communities. Through four case studies, this paper explores platforms providing new ways for health data to be used for personal data sharing, self-health management, research, and clinical care. The case-studies include data platforms: PatientsLikeMe, Open Humans, Health Record Banks, and unforgettable.me. These are explored with regard to what they mean for data access, data control, and data governance. The case studies provide insight into a shift from institutional to individual data stewardship. Looking at emerging data governance models, such as data trusts and data commons, points to collective control over health data as an emerging approach to issues of data control. These shifts pose challenges as to how “traditional” health services make use of data collected on these platforms. Further, it raises broader policy questions regarding how to decide what public good data should be put towards.