To save content items to your account,
please confirm that you agree to abide by our usage policies.
If this is the first time you use this feature, you will be asked to authorise Cambridge Core to connect with your account.
Find out more about saving content to .
To save content items to your Kindle, first ensure no-reply@cambridge.org
is added to your Approved Personal Document E-mail List under your Personal Document Settings
on the Manage Your Content and Devices page of your Amazon account. Then enter the ‘name’ part
of your Kindle email address below.
Find out more about saving to your Kindle.
Note you can select to save to either the @free.kindle.com or @kindle.com variations.
‘@free.kindle.com’ emails are free but can only be saved to your device when it is connected to wi-fi.
‘@kindle.com’ emails can be delivered even when you are not connected to wi-fi, but note that service fees apply.
Abstract. We survey arguments in which methods of classical descriptive set theory are used to obtain information about uncountable models of theories in a countable language. Silver's theorem about Borel equivalence relations is used in the computation of the uncountable spectrum of certain theories and the fact that analytic sets have the property of Baire is useful in the analysis of stable, unsuperstable theories.
In the early days of the development of model theory it was considered natural and was certainly beneficial to assume that the theories under investigation were in a countable language. The primary advantage of this assumption was the presence of the Omitting Types Theorem of Grzegorczyk, Mostowski, and Ryll-Nardzewski [1], which generalized arguments of Henkin [3] and Orey [8]. Following this, Vaught [13] gave a very pleasing analysis of the class of countable models of such a theory. This led to Morley's categoricity theorem [7] for certain classes of uncountable models of theories in a countable language.
The landscape was completely altered by the subsequent work of Shelah (see e.g. [11]). He saw that the salient features of Morley's proof did not require the assumption of the language being countable. Indeed, many of notions that were central to Shelah's work, including unstability, the fcp, the independence property and the strict order property, are local. That is, a theory possesses such a property if and only if some formula has the property. Consequently, the total number of formulas in the language is not relevant. Still other notions, such as superstability, are not local but can be described in terms of countable fragments of the theory. That is, a theory of any cardinality is superstable if and only if all of its reducts to countable fragments of the theory are superstable. Using a vast collection of machinery, Shelah was able to answer literally hundreds of questions about the class of uncountablemodels of certain theories. Most of his arguments do not depend on the cardinality of the underlying language. In particular, he gave a proof of Ło's’ conjecture, that the analogue of Morley's theorem holds for theories in languages of any size. Somewhat curiously, whereas Shelah's methods were very good in classifying uncountable models of a theory, they had considerably less to say about the countable models of a theory.
Abstract. A general model-theoretic theory of approximation is presented which encompasses approximation methods found in analysis in both standard and nonstandard settings. We first give a simple version of the main idea, in the classical metric space setting. This was inspired by work of Anderson and Henson. We inductively define the notions of a closed formula, closed forcing, and the set of approximations of a closed formula. It is shown that given a relatively compact sequence, a closed formula is forced if and only if all its approximations are eventually true, and also if and only if the formula is true at every limit point. Then, in the nonstandard setting, we prove harder analogous results using our theory of neometric spaces, where saturation arguments take the place of compactness arguments. These results shed light on well-known nonstandard constructions that produce new theorems about standard objects.
Introduction. One of the main uses of model theory outside of mathematical logic itself has been the introduction in the early sixties of nonstandard analysis by Abraham Robinson (see [18]). He showed how to apply nonstandard models of the appropriate language to a wide variety of problems in analysis. His construction captured the attention of mathematicians because it made the old idea of infinitesimal quantities available to modern mathematics (for a detailed history of the development of these ideas see the last chapter in [18]).
Robinson's original presentation, which relied heavily on the theory of types, has been “cleaned up”, so that today one does not have to be a logician in order to understand and use nonstandard analysis. Nevertheless there are close ties between model theory and developments that have originated from nonstandard practice. The purpose of this paper is to develop one of these ties: we give a general model theoretic theory of approximation which encompasses approximation methods found in both standard and nonstandard settings.
Edited by
Zoé Chatzidakis, Université de Paris VII (Denis Diderot),Peter Koepke, Rheinische Friedrich-Wilhelms-Universität Bonn,Wolfram Pohlers, Westfälische Wilhelms-Universität Münster, Germany
Abstract. We construct a model in which the first two strongly compact cardinals aren't super-compact yet satisfy significant indestructibility properties for their strong compactness.
Introduction and preliminaries. The study of indestructibility for non-supercompact strongly compact cardinals is one which has been the subject of a great deal of investigation over the last few years, most notably in the papers [1, 6, 7, 3], and [15]. We refer readers to the introductory section of [3] for a thorough discussion of the relevant history. We note, however, that in spite of all of the work done, the basic question of whether it is consistent, relative to anything, for the first two strongly compact cardinals to be non-supercompact yet to satisfy significant indestructibility properties for their strong compactness had heretofore been left unanswered. This should be contrasted to the relative ease with which Laver's forcing of [18] iterates, to produce models such as the one given in [1] in which there is a proper class of supercompact cardinals and every supercompact cardinal κ has its supercompactness indestructible under directed closed forcing.
The purpose of this paper is to provide an affirmative answer to the above question. Specifically, we will prove the following theorem.
THEOREM 1.1. It is consistent, relative to the existence of two supercompact cardinals, for the first two strongly compact cardinals and to be non-supercompact yet to satisfy significant indestructibility properties for their strong compactness. Specifically, in our final model VP, strong compactness is indestructible under arbitrary directed closed forcing, and strong compactness is indestructible under either trivial forcing or directed closed forcing that can be written in the form.
We note that Theorem 1.1 is a generalization of a sort of Theorem 1 of [3]. In the model constructed for this theorem, the first two strongly compact cardinals and aren't supercompact (and in fact, are the first two measurable cardinals), strong compactness is fully indestructible under directed closed forcing, yet measurability, but not necessarily its strong compactness, is indestructible under arbitrary directed closed forcing.
Abstract. We consider the distinction between abstract computability, in which computation is independent of data representations, and concrete computability, in which computations are dependent on data representations. The distinction is useful for current research in computability theories for continuous data and uncountable structures, including topological algebras and higher types. The distinction is also interesting in the seemingly simple case of discrete data and countable algebras. We give some theorems about equivalences and inequivalences between abstract models (e.g., computation with ‘while’ programs) and concrete models (e.g., computation via numberings) in the countable case.
Introduction. By a computability theory we mean a theory of functions and sets that are definable using a model of computation. By a model of computation we mean a model of some general method of calculating the value of a function or of deciding, or enumerating, the elements of a set. We allow the functions and sets to be made from any kind of data.
With this terminology, Classical Computability Theory on the set N of natural numbers is made up of many computability theories. The computable functions and computably enumerable sets are definable by scores of models of computation, based on scores of ideas about machines, programs, algorithms, specifications, rewriting systems, and calculi. It was an important early discovery, in 1936, that different models of computation can be shown to define the same classes of functions and sets. The fact that diverse computability models lead to the same classes of functions and sets on N is the main pillar supporting the Church-Turing Thesis, which gives the classical theory on N its extraordinary unity.
Starting in the 1940s, computability theories have been created for other special sets of data, including:
Edited by
Zoé Chatzidakis, Université de Paris VII (Denis Diderot),Peter Koepke, Rheinische Friedrich-Wilhelms-Universität Bonn,Wolfram Pohlers, Westfälische Wilhelms-Universität Münster, Germany
Abstract. A set A is Martin-Lof random iff the class ﹛A﹜ does not have measure 0. A set A is PA-complete if one can compute relative to A a consistent and complete extension of Peano Arithmetic. It is shown that every Martin-Lof random set either permits to solve the halting problem K or is not PA-complete. This result implies a negative answer to the question of Ambos-Spies and Kucera whether there is a Martin-Lof random set not above K which is also PA-complete.
Introduction. Gacs [3] and Kucera [7, 8] showed that every set can be computed relative to a Martin-Lof random set. In particular, for every set B there is a Martin-Lof random set A such that where K is the halting problem. A can even be chosen such that the reduction from B to A is a weak truth-table reduction, Merkle and Mihailovic [12] give a simplified proof for this fact.
A natural question is whether it is necessary to go up to the degree of in order to find the random set A. Martin-L of random sets can be found below every set which is PA-complete, so there are Martin-Lof random sets in low and in hyperimmune-free Turing degrees. A set A is called PA-complete if one can compute relative to A a complete and consistent extension of the set of first-order formulas provable in Peano Arithmetic. An easier and equivalent definition of being PA-complete is to say that given any partial-recursive and ﹛0, 1﹜-valued function, one can compute relative to A a total extension Ψ of. One can of course choose Ψ such that also Ψ is ﹛0, 1﹜-valued.
Extending all possible ﹛0, 1﹜-valued partial-recursive functions is as difficult as to compute a ﹛0, 1﹜-valued DNR function. A diagonally nonrecursive (DNR) function f satisfies whenever is defined.
Edited by
Zoé Chatzidakis, Université de Paris VII (Denis Diderot),Peter Koepke, Rheinische Friedrich-Wilhelms-Universität Bonn,Wolfram Pohlers, Westfälische Wilhelms-Universität Münster, Germany
Epistemology seems to enjoy an unexpectedly sexy reputation these days. A few years ago William Safire wrote a popular novel called The Sleeper Spy. It depicts a distinctly post-cold-war world in which it is no longer easy to tell the good guys – or, rather, the good spies – from the bad ones. To emphasize this sea change, Safire tells us that his Russian protagonist has not been trained in the military or the police, as he would have been during the old days, but as an epistemologist.
Jaakko Hintikka (2003b)
Conceptual Analysis
One often hears that philosophy largely concerns conceptual analysis. Conceptual analysis is enjoying a revival these days after having been put to sleep for a number of years partly due to the stream of naturalism that has fled the philosophical landscape for the past 50 years or so.
In contemporary mainstream epistemology, the goal of these new conceptual exercises is to spell out and elucidate some of the epistemologically significant notions, like knowledge, justification and rationality, that ordinary folk use on a daily basis. An integral part of the elucidation process is to stretch the usage of these concepts to the max in order to reveal their limitations and what these limitations in turn reveal about the nature of human cognition. Seen from this perspective, conceptual analysis is focused on clarifying how words are used in everyday epistemic contexts.
The actual ‘stretching’ is performed by applying the method of ‘consulting intuitions about possible cases,’ as Jackson (1994) recently made a case for. Jackson takes conceptual analysis to be an indispensable part of intellectual activity in general.