To save content items to your account,
please confirm that you agree to abide by our usage policies.
If this is the first time you use this feature, you will be asked to authorise Cambridge Core to connect with your account.
Find out more about saving content to .
To save content items to your Kindle, first ensure no-reply@cambridge.org
is added to your Approved Personal Document E-mail List under your Personal Document Settings
on the Manage Your Content and Devices page of your Amazon account. Then enter the ‘name’ part
of your Kindle email address below.
Find out more about saving to your Kindle.
Note you can select to save to either the @free.kindle.com or @kindle.com variations.
‘@free.kindle.com’ emails are free but can only be saved to your device when it is connected to wi-fi.
‘@kindle.com’ emails can be delivered even when you are not connected to wi-fi, but note that service fees apply.
The ingestion of water in some form is widely recognized as essential for human life. But we usually do not consider water as food because it does not contain any of those substances we regard as nutriments. Yet if its status as a foodstuff remains ambiguous, it is far less so than it has been through much of human history. Water (or more properly “waters,” for it is only in the last two centuries that it can really have been viewed as a singular substance) has been considered as food, a solvent for food, a pharmaceutical substance, a lethal substance, a characteristic physiological state, and a spiritual or quasi-spiritual entity.
This chapter raises questions about what sort of substance water has been conceived to be and what nutritional role it has been held to have. Moreover it also explores what we know of the history of the kinds of waters that were viewed as suitable to drink – with regard to their origins, the means used to determine their potability, and their preparation or purification. It also has a little to say about historical knowledge of drinking-water habits (i.e., how much water did people drink at different times and situations?) and water consumption as a means of disease transmission.
What Water Is
Modern notions of water as a compound chemical substance more or less laden with dissolved or suspended minerals, gases, microorganisms, or organic detritus have been held at best for only the last two centuries. Even earlier ideas of water as one of the four (or five) elements will mislead us, for in many such schemes elements were less fundamental substances than dynamic principles (e.g., in the case of water, the dynamic tendency is to wet things, cool them, and dissolve them) or generic labels for regular combinations of qualities.
An adverse food reaction is defined as any untoward reaction following the ingestion of food (Lifshitz 1988). These reactions generally fall into two categories: food intolerance and food hypersensitivity. Intolerances are nonimmunologic (Sampson and Cooke 1990) and are responsible for most adverse food reactions. They may be idiosyncratic, due to metabolic disorders, or caused by pharmacological substances such as toxins or drugs present in food. Food additives can also be a cause of food intolerance or hypersensitivity and can produce respiratory and gastrointestinal complaints (Lifshitz 1988).
True food hypersensitivity, or allergy, is an adverse food reaction involving immunologic mechanisms. It is initiated by production of specific antibodies, otherwise known as immunoglobulins, in reaction to food constituents. The body manufactures antibodies as part of its regular defense system against foreign invaders such as viruses and bacteria. In certain individuals, the immune system is triggered to elicit a specific antibody, called immunoglobulin E (IgE), against various environmental substances like pollens, pet dander, insect venoms, and foods. The common immunologic mechanisms involved can cause food-allergic reactions to resemble allergic reactions to honeybee stings or penicillin.
Mechanism
In food-allergic individuals, IgE antibodies are produced against food components and circulate in the blood. Upon reaching certain cells, known as mast cells and basophils, the IgE becomes fixed to the cell surface and remains there. These cells contain high quantities of special receptors for IgE; rat mast cells have been found to contain 2-5 × 105 receptors per cell (Mendoza and Metzger 1976). A large portion of the IgE in the body is fixed to these cells, and when they become armed with IgE, they are said to be “sensitized.”
Rye (Secale cereale L.) is closely related to the genus Triticum (which includes bread wheat, durum wheat, spelt, and the like) and has sometimes been included within that genus (Mansfeld 1986: 1447). In fact, it was possible to breed Triticale, a hybrid of Triticum and Secale, which is cultivated today (Mansfeld 1986: 1449).
Cultivated rye (Secale cereale) is also so closely related genetically to the wild rye (Secale montanum) that both species would appear to have had the same ancestors. Yet to say that the cultivated rye plant derived from the wild one is an oversimplification because both plants have been changing their genetic makeup since speciation between the wild and cultivated plants first occurred.
The cultigen Secale cereale was brought to many parts of the world, but wild rye still grows in the area where cultivated rye originated, which embraces the mountains of Turkey, northwestern Iran, Caucasia, and Transcaucasia (Zohary and Hopf 1988: 64–5; Behre 1992: 142).
The distribution area of wild rye is slightly different from the area of origin of other Near Eastern crops. Wild rye is indigenous to areas north of the range of the wild Triticum and Hordeum species; these areas have a more continental climate with dry summers and very cold, dry winters. The environmental requirements of cultivated rye reflect these conditions of coldness and dryness: It has a germination temperature of only 1 to 2 degrees Centigrade, which is lower than that of other crops. Indeed, low temperatures are necessary to trigger sprouting (Behre 1992: 145), and the plant grows even in winter if the temperature exceeds 0 degrees Centigrade, although rye can suffer from a long-lasting snow cover. In spring it grows quickly, so that the green plant with unripe grains reaches full height before the summer drought begins (Hegi 1935: 498–9). Obviously, these characteristics make rye a good winter crop. It is sown in autumn, grows in winter and spring, and ripens and is harvested in summer – a growth cycle that is well adapted to continental and even less favorable climatic conditions. There is also another cultigen of rye – summer rye – which is grown as a summer crop. But because of a low yield and unreliability, it is rather uncommon today (Hegi 1935: 497).
Despite swelling populations around much of the globe, the enormous expansion of agricultural productivity, the rapid development of transport facilities, and the establishment of globally interlinked market networks have made it theoretically possible to provide adequate food for all. Yet famine and hunger still persist and, indeed, proliferate in some parts of the world. Their durable character represents a perplexing and wholly unnecessary tragedy (Drèze and Sen 1989;Watts and Bohle 1993b). Although the extent of hunger in the world will never be known with precision (Millman 1990), it has been estimated that in the early 1990s, more than 500 million adults and children experienced continuous hunger and even more people were deemed vulnerable to hunger, with over 1 billion facing nutritional deficiencies (WIHD 1992).
The community concerned with world hunger is far from unanimous in its understanding of the situation. S. Millman (1990) likens the situation to the parable of the elephant and the blind men, whereby hunger is perceived differently by those encountering different aspects of it. It is significant that these varying perceptions correspond to particular disciplinary or professional orientations, leading to different diagnoses of the nature of the problem and its underlying causes and implying distinct foci for policy interventions.
Problems of food supply, then, are among the most bewildering, diffuse, and frustrating of humankind’s contemporary dilemmas. Within the lifetime of each of us, official views of the world food situation have oscillated from dire predictions of starving hordes to expectations of a nirvana of plentiful food, then back to impending doom. One expert states that famine is imminent, whereas another declares that our ability to adequately feed the world’s people is finally within reach.
The history of the discovery of the B vitamins includes both the recognition that particular diseases can result from dietary inadequacies and the subsequent isolation of specific nutrients from foods that have been found to prevent and to cure those diseases. Most of these dietary deficiencies were first recognized in humans, but in certain instances, the deficiency was produced by feeding restricted diets to experimental animals.
After each of the B vitamins was isolated, extensive biochemical and physiological studies were conducted to define their specific functions. States of B vitamin dependency were also discovered in which the need for a particular vitamin exceeded the physiological level. These vitamin dependencies were found either to have a genetic basis or to be drug-induced.
Moreover, recognition that certain synthetic compounds bearing close chemical resemblance to B vitamins could block the activity of naturally occurring vitamins has led to a better understanding of vitaminic functions and has provided us with certain drugs that are used in cancer chemotherapy and in the treatment of infections and inflammatory diseases.
Vitamin B research, as well as clinical and public health information, is summarized here to emphasize some significant advances in our knowledge of these compounds. But first a note on nomenclature seems appropriate.
The mallard, Anas platyrhynchos, is the most ubiquitous taxon in the subfamily Anatinae of the family Anatidae. It is the ancestor of most domestic ducks, the males of which still sport the ancestral curling feathers of the upper tail (Delacour 1956–64; Gooders 1975; Gooders and Boyer 1986). Because the wild mallard is so widespread in the Northern Hemisphere, it is extremely likely that it was widely utilized by humans and probably domesticated in different areas at different times. The amount of variability in the domestic duck is very small compared with that found in the domestic chicken (Thomson 1964), and it would seem that present-day domestic ducks evolved gradually (Woelfle 1967), in the process becoming larger than the wild type, with much more variety in color, size, and gait (Clayton 1984). Domestic ducks have also lost the ability to fly.
The excellent flavor of duck flesh (as well as the eggs) has been enjoyed from prehistoric times to the present day. An important incentive in breeding ducks for meat has been the fact that they have a fast growth rate and can be killed as young as 6 to 7 weeks of age and still be palatable. A disadvantage, however, is that duck carcasses are very fatty (Clayton 1984).
Ducks are raised in large numbers in many Western countries, such as the Netherlands, Britain, and the United States, although intensive duck production has occurred only in the last 20 years (Clayton 1984). In Britain, most commercial ducks are found in Norfolk (although some are kept in Aberdeen and Dumfries), but these constitute only about 1 percent of all poultry in the country (Urquhart 1983). Ducks are less prone to disease than hens but eat more food. Unfortunately, their eggs are unpopular with British consumers because they are thought to be unclean. Ducks destined for the supermarkets are killed when they are from 7 to 9 weeks old.
The human body requires an adequate supply of ascorbic acid (L-xyloascorbic acid or vitamin C) to enable it to function normally, and a lack of the vitamin results in the emergence of the condition known as scurvy (scorbutus or avitaminosis C). Unlike plants, and the majority of animals thus far studied, humans are unable to produce ascorbic acid endogenously and, thus, are dependent upon dietary sources – mainly fruit and vegetables – for a supply of the vitamin. In the absence of vitamin C, formation of collagen, an essential glycoprotein component of connective tissue, is impaired, which is believed to be the main underlying biochemical lesion in scurvy (Counsell and Hornig 1981; Englard and Seifter 1986).
The earliest signs of scurvy (fatigue and lassitude) may emerge in humans some 12 weeks after removal of dietary vitamin C, and the more overt traditional signs (hemorrhagic spots under the skin [petechiae], softening of the gums, and defective wound healing) after some 17 to 26 weeks of deprivation.
In 1753, James Lind concluded his pioneer study of scurvy with a chronological Bibliotheca Scorbutica, which imparted a mild historical flavor to his text (Stewart and Guthrie 1953). But more than a century was to pass before the first sustained efforts to produce a history of the disease emerged. One of these was J. Maré’s 200-page article in the Dictionnaire Encyclopédique des Sciences Médicales in 1880, and the second was August Hirsch’s 60-page article in his Handbook of Geographical and Historical Pathology (1883–6).
The eminent medical historian Henry E. Sigerist once noted that “[t]here is no sharp borderline between food and drug,” and that both dietetic and pharmacological therapies were “born of instinct” (Sigerist 1951: 114–15).Today we tend to focus our studies of food on its nutritive values in promoting growth and health and in preventing disease, but for many centuries past, food had an additional, specifically medical role – as a remedy for illness.
The United States Food, Drug, and Cosmetic Act, signed into law June 27, 1938, provides no clearer differentiation between “food” and “drug” than Sigerist could. According to the current wording of that legislation, which updated the Pure Food and Drug Act of 1906, “the term ‘food’ means (1) articles used for food or drink for man or other animals, (2) chewing gum, and (3) articles used for components for any other such article,” whereas “the term ‘drug’ means (A) articles recognized in the official United States Pharmacopoeias [and several other compendia]; and (B) articles intended for use in the diagnosis, cure, mitigation, treatment, or prevention of disease in man or other animals; and (C) articles (other than food) intended to affect the structure or any function of the body of man or other animals; and (D) articles intended for use as a component of any articles specified in clause (A), (B), or (C).” Under clause (B) above, many items that have traditionally been considered foods might also be regarded as drugs under federal law, although they seldom are. The Food and Drug Administration (FDA) can intervene in cases involving food only when it judges an item to be misleadingly labeled as a “food”; it specifically excludes vitamins from the category “drugs.”
The complex of clinical disturbances long known as beriberi has been recognized since early in the twentieth century as arising because of a deficiency of thiamine. Like others in the group of B vitamins, thiamine has essential coenzyme functions in intermediary metabolism. The principal role of this water-soluble molecule is that of precursor of thiamine pyrophosphate (thiamine diphosphate), a coenzyme (often referred to as cocarboxylase) in energy generation through oxidative decarboxylation of alpha-ketoacids, such as pyruvic acid and alpha-ketoglutaric acid. Thiamine pyrophosphate serves also as the coenzyme for transketolase, a catalyst in the pentose phosphate pathway of glucose metabolism. Measurement of transketolase activity in erythrocytes and stimulation of activity by added thiamine pyrophosphate is the most convenient and sensitive method for detecting human thiamine deficiency. As the pyrophosphate and/or the triphosphate, thiamine appears also to have a role in facilitating conduction in peripheral nerves.
Like most vitamins, thiamine cannot be synthesized in the body and must be acquired in the diet. It is found in many foods and is most abundant in grains, legumes, nuts, and yeast. All meats and most dairy products contain some thiamine, but the richest sources are pork and eggs. Milk, however, is not a rich source. As a water-soluble vitamin, thiamine is easily lost in cooking water. Fish can supply good amounts, but with fermentation of raw fish, the enzyme thiaminase may alter the thiamine molecule, blocking its biological activity. Dietary thiamine is actively absorbed, mainly in the small intestine. The recommended daily dietary allowance is 1.2 to 1.5 milligrams for adult males and 1.0 to 1.1 milligrams for adult females, depending upon age, with a 50 percent increase in requirements during pregnancy and lactation. Up to about 25 milligrams can be stored by a healthy person, especially in heart muscle, brain, liver,kidney, and skeletal muscle.
In writing the history of culinary practices, there is a tendency to emphasize the ethnic character of diets (González 1988). Yet nowhere are historical entanglements more apparent than in the international character of modern cuisine, even if explicit ethnic territories are strongly defended. Foods are often defined with apparent regard to national origin: Indian corn, Irish potatoes, Italian tomatoes, Dutch chocolate, and Hawaiian pineapple, to name but a few. However, the plants that form the basis of many European cuisines in fact originated in the Americas (Keegan 1992), and American diets were transformed in what Alfred Crosby (1986) has described as the creation of the neo-Europes.
“You call it corn, we call it maize.” Contrary to the American television commercial in which a very Navaho-looking women makes that statement, the word is actually of Taino origin. Peter Martyr was among the first Europeans to describe this plant that the native West Indians called maíz (Zea mays) (Sauer 1966: 55). Other Taino words for plants and animals have also entered the English lexicon, including cazabi and yuca (Manihot esculenta Crantz), guayaba (Psidium guajava L.), bixa (Bixa orellana L.), iguana, and manati (Trichechus manatus) (Oviedo [1526]1959: 13–16;Taylor 1977: 20–1).
Cultigens from the circum-Caribbean lowlands have also been of significant effect (Keegan et al. 1992). Tomatoes (Lycopersicon esculentum Mill.) were first encountered in coastal Mexico, where the Spanish were also treated to a drink called chocolatl, a blend of cacao (Theobroma cacao L.), peppers (Capsicum spp.), and other spices (including Bixa orellana L.). Cacao won immediate acceptance; together, it and vanilla (Vanilla spp.), a semidomesticated lowland orchid, have become the most important flavorings in the world. In contrast, the tomatl (tomato) languished under the specter of its membership in the “deadly” nightshade family of plants. First grown as an ornamental, and only much later as food, the tomato eventually reshaped Italian cuisine.
Vitamin A is a fat-soluble substance essential to the health, survival, and reproduction of all vertebrates. As with all vitamins, it is needed in only small amounts in the human diet, about 1 to 1.5 milligrams a day. Vitamin A does not occur in the plant kingdom, but plants supply animals with precursors (or provitamins), such as beta-carotene and other carotene-related compounds (carotenoids), that are converted to vitamin A in the intestinal mucosa of animals and humans. Beta-carotene (and other carotenoids) are abundant in all photosynthesizing parts of plants (green leaves), as well as in yellow and red vegetables. Vitamin A, also known as “retinol,” is itself a precursor of several substances active in the vertebrate organism; these are collectively termed “retinoids.” One retinoid is retinoic acid, an oxidation product of retinol, formed in the liver and other organs and existing in different chemical isomers, such as all-trans retinoic acid and 9-cis-retinoic acid, with different functions. Other retinoids are all-trans-retinaldehyde and its 11-cis-isomer. The latter is active in the retina of the eye, forming the light-sensitive pigment rhodopsin by combination with the protein opsin. In the liver, retinol is stored in the form of its ester (retinyl palmitate).
Retinoids in the animal organism are generally not found in the free state but are bound to specific proteins. Thus, in the blood, retinol is carried by a retinolbinding protein and, within cells, by an intracellular retinol-binding protein. Retinoic acid and retinaldehyde are carried by specific intracellular binding proteins. When carrying out its hormonal function, retinoic acid combines with another set of proteins, called retinoic acid receptors, located in the cell nucleus. The retinoic acid-receptor complex can then interact with specific genes at sites known as retinoic acid response elements, thereby activating these genes and causing them to stimulate (or repress) the expression of specific proteins or enzymes involved in embryonic development, cell differentiation, metabolism, or growth.
Dietary reconstruction for past populations holds significant interest as it relates to biological and cultural adaptation, stability, and change. Although archaeological recovery of floral and faunal remains within a prehistoric or historical context provides some direct evidence of the presence (and sometimes quantity) of potential food resources, indirect evidence for the dietary significance of such foodstuffs frequently must be deduced from other bioarchaeological data.
The types of data with dietary significance range from recovered plant and animal remains through evidence of pathology associated with diet, growth disruption patterns, and coprolite contents. Other traditional approaches involving the people themselves – as represented by skeletal remains – include demographic (Buikstra and Mielke 1985) and metabolic (Gilbert 1985) stress patterns.
In addition to bioanthropological analyses, reconstruction of environmental factors and the availability and limits of food species and their distribution for a population with a particular size, technology, and subsistence base are typical components within an archaeological reconstruction. Although these physical aspects are significant, the distribution, or more likely the restriction, of particular foodstuffs from certain segments of the population (because of sex, age, status, food avoidance, or food taboos) may be important cultural system features. The seasonal availability of food and its procurement, preservation, and preparation may also have influenced group dietary patterns and nutritional status (Wing and Brown 1979).
Analysis of skeletal remains may also provide some direct evidence of diet. Type and adequacy of diet have long been of interest to physical anthropologists, especially osteologists and paleopathologists (Gilbert and Mielke 1985; Larsen 1987). More recently, direct chemical analysis of bones and teeth has been attempted in an effort to assess the body’s metabolism and storage of nutritive minerals and other elements. L. L. Klepinger (1984) has reviewed the potential application of this approach for nutritional assessment and summarized the early findings reported in the anthropological literature. (In addition, see Volume 14 of the Journal of Human Evolution [1985], which contains significant research surveys to that date.)
One of the most important of today’s oil crops, the sunflower is a unique contribution of temperate North America to the world’s major food plants. In addition to its superior oil, the seed of the sunflower is much appreciated as a food. Other parts of the plant were used for a variety of purposes by Native Americans. Today the species is also widely grown as an ornamental for its large showy heads.
Biology
Scientifically, the sunflower crop plant, known as Helianthus annuus var. macrocarpus, is a member of the family Asteraceae. It is an annual, is unbranched, grows from 1 to 3 meters tall, and bears a single large head up to 76 centimeters (cm) in diameter. Each head contains showy yellow sterile ray flowers and up to 8,000 smaller disk flowers. The latter produce the fruits, technically known as achenes, but commonly called seeds. The fruits, from 6 to 16 millimeters (mm) in length, contain a single seed.
In addition to the cultivated variety, the sunflower also includes branched, smaller-headed varieties (Helianthus annuus var. annuus and Helianthus annuus var. lenticularis) that are common as weeds or wild plants in North America from southern Canada to northern Mexico. Forms of the sunflower, particularly those with double flowers or red ray flowers, are cultivated as ornamentals, but more so in Europe than in North America (Heiser 1976).
We began work on the Cambridge History and Culture of Food and Nutrition Project even as we were still reading the page proofs for The Cambridge World History of Human Disease, published in 1993. At some point in that effort we had begun to conceive of continuing our history of human health by moving into food and nutrition – an area that did more than simply focus on the breakdown of that health. For the history of disease we had something of a model provided by August Hirsch in his three-volume Handbook of Geographical and Historical Pathology (London, 1883–6). Yet there was no “Handbook of Geographical and Historical Food and Nutrition” to light the way for the present volumes, and thus they would be unique.
Fortunately, there was no lack of expertise available; it came from some 200 authors and board members, representing a score of disciplines ranging from agronomy to zoology. This undertaking, then, like its predecessor, represents a collective interdisciplinary and international effort, aimed in this case at encapsulating what is known of the history of food and nutrition throughout humankind’s stay on the planet. We hope that, together, these volumes on nutrition and the earlier one on disease will provide scholars of the future – as well as those of the present – a glimpse of what is known (and not known) about human health as the twentieth century comes to a close.
Two of our major themes are embedded in the title. Food, of course, is central to history; without it, there would be no life and thus no history, and we devote considerable space to providing a history of the most important foodstuffs across the globe. To some extent, these treatments are quantitative, whereas by contrast, Nutrition – the body’s need for foods and the uses it makes of them – has had much to do with shaping the quality of human life. Accordingly, we have placed a considerable array of nutritional topics in longitudinal contexts to illustrate their importance to our past and present and to suggest something of our nutritional future.
In considering the human body’s demand for food and nutrition, the simple need for liquid refreshment is sometimes overlooked. Although this fundamental physiological requirement can be satisfied by drinking an adequate supply of pure water, most people, when given a choice, prefer to achieve the required level of liquid intake with a variety of flavored drinks to stimulate the palate.
Soft drinks are usually defined as nonalcoholic, water-based drinks, although a few may contain alcohol, albeit in quantities too small to warrant their classification as “hard liquor.” Soft drinks are usually sweetened – soda water being an obvious exception – and flavored with food acids, essences, and sometimes fruit juices. They are often carbonated – that is, charged with carbon dioxide gas – and, indeed, in North America are referred to as carbonated beverages. In some countries, including the United Kingdom, there is a significant retail market for concentrated soft drinks intended for dilution at home before consumption. Soft drinks in powdered form are similarly marketed for preparation at home. In addition, uncarbonated, ready-to-drink soft drinks are also found.
The flavors of soft drinks may be derived from fruits, nuts, berries, roots, herbs, and other plants. Moreover, fruit (and to some extent vegetable) juices, as such, have grown in popularity in recent years and have come to be included among the soft drinks. In many countries, soft drinks are distinguished from hard liquor by the higher taxation of stronger drinks, for example through excise duties, and the term “nonalcoholic” can sometimes mean merely “non-excisable.” However, soft drinks are frequently subject to other taxes, though usually at lower levels than those that are levied on alcoholic drinks. Soft drinks are often distinguished from medicines by legislation. In the past, these distinctions were less precise, and a historical study of soft drinks will include products that began with a mainly medicinal purpose but are regarded today as simple refreshment.
The basic ingredients that have historically comprised the southern European diet are well known and have recently received much attention for their health-promoting benefits: These are bread, wine, olive oil, and a wide variety of fruits and vegetables supplemented by fish, dairy products, and a relatively small amount of animal flesh.
Less known, however, are the historical forces that shaped how southern Europeans think about food. Essentially, three rival systems have influenced the culture of food in southern Europe since late antiquity, and in various combinations these systems have informed eating patterns at all levels of society.
The most pervasive of these food systems might be called “Christian,” although its roots are not necessarily found in the teachings of Jesus and his disciples. It encompasses monastic asceticism as well as the calendar of fasts and feasts that have historically regulated food consumption. In all its manifestations, the ideal goal of Christian foodways has been spiritual purity through the control of bodily urges, though this can easily be lost sight of when rules are bent and holidays become occasions for excess.
The second major system is medical in origin and has gained and lost popularity in the past two millennia depending on the state of nutritional science, though it continues to influence common beliefs to this day. The object of this system of “humoral physiology,” of course, is the maintenance or recovery of health by means of dietary regimen.
Lastly, the “courtly” or gastronomic food culture has also profoundly influenced southern Europe, radiating from urban centers of power such as Rome, Naples, Venice, and the courts of Aragon, Castile, and Provence. Its goal is ostensibly pleasure, but this is usually mixed with motives of conscious ostentation in order to impress guests.