We use cookies to distinguish you from other users and to provide you with a better experience on our websites. Close this message to accept cookies or find out how to manage your cookie settings.
To save content items to your account,
please confirm that you agree to abide by our usage policies.
If this is the first time you use this feature, you will be asked to authorise Cambridge Core to connect with your account.
Find out more about saving content to .
To save content items to your Kindle, first ensure no-reply@cambridge.org
is added to your Approved Personal Document E-mail List under your Personal Document Settings
on the Manage Your Content and Devices page of your Amazon account. Then enter the ‘name’ part
of your Kindle email address below.
Find out more about saving to your Kindle.
Note you can select to save to either the @free.kindle.com or @kindle.com variations.
‘@free.kindle.com’ emails are free but can only be saved to your device when it is connected to wi-fi.
‘@kindle.com’ emails can be delivered even when you are not connected to wi-fi, but note that service fees apply.
Ostrinia furnacalis Guenée (Lepidoptera: Crambidae) is a key lepidopteran pest affecting maize production across Asia. While its general biology has been well studied, the phenomenon of pupal ring formation remains poorly understood. This study examined the factors influencing pupal ring formation under controlled laboratory conditions. Results showed that pupal rings were formed exclusively when larvae were reared on an artificial diet, with no ring formation observed on corn-stalks. Females exhibited a significantly higher tendency to participate in ring formation than males. Additionally, male participation increased proportionally with the number of rings formed, a pattern not observed in females. The size of the rearing arena significantly influenced ring formation, with smaller arenas (6 cm diameter) promoting more frequent pairing, particularly among females. Temperature also played a significant role: lower participation rates were recorded at 22 °C compared to 25 °C and 28 °C, although the number of rings formed did not differ significantly across temperatures. Developmental stage and sex composition further influenced pairing behaviour; pupal rings formed only among individuals of similar maturity, and male participation was significantly reduced in all-male groups compared to mixed-sex groups. These findings suggest that pupal ring formation in O. furnacalis is modulated by dietary substrate, larval sex, environmental conditions, and developmental synchrony, offering new insights into the behavioural ecology of this pest.
Asian corn borer, Ostrinia furnacalis Guenée (Lepidoptera: Crambidae), is a major pest in corn production, and its management remains a significant challenge. Current control methods, which rely heavily on synthetic chemical pesticides, are environmentally detrimental and unsustainable, necessitating the development of eco-friendly alternatives. This study investigates the potential of the entomopathogenic nematode Steinernema carpocapsae as a biological control agent for O. furnacalis pupae, focusing on its infection efficacy and the factors influencing its performance. We conducted a series of laboratory experiments to evaluate the effects of distance, pupal developmental stage, soil depth, and light conditions on nematode attraction, pupal mortality and sublethal impacts on pupal longevity and oviposition. Results demonstrated that S. carpocapsae exhibited the highest attraction to pupae at a 3 cm distance, with infection declining significantly at greater distances. Younger pupae (<12 h old), were more attractive to nematodes than older pupae, and female pupae were preferred over males. Nematode infection was highest on the head and thorax of pupae, with a significant reduction in infection observed after 24 h. Infection caused 100% mortality in pupae within 2 cm soil depth, though efficacy was reduced under light conditions. Sublethal effects included a significant reduction in the longevity of infected adults and a decrease in the number of eggs laid by infected females compared to controls. These findings underscore the potential of S. carpocapsae as an effective biocontrol agent for sustainable pest management in corn production, offering a viable alternative to chemical pesticides.
The Asian corn borer (ACB), Ostrinia furnacalis (Guenée, 1854), is a serious pest of several crops, particularly a destructive pest of maize and other cereals throughout most of Asia, including China, the Philippines, Indonesia, Malaysia, Thailand, Sri Lanka, India, Bangladesh, Japan, Korea, Vietnam, Laos, Myanmar, Afghanistan, Pakistan and Cambodia. It has long been known as a pest in South-east Asia and has invaded other parts of Asia, Solomon Islands, parts of Africa and certain regions of Australia and Russia. Consequently, worldwide efforts have been increased to ensure new control strategies for O. furnacalis management. In this article, we provide a comprehensive review of the ACB covering its (i) distribution (geographic range and seasonal variations), (ii) morphology and ecology (taxonomy, life-history, host plants and economic importance) and (iii) management strategies (which include agroecological approaches, mating disruption, integrated genetic approaches, chemical as well as biological control). Furthermore, we conclude this review with recommendations to provide some suggestions for improving eco-friendly pest management strategies to enhance the sustainable management of ACB in infested areas.
Validating the theoretical work on Rayleigh–Taylor instability (RTI) through experiments with an exceptionally clean and well-characterized initial condition has been a long-standing challenge. Experiments were conducted to study the three-dimensional RTI of an SF$_6$–air interface at moderate Atwood numbers. A novel soap film technique was developed to create a discontinuous gaseous interface with controllable initial conditions. Spectrum analysis revealed that the initial perturbation of the soap film interface is half the size of an entire single-mode perturbation. The correlation between the initial interface perturbation and Atwood numbers was determined. Due to the steep and highly curved feature of the initial soap film interface, the early-time evolution of RTI exhibits significant nonlinearity. In the quasi-steady regime, various potential flow models accurately predict the late-time bubble velocities by considering the channel width as the perturbation wavelength. Differently, the late-time spike velocities are described by these potential flow models using the wavelength of the entire single-mode perturbation. These findings indicate that the bubble evolution is influenced primarily by the spatial constraint imposed by walls, while the spike evolution is influenced mainly by the initial curvature of the spike tip. Consequently, a recent potential flow model was adopted to describe the time-varying amplitude growth induced by RTI. Furthermore, the self-similar growth factors for bubbles and spikes were determined from experiments and compared with existing studies, revealing that a large amplitude in the initial soap film interface promotes the spike development.
Evidence for necrotising otitis externa (NOE) diagnosis and management is limited, and outcome reporting is heterogeneous. International best practice guidelines were used to develop consensus diagnostic criteria and a core outcome set (COS).
Methods
The study was pre-registered on the Core Outcome Measures in Effectiveness Trials (COMET) database. Systematic literature review identified candidate items. Patient-centred items were identified via a qualitative study. Items and their definitions were refined by multidisciplinary stakeholders in a two-round Delphi exercise and subsequent consensus meeting.
Results
The final COS incorporates 36 items within 12 themes: Signs and symptoms; Pain; Advanced Disease Indicators; Complications; Survival; Antibiotic regimes and side effects; Patient comorbidities; Non-antibiotic treatments; Patient compliance; Duration and cessation of treatment; Relapse and readmission; Multidisciplinary team management.
Consensus diagnostic criteria include 12 items within 6 themes: Signs and symptoms (oedema, otorrhoea, granulation); Pain (otalgia, nocturnal otalgia); Investigations (microbiology [does not have to be positive], histology [malignancy excluded], positive CT and MRI); Persistent symptoms despite local and/or systemic treatment for at least two weeks; At least one risk factor for impaired immune response; Indicators of advanced disease (not obligatory but mut be reported when present at diagnosis). Stakeholders were unanimous that there is no role for secondary, graded, or optional diagnostic items. The consensus meeting identified themes for future research.
Conclusion
The adoption of consensus-defined diagnostic criteria and COS facilitates standardised research reporting and robust data synthesis. Inclusion of patient and professional perspectives ensures best practice stakeholder engagement.
Comprehensive geriatric assessment (CGA) has been one of the cornerstones of geriatric medicine since its introduction by Marjory Warren in 1936. This kind of assessment is defined as a multidimensional and multidisciplinary process related to identifying medical, social, and functional needs and developing an integrated care plan designed to meet the patien’st needs.The practice and applications of CGA have been used to various degrees in mainstream care for older people in the UK and internationally.
Some limitations still exist around the wider implementation of CGA, as its practice relies on members of the multidisciplinary team (MDT) and on an effective communication between them, the patients, and their families. This kind of assessment has been criticised for not adequately acknowledging frailty and for not using patient-reported outcome measures to test its efficacy.
Randomised controlled studies, systematic reviews, and meta-analyses provided considerable evidence for the clinical and financial effectiveness of CGA in various hospital specialties. However, there are still concerns about the generalisability of CGA in community settings. Further research to identify target populations for CGA-led interventions and a consensus on outcome measures are required to realise CGA benefits.
In this chapter we describe required skills and practical tips to deliver CGA across a variety of settings.
Older people are one of the biggest populations requiring hospital care, and the demand is expected to rise. There is a compelling need to transform hospital environments to meet older-people physical, psychological, and emotional needs. In the UK, certain hospital circumstances such as ward configuration, mealtimes, noise levels, and visiting hours can be detrimental to patients admitted with delirium and to those living with dementia. In rehabilitation settings, lack of meaningful activities, isolation, and boredom are additional key challenges.
Models of good hospital practice catering for old people exist, both in the UK and internationally, and there is strong evidence for their clinical effectiveness. Environmental strategies to maintain orientation and enhance safety in hospital are crucial for a positive experience. Arts-based programmes in acute care settinsg can improve the experience of a hospital admission.
A cultural shift is warranted to champion the delivery an elderly-friendly service. Creating the right environment requires a hospital-wide system, a ward-based service, and a specially trained clinical team. In this chapter we will present examples of essential ingredients for hospitals and wards, and desirable qualities in clinicians who work in collaboration to deliver the best outcomes for an older population.
The colluvium and saprolite deposits in the Fran Ali area (Oued Laou, northern Morocco) constitute the main source of raw materials used in traditional pottery. These materials are becoming scarce, however, so alternative materials with the same characteristics are needed; this would ensure the sustainability of pottery activities in the area. The objective of the present study was to examine ten representative samples of clayey materials extracted from the Fran Ali area, i.e. the Ikhadimene, Dar Haddoune, Ihadounene, Aqqbat Ajjoua, and Isalahene sites. The geological materials consist mainly of grayish to brownish phyllites, thin layers of yellowish clay, thicker intervals of reddish-yellow soils ranging in depth from 1 to 4 m, and reddish colluvium soils. The physical properties of these materials were determined using semi-wet sieving and Atterberg limit tests, while chemical, mineralogical, and thermal properties were obtained from the methylene blue test (MBT), the calcimetry test, X-ray fluorescence spectrometry (XRF), X-ray diffraction (XRD), and thermogravimetric and differential thermal (TGA/DTGA) analysis. The results suggest that the soils contain 21–35% clay, 28–34% silt, and 37–52% sand. They are moderately plastic, with methylene blue adsorption capacities ranging from 3 to 7% and minimal CaCO3 carbonate contents (1–4%). Samples are dominated by SiO2 (51–57%), Al2O3 (17–21%), and Fe2O3 (8–10%). Mineralogically, they are composed of illite (19–27%), chlorite (0–22%), kaolinite (5–9%), and quartz (29–32%). Thermal analysis showed a relatively large mass loss of ~10%. The samples are deemed to be moderately plastic. The results indicate that this raw material is acceptable for pottery fabrication, given the small proportion of irregular interlayer content and its average geotechnical properties. In addition, extraction of the colluvium material is not sustainable because of the relative scarcity of the material. Given the mineralogical similarity between the weathered layers (colluvium) and their parent rock (shales), the present results suggest that the latter is a suitable alternative to the former.
This paper outlines the ways in which the project is addressing the colonial legacy of Henry Wellcome as well as presenting the data from the first three field seasons at Jebel Moya, south-central Sudan. These data have substantially revised our chronological and socio-economic understanding of the site. Our excavations, initiated in 2017 and continued in 2019 and 2022, show a longer, more continuous occupation of the site than has been previously recognised. The faunal and botanical remains have implications for the spread of early domesticates in the eastern Sahel and for climate changes, and raise issues of resilience. There is confirmed human burial activity from at least the third millennium BC onwards, while the pottery continues to yield information about the variety of decoration and, for the final Assemblage 3, data on its usage. Overall, the continued importance of the site for the eastern Sahel is re-emphasised.
The recommended daily dose of vitamin D is 2000 IU was found to be insufficient in many patients. The objective of the present study is to find whether the daily dose of vitamin D should be based on BMI. Two hundred and thirty patients with an established vitamin D deficiency (serum level of 25 Hydroxy vitamin D3 (25OHD3) of ≤20 ng/ml) and patients with BMI ≥30 kg/m2 were included in the study. Demographic data, comorbidities and BMI were recorded. Pre-treatment and post-treatment serum 25OHD3, calcium, phosphorus and parathyroid hormone (PTH) were tested at 0-, 3- and 6-month periods. Patients were treated with a standard dose of 50 000 IU of vitamin D weekly and 600/1200 mg of calcium a day. Once their level of 25OHD3 reached ≥30 ng/ml, patients were randomised into two groups. Group A received a standard recommended maintenance dose of 2000 IU daily and Group B patients received 125 IU/kg/m2 of vitamin D3. The data were entered in the database and analysed. The mean age of Group A was 50⋅74 ± 7⋅64 years compared to 52⋅32 ± 7⋅21 years in Group B. In both groups, pre-treatment vitamin D level was ≤15 ng/ml and increased to 34⋅6 ± 2⋅6 and 33⋅7 ± 2⋅4 ng/ml at the end of 3 months treatment with a dose 50 000 IU of vitamin D3 and calcium 600/1200 mg once a day for group A and group B, respectively. At 6 months, patients in Group A 25OHD3 level was 22⋅8 ± 3⋅80 and in Group B was 34⋅0 ± 1⋅85 ng/ml (P < 0⋅001). This preliminary study suggests that obese patients need higher dosage of vitamin D than the recommended dose. It is prudent that the dosage should be based on the BMI to maintain normal levels for a healthy musculoskeletal system.
The prenatal period represents a critical time for brain growth and development. These rapid neurological advances render the fetus susceptible to various influences with life-long implications for mental health. Maternal distress signals are a dominant early life influence, contributing to birth outcomes and risk for offspring psychopathology. This prospective longitudinal study evaluated the association between prenatal maternal distress and infant white matter microstructure. Participants included a racially and socioeconomically diverse sample of 85 mother–infant dyads. Prenatal distress was assessed at 17 and 29 weeks’ gestational age (GA). Infant structural data were collected via diffusion tensor imaging (DTI) at 42–45 weeks’ postconceptional age. Findings demonstrated that higher prenatal maternal distress at 29 weeks’ GA was associated with increased fractional anisotropy, b = .283, t(64) = 2.319, p = .024, and with increased axial diffusivity, b = .254, t(64) = 2.067, p = .043, within the right anterior cingulate white matter tract. No other significant associations were found with prenatal distress exposure and tract fractional anisotropy or axial diffusivity at 29 weeks’ GA, or earlier in gestation.
Cardiovascular disease (CVD) is the most common non-communicable disease occurring globally. Although previous literature has provided useful insights into the important role that diet plays in CVD prevention and treatment, understanding the causal role of diets is a difficult task considering inherent and introduced weaknesses of observational (e.g. not properly addressing confounders and mediators) and experimental research designs (e.g. not appropriate or well designed). In this narrative review, we organised current evidence linking diet, as well as conventional and emerging physiological risk factors, with CVD risk, incidence and mortality in a series of diagrams. The diagrams presented can aid causal inference studies as they provide a visual representation of the types of studies underlying the associations between potential risk markers/factors for CVD. This may facilitate the selection of variables to be considered and the creation of analytical models. Evidence depicted in the diagrams was systematically collected from studies included in the British Nutrition Task Force report on diet and CVD and database searches, including Medline and Embase. Although several markers and disorders linked to conventional and emerging risk factors for CVD were identified, the causal link between many remains unknown. There is a need to address the multifactorial nature of CVD and the complex interplay between conventional and emerging risk factors with natural and built environments, while bringing the life course into the spotlight.
Suicide is one of the leading mental health crises and takes one life every 40 seconds. Four out of every five suicides occur in low- and middle-income countries. Despite religion being a protective factor against suicide, the estimated number of suicides is rapidly increasing in Pakistan.
Aims
Our review focuses on the trends of suicide and means of self-poisoning in the past three decades, and the management of commonly used poisons.
Method
We searched two electronic databases (PubMed and PakMediNet) for published English-language studies describing agents used for suicide in different regions of Pakistan. A total of 46 out of 85 papers (N = 54 747 cases) met our inclusion criteria.
Results
Suicidal behaviour was more common among individuals younger than 30 years. Females comprised 60% of those who attempted suicide in our study sample, although the ratio of completed suicides favoured males. There were regional trends in the choice of agent for overdose. Organophosphate poisoning was reported across the nation, with a predominance of cases from the agricultural belt of South Punjab and interior Sindh. Aluminium phosphide (‘wheat pills’) was a preferred agent in North Punjab, whereas paraphenylenediamine (‘kala pathar’) was implicated in deaths by suicide from South Punjab. Urban areas had other means for suicide, including household chemicals, benzodiazepines, kerosene oil and rat poison.
Conclusions
Urgent steps are needed, including psychoeducational campaigns on mental health and suicide, staff training, medical resources for prompt treatment of self-poisoning and updated governmental policy to regulate pesticide sales.
Urban density is erroneously regarded as the main factor in the spread of COVID-19 in cities. A review of extant literature and findings from our case study of Karachi, Pakistan indicate that inequalities in income, healthcare, and living conditions play a key role in the spread of contagions along with government responsiveness to the pandemic. Moving forward, urban policies need to address these inequalities through changes in housing policies and decentralized governance systems. Cities must adapt to sustainable modes of travel, reduce digital inequalities, and encourage people friendly urban planning to become resilient in the face of pandemics.
Technical summary
COVID-19 has changed how urban residents relate to their cities. Urban centers have become epicenters of disease, which has raised questions about the long-term sustainability of high-density settlements and public transport usage. However, the spread of COVID-19 in cities is incorrectly attributed to urban density.
Using the case study of Karachi, Pakistan, we find that inequality of income, healthcare, and living conditions is a major contributing factor to the spread of COVID-19. Data on positive COVID-19 cases, density, and socioeconomic status were obtained at the Union Council level from administrative districts of Karachi, Pakistan between March 2020, and July 2020. Despite low population densities, low-to-middle income neighborhoods in Karachi had a higher proportion of positive cases. Further, the experience of dense cities such as Hanoi in Vietnam and New York in the US differs regarding the spread of COVID-19. Hence, the government's response to the pandemic is also a major factor in containing the outbreak.
Our findings suggest that a crisis in a city is exacerbated by its inability to take advantage of its density, inequality in the distribution of resources, lack of inclusiveness, and centralized governance mechanisms that make it difficult to respond quickly to situations. Thus, urban planning scholarship and practice should take an interdisciplinary approach to make cities equitable, inclusive, and adaptive.
Social media summary
Cities in the developing world have an opportunity for more resilient renewal in the post-COVID world.
This article reports the establishment of an isolated, fully functional field intensive care unit (FICU) unit equipped with all necessary critical care facilities as a part of the national pre-emptive preparedness to treat an unexpected surge outbreak of coronavirus disease 2019 (COVID-19) patients in Bahrain. One floor of an existing car parking structure was converted into a 130-bed FICU set-up by the in-house project implementation team comprised of multidisciplinary departments. The setting was a military hospital in the Kingdom of Bahrain, and the car park was on the hospital premises. The FICU contained a 112-bed fully equipped ICU and an 18-bed step-down ICU, and was built in 7 d to cater to the intensive care of COVID-19 patients in Bahrain.
Short-term peripheral venous catheter–related bloodstream infection (PVCR-BSI) rates have not been systematically studied in resource-limited countries, and data on their incidence by number of device days are not available.
Methods:
Prospective, surveillance study on PVCR-BSI conducted from September 1, 2013, to May 31, 2019, in 727 intensive care units (ICUs), by members of the International Nosocomial Infection Control Consortium (INICC), from 268 hospitals in 141 cities of 42 countries of Africa, the Americas, Eastern Mediterranean, Europe, South East Asia, and Western Pacific regions. For this research, we applied definition and criteria of the CDC NHSN, methodology of the INICC, and software named INICC Surveillance Online System.
Results:
We followed 149,609 ICU patients for 731,135 bed days and 743,508 short-term peripheral venous catheter (PVC) days. We identified 1,789 PVCR-BSIs for an overall rate of 2.41 per 1,000 PVC days. Mortality in patients with PVC but without PVCR-BSI was 6.67%, and mortality was 18% in patients with PVC and PVCR-BSI. The length of stay of patients with PVC but without PVCR-BSI was 4.83 days, and the length of stay was 9.85 days in patients with PVC and PVCR-BSI. Among these infections, the microorganism profile showed 58% gram-negative bacteria: Escherichia coli (16%), Klebsiella spp (11%), Pseudomonas aeruginosa (6%), Enterobacter spp (4%), and others (20%) including Serratia marcescens. Staphylococcus aureus were the predominant gram-positive bacteria (12%).
Conclusions:
PVCR-BSI rates in INICC ICUs were much higher than rates published from industrialized countries. Infection prevention programs must be implemented to reduce the incidence of PVCR-BSIs in resource-limited countries.
This study provides an overview of the extent, nature and quality of reporting on mental health compared with physical health in Qatari newspapers. We analysed 1274 news reports from daily newspapers in Qatar. The majority of the articles provided general information and were either positive or neutral in tone, reporting purely on physical health matters. A small proportion made associations with violence or reported on suicide or substance use. Our results highlight the underrepresentation of mental health in Qatari newspapers. A collaboration between media and health professionals is recommended to improve reporting on mental health.
To investigate a Middle East respiratory syndrome coronavirus (MERS-CoV) outbreak event involving multiple healthcare facilities in Riyadh, Saudi Arabia; to characterize transmission; and to explore infection control implications.
Design
Outbreak investigation.
Setting
Cases presented in 4 healthcare facilities in Riyadh, Saudi Arabia: a tertiary-care hospital, a specialty pulmonary hospital, an outpatient clinic, and an outpatient dialysis unit.
Methods
Contact tracing and testing were performed following reports of cases at 2 hospitals. Laboratory results were confirmed by real-time reverse transcription polymerase chain reaction (rRT-PCR) and/or genome sequencing. We assessed exposures and determined seropositivity among available healthcare personnel (HCP) cases and HCP contacts of cases.
Results
In total, 48 cases were identified, involving patients, HCP, and family members across 2 hospitals, an outpatient clinic, and a dialysis clinic. At each hospital, transmission was linked to a unique index case. Moreover, 4 cases were associated with superspreading events (any interaction where a case patient transmitted to ≥5 subsequent case patients). All 4 of these patients were severely ill, were initially not recognized as MERS-CoV cases, and subsequently died. Genomic sequences clustered separately, suggesting 2 distinct outbreaks. Overall, 4 (24%) of 17 HCP cases and 3 (3%) of 114 HCP contacts of cases were seropositive.
Conclusions
We describe 2 distinct healthcare-associated outbreaks, each initiated by a unique index case and characterized by multiple superspreading events. Delays in recognition and in subsequent implementation of control measures contributed to secondary transmission. Prompt contact tracing, repeated testing, HCP furloughing, and implementation of recommended transmission-based precautions for suspected cases ultimately halted transmission.