We use cookies to distinguish you from other users and to provide you with a better experience on our websites. Close this message to accept cookies or find out how to manage your cookie settings.
To save content items to your account,
please confirm that you agree to abide by our usage policies.
If this is the first time you use this feature, you will be asked to authorise Cambridge Core to connect with your account.
Find out more about saving content to .
To save content items to your Kindle, first ensure no-reply@cambridge.org
is added to your Approved Personal Document E-mail List under your Personal Document Settings
on the Manage Your Content and Devices page of your Amazon account. Then enter the ‘name’ part
of your Kindle email address below.
Find out more about saving to your Kindle.
Note you can select to save to either the @free.kindle.com or @kindle.com variations.
‘@free.kindle.com’ emails are free but can only be saved to your device when it is connected to wi-fi.
‘@kindle.com’ emails can be delivered even when you are not connected to wi-fi, but note that service fees apply.
This brief discussion of United States economic aid to Africa has been compiled as a factual summary rather than an analysis or critical evaluation of present programs. A general description of economic aid, and of the activities of the Agency for International Development, is followed by a more detailed note on a country program in operation, that in the Sudan.
Although not the largest single donor in Africa, the U.S. Government in fiscal 1962 and again in fiscal 1963 provided about $500 million in economic assistance to 34 African countries. Surplus food and fiber under the Food for Peace Program accounted for nearly half of this aid. In contrast, other Free World government sources provided about $1, 200 million, and the Sino-Soviet Bloc extended $200 to $250 million in credits. Thus, the U.S. contributed roughly one-fourth of Africa's $2 billion in annual external economic assistance from government sources. In addition important contributions were made by U.S. private foundations, religious organizations, and other nongovernment groups. U.S. direct investments in “AID Africa” rose to about $800 million in 1962 from about $100 million in 1950.
Background: Nipocalimab (a fully human, effectorless anti-neonatal Fc receptor (FcRn) monoclonal antibody) may ameliorate gMG disease manifestations by selectively targeting FcRn IgG recycling and lowering IgG, including pathogenic autoantibodies in generalized myasthenia gravis (gMG). The objective was to evaluate the effectiveness and safety of intravenous nipocalimab added to background standard-of-care therapy in adolescents with gMG. Methods: Seropositive patients (12-<18 years) with gMG (MGFA Class II-IV) on stable therapy but inadequately controlled, were enrolled in a 24-week open label study. Nipocalimab was administered as a 30 mg/kg IV loading dose followed by 15 mg/kg IV every 2 Weeks. Results: Seven adolescents were enrolled; 5 completed 24-weeks of dosing. The mean(SD) age was 14.1(1.86) years; seven were anti-AChR+, six were female. Mean(SD) baseline MG-ADL/QMG scores were 4.29(2.430)/12.50(3.708). Nipocalimab showed a significant reduction in total serum IgG at week-24; the mean(SD) change from baseline to week-24 for total serum IgG was -68.98%(7.561). The mean(SD) change in MG-ADL/QMG scores at week-24 was -2.40(0.418)/-3.80(2.683); 4 of 5 patients achieved minimum symptom expression (MG-ADL score 0-1) by week-24. Nipocalimab was well-tolerated; there were no serious adverse events. There were no clinically meaningful laboratory changes. Conclusions: Nipocalimab demonstrated efficacy and safety in this 6-month trial in seropositive adolescents with gMG.
The colonisation of Australia around 250 years ago resulted in significant disruptive changes to the lifestyle and diet of Aboriginal and Torres Strait Islander peoples. Traditional foods high in micronutrients, including vitamin D, have been largely replaced with energy-dense foods(1). Sun exposure—a primary source of vitamin D—may be reduced due to changes in clothing and housing structure(2). Consequently, there is a high prevalence of vitamin D deficiency (serum 25-hydroxyvitamin D concentration < 50 nmol/L) and low vitamin D intake among Aboriginal and Torres Strait Islander peoples(2,3). There is a need for a public health strategy to improve vitamin D status. Since few foods naturally contain vitamin D (e.g., fish, eggs, and meat), food fortification could be a suitable public health strategy to increase vitamin D intake without changing consumption behaviour. In Australia, besides food mandated for fortification (e.g., edible oil spreads), few foods permitted for voluntary fortification are routinely fortified. We aimed to model vitamin D food fortification scenarios among Aboriginal and Torres Strait Islander peoples. We used nationally representative food consumption data from the 2012–2013 National Aboriginal and Torres Strait Islander Nutrition and Physical Activity Survey (n = 4,109) and analytical vitamin D food composition data(4) to model four food fortification scenarios. Scenario 1 modelled the addition of the maximum permitted amount of vitamin D to all foods permitted for fortification in Australia: i) dairy products and alternatives, ii) butter/margarine/oil spreads, iii) formulated beverages (e.g., water with added sugar, vitamins and minerals), and iv) selected ready-to-eat breakfast cereal. Scenarios 2a–c included vitamin D concentrations higher than permitted in fluid milks/alternatives (1 μg/day) and butter/margarine/oil spreads (20 μg/day). Scenario 2a: i) dairy products and alternatives, ii) butter/margarine/oil spreads, iii) formulated beverages. Scenario 2b: as per Scenario 2a plus selected ready-to-eat breakfast cereals. Scenario 2c: as per Scenario 2b plus bread (not permitted for vitamin D fortification in Australia). Vitamin D fortification of a range of staple foods could potentially increase vitamin D intake among Aboriginal and Torres Strait Islander peoples by ~3–6 μg/day. Scenario 2c showed the highest potential median vitamin D intake increase from baseline of 2 μg/day to ~8 μg/day. Across all scenarios, the vitamin D intake of all participants remained below the Australian Tolerable Upper Intake Level of 80 μg/day. Our findings demonstrated that vitamin D fortification of a range of staple foods could potentially increase vitamin D intake among Aboriginal and Torres Strait Islander peoples in Australia. However, the most impactful vitamin D fortification strategy (Scenario 2c) would require a revision of the Australia New Zealand Food Standards Code to permit the addition of higher amounts of vitamin D than currently permitted and the inclusion of bread as a food vehicle for fortification.
Low vitamin D status (circulating 25-hydroxyvitamin D [25(OH)D] concentration < 50 nmol/L) affects nearly one in four Australian adults(1). The primary source of vitamin D is sun exposure; however, a safe level of sun exposure for optimal vitamin D production has not been established. As supplement use is uneven, increasing vitamin D in food is the logical option for improving vitamin D status at a population level. The dietary supply of vitamin D is low since few foods are naturally rich in vitamin D. While there is no Australia-specific estimated average requirement (EAR) for vitamin D, the Institute of Medicine recommends an EAR of 10 μg/day for all ages. Vitamin D intake is low in Australia, with mean usual intake ranging from 1.8–3.2 μg/day across sex/age groups(2), suggesting a need for data-driven nutrition policy to improve the dietary supply of vitamin D. Food fortification has proven effective in other countries. We aimed to model four potential vitamin D fortification scenarios to determine an optimal strategy for Australia. We used food consumption data for people aged ≥ 2 years (n = 12,153) from the 2011–2012 National Nutrition and Physical Activity Survey, and analytical food composition data for vitamin D3, 25(OH)D3, vitamin D2 and 25(OH)D2(3). Certain foods are permitted for mandatory or voluntary fortification in Australia. As industry uptake of the voluntary option is low, Scenario 1 simulated addition of the maximum permitted amount of vitamin D to all foods permitted under the Australia New Zealand Food Standards Code (dairy products/plant-based alternatives, edible oil spreads, formulated beverages and permitted ready-to-eat breakfast cereals (RTEBC)). Scenarios 2–4 modelled higher concentrations than those permitted for fluid milk/alternatives (1 μg/100 mL) and edible oil spreads (20 μg/100 g) within an expanding list of food vehicles: Scenario 2—dairy products/alternatives, edible oil spreads, formulated beverages; Scenario 3—Scenario 2 plus RTEBC; Scenario 4—Scenario 3 plus bread (which is not permitted for vitamin D fortification in Australia). Usual intake was modelled for the four scenarios across sex and age groups using the National Cancer Institute Method(4). Assuming equal bioactivity of the D vitamers, the range of mean usual vitamin D intake across age groups for males for Scenarios 1 to 4, respectively, was 7.2–8.8, 6.9–8.3, 8.0–9.7 and 9.3–11.3 μg/day; the respective values for females were 5.8–7.5, 5.8–7.2, 6.4–8.3 and 7.5–9.5 μg/day. No participant exceeded the upper level of intake (80 μg/day) under any scenario. Systematic fortification of all foods permitted for vitamin D fortification could substantially improve vitamin D intake across the population. However, the optimal strategy would require permissions for bread as a food vehicle, and addition of higher than permitted concentrations of vitamin D to fluid milks/alternatives and edible oil spreads.
Disease-modifying therapies (DMTs) for Alzheimer’s disease (AD) are emerging following successful clinical trials of therapies targeting amyloid beta (Aβ) protofibrils or plaques. Determining patient eligibility and monitoring treatment efficacy and adverse events, such as Aβ-related imaging abnormalities, necessitates imaging with MRI and PET. The Canadian Consortium on Neurodegeneration in Aging (CCNA) Imaging Workgroup aimed to synthesize evidence and provide recommendations on implementing imaging protocols for AD DMTs in Canada.
Methods:
The workgroup employed a Delphi process to develop these recommendations. Experts from radiology, neurology, biomedical engineering, nuclear medicine, MRI and medical physics were recruited. Surveys and meetings were conducted to achieve consensus on key issues, including protocol standardization, scanner strength, monitoring protocols based on risk profiles and optimal protocol lengths. Draft recommendations were refined through multiple iterations and expert discussions.
Results:
The recommendations emphasize standardized acquisition imaging protocols across manufacturers and scanner strengths to ensure consistency and reliability of clinical treatment decisions, tailored monitoring protocols based on DMTs’ safety and efficacy profiles, consistent monitoring regardless of perceived treatment efficacy and MRI screening on 1.5T or 3T scanners with adapted protocols. An optimal protocol length of 20–30 minutes was deemed feasible; specific sequences are suggested.
Conclusion:
The guidelines aim to enhance imaging data quality and consistency, facilitating better clinical decision-making and improving patient outcomes. Further research is needed to refine these protocols and address evolving challenges with new DMTs. It is recognized that administrative, financial and logistical capacity to deliver additional MRI and positron emission tomography scans require careful planning.
Despite high UVB radiation from the sun in Australia (the primary source of vitamin D), vitamin D deficiency (serum 25-hydroxyvitamin D concentrations [25(OH)D] <50 nmol/L) is prevalent among Aboriginal and Torres Strait Islander peoples (27% of adults nationally; 39% of adults living in remote areas)(1). Vitamin D deficiency affects musculoskeletal health and may be associated with non-communicable diseases, such as type 2 diabetes and cardiovascular diseases, prevalent in Aboriginal and Torres Strait Islander peoples.(2, 3) Alternative to UVB radiation, vitamin D can also be obtained from foods (e.g., fish, eggs, and meat) and supplements. However, vitamin D intake in Aboriginal and Torres Strait Islander peoples is currently unknown. Hence, we aimed to provide the first estimate of absolute vitamin D intake in Aboriginal and Torres Strait Islander peoples. We used food consumption data from the 2012-2013 National Aboriginal and Torres Strait Islander Nutrition and Physical Activity Survey and vitamin D food composition data for vitamin D3, 25(OH)D3, vitamin D2, and 25(OH)D2. Absolute vitamin D intake was estimated by sex and remote and non-remote areas using bioactivity factors (BF) of 1 and 5 for 25(OH)D, which may be up to five times more bioactive than vitamin D. The estimated average requirement for vitamin D intake recommended by the Institute of Medicine is 10 μg/day(4). The estimated absolute vitamin D intake from food and beverages was low for Aboriginal and Torres Strait Islander peoples. The mean estimated absolute vitamin D intake of Aboriginal and Torres Strait Islander peoples was 2.9 μg/day and 5.3 μg/day for BF 1 and 5, respectively. Males had a higher mean intake (3.2 μg/day, BF 1 and 5.9 μg/day, BF 5) than females (2.6 μg/day, BF 1 and 4.7 μg/day, BF 5). Vitamin D intake was 2.9 μg/day (BF 1) and 5.2 μg/day (BF 5) in non-remote and 2.8 μg/day (BF 1) and 5.4 μg/day (BF 5) in remote areas. The high prevalence of vitamin D deficiency and low vitamin D intake highlights a need to promote vitamin D sufficiency through public health policies. The results from this study can be used to model food fortification strategies to provide evidence for the development of nutrition policies to improve the vitamin D status of the Aboriginal and Torres Strait Islander population.
The objective of this study was to explore barriers and enablers to improving the management of bacteriuria in hospitalized adults.
Design:
Qualitative study.
Setting:
Nova Scotia, Canada.
Participants:
Nurses, physicians, and pharmacists involved in the assessment, diagnosis, and treatment of bacteriuria in hospitalized patients.
Methods:
Focus groups (FGs) were completed between May and July 2019. FG discussions were facilitated using an interview guide that consisted of open-ended questions coded to the theoretical domains framework (TDF) v2. Discussions were transcribed verbatim then independently coded to the TDFv2 by two members of the research team and compared. Thematic analysis was used to identify themes.
Results:
Thirty-three healthcare providers from five hospitals participated (15 pharmacists, 11 nurses, and 7 physicians). The use of antibiotics for the treatment of asymptomatic bacteriuria (ASB) was the main issue identified. Subthemes that related to management of ASB included: “diagnostic uncertainty,” difficulty “ignoring positive urine cultures,” “organizational challenges,” and “how people learn.” Barriers and/or enablers to improving the management of bacteriuria were mapped to 12 theoretical domains within these subthemes. Barriers and enablers identified by participants that were most extensively discussed related to the domains of environmental context and resources, belief about capabilities, social/professional role and identity, and social influences.
Conclusions:
Healthcare providers highlighted barriers and recognized enablers that may improve delivery of care to patients with bacteriuria. A wide range of barriers at the individual and organization level to address diagnostic challenges and improve workload should be considered to improve management of bacteriuria.
Background: Saccade and pupil responses are potential neurodegenerative disease biomarkers due to overlap between oculomotor circuitry and disease-affected areas. Instruction-based tasks have previously been examined as biomarker sources, but are arduous for patients with limited cognitive abilities; additionally, few studies have evaluated multiple neurodegenerative pathologies concurrently. Methods: The Ontario Neurodegenerative Disease Research Initiative recruited individuals with Alzheimer’s disease (AD), mild cognitive impairment (MCI), amyotrophic lateral sclerosis (ALS), frontotemporal dementia, progressive supranuclear palsy, or Parkinson’s disease (PD). Patients (n=274, age 40-86) and healthy controls (n=101, age 55-86) viewed 10 minutes of frequently changing video clips without instruction while their eyes were tracked. We evaluated differences in saccade and pupil parameters (e.g. saccade frequency and amplitude, pupil size, responses to clip changes) between groups. Results: Preliminary data indicates low-level behavioural alterations in multiple disease cohorts: increased centre bias, lower overall saccade rate and reduced saccade amplitude. After clip changes, patient groups generally demonstrated lower saccade rate but higher microsaccade rate following clip change to varying degrees. Additionally, pupil responses were blunted (AD, MCI, ALS) or exaggerated (PD). Conclusions: This task may generate behavioural biomarkers even in cognitively impaired populations. Future work should explore the possible effects of factors such as medication and disease stage.
Quantifying the marine radiocarbon reservoir effect, offsets (ΔR), and ΔR variability over time is critical to improving dating estimates of marine samples while also providing a proxy of water mass dynamics. In the northeastern Pacific, where no high-resolution time series of ΔR has yet been established, we sampled radiocarbon (14C) from exactly dated growth increments in a multicentennial chronology of the long-lived bivalve, Pacific geoduck (Paneopea generosa) at the Tree Nob site, coastal British Columbia, Canada. Samples were taken at approximately decadal time intervals from 1725 CE to 1920 CE and indicate average ΔR values of 256 ± 22 years (1σ) consistent with existing discrete estimates. Temporal variability in ΔR is small relative to analogous Atlantic records except for an unusually old-water event, 1802–1812. The correlation between ΔR and sea surface temperature (SST) reconstructed from geoduck increment width is weakly significant (r2 = .29, p = .03), indicating warm water is generally old, when the 1802–1812 interval is excluded. This interval contains the oldest (–2.1σ) anomaly, and that is coincident with the coldest (–2.7σ) anomalies of the temperature reconstruction. An additional 32 14C values spanning 1952–1980 were detrended using a northeastern Pacific bomb pulse curve. Significant positive correlations were identified between the detrended 14C data and annual El Niño Southern Oscillation (ENSO) and summer SST such that cooler conditions are associated with older water. Thus, 14C is generally relatively stable with weak, potentially inconsistent associations to climate variables, but capable of infrequent excursions as illustrated by the unusually cold, old-water 1802–1812 interval.
Background: Eye movements reveal neurodegenerative disease processes due to overlap between oculomotor circuitry and disease-affected areas. Characterizing oculomotor behaviour in context of cognitive function may enhance disease diagnosis and monitoring. We therefore aimed to quantify cognitive impairment in neurodegenerative disease using saccade behaviour and neuropsychology. Methods: The Ontario Neurodegenerative Disease Research Initiative recruited individuals with neurodegenerative disease: one of Alzheimer’s disease, mild cognitive impairment, amyotrophic lateral sclerosis, frontotemporal dementia, Parkinson’s disease, or cerebrovascular disease. Patients (n=450, age 40-87) and healthy controls (n=149, age 42-87) completed a randomly interleaved pro- and anti-saccade task (IPAST) while their eyes were tracked. We explored the relationships of saccade parameters (e.g. task errors, reaction times) to one another and to cognitive domain-specific neuropsychological test scores (e.g. executive function, memory). Results: Task performance worsened with cognitive impairment across multiple diseases. Subsets of saccade parameters were interrelated and also differentially related to neuropsychology-based cognitive domain scores (e.g. antisaccade errors and reaction time associated with executive function). Conclusions: IPAST detects global cognitive impairment across neurodegenerative diseases. Subsets of parameters associate with one another, suggesting disparate underlying circuitry, and with different cognitive domains. This may have implications for use of IPAST as a cognitive screening tool in neurodegenerative disease.
Background:Candida auris is an emerging multidrug-resistant yeast that is transmitted in healthcare facilities and is associated with substantial morbidity and mortality. Environmental contamination is suspected to play an important role in transmission but additional information is needed to inform environmental cleaning recommendations to prevent spread. Methods: We conducted a multiregional (Chicago, IL; Irvine, CA) prospective study of environmental contamination associated with C. auris colonization of patients and residents of 4 long-term care facilities and 1 acute-care hospital. Participants were identified by screening or clinical cultures. Samples were collected from participants’ body sites (eg, nares, axillae, inguinal creases, palms and fingertips, and perianal skin) and their environment before room cleaning. Daily room cleaning and disinfection by facility environmental service workers was followed by targeted cleaning of high-touch surfaces by research staff using hydrogen peroxide wipes (see EPA-approved product for C. auris, List P). Samples were collected immediately after cleaning from high-touch surfaces and repeated at 4-hour intervals up to 12 hours. A pilot phase (n = 12 patients) was conducted to identify the value of testing specific high-touch surfaces to assess environmental contamination. High-yield surfaces were included in the full evaluation phase (n = 20 patients) (Fig. 1). Samples were submitted for semiquantitative culture of C. auris and other multidrug-resistant organisms (MDROs) including methicillin-resistant Staphylococcus aureus (MRSA), vancomycin-resistant Enterococcus (VRE), extended-spectrum β-lactamase–producing Enterobacterales (ESBLs), and carbapenem-resistant Enterobacterales (CRE). Times to room surface contamination with C. auris and other MDROs after effective cleaning were analyzed. Results:Candida auris colonization was most frequently detected in the nares (72%) and palms and fingertips (72%). Cocolonization of body sites with other MDROs was common (Fig. 2). Surfaces located close to the patient were commonly recontaminated with C. auris by 4 hours after cleaning, including the overbed table (24%), bed handrail (24%), and TV remote or call button (19%). Environmental cocontamination was more common with resistant gram-positive organisms (MRSA and, VRE) than resistant gram-negative organisms (Fig. 3). C. auris was rarely detected on surfaces located outside a patient’s room (1 of 120 swabs; <1%). Conclusions: Environmental surfaces near C. auris–colonized patients were rapidly recontaminated after cleaning and disinfection. Cocolonization of skin and environment with other MDROs was common, with resistant gram-positive organisms predominating over gram-negative organisms on environmental surfaces. Limitations include lack of organism sequencing or typing to confirm environmental contamination was from the room resident. Rapid recontamination of environmental surfaces after manual cleaning and disinfection suggests that alternate mitigation strategies should be evaluated.
Recent excavations by the Ancient Southwest Texas Project of Texas State University sampled a previously undocumented Younger Dryas component from Eagle Cave in the Lower Pecos Canyonlands of Texas. This stratified assemblage consists of bison (Bison antiquus) bones in association with lithic artifacts and a hearth. Bayesian modeling yields an age of 12,660–12,480 cal BP, and analyses indicate behaviors associated with the processing of a juvenile bison and the manufacture and maintenance of lithic tools. This article presents spatial, faunal, macrobotanical, chronometric, geoarchaeological, and lithic analyses relating to the Younger Dryas component within Eagle Cave. The identification of the Younger Dryas occupation in Eagle Cave should encourage archaeologists to revisit previously excavated rockshelter sites in the Lower Pecos and beyond to evaluate deposits for unrecognized, older occupations.
Two sentiments governed the postwar world: fear and hope. These two feelings dominated the debates that gave birth to both the Charter of the United Nations and the Universal Declaration of Human Rights. The League of Nations had failed. Leaders had expressed the desire for a world grounded in human rights but could not agree on what that meant or whether individual rights trumped the sovereign rights of nations. The UN Charter reflected these concerns, recognizing human rights but leaving their scope undefined. No precedents existed to guide the work. A committee of eighteen nations, chaired by Eleanor Roosevelt, accepted the unprecedented assignment of defining basic rights for all people everywhere. After consulting with noted jurists, philosophers, and social justice organizations, the committee set out to draft a document that would recognize the horrors of war and engender a commitment to peace. They envisioned a world governed more by hope than by fear. It was hard work. The debate was punctuated by escalating Cold War politics. A legally binding document seemed out of reach. All efforts turned instead to securing a declaration of human rights, which ultimately paved the way for legally binding commitments and energized a budding human rights movement.
The state courts of last resort are vital components of American judicial system, disposing of many important legal matters. The chief justices of these courts serve consequential roles in these institutions. Although scholars have examined the selection and duties of states' chief justices, their interactions with the elected branches are understudied. We focus on how chief justices on state high courts use their roles to encourage judicial reform. Specifically, we examine the determinants of chief justices' successes or failures as advocates for their justice systems. To analyze why chief justices succeed or fail as reform advocates, we analyze the fate of reform proposals offered in state of the judiciary addresses. Our results indicate that greater ideological similarity between the state legislature and chief justice or state supreme court median increases the odds of an agenda item being enacted. We also find that the scope of a policy request influences the likelihood it will be granted.
Large prospective observational studies have cast doubt on the common assumption that endovascular thrombectomy (EVT) is superior to intravenous thrombolysis for patients with acute basilar artery occlusion (BAO). The purpose of this study was to retrospectively review our experience for patients with BAO undergoing EVT with modern endovascular devices.
Methods:
All consecutive patients undergoing EVT with either a second-generation stent retriever or direct aspiration thrombectomy for BAO at our regional stroke center from January 1, 2013 to March 1, 2019 were included. The primary outcome measure was functional outcome at 1 month using the modified Rankin Scale (mRS) score. Multivariable logistic regression was used to assess the association between patient characteristics and dichotomized mRS.
Results:
A total of 43 consecutive patients underwent EVT for BAO. The average age was 67 years with 61% male patients. Overall, 37% (16/43) of patients achieved good functional outcome. Successful reperfusion was achieved in 72% (31/43) of cases. The median (interquartile range) stroke onset to treatment time was 420 (270–639) minutes (7 hours) for all patients. The procedure-related complication rate was 9% (4/43). On multivariate analysis, posterior circulation Alberta stroke program early computed tomography score and Basilar Artery on Computed Tomography Angiography score were associated with improved functional outcome.
Conclusion:
EVT appears to be safe and feasible in patients with BAO. Our finding that time to treatment and successful reperfusion were not associated with improved outcome is likely due to including patients with established infarcts. Given the variability of collaterals in the posterior circulation, the paradigm of utilizing a tissue window may assist in patient selection for EVT. Magnetic resonance imaging may be a reasonable option to determine the extent of ischemia in certain situations.
Three-dimensional printing is a revolutionary technology that is disrupting the status quo in surgery. It has been rapidly adopted by otolaryngology as a tool in surgical simulation for high-risk, low-frequency procedures. This systematic review comprehensively evaluates the contemporary usage of three-dimensional printed otolaryngology simulators.
Method
A systematic review of the literature was performed with narrative synthesis.
Results
Twenty-two articles were identified for inclusion, describing models that span a range of surgical tasks (temporal bone dissection, airway procedures, functional endoscopic sinus surgery and endoscopic ear surgery). Thirty-six per cent of articles assessed construct validity (objective measures); the other 64 per cent only assessed face and content validity (subjective measures). Most studies demonstrated positive feedback and high confidence in the models’ value as additions to the curriculum.
Conclusion
Whilst further studies supported with objective metrics are merited, the role of three-dimensional printed otolaryngology simulators is poised to expand in surgical training given the enthusiastic reception from trainees and experts alike.
Apolipoprotein E (APOE) E4 is the main genetic risk factor for Alzheimer’s disease (AD). Due to the consistent association, there is interest as to whether E4 influences the risk of other neurodegenerative diseases. Further, there is a constant search for other genetic biomarkers contributing to these phenotypes, such as microtubule-associated protein tau (MAPT) haplotypes. Here, participants from the Ontario Neurodegenerative Disease Research Initiative were genotyped to investigate whether the APOE E4 allele or MAPT H1 haplotype are associated with five neurodegenerative diseases: (1) AD and mild cognitive impairment (MCI), (2) amyotrophic lateral sclerosis, (3) frontotemporal dementia (FTD), (4) Parkinson’s disease, and (5) vascular cognitive impairment.
Methods:
Genotypes were defined for their respective APOE allele and MAPT haplotype calls for each participant, and logistic regression analyses were performed to identify the associations with the presentations of neurodegenerative diseases.
Results:
Our work confirmed the association of the E4 allele with a dose-dependent increased presentation of AD, and an association between the E4 allele alone and MCI; however, the other four diseases were not associated with E4. Further, the APOE E2 allele was associated with decreased presentation of both AD and MCI. No associations were identified between MAPT haplotype and the neurodegenerative disease cohorts; but following subtyping of the FTD cohort, the H1 haplotype was significantly associated with progressive supranuclear palsy.
Conclusion:
This is the first study to concurrently analyze the association of APOE isoforms and MAPT haplotypes with five neurodegenerative diseases using consistent enrollment criteria and broad phenotypic analysis.
Essential variables to consider for an efficient control strategy for invasive plants include dispersion pattern (i.e., satellite or invasion front) and patch expansion rate. These variables were demonstrated for buffelgrass [Pennisetum ciliare (L.) Link], a C4 perennial grass introduced from Africa, which has invaded broadly around the world. The study site was along a roadway in southern Arizona (USA). The P. ciliare plant distributions show the pattern of clumping associated with the satellite (nascent foci) colonization pattern (average nearest neighbor test, z-score −47.2, P<0.01). The distance between patches ranged from 0.743 to 12.8 km, with an average distance between patches of 5.6 km. Median patch expansion rate was 271% over the 3-yr monitoring period versus 136% found in other studies of established P. ciliare patches. Targeting P. ciliare satellite patches as a control strategy may exponentially reduce the areal doubling time, while targeting the largest patches may have less effect on the invasion speed.
The epidemic of prescription and non-prescription opioid misuse is of particular importance in pregnancy. The Society of Obstetricians and Gynaecologists of Canada currently recommends opioid replacement therapy with methadone or buprenorphine for opioid-dependent women during pregnancy. This vulnerable segment of the population has been shown to be at increased risk of blood-borne infectious diseases, nutritional insecurity and stress. The objective of this study was to describe an urban cohort of pregnant women on opioid replacement therapy and to evaluate potential effects on the fetus. A retrospective chart review of all women on opioid replacement therapy and their infants who delivered at The Ottawa Hospital General and Civic campuses between January 1, 2013 and March 24, 2017 was conducted. Data were collected on maternal characteristics, pregnancy outcomes, neonatal outcomes and corresponding placental pathology. Maternal comorbidities identified included high rates of infection, tobacco use and illicit substance use, as well as increased rates of placental abruption compared with national averages. Compared with national baseline averages, the mean neonatal birth weight was low, and the incidence of small for gestational age infants and congenital anomalies was high. The incidence of NAS was comparable with estimates from other studies of similar cohorts. Findings support existing literature that calls for a comprehensive interdisciplinary risk reduction approach including dietary, social, domestic, psychological and other supports to care for opioid-dependent women in pregnancy.