We use cookies to distinguish you from other users and to provide you with a better experience on our websites. Close this message to accept cookies or find out how to manage your cookie settings.
To save content items to your account,
please confirm that you agree to abide by our usage policies.
If this is the first time you use this feature, you will be asked to authorise Cambridge Core to connect with your account.
Find out more about saving content to .
To save content items to your Kindle, first ensure no-reply@cambridge.org
is added to your Approved Personal Document E-mail List under your Personal Document Settings
on the Manage Your Content and Devices page of your Amazon account. Then enter the ‘name’ part
of your Kindle email address below.
Find out more about saving to your Kindle.
Note you can select to save to either the @free.kindle.com or @kindle.com variations.
‘@free.kindle.com’ emails are free but can only be saved to your device when it is connected to wi-fi.
‘@kindle.com’ emails can be delivered even when you are not connected to wi-fi, but note that service fees apply.
Underrepresentation of diverse populations in medical research undermines generalizability, exacerbates health disparities, and erodes trust in research institutions. This study aimed to identify a suitable survey instrument to measure trust in medical research among Black and Latino communities in Baltimore, Maryland.
Methods:
Based on a literature review, a committee selected two validated instruments for community evaluation: Perceptions of Research Trustworthiness (PoRT) and Trust in Medical Researchers (TiMRs). Both were translated into Spanish through a standardized process. Thirty-four individuals participated in four focus groups (two in English, two in Spanish). Participants reviewed and provided feedback on the instruments’ relevance and clarity. Discussions were recorded, transcribed, and analyzed thematically.
Results:
Initial reactions to the instruments were mixed. While 68% found TiMR easier to complete, 74% preferred PoRT. Key discussion themes included the relevance of the instrument for measuring trust, clarity of the questions, and concerns about reinforcing negative perceptions of research. Participants felt that PoRT better aligned with the research goal of measuring community trust in research, though TiMR was seen as easier to understand. Despite PoRT’s lower reading level, some items were found to be more confusing than TiMR items.
Conclusion:
Community feedback highlighted the need to differentiate trust in medical research, researchers, and institutions. While PoRT and TiMR are acceptable instruments for measuring trust in medical research, refinement of both may be beneficial. Development and validation of instruments in multiple languages is needed to assess community trust in research and inform strategies to improve diverse participation in research.
Black and Latino individuals are underrepresented in COVID-19 treatment and vaccine clinical trials, calling for an examination of factors that may predict willingness to participate in trials.
Methods:
We administered the Common Survey 2.0 developed by the Community Engagement Alliance (CEAL) Against COVID-19 Disparities to 600 Black and Latino adults in Baltimore City, Prince George’s County, Maryland, Montgomery County, Maryland, and Washington, DC, between October and December 2021. We examined the relationship between awareness of clinical trials, social determinants of health challenges, trust in COVID-19 clinical trial information sources, and willingness to participate in COVID-19 treatment and vaccine trials using multinomial regression analysis.
Results:
Approximately half of Black and Latino respondents were unwilling to participate in COVID-19 treatment or vaccine clinical trials. Results showed that increased trust in COVID-19 clinical trial information sources and trial awareness were associated with greater willingness to participate in COVID-19 treatment and vaccine trials among Black and Latino individuals. For Latino respondents, having recently experienced more challenges related to social determinants of health was associated with a decreased likelihood of willingness to participate in COVID-19 vaccine trials.
Conclusions:
The willingness of Black and Latino adults to participate in COVID-19 treatment and vaccine clinical trials is influenced by trial awareness and trust in trial information sources. Ensuring the inclusion of these communities in clinical trials will require approaches that build greater awareness and trust.
The goals of this investigation were to 1) identify and measure exposures inside homes of individuals with chemical intolerance (CI), 2) provide guidance for reducing these exposures, and 3) determine whether our environmental house calls (EHCs) intervention could reduce both symptoms and measured levels of indoor air contaminants.
Background:
CI is an international public health and clinical concern, but few resources are available to address patients’ often disabling symptoms. Numerous studies show that levels of indoor air pollutants can be two to five (or more) times higher than outdoor levels. Fragranced consumer products, including cleaning supplies, air fresheners, and personal care products, are symptom triggers commonly reported by susceptible individuals.
Methods:
A team of professionals trained and led by a physician/industrial hygienist and a certified indoor air quality specialist conducted a series of 5 structured EHCs in 37 homes of patients reporting CI.
Results:
We report three case studies demonstrating that an appropriately structured home intervention can teach occupants how to reduce indoor air exposures and associated symptoms. Symptom improvement, documented using the Quick Environmental Exposure and Sensitivity Inventory Symptom Star, corresponded with the reduction of indoor air volatile organic compounds, most notably fragrances. These results provide a deeper dive into 3 of the 37 cases described previously in Perales et al. (2022).
Discussion:
We address the long-standing dilemma that worldwide reports of fragrance sensitivity have not previously been confirmed by human or animal challenge studies. Our ancient immune systems’ ‘first responders’, mast cells, which evolved 500 million years ago, can be sensitized by synthetic organic chemicals whose production and use have grown exponentially since World War II. We propose that these chemicals, which include now-ubiquitous fragrances, trigger mast cell degranulation and inflammatory mediator release in the olfactory-limbic tract, thus altering cerebral blood flow and impairing mood, memory, and concentration (often referred to as ‘brain fog’). The time has come to translate these research findings into clinical and public health practice.
Foramina of bones are beginning to yield more information about metabolic rates and activity levels of living and extinct species. This study investigates the relationship between estimated blood flow rate to the femur and body mass among cursorial birds extending back to the Late Cretaceous. Data from fossil foramina are compared with those of extant species, revealing similar scaling relationships for all cursorial birds and supporting crown bird–like terrestrial locomotor activity. Because the perfusion rate in long bones of birds is related to the metabolic cost of microfracture repair due to stresses applied during locomotion, as it is in mammals, this study estimates absolute blood flow rates from sizes of nutrient foramina located on the femur shafts. After differences in body mass and locomotor behaviors are accounted for, femoral bone blood flow rates in extinct species are similar to those of extant cursorial birds. Femoral robustness is generally greater in aquatic flightless birds than in terrestrial flightless and ground-dwelling flighted birds, suggesting that the morphology is shaped by life-history demands. Femoral robustness also increases in larger cursorial bird taxa, probably associated with their weight redistribution following evolutionary loss of the tail, which purportedly constrains femur length, aligns it more horizontally, and necessitates increased robustness in larger species.
To determine whether environmental house calls that improved indoor air quality (IAQ) is effective in reducing symptoms of chemical intolerance (CI).
Background:
Prevalence of CI is increasing worldwide. Those affected typically report symptoms such as headaches, fatigue, ‘brain fog’, and gastrointestinal problems – common primary care complaints. Substantial evidence suggests that improving IAQ may be helpful in reducing symptoms associated with CI.
Methods:
Primary care clinic patients were invited to participate in a series of structured environmental house calls (EHCs). To qualify, participants were assessed for CI with the Quick Environmental Exposure and Sensitivity Inventory. Those with CI volunteered to allow the EHC team to visit their homes to collect air samples for volatile organic compounds (VOCs). Initial and post-intervention IAQ sampling was analyzed by an independent lab to determine VOC levels (ng/L). The team discussed indoor air exposures, their health effects, and provided guidance for reducing exposures.
Findings:
Homes where recommendations were followed showed the greatest improvements in IAQ. The improvements were based upon decreased airborne VOCs associated with reduced use of cleaning chemicals, personal care products, and fragrances, and reduction in the index patients’ symptoms. Symptom improvement generally was not reported among those whose homes showed no VOC improvement.
Conclusion:
Improvements in both IAQ and patients’ symptoms occur when families implement an action plan developed and shared with them by a trained EHC team. Indoor air problems simply are not part of most doctors’ differential diagnoses, despite relatively high prevalence rates of CI in primary care clinics. Our three-question screening questionnaire – the BREESI – can help physicians identify which patients should complete the QEESI. After identifying patients with CI, the practitioner can help by counseling them regarding their home exposures to VOCs. The future of clinical medicine could include environmental house calls as standard of practice for susceptible patients.
Despite considerable achievements in the field of conservation, biodiversity continues to decline and conservation initiatives face numerous barriers. Although many of these barriers are well known, for example insufficient funding and capacity, there has been no systematic attempt to catalogue and categorize them into a typology. Because risks compromise the conservation mission, any barrier to success is a risk. Here we present the first attempt at identifying key barriers. We analyse extensive interviews with 74 conservationists, primarily from Africa but with international experience, to identify potential risks to their projects and use that information to create a typology of barriers to conservation success. We draw on the literature to explain the prevalence of some of the barriers identified. We suggest that this typology could form the basis of heuristic tools that conservationists can use to identify and manage potential risks to their projects, thereby improving decision-making, strategic planning and, ultimately, overall impact. The typology is also useful for the conservation community (comprising conservationists and funders) to help implement better practices and improve the likelihood of success. We present examples of such work already underway and suggest more can be done to continue to improve.
To compare the impact on child diet and growth of a multisectoral community intervention v. nutrition education and livestock management training alone.
Design:
Longitudinal community-based randomized trial involving three groups of villages assigned to receive: (i) Full Package community development activities, delivered via women’s groups; (ii) livestock training and nutrition education alone (Partial Package); or (iii) no intervention (Control). Household surveys, child growth monitoring, child and household diet quality measures (diet diversity (DD), animal-source food (ASF) consumption) were collected at five visits over 36 months. Mixed-effect linear regression and Poisson models used survey round, treatment group and group-by-round interaction to predict outcomes of interest, adjusted for household- and child-specific characteristics.
Setting:
Banke, Nepal.
Participants:
Households (n 974) with children aged 1–60 months (n 1333).
Results:
Children in Full Package households had better endline anthropometry (weight-for-age, weight-for-height, mid-upper-arm-circumference Z-scores), DD, and more consumption of ASF, after adjusting for household- and child-specific characteristics. By endline, compared with Partial Package or Control groups, Full Package households demonstrated preferential child feeding practices and had significantly more improvement in household wealth and hygiene habits.
Conclusions:
In this longitudinal study, a comprehensive multisectoral intervention was more successful in improving key growth indicators as well as diet quality in young children. Provision of training in livestock management and nutrition education alone had limited effect on these outcomes. Although more time-consuming and costly to administer, incorporating nutrition training with community social capital development was associated with better child growth and nutrition outcomes than isolated training programmes alone.
The need for hollow microneedle arrays is important for both drug delivery and wearable sensor applications; however, their fabrication poses many challenges. Hollow metal microneedle arrays residing on a flexible metal foil substrate were created by combining additive manufacturing, micromolding, and electroplating approaches in a process we refer to as electromolding. A solid microneedle with inward facing ledge was fabricated with a two photon polymerization (2PP) system utilizing laser direct write (LDW) and then molded with polydimethylsiloxane. These molds were then coated with a seed layer of Ti/Au and subsequently electroplated with pulsed deposition to create hollow microneedles. An inward facing ledge provided a physical blocking platform to restrict deposition of the metal seed layer for creation of the microneedle bore. Various ledge sizes were tested and showed that the resulting seed layer void could be controlled via the ledge length. Mechanical properties of the PDMS mold was adjusted via the precursor ratio to create a more ductile mold that eliminated tip damage to the microneedles upon removal from the molds. Master structures were capable of being molded numerous times and molds were able to be reused. SEM/EDX analysis showed that trace amounts of the PDMS mold were transferred to the metal microneedle upon removal. The microneedle substrate showed a degree of flexibility that withstood over 100 cycles of bending from side to side without damaging. Microneedles were tested for their fracture strength and were capable of puncturing porcine skin and injecting a dye.
Dietary fatty acid (FA) composition may influence metabolism, possibly affecting weight management. The purpose of this study was to compare the effects of a 5-d diet rich in PUFA v. MUFA. A total of fifteen normal-weight men participated in a randomised cross-over design with two feeding trials (3 d lead-in diet, pre-diet visit, 5-d PUFA- or MUFA-rich diet, post-diet visit). The 5-d diets (50 % fat) were rich in either PUFA (25 % of energy) or MUFA (25 % of energy). At pre- and post-diet visits, subjects consumed breakfast and lunch test meals, rich in the FA for that 5-d diet. Indirect calorimetry was used for 4 h after each meal. There were no treatment differences in fasting metabolism acutely or after the 5-d diet. For acute meal responses before diet, RER was higher for PUFA v. MUFA (0·86 (sem 0·01) v. 0·84 (sem 0·01), P<0·05), whereas diet-induced thermogenesis (DIT) was lower for PUFA v. MUFA (18·91 (SEM 1·46) v. 21·46 (SEM 1·34) kJ, P<0·05). After the 5-d diets, the change in RER was different for PUFA v. MUFA (−0·02 (sem 0·01) v. 0·00 (sem 0·01), P<0·05). Similarly, the change in fat oxidation was greater for PUFA v. MUFA (0·18 (sem 0·07) v. 0·04 (sem 0·06) g, P<0·05). In conclusion, acutely, a MUFA-rich meal results in lower RER and greater DIT. However, after a 5-d high-fat diet, the change in metabolic responses was greater in the PUFA diet, showing the metabolic adaptability of a PUFA-rich diet.
In this investigation, we reported the increase in emergency department and inpatient admission cases during the month of November 2012 post Hurricane Sandy as compared with baseline (November 2010, 2011, and 2013) for elderly patients aged 65 and up.
Methods
Medical claims data for patients aged 65 and over treated at emergency department and inpatient health care facilities in New Jersey were analyzed to examine the surge in frequencies of diagnoses treated immediately following Hurricane Sandy. The differences were quantified using gap analysis for 2 years before and 1 year after the event.
Results
There was an average increase of 1700 cases for the month of November 2012 relative to baseline for the top 15 most frequently diagnosed emergency department medical conditions. On a daily basis, a volume increase by an average 57 cases could be expected, including significant numbers of limb fractures and other trauma cases for these most frequently encountered medical conditions.
Conclusions
Understanding the surge level in medical services needed in emergency departments and inpatient facilities during a natural disaster aftermath is critical for effective emergency preparation and response for the elderly population. (Disaster Med Public Health Preparedness. 2018;12:730-738)
In this study, we analyzed the patterns of socioeconomic and demographic factors along with health services provider availability for the current Zika outbreak in Miami-Dade County, South Florida. We used Center for Consumer Information & Insurance Oversight (CCIIO) Machine-Readable Public Use Files (MR-PUFs) to examine provider availability in combination with socioeconomic and demographic factors that could potentially lead to healthcare disparities between any underserved population of the Wynwood neighborhood and the broader population of Miami-Dade County. MR-PUFs contain public provider-level data from states that are participating in the Federally Facilitated Marketplace. According to CCIIO, an issuer of a Qualified Health Plan that uses a provider network must maintain a network that is sufficient in the number and types of providers, including providers that specialize in mental-health and substance-use disorder services, to assure that all services will be accessible to enrollees without unreasonable delay. (Disaster Med Public Health Preparedness. 2018;12:455–459)
Coapplication of herbicides and insecticides affords growers an opportunity to control multiple pests with one application, given that efficacy is not compromised. Glufosinate was applied at 470 g ai/ha both alone and in combination with the insecticides acephate, acetamiprid, bifenthrin, cyfluthrin, dicrotophos, emamectin benzoate, imidacloprid, indoxacarb, lambda-cyhalothrin, methoxyfenozide, spinosad, or thiamethoxam to determine coapplication effects on control of some of the more common and/or troublesome broadleaf weeds infesting cotton. Hemp sesbania, pitted morningglory, prickly sida, redroot pigweed, and sicklepod were treated at the three- to four- or the seven- to eight-leaf growth stage. When applied at the earlier application timing, glufosinate applied alone provided complete control at 14 d after treatment, and control was unaffected by coapplication with insecticides. When glufosinate application was delayed to the later application timing, visual weed control was unaffected by insecticide coapplication. Fresh-weight reduction from the herbicide applied to larger weeds was negatively impacted by addition of the insecticides dicrotophos and imidacloprid with respect to redroot pigweed and prickly sida, but only in one of two experiments. In most cases, delaying application of glufosinate to larger weeds resulted in reduced control compared to that from a three- to four-leaf application, with the extent of reduction varying by species. Results indicate that when applied according to the herbicide label (three- to four-leaf stage), glufosinate/ insecticide coapplications offer producers the ability to integrate pest management strategies and to limit application costs without sacrificing control of the broadleaf weeds evaluated.
Field studies were conducted to evaluate weed control with combinations of glyphosate at 750 g ae/ha and the insecticides acephate (370 g ai/ha), dicrotophos (370 g ai/ha), dimethoate (220 g ai/ha), fipronil (56 g ai/ha), imidacloprid (53 g ai/ha), lambda-cyhalothrin (37 g ai/ha), oxamyl (280 g ai/ha), or endosulfan (420 g ai/ha) and insect control with coapplication of the herbicide with insecticides acephate, dicrotophos, dimethoate, and imidacloprid. Applying lambda-cyhalothrin or fipronil with glyphosate reduced control of hemp sesbania by 19 and 9 percentage points, respectively, compared with glyphosate alone. Acephate, dicrotophos, dimethoate, imidacloprid, lambda-cyhalothrin, oxamyl, and endosulfan did not affect hemp sesbania, pitted morningglory, prickly sida, and redweed control by glyphosate. Lambda-cyhalothrin and fipronil did not affect glyphosate control of weeds other than hemp sesbania. Addition of glyphosate to dicrotophos improved cotton aphid control 4 d after treatment compared with dicrotophos alone. Thrips control was improved with addition of glyphosate to imidacloprid. Insect control was not reduced by glyphosate regardless of insecticide.
Field trials were conducted in 2005 and 2006 to evaluate application of glyphosate alone or plus the plant growth regulator mepiquat chloride with 20 different insecticides to second-generation glyphosate-resistant cotton at the pinhead square or first bloom growth stages. At 7 DAT, averaged across cotton growth stages and herbicide treatments, combination with insecticides profenofos and methomyl resulted in 5 and 9% plant injury, respectively, and were the only insecticide combinations that resulted in injury greater than glyphosate or glyphosate plus mepiquat chloride applied alone. By 14 DAT, cotton injury was less than 2% for all treatments. Averaged across cotton growth stages and insecticides, addition of mepiquat chloride to glyphosate resulted in a 4 and 6 cm height reduction at 7 and 28 DAT, respectively. Seed cotton yield and percent first harvest were similar for all treatments, indicating that cotton injury and height reductions observed after application did not result in yield reductions or maturity delays. Glyphosate combined with insecticides and mepiquat chloride, in accordance with herbicide labeling for second-generation glyphosate-resistant cotton, offers producers the ability to integrate pest and crop management strategies and reduce application costs with minimal effect on the crop.
Field research was conducted for 2 yr to determine the effects of reduced rates of bromoxynil on growth and yield of non–bromoxynil-resistant cotton. Rates of 4.5, 9, 17, 35, 70, and 140 g ha−1, representing 0.008, 0.016, 0.031, 0.063, 0.125, and 0.25 fractions of the maximum labeled use rate per application (560 g ha−1), were applied to cotton at the two-, five-, or nine-node growth stage. Visual injury was reduced because application timing was delayed from two- to five-node stage in all experiments and from five- to nine-node stage in two of three experiments. Although negatively affected at all application timings, plant height reduction response decreased with increasing cotton maturity. Plant dry weight was most negatively affected after application at the two-node stage. Bromoxynil application, based on the node above white flower number, did not result in maturity delays but did promote earlier maturity when applied at 140 g ha−1 to two- and five-node stage cotton in one of the three experiments. Final plant population was reduced only at the two- and five-node timings, with response more pronounced at the initial timing. Seedcotton yield after bromoxynil application at the highest rate to two-leaf cotton was reduced 34% compared with other rates and the nontreated control. Bromoxynil applied to five- or nine-node cotton did not significantly reduce yield.
On 4 July 1986, dye was injected at a point slightly above the equilibrium line on Storglaciären, a small valley glacier in northern Sweden. Just below the equilibrium line, the glacier bed is over-deepened. The dye re-appeared in a stream at the glacier terminus over the next 35 d. This stream normally carries relatively little sediment, in constrast to the situation in another nearby stream that also emerges from the glacier. This suggests that the dye traveled in englacial rather than subglacial conduits. Tracer tests utilizing salt in bore holes in the overdeepening support this interpretation, as the bore holes were draining well above the bed. The dye appeared during three distinct events, suggesting that it became divided into at least three separate parcels shortly after injection. This probably occurred in the crevassed area in the vicinity of the injection point.
The englacial location of the drainage may be explained by the fact that, in order to remain at the pressure melting-point, water in subglacial conduits coming out of the overdeepening may have had to warm up faster than would be possible by viscous heating alone. Such conduits would thus tend to freeze closed.
Field research was conducted in 1999 and 2000 to determine the effect of reduced glyphosate rates on growth and yield of nonglyphosate-resistant cotton. Rates of 9, 18, 35, 70, 140, and 280 g ha−1, representing 0.008, 0.016, 0.031 0.063, 0.125, and 0.25, respectively, of the maximum use rate per application (1,120 g ha−1), were applied to cotton at the two-, five-, or nine-node growth stage. On the basis of visual injury estimates, cotton was more tolerant to glyphosate at the nine-node than at earlier growth stages. Plant dry weight was reduced with 70 g ha−1 of glyphosate or higher, when applied at the two- and five-node growth stages in two of three experiments. Dry weight was not affected by glyphosate at the nine-node stage. Plant height also was unaffected by glyphosate rates below 70 g ha−1, but height reduction was noted for all growth stages by experiment combinations, with the exception of the nine-node application for both experiments in 2000, with herbicide rates of 70 g ha−1 or higher. Cotton maturity delay, as noted by an increase in node above white flower number, was observed only at the highest glyphosate rate applied to two- and five-node cotton in one of three experiments. Percent open boll data analysis indicated a decreased opportunity of observing an open boll with increasing glyphosate rate, and this effect was greater at the five-node compared with the two- and nine-node stages in two of three experiments. Seedcotton yield after all glyphosate applications was equivalent to that for the nontreated control.
Field studies investigated possible interactions associated with early-season coapplication of the herbicide pyrithiobac and various insecticides. Pyrithiobac at 70 g ai/ha, in combination with the insecticides acephate or dicrotophos at 370 g ai/ha, fipronil at 56 g ai/ha, imidacloprid at 52 g ai/ha, lambda-cyhalothrin at 37 g ai/ha, or oxamyl, carbofuran, or dimethoate at 280 g ai/ha did not reduce cotton leaf area, height, main stem node number, main stem nodes to first square, days to first square or flower, main stem nodes above white flower, or seed cotton yield compared with pyrithiobac alone. Pyrithiobac alone reduced dry weight of pitted morningglory, hemp sesbania, prickly sida, velvetleaf, and entireleaf–ivyleaf morningglory 28 d after treatment (DAT) 86, 98, 51, 94, and 91%, respectively, and weed control was not affected by the coapplication of insecticides. Control of thrips (adult plus larvae) 5 DAT with insecticides was unaffected by pyrithiobac addition at the P = 0.05 level of significance. At the P = 0.1 level, however, addition of pyrithiobac to dimethoate resulted in a reduction in insecticide efficacy in one of three experiments. Efficacy of other insecticides was unaffected.
Field research was conducted for 2 yr to determine the effect of reduced rates of glufosinate on growth and yield of non–glufosinate-resistant cotton. Rates of 3.4, 6.7, 13, 26.5, 52.5, and 105 g ha−1, representing 0.008, 0.016, 0.031, 0.063, 0.125, and 0.25 of an effective use rate (420 g ha−1), were applied to cotton at the two-, five-, or nine-node growth stage. Based on analysis of visual injury, cotton response decreased as application timing was delayed in one of the three experiments. Injury response was increased slightly with application at the five- compared with the two-node growth stage and was not significant for the latest application timing (nine-node stage) in two of three experiments. In two of the three experiments, plant height reduction response was lowest at the five-node stage and greatest at the nine-node stage. Regardless of application timing, plant dry weight was negatively affected only with the highest rate of glufosinate. Glufosinate application, based on node above white flower number and percent open boll, did not result in a delay in maturity. Final plant population was reduced in all experiments at the two-node application and in one of the three experiments at the five-node stage. Glufosinate application did not adversely affect final plant population when applied to nine-node cotton. Negative effects on cotton growth were not manifested in seedcotton yield reduction after glufosinate application.