We use cookies to distinguish you from other users and to provide you with a better experience on our websites. Close this message to accept cookies or find out how to manage your cookie settings.
To save content items to your account,
please confirm that you agree to abide by our usage policies.
If this is the first time you use this feature, you will be asked to authorise Cambridge Core to connect with your account.
Find out more about saving content to .
To save content items to your Kindle, first ensure no-reply@cambridge.org
is added to your Approved Personal Document E-mail List under your Personal Document Settings
on the Manage Your Content and Devices page of your Amazon account. Then enter the ‘name’ part
of your Kindle email address below.
Find out more about saving to your Kindle.
Note you can select to save to either the @free.kindle.com or @kindle.com variations.
‘@free.kindle.com’ emails are free but can only be saved to your device when it is connected to wi-fi.
‘@kindle.com’ emails can be delivered even when you are not connected to wi-fi, but note that service fees apply.
Governments are increasingly implementing policies to improve population diets, despite food industry resistance to regulation that may reduce their profits from sales of unhealthy foods. However, retail food environments remain an important target for policy action. This study analysed publicly available responses of industry actors to two public consultations on regulatory options for restricting unhealthy food price and placement promotions in retail outlets in Scotland.
Design:
We conducted a qualitative content analysis guided by the Policy Dystopia Model to identify the discursive (argument-based) and instrumental (tactic-based) strategies used by industry actors to counter the proposed food retail policies.
Setting:
Scotland, UK, 2017-2019.
Participants:
N/A
Results:
Most food and retail industry responses opposed the policy proposals. Discursive strategies employed by these actors commonly highlighted the potential costs to the economy, their industries and the public in the context of a financial crisis, and disputed the potential health benefits of the proposals. They claimed that existing efforts to improve population diets, such as nutritional reformulation, would be undermined. Instrumental strategies included using unsubstantiated and misleading claims, building a coordinated narrative focused on key opposing arguments and seeking further involvement in policy decision-making.
Conclusions:
These findings can be used by public health actors to anticipate and prepare for industry opposition when developing policies targeted at reducing the promotion of unhealthy food in retail settings. Government action should ensure robust management of conflicts of interest and establishment of guidance for the use of supporting evidence as part of the public health policy process.
Inadequate recruitment and retention impede clinical trial goals. Emerging decentralized clinical trials (DCTs) leveraging digital health technologies (DHTs) for remote recruitment and data collection aim to address barriers to participation in traditional trials. The ACTIV-6 trial is a DCT using DHTs, but participants’ experiences of such trials remain largely unknown. This study explored participants’ perspectives of the ACTIV-6 DCT that tested outpatient COVID-19 therapeutics.
Methods:
Participants in the ACTIV-6 study were recruited via email to share their day-to-day trial experiences during 1-hour virtual focus groups. Two human factors researchers guided group discussions through a semi-structured script that probed expectations and perceptions of study activities. Qualitative data analysis was conducted using a grounded theory approach with open coding to identify key themes.
Results:
Twenty-eight ACTIV-6 study participants aged 30+ years completed a virtual focus group including 1–4 participants each. Analysis yielded three major themes: perceptions of the DCT experience, study activity engagement, and trust. Participants perceived the use of remote DCT procedures supported by DHTs as an acceptable and efficient method of organizing and tracking study activities, communicating with study personnel, and managing study medications at home. Use of social media was effective in supporting geographically dispersed participant recruitment but also raised issues with trust and study legitimacy.
Conclusions:
While participants in this qualitative study viewed the DCT-with-DHT approach as reasonably efficient and engaging, they also identified challenges to address. Understanding facilitators and barriers to DCT participation and DHT interaction can help improve future research design.
We present a 1000 km transect of phase-sensitive radar measurements of ice thickness, basal reflection strength, basal melting and ice-column deformation across the Ross Ice Shelf (RIS). Measurements were gathered at varying intervals in austral summer between 2015 and 2020, connecting the grounding line with the distant ice shelf front. We identified changing basal reflection strengths revealing a variety of basal conditions influenced by ice flow and by ice–ocean interaction at the ice base. Reflection strength is lower across the central RIS, while strong reflections in the near-front and near-grounding line regions correspond with higher basal melt rates, up to 0.47 ± 0.02 m a−1 in the north. Melting from atmospherically warmed surface water extends 150–170 km south of the RIS front. Melt rates up to 0.29 ± 0.03 m a−1 and 0.15 ± 0.03 m a−1 are observed near the grounding lines of the Whillans and Kamb Ice Stream, respectively. Although troublesome to compare directly, our surface-based observations generally agree with the basal melt pattern provided by satellite-based methods but provide a distinctly smoother pattern. Our work delivers a precise measurement of basal melt rates across the RIS, a rare insight that also provides an early 21st-century baseline.
To investigate the concordance between Australian government guidelines for classifying the healthiness of foods across various public settings.
Design:
Commonly available products in Australian food service settings across eight food categories were classified according to each of the seventeen Australian state and territory food classification guidelines applying to public schools, workplaces and healthcare settings. Product nutrition information was retrieved from online sources. The level of concordance between each pair of guidelines was determined by the proportion of products rated at the same level of healthiness.
Setting:
Australia.
Participants:
No human participants.
Results:
Approximately half (56 %) of the 967 food and drink products assessed were classified as the same level of healthiness across all fifteen ‘traffic light’-based systems. Within each setting type (e.g. schools), pairwise concordance in product classifications between guidelines ranged from 74 % to 100 %. ‘Vegetables’ (100 %) and ‘sweet snacks and desserts’ (78 %) had the highest concordance across guidelines, while ‘cold ready-to-eat foods’ (0 %) and ‘savoury snacks’ (23 %) had the lowest concordance. In addition to differences in classification criteria, discrepancies between guidelines arose from different approaches to grouping of products. The largest proportion of discrepancies (58 %) were attributed to whether products were classified as ‘Red’ (least healthy) or ‘Amber’ (moderately healthy).
Conclusions:
The results indicate only moderate concordance between all guidelines. National coordination to create evidence-based consistency between guidelines would help provide clarity for food businesses, which are often national, on how to better support community health through product development and reformulation.
Increased ultra-processed food (UPF) is associated with adverse health outcomes. However, with limitations in UPF evidence, and partial overlap between UK front-of-package labelling (FOPL) and degree of food processing, the value of food processing within dietary guidance is unclear. This study compared food and drink from the UK National Diet and Nutrition Survey (NDNS) database based on micronutrient content, Nova classification and FOPL. The aim was to examine the micronutrient contributions of UK food and drink to UK government dietary micronutrient recommendations for adult females and males, aged 19–64 years, based on the degree of food processing and FOPL. NDNS items were coded into minimally processed food (MPF), processed culinary ingredients, processed food (PF) and UPF, and FOPL traffic lights. MPF, PF and UPF provided similar average contributions per 100 g to micronutrient recommendations. Per 100 kcal, MPF provided the greatest average contribution (14·4 % (interquartile range (IQR): 8·2–28·1)), followed by PF (7·7 % (IQR: 4·6–10·9) and then UPF (5·8 % (IQR: 3·1–9·7)). After adjusting for healthy/unhealthy items (presence of 1+ red FOPL), MPF had higher odds of an above-average micronutrient contribution per 100 kcal than UPF (OR: 5·9 (95 % CI 4·9–7·2)) and PF (OR: 3·2 (95 % CI 2·4–4·2)). MPF were more likely to provide greater contributions to micronutrient recommendations than PF or UPF per 100 kcal. These findings suggest that UPF or PF diets are less likely to meet micronutrient recommendations than an energy-matched MPF diet. The results are important for understanding how consumers perceive the healthiness of products based on FOPL.
Research on nutraceutical and dietary interventions in psychiatry has grown substantially, but progress is hindered by methodological inconsistencies and limited reporting standards. To address this, the International Society for Nutritional Psychiatry Research presents the first guidelines on clinical trial design, conduct, and reporting for future clinical trials in this area. Recommendations were developed using a Delphi process including eighteen researchers with considerable clinical trial expertise and experience in either methodology, nutraceutical, or dietary interventions in psychiatry. These guidelines provide forty-nine recommendations for clinical trial design and outcomes, five for trial reporting, and seven for future research priorities. The recommendations included in these guidelines are designed to inform both nutraceutical and dietary clinical trial interventions in Nutritional Psychiatry. Common themes include an emphasis on the importance of a multidisciplinary research team and integration of co-design processes into the conduct and design of clinical research, methods to improve transparency and replicability of trial outcomes, and measures to address common biases in nutrition trials. Furthermore, we provide recommendations for future research including examining a greater variety of nutraceutical and dietary interventions, scalable delivery models, effectiveness and implementation studies, and the need to investigate these interventions in the prevention and management of less studied psychiatric conditions (e.g. schizophrenia and bipolar disorder). Recommendations included within these guidelines are intended to improve the rigor and clinical relevance of ongoing and future clinical trials in Nutritional Psychiatry.
Leader exemplification involves implicit and explicit claims of high moral values made by a leader. We employed a 2 × 3 experimental design with samples of 265 students in Study 1 and 142 working adults in Study 2 to examine the effects of leader exemplification (exemplification versus no exemplification) and ethical conduct (self-serving, self-sacrificial, and self-other focus) on perceived leader authenticity, trust in leader, and organizational advocacy. In Study 1, we found that exemplification produced elevated levels of perceived authenticity, trust, and advocacy in the form of employment and investment recommendations. We also showed that leader ethical conduct moderated this effect, as ratings were highest following a leader’s self-sacrificial conduct, lowest for self-serving conduct, and moderate for conduct reflecting self-other concerns. In Study 2, we replicated these findings for perceived authenticity and trust, but not organizational advocacy, which yielded mixed results. The leadership implications and future research directions are discussed.
To understand the characteristics of food environments in the Pacific region, and the broader economic, policy and sociocultural surroundings that influence food choices and interventions to improve food environments for Pacific communities.
Design:
Systematic searches were conducted for articles related to food environments or factors influencing food choices from 1993 to 2024 in five academic databases, Google, Google Scholar and relevant organisations’ websites. Studies were included if they meet the eligibility criteria. Two authors independently reviewed the title and abstract of identified articles. Full-text screening was conducted before data were extracted from eligible studies. A narrative analysis was informed by an existing food environments framework.
Setting:
Pacific Island countries or territories that are a member of the Pacific Community (SPC).
Participants:
Not applicable.
Results:
From the sixty-six included studies (of 2520 records screened), it was clear that food environments in the Pacific region are characterised by high availability and promotion of ultra-processed unhealthy foods. These foods were reported to be cheaper than healthier alternatives and have poor nutritional labelling. Food trade and investment, together with sociocultural and political factors, were found to contribute to unhealthy food choices. Policy interventions have been implemented to address food environments; however, the development and implementation of food environment policies could be strengthened through stronger leadership, effective multisectoral collaboration and clear lines of responsibility.
Conclusions:
Interventions focused on improving physical, economic, policy and sociocultural influences on food choices should be prioritised in the Pacific region to improve the food environment and mitigate barriers to healthy eating.
The 17-subunit RNA polymerase III (RNAP III) synthesizes essential untranslated RNAs such as tRNAs and 5S rRNA. In yeast and vertebrates, subunit C82 forms a stable subcomplex with C34 and C31 that is necessary for promoter-specific transcription initiation. Little is known about RNAP III transcription in trypanosomatid parasites. To narrow this knowledge gap, we characterized the C82 subunit in Trypanosoma brucei and Leishmania major. Bioinformatic analyses showed that the 4 distinctive extended winged-helix (eWH) domains and the coiled-coil motif are present in C82 in these microorganisms. Nevertheless, C82 in trypanosomatids presents certain unique traits, including an exclusive loop within the eWH1 domain. We found that C82 localizes to the nucleus and binds to RNAP III-dependent genes in the insect stages of both parasites. Knock-down of C82 by RNA interference significantly reduced the levels of tRNAs and 5S rRNA and led to the death of procyclic forms of T. brucei. Tandem affinity purifications with both parasites allowed the identification of several C82-interacting partners, including C34 and some genus-specific putative regulators of transcription. However, the orthologue of C31 was not found in trypanosomatids. Interestingly, our data suggest a strong association of C82 with TFIIIC subunits in T. brucei, but not in L. major.
Sulfidated nanoscale zerovalent iron (S-nZVI) materials show enhanced reactivity and selectivity towards chlorinated solvents compared to non-sulfidated nZVI, thus they have a high potential for subsurface chlorinated solvent remediation. However, little is known about the possible toxic effects of S-nZVI towards microbial communities, which is of particular concern with regard to combined abiotic–biotic chlorinated solvent treatment scenarios. In this study, the toxicity of two different S-nZVI materials towards Shewanella oneidensis MR-1 (S. MR-1) was examined under anaerobic and aerobic conditions using colony forming units (CFU) and adenosine triphosphate (ATP) measurements, and the results were then compared to identical exposures performed with non-sulfidated nZVI. In a second step, the toxicity of S-nZVI and nZVI materials was tested on the commercial bioremediation culture KB-1® and on an in-house trichloroethylene enrichment culture. Under aerobic conditions, S. MR-1 viability was less affected by S-nZVI materials compared to non-sulfidated nZVI materials (up to three times higher viability) and it was generally lower compared to anaerobic conditions where little difference in S. MR-1 viability was observed between the tested materials. In terms of the two dechlorinating cultures, they exhibited significantly higher ATP viability during anaerobic exposures to S-nZVI and nZVI materials. Particularly for KB-1®, which retained comparable ATP-viability after ~60 hours exposure as S. MR-1 after two hours. Moreover, the ATP viability of the mixed cultures was generally higher in S-nZVI exposures compared to nZVI exposures (up to three times higher viability). The observed viability patterns are explained by differences in the shell structure, chemistry and stability of the tested S-nZVI and nZVI materials towards corrosion, while the substantially enhanced resilience of KB-1® is argued to stem from its year-long cultivation in the presence of reduced FeS particulates.
Rift propagation, rather than basal melt, drives the destabilization and disintegration of the Thwaites Eastern Ice Shelf. Since 2016, rifts have episodically advanced throughout the central ice-shelf area, with rapid propagation events occurring during austral spring. The ice shelf's speed has increased by ~70% during this period, transitioning from a rate of 1.65 m d−1 in 2019 to 2.85 m d−1 by early 2023 in the central area. The increase in longitudinal strain rates near the grounding zone has led to full-thickness rifts and melange-filled gaps since 2020. A recent sea-ice break out has accelerated retreat at the western calving front, effectively separating the ice shelf from what remained of its northwestern pinning point. Meanwhile, a distributed set of phase-sensitive radar measurements indicates that the basal melting rate is generally small, likely due to a widespread robust ocean stratification beneath the ice–ocean interface that suppresses basal melt despite the presence of substantial oceanic heat at depth. These observations in combination with damage modeling show that, while ocean forcing is responsible for triggering the current West Antarctic ice retreat, the Thwaites Eastern Ice Shelf is experiencing dynamic feedbacks over decadal timescales that are driving ice-shelf disintegration, now independent of basal melt.
A Palmer amaranth biotype (CT-Res) with resistance to glyphosate was recently confirmed in a pumpkin field in Connecticut. However, the underlying mechanisms conferring glyphosate resistance in this biotype is not known. The main objectives of this research were 1) to determine the effect of plant height (10, 20, and 30 cm) on glyphosate resistance levels in CT-Res Palmer amaranth biotype, and 2) to investigate whether the target site–based mechanisms confer glyphosate resistance. To achieve these objectives, progeny seeds of the CT-Res biotype after two generations of recurrent selection with glyphosate (6,720 g ae ha−1) were used. Similarly, known glyphosate-susceptible Palmer amaranth biotypes from Kansas (KS-Sus) and Alabama (AL-Sus) were included. Results from greenhouse dose-response studies revealed that CT-Res Palmer amaranth biotype had 69-, 64-, and 54-fold resistance to glyphosate as compared with the KS-Sus biotype when treated at heights of 10, 20, and 30 cm, respectively. Sequence analysis of the EPSPS gene revealed no point mutations at the Pro106 and Thr102 residues in the CT-Res Palmer amaranth biotype. Quantitative polymerase chain reaction analysis revealed that the CT-Res biotype had 33 to 111 relative copies of the EPSPS gene compared with the AL-Sus biotype. All these results suggest that the EPSPS gene amplification endows a high level of glyphosate resistance in the GR Palmer amaranth biotype from Connecticut. Because of the lack of control with glyphosate, growers should adopt the use of effective alternative preemergence and postemergence herbicides in conjunction with other cultural and mechanical tactics to mitigate the further spread of GR Palmer amaranth in Connecticut.
In the UK and Republic of Ireland, the European badger (Meles meles) is considered the most significant wildlife reservoir of the bacterium Mycobacterium bovis, the cause of bovine tuberculosis (bTB). To expand options for bTB surveillance and disease control, the Animal and Plant Health Agency developed a bespoke physical restraint cage to facilitate collection of a small blood sample from a restrained, conscious badger in the field. A key step, prior to pursuing operational deployment of the novel restraint cage, was an assessment of the relative welfare impacts of the approach. We used an established welfare assessment model to elicit expert opinion during two workshops to compare the impacts of the restraint cage approach with the only current alternative for obtaining blood samples from badgers in the field, which involves administration of a general anaesthetic. Eleven panellists participated in the workshops, comprising experts in the fields of wildlife biology, animal welfare science, badger capture and sampling, and veterinary science. Both approaches were assessed to have negative welfare impacts, although in neither case were overall welfare scores higher than intermediate, never exceeding 5–6 out of a possible 8. Based on our assessments, the restraint cage approach is no worse for welfare compared to using general anaesthesia and possibly has a lower overall negative impact on badger welfare. Our results can be used to integrate consideration of badger welfare alongside other factors, including financial cost and efficiency, when selecting a field method for blood sampling free-living badgers.
Cardiac vagal tone is an indicator of parasympathetic nervous system functioning, and there is increasing interest in its relation to antisocial behavior. It is unclear however whether antisocial individuals are characterized by increased or decreased vagal tone, and whether increased vagal tone is the source of the low heart rate frequently reported in antisocial populations.
Methods
Participants consisted of four groups of community-dwelling adolescent boys aged 15.7 years: (1) controls, (2) childhood-only antisocial, (3) adolescent-only antisocial, and (4) persistently antisocial. Heart rate and vagal tone were assessed in three different conditions: rest, cognitive stressor, and social stressor.
Results
All three antisocial groups had both lower resting heart rates and increased vagal tone compared to the low antisocial controls across all three conditions. Low heart rate partially mediated the relationship between vagal tone and antisocial behavior.
Conclusions
Results indicate that increased vagal tone and reduced heart rate are relatively broad risk factors for different developmental forms of antisocial behavior. Findings are the first to implicate vagal tone as an explanatory factor in understanding heart rate – antisocial behavior relationships. Future experimental work using non-invasive vagus nerve stimulation or heart rate variability biofeedback is needed to more systematically evaluate this conclusion.
UK front of package labelling (FOPL) informs consumers on the nutrient content of food. However, FOPL does not consider food processing, and with the UK government being urged to act on ultra-processed food (UPF), whether UPF should be added to FOPL is unclear. This study compared food and drink in the UK National Diet and Nutrition Survey (NDNS) Intake24 database based on FOPL, nutrient content and NOVA classification, to understand whether UPF are covered by dietary recommendations for foods high in fat, salt and sugar. NDNS items were coded into minimally processed food (MPF), processed culinary ingredients, processed food and UPF according to the NOVA classification and FOPL traffic lights. UPF contained greater energy, fat, saturated fat (SF), total sugar (TS) and salt than MPF. UPF had a greater odds of containing red FOPL and an unhealthier overall FOPL score (OR:4·59 (95 % CI: 3·79, 5·57); OR:7·0 (95 % CI: 6·1, 8·2), respectively) and lower odds of containing green FOPL (OR:0·05 (95 % CI: 0·03, 0·10)), compared with MPFs. For items with no red FOPL, UPF still contained greater energy, fat, SF, TS and salt than MPF. However, several UPF have healthier FOPL scores. UPF had an unhealthier nutritional profile and FOPL score than MPF. For items with no red FOPL, UPF still had an unhealthier profile than MPF, with a higher energy density. Importantly, not all UPF were unhealthy according to FOPL. These results indicate partial overlap between FOPL, nutrient content and NOVA classification of UK food and drink products, with implications for UK food and drink labelling.
Energy consumption in buildings, both residential and commercial, accounts for approximately 40% of all energy usage in the United States, and similar numbers are being reported from countries around the world. This significant amount of energy is used to maintain a comfortable, secure, and productive environment for the occupants. So, it is crucial that energy consumption in buildings must be optimized, all the while maintaining satisfactory levels of occupant comfort, health, and safety. Machine learning (ML) has been proven to be an invaluable tool in deriving important insights from data and optimizing various systems. In this work, we review some of the most promising ways in which ML has been leveraged to make buildings smart and energy-efficient. For the convenience of readers, we provide a brief introduction to the relevant ML paradigms and the components and functioning of each smart building system we cover. Finally, we discuss the challenges faced while implementing machine learning algorithms in smart buildings and provide future avenues for research in this field.
Children who sustain a mild traumatic brain injury (mTBI) are at increased odds of additive injury and continue to show altered motor performance relative to never-injured peers after being medically cleared (MC) to return to normal activities. There is a critical need to determine when children can return to activities without risk of short and long-term adverse effects, with research showing high reinjury rates for 3-12 months after RTP. The Physical and Neurological Examination for Subtle Signs (PANESS) measures subtle signs of motor impairment during gait, balance, and timed motor functions. Recent literature has demonstrated that PANESS timed motor function can distinguish between children medically cleared post-mTBI compared to never-injured controls. The present study examined performance on timed motor tasks in youth medically cleared from mTBI following medical clearance and 3-months later, compared to never-injured peers.
Participants and Methods:
25 children (Mage=14.16, SD=2.46; Male=68%) were enrolled within 6 weeks of medical clearance from mTBI (Mdays post MC=33, SD=13.4, Range=2-59) along with 66 typically developing, never-injured controls (Mage=13.9, SD=2.22; Male=50%). Group differences were evaluated for the Timed Motor section of the PANESS at enrollment and at a 3-month follow-up (Mdays from enrollment to follow-up=95.90, SD=12.69, Range=62-129). This 3-month follow-up occurred on average 4 months after medical clearance (Mdays from MC to follow-up=130.08, SD=17.58, Range=92 - 164). The Timed Motor section includes Repetitive (foot tapping, hand patting, and finger tapping) and Sequential (heel-toe rocking, hand pronate/supinate, finger sequencing) raw time scores, measured in seconds. The Total Timed Motor Speed score is the combination of Repetitive and Sequential Movement and the side-to-side tongue item.
Results:
At 3-month follow-up, mTBI participants (M=67.55, SD=8.26, Range=53.66-83.88) performed worse than controls (M=63.09, SD=10.23, Range=39.86-100.51) on Total Timed Motor Speed, t(89)= 1.95, p<0.05), including when controlling for age and sex, F(1, 87)=4.67, p<0.05. At the same time point, mTBI participants (M=36.54, SD=5.47, Range=28.74-49.17) performed worse on Sequential Speed than controls (M=32.93, SD=6.1, Range=21.49-56.76), t(89)=2.59, p<0.01, including when controlling for age and sex, F(1, 87)=7.687, p<0.01). Although groups performed similarly on Sequential Speed at the initial time point, mTBI participants exhibited a trend of less improvement from initial to follow-up (MmTBI=-1.69, Mcontrol=-3.68, t(90)=1.445, p=0.076).
Conclusions:
Although groups did not significantly differ on Timed Motor Speed items at the initial time point, the mTBI group showed consistently lower scores than controls at both time points and less improvement over time. Results indicate that Total Timed Motor Speed, specifically Sequential Speed, may be a sensitive marker of persisting differences in high-level motor and cognitive learning/control in children who have been medically cleared after mTBI. More data are needed to evaluate these findings over a longer time period, and future studies should examine behavioral markers concurrently with physiologic brain recovery over time.
Virtual testing can reduce cost and burdens, as well as increase access to clinical care. Few studies have examined the equivalency of virtual and in-person administration of standardized measures of executive functioning in children. During the COVID-19 pandemic, we utilized virtual administration of the Delis-Kaplan Executive Function System, Color-Word Interference Test (DKEFS-CW) in our ongoing longitudinal research study exploring outcomes in children clinically recovered from concussion compared to never-concussed peers. In the current study, we explore the equivalence of scores obtained via in-person and virtual administration of the DKEFS-CW in youth recovered from concussion and never-concussed controls.
Participants and Methods:
Participants included 112 youth ages 10-18 (Mage=14.05 years, SD=2.296; 53.5 % Male) who completed the DKEFS-CW in-person (n=63) or virtually (n=49) as part of their involvement in the parent study. Of these, 38 were recovered from concussion (Mdays since injury— 91.21, SD=88.91), and 74 were never-injured controls. Virtual administration was done via Zoom by presenting digital scans of the DKEFS stimulus book using the screen-sharing function. Participants set up and joined the Zoom call from a secondary device (cell phone) that was set in a stable position to provide a view of their screen, mouse and keyboard setup. Group (in-person vs remote) differences in DKEFS-CW scores were examined using independent-samples t-tests for all subtest conditions (color naming, word reading, inhibition, and inhibition/switching). T-tests/chi-square tests were used to examine between-group differences in demographic variables (i.e., age, sex maternal education, IQ, concussion history). Demographic variables that were significantly different by group were then included as covariates in ANCOVA models examining the effect of administration context on performance.
Results:
There were no significant differences in DKEFS-CW scaled scores between those who were administered the measure in-person or virtually (Color Naming: Min-person=10.78, Mvirtual=10.08, t(110)=1.634, p=.105; Word Reading: Min-person=11.25, Mvirtual=10.92, t(110)=.877, p=.382; Inhibition: M in-person= 11.70, Mvirtual=11.24, t(110)=1.182, p=.240; Inhibition/Switching: Mi n-person= 11.29, Mvirtual=10.82, t(110)=1.114, p=.268). There were no significant between-group differences in concussion history, sex, maternal education or IQ. However, those who were administered the DKEFS-CW in-person (Mage=13.55) were significantly younger than those who were administered the measure virtually (Mage=14.69), t(110)=-2.777, p=.006. After controlling for age, there remained no significant relationship between administration context (in-person vs. virtual) and DKEFS-CW performance for any subtest condition (Color Naming: F(1,30)=.016, p=.889; Word Reading: F(1,76)=.655, p=.421; Inhibition: F(1,30)=.038, p=.847; Inhibition/Switching: F(1,30)=.015, p=.902).
Conclusions:
The recommended practice for remote administration of DKEFS-CW is to have test stimuli presented flat on a table by a trained facilitator present with the examinees. Here, we provide preliminary evidence of equivalence between DKEFS-CW scores from tests completed in-person and those completed virtually with stimuli presented on a computer screen. Future studies are needed to replicate these findings in clinical populations with greater variability in executive function. Some clinical populations may also require more in-person support. Likewise, future studies may examine the role of trained facilitators or caregivers in the virtual testing process.
Protected areas safeguard biodiversity of global ecological importance, even throughout armed conflicts. The International Law Commission's Principles on Protection of the Environment in Relation to Armed Conflicts propose that certain ecologically important areas could be designated as protected zones during armed conflicts. This article uses a geospatial analysis of armed conflicts and Key Biodiversity Areas and three case studies to inform recommendations on how the protection of ecologically important areas could be enhanced through visibility, local actors and international stakeholders as part of a broader interpretation of a protected zone.
Despite the critical role that quantitative scientists play in biomedical research, graduate programs in quantitative fields often focus on technical and methodological skills, not on collaborative and leadership skills. In this study, we evaluate the importance of team science skills among collaborative biostatisticians for the purpose of identifying training opportunities to build a skilled workforce of quantitative team scientists.
Methods:
Our workgroup described 16 essential skills for collaborative biostatisticians. Collaborative biostatisticians were surveyed to assess the relative importance of these skills in their current work. The importance of each skill is summarized overall and compared across career stages, highest degrees earned, and job sectors.
Results:
Survey respondents were 343 collaborative biostatisticians spanning career stages (early: 24.2%, mid: 33.8%, late: 42.0%) and job sectors (academia: 69.4%, industry: 22.2%, government: 4.4%, self-employed: 4.1%). All 16 skills were rated as at least somewhat important by > 89.0% of respondents. Significant heterogeneity in importance by career stage and by highest degree earned was identified for several skills. Two skills (“regulatory requirements” and “databases, data sources, and data collection tools”) were more likely to be rated as absolutely essential by those working in industry (36.5%, 65.8%, respectively) than by those in academia (19.6%, 51.3%, respectively). Three additional skills were identified as important by survey respondents, for a total of 19 collaborative skills.
Conclusions:
We identified 19 team science skills that are important to the work of collaborative biostatisticians, laying the groundwork for enhancing graduate programs and establishing effective on-the-job training initiatives to meet workforce needs.