We use cookies to distinguish you from other users and to provide you with a better experience on our websites. Close this message to accept cookies or find out how to manage your cookie settings.
To save content items to your account,
please confirm that you agree to abide by our usage policies.
If this is the first time you use this feature, you will be asked to authorise Cambridge Core to connect with your account.
Find out more about saving content to .
To save content items to your Kindle, first ensure no-reply@cambridge.org
is added to your Approved Personal Document E-mail List under your Personal Document Settings
on the Manage Your Content and Devices page of your Amazon account. Then enter the ‘name’ part
of your Kindle email address below.
Find out more about saving to your Kindle.
Note you can select to save to either the @free.kindle.com or @kindle.com variations.
‘@free.kindle.com’ emails are free but can only be saved to your device when it is connected to wi-fi.
‘@kindle.com’ emails can be delivered even when you are not connected to wi-fi, but note that service fees apply.
We conducted a pilot study of implementing community health workers (CHWs) to assist patients with hypertension and social needs. As part of clinical care, patients identified as having an unmet need were referred to a CHW. We evaluated changes in blood pressure and needs among 35 patients and conducted interviews to understand participants’ experiences. Participants had a mean age of 54.1 years and 29 were Black. Twenty-six completed follow-up. Blood pressure and social needs improved from baseline to 6 months. Participants reported being accepting of CHWs, but also challenges with establishing a relationship with a CHW and being unclear about their role.
North Carolina growers have long struggled to control Italian ryegrass, and recent research has confirmed that some Italian ryegrass biotypes have become resistant to nicosulfuron, glyphosate, clethodim, and paraquat. Integrating alternative management strategies is crucial to effectively control such biotypes. The objectives of this study were to evaluate Italian ryegrass control with cover crops and fall-applied residual herbicides and investigate cover crop injury from residual herbicides. This study was conducted during the fall/winter of 2021–22 in Salisbury, NC, and fall/winter of 2021–22 and 2022–23 in Clayton, NC. The study was designed as a 3 × 5 split-plot in which the main plot consisted of three cover crop treatments (no-cover, cereal rye at 80 kg ha−1, and crimson clover at 18 kg ha−1), and the subplots consisted of five residual herbicide treatments (S-metolachlor, flumioxazin, metribuzin, pyroxasulfone, and nontreated). In the 2021–22 season at Clayton, metribuzin injured cereal rye and crimson clover 65% and 55%, respectively. However, metribuzin injured both cover crops ≤6% in 2022–23. Flumioxazin resulted in unacceptable crimson clover injury of 50% and 38% in 2021–22 and 2022–23 in Clayton and 40% in Salisbury, respectively. Without preemergence herbicides, cereal rye controlled Italian ryegrass by 85% and 61% at 24 wk after planting in 2021–22 and 2022–23 in Clayton and 82% in Salisbury, respectively. In 2021–22, Italian ryegrass seed production was lowest in cereal rye plots at both locations, except when it was treated with metribuzin. For example, in Salisbury, cereal rye plus metribuzin resulted in 39,324 seeds m–2, compared to ≤4,386 seeds m–2 from all other cereal rye treatments. In 2022–23, Italian ryegrass seed production in cereal rye was lower when either metribuzin or pyroxasulfone were used preemergence (2,670 and 1,299 seeds m–2, respectively) compared with cereal rye that did not receive an herbicide treatment (5,600 seeds m–2). cereal rye (Secale cereale L.) and crimson clover (Trifolium incarnatum L.)
In December 2018, an outbreak of Salmonella Enteritidis infections was identified in Canada by whole-genome sequencing (WGS). An investigation was initiated to identify the source of the illnesses, which proved challenging and complex. Microbiological hypothesis generation methods included comparisons of Salmonella isolate sequence data to historical domestic outbreaks and international repositories. Epidemiological hypothesis generation methods included routine case interviews, open-ended centralized re-interviewing, thematic analysis of open-ended interview data, collection of purchase records, a grocery store site visit, analytic comparison to healthy control groups, and case–case analyses. Food safety hypothesis testing methods included food sample collection and analysis, and traceback investigations. Overall, 83 cases were identified across seven provinces, with onset dates from 6 November 2018 to 7 May 2019. Case ages ranged from 1 to 88 years; 60% (50/83) were female; 39% (22/56) were hospitalized; and three deaths were reported. Brand X profiteroles and eclairs imported from Thailand were identified as the source of the outbreak, and eggs from an unregistered facility were hypothesized as the likely cause of contamination. This study aims to describe the outbreak investigation and highlight the multiple hypothesis generation methods that were employed to identify the source.
An investigation into an outbreak of Salmonella Newport infections in Canada was initiated in July 2020. Cases were identified across several provinces through whole-genome sequencing (WGS). Exposure data were gathered through case interviews. Traceback investigations were conducted using receipts, invoices, import documentation, and menus. A total of 515 cases were identified in seven provinces, related by 0–6 whole-genome multi-locus sequence typing (wgMLST) allele differences. The median age of cases was 40 (range 1–100), 54% were female, 19% were hospitalized, and three deaths were reported. Forty-eight location-specific case sub-clusters were identified in restaurants, grocery stores, and congregate living facilities. Of the 414 cases with exposure information available, 71% (295) had reported eating onions the week prior to becoming ill, and 80% of those cases who reported eating onions, reported red onion specifically. The traceback investigation identified red onions from Grower A in California, USA, as the likely source of the outbreak, and the first of many food recall warnings was issued on 30 July 2020. Salmonella was not detected in any tested food or environmental samples. This paper summarizes the collaborative efforts undertaken to investigate and control the largest Salmonella outbreak in Canada in over 20 years.
Emergency psychiatric care, unplanned hospital admissions, and inpatient health care are the costliest forms of mental health care. According to Statistics Canada (2018), almost 18% (5.3 million) of Canadians reported needing mental health support. However, just above half of this figure (56.2%) have reported their needs were fully met. To further expand capacity and access to mental health care in the province, Nova Scotia Health has launched a novel mental health initiative, the Rapid Access, and Stabilization Program (RASP).
Objectives
This study evaluates the effectiveness and impact of the RASP on high-cost health services utilization (e.g. ED visits, mobile crisis visits, and inpatient treatments) and related costs. It also assesses healthcare partners’ (e.g. healthcare providers, policymakers, community leaders) perceptions and patient experiences and satisfaction with the program and identifies sociodemographic characteristics, psychological conditions, recovery, well-being, and risk measures in the assisted population.
Methods
This is a hypothesis-driven program evaluation study that employs a mixed methods approach. A within-subject comparison will examine health services utilization data from patients attending RASP, one year before and one year after their psychiatry assessment at the program. A controlled between-subject comparison will use historical data from a control population will examine whether possible changes in high-cost health services utilization are associated with the intervention (RASP). The primary analysis involves extracting secondary data from provincial information systems, electronic medical records, and regular self-reported clinical assessments. Additionally, a qualitative sub-study will examine patient experience and satisfaction, and examine health care partners’ impressions.
Results
The results for the primary, secondary, and qualitative outcome measures to be available within 6 months of study completion. We expect that RASP evaluation findings will demonstrate a minimum 10% reduction in high-cost health services utilization and corresponding 10% cost savings, and also a reduction in the wait times for patient consultations with psychiatrists to less than 30 calendar days. In addition, we anticipate that patients, healthcare providers, and healthcare partners would express high levels of satisfaction with the new service.
Conclusions
This study will demonstrate the results of the Mental Health and Addictions Program (MHAP) efforts to provide stepped-care, particularly community-based support, to individuals with mental illnesses. Results will provide new insights into a novel community-based approach to mental health service delivery and contribute to knowledge on how to implement mental health programs across varying contexts.
Obesity is a significant health issue in Aotearoa; effective and pragmatic strategies to facilitate weight loss are urgently required. Growing recognition of the circadian rhythm’s impact on metabolism has popularised diets like time-restricted eating (TRE)(1). The 16:8 TRE method involves limiting food intake to an 8-hour daily eating window and can lead to weight loss without other substantial changes to diet(2). Nonetheless, TRE requires accountability and tolerating hunger for short periods. Continuous glucose monitors (CGM) are small wearable biofeedback devices that measure interstitial glucose levels scanned via smartphones. By providing immediate feedback on the physiological effects of eating and fasting, CGM use may promote adherence to TRE(3). This pilot study aimed to 1) investigate how CGM affects adherence to TRE and 2) assess the feasibility of CGM use while undertaking TRE. This two-arm randomised controlled trial enrolled healthy adults from Dunedin, assigning them to TRE-only or TRE+CGM groups for 14 days. Successful adherence to TRE was defined a priori as maintaining an 8-hour eating window on 80% of days. CGM feasibility was defined a priori as scanning the glucose monitor thrice daily on 80% of days. Secondary outcomes included well-being, anthropometry, glucose levels, and overall TRE and CGM experiences via semi-structured interviews. Twenty-two participants were randomised into two groups: TRE-only (n = 11) and TRE+CGM (n = 11, with n = 2 excluded from analysis post-randomisation for medical reasons). Participants had a diverse range of ethnicities, the mean age was 32 (+/-14.9) years, and 55% were female. The TRE+CGM group adhered to the 8-hour eating window for an average of 10.0 days (range 2-14) compared with 8.6 days (range 2-14) in the TRE-only group. Both groups had similar mean eating window durations of 8.1 hours. Five (56%) participants in the TRE+CGM group achieved the a priori criteria for TRE adherence, compared to 3 (27%) in the TRE-only group. Participants in the TRE+CGM group performed an average of 8.2 (+/-5.6) daily scans, with n = 7 (78%) of participants meeting the a priori CGM feasibility criteria. Neither group reported consistent adverse psychological impacts in DASS-21 and WHO-5 scores. Interviews highlighted that CGM increased hunger tolerance during fasting as participants felt reassured by their normal glucose levels. CGM aided TRE accountability by acting as a biological tracker of food intake. Participants reported that TRE led to improved energy and self-efficacy, a more productive daily routine, and healthier food choices. Promisingly, 72% of participants would use CGM and undertake TRE in future. This study demonstrates that using CGM while undertaking TRE is feasible and can improve adherence by enhancing hunger tolerance and accountability. Overall, participants experienced increased awareness of eating habits and physiological mechanisms. Over the longer term, this simple and synergistic approach may be a helpful weight loss strategy.
The complementary feeding period (6-23 months of age) is when solid foods are introduced alongside breastmilk or infant formula and is the most significant dietary change a person will experience. The introduction of complementary foods is important to meet changing nutritional requirements(1). Despite the rising Asian population in New Zealand, and the importance of nutrition during the complementary feeding period, there is currently no research on Asian New Zealand (NZ) infants’ micronutrient intakes from complementary foods. Complementary foods are a more easily modifiable component of the diet than breastmilk or other infant milk intake. This study aimed to compare the dietary intake of micronutrients from complementary foods of Asian infants and non-Asian infants in NZ. This study reported a secondary analysis of the First Foods New Zealand cross-sectional study of infants (aged 7.0-9.9 months) in Dunedin and Auckland. 24-hour recall data were analysed using FoodFiles 10 software with the NZ food composition database FOODfiles 2018, and additional data for commercial complementary foods(2). The multiple source method was used to estimate usual dietary intake. Ethnicity was collected from the main questionnaire of the study, answered by the respondents (the infant’s parent/caregiver). Within the Asian NZ group, three Asian subgroups were identified – South East Asian, East Asian, and South Asian. The non-Asian group included all remaining participants of non-Asian ethnicities. Most nutrient reference values (NRV’s)(3) available for the 7-12 month age group are for total intake from complementary foods and infant milks, so the adequacy for the micronutrient intakes from complementary foods alone could not be determined. Vitamin A was the only micronutrient investigated in this analysis that had an NRV available from complementary foods only, allowing conclusions around adequacy to be made. The Asian NZ group (n = 99) had lower mean group intakes than the non-Asian group (n = 526) for vitamin A (274µg vs. 329µg), and vitamin B12 (0.49µg vs. 0.65µg), and similar intakes for vitamin C (27.8mg vs. 28.5mg), and zinc (1.7mg vs. 1.9mg). Mean group iron intakes were the same for both groups (3.0mg). The AI for vitamin A from complementary foods (244µg) was exceeded by the mean intakes for both groups, suggesting that Vitamin A intakes were adequate. The complementary feeding period is a critical time for obtaining nutrients essential for development and growth. The results from this study indicate that Asian NZ infants have lower intakes of two of the micronutrients of interest than the non-Asian infants in NZ. However, future research is needed with the inclusion of infant milk intake in these groups to understand the total intake of the micronutrients. Vitamin A intakes do appear to be adequate in NZ infants.
The prevalence of food allergies in New Zealand infants is unknown; however, it is thought to be similar to Australia, where the prevalence is over 10% of 1-year-olds(1). Current New Zealand recommendations for reducing the risk of food allergies are to: offer all infants major food allergens (age appropriate texture) at the start of complementary feeding (around 6 months); ensure major allergens are given to all infants before 1 year; once a major allergen is tolerated, maintain tolerance by regularly (approximately twice a week) offering the allergen food; and continue breastfeeding while introducing complementary foods(2). To our knowledge, there is no research investigating whether parents follow these recommendations. Therefore, this study aimed to explore parental offering of major food allergens to infants during complementary feeding and parental-reported food allergies. The cross-sectional study included 625 parent-infant dyads from the multi-centred (Auckland and Dunedin) First Foods New Zealand study. Infants were 7-10 months of age and participants were recruited in 2020-2022. This secondary analysis included the use of a study questionnaire and 24-hour diet recall data. The questionnaire included determining whether the infant was currently breastfed, whether major food allergens were offered to the infant, whether parents intended to avoid any foods during the first year of life, whether the infant had any known food allergies, and if so, how they were diagnosed. For assessing consumers of major food allergens, 24-hour diet recall data was used (2 days per infant). The questionnaire was used to determine that all major food allergens were offered to 17% of infants aged 9-10 months. On the diet recall days, dairy (94.4%) and wheat (91.2%) were the most common major food allergens consumed. Breastfed infants (n = 414) were more likely to consume sesame than non-breastfed infants (n = 211) (48.8% vs 33.7%, p≤0.001). Overall, 12.6% of infants had a parental-reported food allergy, with egg allergy being the most common (45.6% of the parents who reported a food allergy). A symptomatic response after exposure was the most common diagnostic tool. In conclusion, only 17% of infants were offered all major food allergens by 9-10 months of age. More guidance may be required to ensure current recommendations are followed and that all major food allergens are introduced by 1 year of age. These results provide critical insight into parents’ current practices, which is essential in determining whether more targeted advice regarding allergy prevention and diagnosis is required.
Although food insecurity affects a significant proportion of young children in New Zealand (NZ)(1), evidence of its association with dietary intake and sociodemographic characteristics in this population is lacking. This study aims to assess the household food security status of young NZ children and its association with energy and nutrient intake and sociodemographic factors. This study included 289 caregiver and child (1-3 years old) dyads from the same household in either Auckland, Wellington, or Dunedin, NZ. Household food security status was determined using a validated and NZ-specific eight-item questionnaire(2). Usual dietary intake was determined from two 24-hour food recalls, using the multiple source method(3). The prevalence of inadequate nutrient intake was assessed using the Estimated Average Requirement (EAR) cut-point method and full probability approach. Sociodemographic factors (i.e., socioeconomic status, ethnicity, caregiver education, employment status, household size and structure) were collected from questionnaires. Linear regression models were used to estimate associations with statistical significance set at p <0.05. Over 30% of participants had experienced food insecurity in the past 12 months. Of all eight indicator statements, “the variety of foods we are able to eat is limited by a lack of money,” had the highest proportion of participants responding “often” or “sometimes” (35.8%). Moderately food insecure children exhibited higher fat and saturated fat intakes, consuming 3.0 (0.2, 5.8) g/day more fat, and 2.0 (0.6, 3.5) g/day more saturated fat compared to food secure children (p<0.05). Severely food insecure children had lower g/kg/day protein intake compared to food secure children (p<0.05). In comparison to food secure children, moderately and severely food insecure children had lower fibre intake, consuming 1.6 (2.8, 0.3) g/day and 2.6 (4.0, 1.2) g/day less fibre, respectively. Severely food insecure children had the highest prevalence of inadequate calcium (7.0%) and vitamin C (9.3%) intakes, compared with food secure children [prevalence of inadequate intakes: calcium (2.3%) and vitamin C (2.8%)]. Household food insecurity was more common in those of Māori or Pacific ethnicity; living in areas of high deprivation; having a caregiver who was younger, not in paid employment, or had low educational attainment; living with ≥2 other children in the household; and living in a sole-parent household. Food insecure young NZ children consume a diet that exhibits lower nutritional quality in certain measures compared to their food-secure counterparts. Food insecurity was associated with various sociodemographic factors that are closely linked with poverty or low income. As such, there is an urgent need for poverty mitigation initiatives to safeguard vulnerable young children from the adverse consequences of food insecurity.
We demonstrate the importance of radio selection in probing heavily obscured galaxy populations. We combine Evolutionary Map of the Universe (EMU) Early Science data in the Galaxy and Mass Assembly (GAMA) G23 field with the GAMA data, providing optical photometry and spectral line measurements, together with Wide-field Infrared Survey Explorer (WISE) infrared (IR) photometry, providing IR luminosities and colours. We investigate the degree of obscuration in star-forming galaxies, based on the Balmer decrement (BD), and explore how this trend varies, over a redshift range of $0<z<0.345$. We demonstrate that the radio-detected population has on average higher levels of obscuration than the parent optical sample, arising through missing the lowest BD and lowest mass galaxies, which are also the lower star formation rate (SFR) and metallicity systems. We discuss possible explanations for this result, including speculation around whether it might arise from steeper stellar initial mass functions in low mass, low SFR galaxies.
Our institution sought to evaluate our antimicrobial stewardship empiric treatment recommendations for Salmonella. Results from 36 isolates demonstrated reduced susceptibilities to fluoroquinolones with 1 isolate susceptible only to ceftriaxone. Analysis supports the current recommendation of empiric ceftriaxone therapy for severe infection and updated recommendation for sulfamethoxazole-trimethoprim in non-severe infections.
Mild cognitive impairment (MCI) is an etiologically nonspecific diagnosis including a broad spectrum of cognitive decline between normal aging and dementia. Several large-scale cohort studies have found sex effects on neuropsychological test performance in MCI. The primary aim of the current project was to examine sex differences in neuropsychological profiles in a clinically diagnosed MCI sample using clinical and research diagnostic criteria.
Method:
The current study includes archival data from 349 patients (age M = 74.7; SD = 7.7) who underwent an outpatient neuropsychological evaluation and were diagnosed with MCI. Raw scores were converted to z-scores using normative datasets. Sex differences in neurocognitive profiles including severity, domain-specific composites (memory, executive functioning/information processing speed, and language), and modality-specific learning curves (verbal, visual) were examined using Analysis of Variance, Chi-square analyses, and linear mixed models. Post hoc analyses examined whether sex effects were uniform across age and education brackets.
Results:
Females exhibit worse non-memory domain and test-specific cognitive performances compared to males with otherwise comparable categorical MCI criteria and global cognition measured via screening and composite scores. Analysis of learning curves showed additional sex-specific advantages (visual Males>Females; verbal Females >Males) not captured by MCI subtypes.
Conclusions:
Our results highlight sex differences in a clinical sample with MCI. The emphasis of verbal memory in the diagnosis of MCI may result in diagnosis at more advanced stages for females. Additional investigation is needed to determine whether these profiles confer greater risk for progressing to dementia or are confounded by other factors (e.g., delayed referral, medical comorbidities).
To evaluate the clinical impact of the BioFire FilmArray Pneumonia Panel (PNA panel) in critically ill patients.
Design:
Single-center, preintervention and postintervention retrospective cohort study.
Setting:
Tertiary-care academic medical center.
Patients:
Adult ICU patients.
Methods:
Patients with quantitative bacterial cultures obtained by bronchoalveolar lavage or tracheal aspirate either before (January–March 2021, preintervention period) or after (January–March 2022, postintervention period) implementation of the PNA panel were randomly screened until 25 patients per study month (75 in each cohort) who met the study criteria were included. Antibiotic use from the day of culture collection through day 5 was compared.
Results:
The primary outcome of median time to first antibiotic change based on microbiologic data was 50 hours before the intervention versus 21 hours after the intervention (P = .0006). Also, 56 postintervention regimens (75%) were eligible for change based on PNA panel results; actual change occurred in 30 regimens (54%). Median antibiotic days of therapy (DOTs) were 8 before the intervention versus 6 after the intervention (P = .07). For the patients with antibiotic changes made based on PNA panel results, the median time to first antibiotic change was 10 hours. For patients who were initially on inadequate therapy, time to adequate therapy was 67 hours before the intervention versus 37 hours after the intervention (P = .27).
Conclusions:
The PNA panel was associated with decreased time to first antibiotic change and fewer antibiotic DOTs. Its impact may have been larger if a higher percentage of potential antibiotic changes had been implemented. The PNA panel is a promising tool to enhance antibiotic stewardship.
We assessed patterns of enteric infections caused by 14 pathogens, in a longitudinal cohort study of sequelae in British Columbia (BC) Canada, 2005–2014. Our population cohort of 5.8 million individuals was followed for an average of 7.5 years/person; during this time, 40 523 individuals experienced 42 308 incident laboratory-confirmed, provincially reported enteric infections (96.4 incident infections per 100 000 person-years). Most individuals (38 882/40 523; 96%) had only one, but 4% had multiple concurrent infections or more than one infection across the study. Among individuals with more than one infection, the pathogens and combinations occurring most frequently per individual matched the pathogens occurring most frequently in the BC population. An additional 298 557 new fee-for-service physician visits and hospitalisations for enteric infections, that did not coincide with a reported enteric infection, also occurred, and some may be potentially unreported enteric infections. Our findings demonstrate that sequelae risk analyses should explore the possible impacts of multiple infections, and that estimating risk for individuals who may have had a potentially unreported enteric infection is warranted.
The diagnosis of neurodegenerative and psychiatric disorders (NPDs) in primary care can suffer from inefficiencies resulting in misdiagnoses and delayed diagnosis, limiting effective treatment options. The development of speech and language-based profiling biomarkers could aid in achieving earlier motor diagnosis for PD for instance, or more accurate diagnosis of clinically similar or late presenting NPDs.
Objectives
RHAPSODY aims to investigate the feasibility of the remote administration of a battery of speech tasks in eliciting continuous narrative speech across a range of NPDs. The project also aims to determine the feasibility of using acoustic and linguistic biomarkers from speech data to support the clinical assessment and disambiguation of common NPDs
Methods
All participants (n=250) will take part in a single virtual telemedicine video conference with a researcher in which they are screened and complete a battery of speech tasks, in addition to cohort-specific screening measures. Over the following month, participants will be asked to complete a series of short, self-administered speech assessments via a smartphone application.
Results
The speech tasks will be audio-recorded and analysed on Novoic’s technology platform. Objectives will be analysed using measures including average length of speech elicitation for speech tasks, intra- and inter-subject variance, differences in linguistic patterns, and response rates to speech assessments.
Conclusions
The analyses could help to identify and validate speech-derived clinical biomarkers to support clinicians in detecting and disambiguating between NPDs with heterogeneous presentations. This should further support earlier intervention, improved treatment options and improved quality of life.
Disclosure
In terms of significant financial interest and relationships, it is emphasised that the private organisation Novoic, who aim to develop speech algorithms for diagnostic use, is the study’s sponsor and employees or former employees of this company comprise
Traditionally, primate cognition research has been conducted by independent teams on small populations of a few species. Such limited variation and small sample sizes pose problems that prevent us from reconstructing the evolutionary history of primate cognition. In this chapter, we discuss how large-scale collaboration, a research model successfully implemented in other fields, makes it possible to obtain the large and diverse datasets needed to conduct robust comparative analysis of primate cognitive abilities. We discuss the advantages and challenges of large-scale collaborations and argue for the need for more open science practices in the field. We describe these collaborative projects in psychology and primatology and introduce ManyPrimates as the first, successful collaboration that has established an infrastructure for large-scale, inclusive research in primate cognition. Considering examples of large-scale collaborations both in primatology and psychology, we conclude that this type of research model is feasible and has the potential to address otherwise unattainable questions in primate cognition.
Background: Visual impairment can impact 70% of individuals who have experienced a stroke. Identification and remediation of visual impairments can improve overall function and perceived quality of life. Our project aimed to improve visual assessment and timely intervention for patients with post-stroke visual impairment (PSVI). Methods: We conducted a quality improvement initiative to create a standardized screening and referral process for patients with PSVI to access an orthoptist. Post-stroke visual impairment was identified using the Visual Screen Assessment (VISA) tool. Patients filled out a VFQ-25 questionnaire before and after orthoptic assessment, and differences between scores were evaluated. Results: Eighteen patients completed the VFQ-25 both before and after orthoptic assessment. Of the vision related constructs, there was a significant improvement in reported outcomes for general vision (M=56.9, SD=30.7; M=48.6, SD=16.0), p=0.002, peripheral vision (M=88.3, SD=16; M=75, SD=23.1), p= 0.027, ocular pain (M=97.2, SD=6.9; M=87.5, SD=21.4), p=0.022, near activities (M=82.4, SD=24.1; M=67.8, SD=25.6), p<0.001, social functioning (M=90.2, SD=19; M=78.5, SD=29.3), p=0.019, mental health (M=84.0, SD=25.9; M=70.5, SD=31.2), p=0.017, and role difficulties (M=84.7, SD=26.3; M=67.4, SD=37.9), p=0.005. Conclusions: Orthoptic assessments for those with PSVI significantly improved perceived quality of life in a numerous vision related constructs, suggesting it is a valuable part of a patient’s post-stroke recovery.
Researchers have begun to change their approach to training in the biomedical sciences through the development of communities of practice (CoPs). CoPs share knowledge across clinical and laboratory contexts to promote the progress of clinical and translational science. The Congressionally Directed Medical Research Programs’ (CDMRP) Ovarian Cancer Academy (OCA) was designed as a virtual CoP to promote interactions among early career investigators (ECIs) and their mentors with the goal of eliminating ovarian cancer.
Methods:
A mixed-methods approach (surveys and interviews) was used to evaluate the effectiveness of the OCA for the eight ECIs and five mentors. Quantitative analysis included internal reliability of scales and descriptive statistics for each measure, as well as paired sample t-tests for Time 1 and Time 2. Qualitative data were analyzed for themes to discern which aspects of the program were useful and where more attention is needed.
Results:
Preliminary analyses reveal several trends, including the importance of training in grant writing to the ECI’s productivity, as well as the value of peer mentorship.
Conclusion:
The results show that the OCA was an innovative and effective way to create a CoP with broad implications for the field of ovarian cancer research, as well as for the future of biomedical research training.