We use cookies to distinguish you from other users and to provide you with a better experience on our websites. Close this message to accept cookies or find out how to manage your cookie settings.
To save content items to your account,
please confirm that you agree to abide by our usage policies.
If this is the first time you use this feature, you will be asked to authorise Cambridge Core to connect with your account.
Find out more about saving content to .
To save content items to your Kindle, first ensure no-reply@cambridge.org
is added to your Approved Personal Document E-mail List under your Personal Document Settings
on the Manage Your Content and Devices page of your Amazon account. Then enter the ‘name’ part
of your Kindle email address below.
Find out more about saving to your Kindle.
Note you can select to save to either the @free.kindle.com or @kindle.com variations.
‘@free.kindle.com’ emails are free but can only be saved to your device when it is connected to wi-fi.
‘@kindle.com’ emails can be delivered even when you are not connected to wi-fi, but note that service fees apply.
Average lifespans for people with physical disabilities are increasing; yet there is limited knowledge about their perceptions of what it means to age well. The criteria for Rowe and Kahn’s influential model of successful ageing effectively preclude people ageing with a long-term disability. Several authors have attempted to develop more-inclusive models of successful ageing. The aim of this study was to explore what successful ageing means for people ageing with either spinal cord injury (SCI) or post-polio syndrome (PPS). We used an emic-based methodology, and recruited from Australia 17 participants aged 40–78 years. Nine participants (one male, eight female) had acquired poliomyelitis in childhood and experienced PPS, and eight participants (seven male, one female) had acquired an SCI 15 or more years ago. We used semi-structured interviews to elicit participants’ views on the dimensions important to ageing successfully with a disability, and analysed the transcripts using inductive thematic analysis. We identified eight themes, which related to: (1) maintaining physical health, (2) retaining cognitive abilities, (3) a sense of safety and security, (4) being treated with fairness and respect, (5) positive psychological resources, (6) independence and autonomy, (7) social engagement and participation in community and (8) a sense of purpose. We used the findings to construct a multi-dimensional successful ageing model for those ageing with SCI or PPS. The model includes insights from lay perspectives that further illustrate the role broader society plays in supporting or hindering individuals to age successfully, and has implications for health-care and government services.
The phoretic mite assemblage of the Douglas-fir beetle, Dendroctonus pseudotsugae Hopkins (Coleoptera: Curculionidae), has not been thoroughly documented. Phoretic mites can impact fitness and population dynamics of hosts; documenting a mite assemblage may provide information on their ecological roles. We caught Douglas-fir beetles in central British Columbia, Canada, and sorted associated mites into morphospecies. Representatives of the morphospecies were DNA barcoded (CO1 barcode region), indicating at least nine operational taxonomic units (OTUs). Representatives of all OTUs were slide-mounted and morphologically identified. There was a mean of 50.5 ± 4.7 mites per beetle, with both females and males carrying similar numbers of most mite species, except for OTU B1, which was found in higher numbers on females. OTU B1, Parawinterschmidtia furnissi (Woodring) (Astigmata: Winterschmidtiidae), was found in substantially higher numbers than all other OTUs and was always clustered in large aggregations in an anterior pocket on the beetles’ subelytral surface. When this OTU was removed from the calculation, the mean number dropped to 1.3 ± 0.2 mites per beetle. The consistent high numbers of OTU B1 in conjunction with its consistent anatomical aggregation suggests an important interaction between this particular mite species and the Douglas-fir beetle.
Research study complexity refers to variables that contribute to the difficulty of a clinical trial or study. This includes variables such as intervention type, design, sample, and data management. High complexity often requires more resources, advanced planning, and specialized expertise to execute studies effectively. However, there are limited instruments that scale study complexity across research designs. The purpose of this study was to develop and establish initial psychometric properties of an instrument that scales research study complexity.
Methods:
Technical and grammatical principles were followed to produce clear, concise items using language familiar to researchers. Items underwent face, content, and cognitive validity testing through quantitative surveys and qualitative interviews. Content validity indices were calculated, and iterative scale revision was performed. The instrument underwent pilot testing using 2 exemplar protocols, asking participants (n = 31) to score 25 items (e.g., study arms, data collection procedures).
Results:
The instrument (Research Complexity Index) demonstrated face, content, and cognitive validity. Item mean and standard deviation ranged from 1.0 to 2.75 (Protocol 1) and 1.31 to 2.86 (Protocol 2). Corrected item-total correlations ranged from .030 to .618. Eight elements appear to be under correlated to other elements. Cronbach’s alpha was 0.586 (Protocol 1) and 0.764 (Protocol 2). Inter-rater reliability was fair (kappa = 0.338).
Conclusion:
Initial pilot testing demonstrates face, content, and cognitive validity, moderate internal consistency reliability and fair inter-rater reliability. Further refinement of the instrument may increase reliability thus providing a comprehensive method to assess study complexity and related resource quantification (e.g., staffing requirements).
OBJECTIVES/GOALS: The Community Research Liaison Model (CRLM) is a novel model to facilitate community engaged research (CEnR) and community–academic research partnerships focused on health priorities identified by the community. We describe the CRLM development process and how it is operationalized today. METHODS/STUDY POPULATION: The CRLM, informed by the Principles of Community Engagement, builds trust among rural communities and expands capacity for community and investigator-initiated research. We followed a multi-phase process to design and implement a community engagement model that could be replicated. The resulting CRLM moves community–academic research collaborations from objectives to outputs using a conceptual framework that specifies our guiding principles, objectives, and actions to facilitate the objectives (i.e., capacity, motivations, and partners), and outputs. RESULTS/ANTICIPATED RESULTS: The CRLM has been fully implemented across Oregon. Six Community Research Liaisons collectively support 18 predominantly rural Oregon counties. Since 2017, the liaison team has engaged with communities on nearly 300 community projects. The CRLM has been successful in facilitating CEnR and community–academic research partnerships. The model has always existed on a dynamic foundation and continues to be responsive to the lessons learned by the community and researchers. The model is expanding across Oregon as an equitable approach to addressing health disparities across the state. DISCUSSION/SIGNIFICANCE: Our CRLM is based on the idea that community partnerships build research capacity at the community level and are the backbone for pursuing equitable solutions and better health for communities we serve. Our model is unique in its use of CRLs to facilitate community–academic partnerships; this model has brought successes and challenges over the years.
Birthweight has been associated with diabetes in a reverse J-shape (highest risk at low birthweight and moderately high risk at high birthweight) and inversely associated with hypertension in adulthood with inconsistent evidence for cardiovascular disease. There is a lack of population-based studies examining the incidence of cardiometabolic outcomes in young adults born with low and high birthweights. To evaluate the association between birthweight and diabetes, hypertension, and ischemic heart disease (IHD) in young adulthood, we conducted a retrospective cohort study of 874,904 singletons born in Ontario, Canada, from 1994 to 2002, identified from population-based health administrative data. Separate Cox regression models examined birthweight in association with diabetes, hypertension, and IHD adjusting for confounders. Among adults 18–26 years, the diabetes incidence rate was 18.15 per 100,000 person-years, hypertension was 15.80 per 100,000 person-years, and IHD was 1.85 per 100,000 person-years. Adjusted hazard ratios (AHR) for the hazard of diabetes with low (<2500g) and high (>4000g), compared with normal (2500–4000g) birthweight, were 1.46 (95% CI 1.28, 1.68) and 1.09 (0.99, 1.21), respectively. AHR for hypertension with low and high birthweight were 1.34 (1.15, 1.56) and 0.86 (0.77, 0.97), respectively. AHR for IHD with low and high birthweight were 1.28 (0.80, 2.05) and 0.97 (0.71, 1.33), respectively. Overall, birthweight was associated with diabetes in young adults in a reverse J-shape and inversely with hypertension. There was insufficient evidence of an association with IHD. Further evidence is needed to understand the causal mechanisms between birthweight and cardiometabolic diseases in young adults.
We examined the association between multidrug resistance and socioeconomic status (SES), analyzing microbiological and ZIP-code–level socioeconomic data. Using generalized linear models, we determined that multidrug resistance is significantly and persistently more prevalent in samples taken from patients residing in low-income ZIP codes versus high-income ZIP codes in North Carolina.
This chapter explores the ways in which students have experienced the Professional Qualification in Probation (PQiP) during the COVID-19 pandemic. It includes comments drawn from focus groups with students in a range of cohorts at various stages of the qualification when the pandemic occurred. The findings lead to reflections on the implications for probation education moving forward. The impact of the COVID-19 pandemic on probation has been documented by the House of Commons Justice Committee (2020) in their report Coronavirus (COVID-19): The impact on probation systems, which details the move to the exceptional delivery model that changed the way probation services were delivered. Unfortunately, there was no mention of PQiP students within the Justice Committee's report. We are interested in learning more about the students who transitioned from the workplace to working from home, and those students who started the qualification when already working from home. Given that the report states that staff ‘[m] orale and wellbeing are being affected’ (House of Commons Justice Select Committee, 2020) by the current circumstances, there is further support for research that explores the ‘wellness’ of probation trainees.
The aspiring probation officers were entering the service at a time of tumultuous change. During this period, the Justice Select Committee (2020) reported that probation practice adapted by using Skype, phone and messaging services for supervision, while people assessed as high risk received doorstep visits. Further to this, sentence requirements could not be completed where they involved unpaid work or offending behaviour programme interventions. In addition, an already-stretched workforce, with high caseloads and not enough staff, suffered the absence of 2,000 staff per day due to COVID-19. Moreover, Phillips (2020) flags a request from HM Prison & Probation Service (HMPPS) for the frequency of contact with people on probation at this time to be doubled. With 224,174 individuals on probation in the community in March 2021 (GOV.UK Justice Data, 2021), this was also a difficult time for people on probation and in prison experiencing supervision, exacerbating the ‘pains’ of supervision (McNeill 2019, 2020). This has all come at a time when probation services in England and Wales are undergoing significant reforms following the failure of Transforming Rehabilitation.
Insulin-like growth factor-1 (IGF-1) is a critical fetal growth hormone that has been proposed as a therapy for intrauterine growth restriction. We previously demonstrated that a 1-week IGF-1 LR3 infusion into fetal sheep reduces in vivo and in vitro insulin secretion suggesting an intrinsic islet defect. Our objective herein was to determine whether this intrinsic islet defect was related to chronicity of exposure. We therefore tested the effects of a 90-min IGF-1 LR3 infusion on fetal glucose-stimulated insulin secretion (GSIS) and insulin secretion from isolated fetal islets. We first infused late gestation fetal sheep (n = 10) with either IGF-1 LR3 (IGF-1) or vehicle control (CON) and measured basal insulin secretion and in vivo GSIS utilizing a hyperglycemic clamp. We then isolated fetal islets immediately following a 90-min IGF-1 or CON in vivo infusion and exposed them to glucose or potassium chloride to measure in vitro insulin secretion (IGF-1, n = 6; CON, n = 6). Fetal plasma insulin concentrations decreased with IGF-1 LR3 infusion (P < 0.05), and insulin concentrations during the hyperglycemic clamp were 66% lower with IGF-1 LR3 infusion compared to CON (P < 0.0001). Insulin secretion in isolated fetal islets was not different based on infusion at the time of islet collection. Therefore, we speculate that while acute IGF-1 LR3 infusion may directly suppress insulin secretion, the fetal β-cell in vitro retains the ability to recover GSIS. This may have important implications when considering the long-term effects of treatment modalities for fetal growth restriction.
The quenching of cluster satellite galaxies is inextricably linked to the suppression of their cold interstellar medium (ISM) by environmental mechanisms. While the removal of neutral atomic hydrogen (H i) at large radii is well studied, how the environment impacts the remaining gas in the centres of galaxies, which are dominated by molecular gas, is less clear. Using new observations from the Virgo Environment traced in CO survey (VERTICO) and archival H i data, we study the H i and molecular gas within the optical discs of Virgo cluster galaxies on 1.2-kpc scales with spatially resolved scaling relations between stellar ($\Sigma_{\star}$), H i ($\Sigma_{\text{H}\,{\small\text{I}}}$), and molecular gas ($\Sigma_{\text{mol}}$) surface densities. Adopting H i deficiency as a measure of environmental impact, we find evidence that, in addition to removing the H i at large radii, the cluster processes also lower the average $\Sigma_{\text{H}\,{\small\text{I}}}$ of the remaining gas even in the central $1.2\,$kpc. The impact on molecular gas is comparatively weaker than on the H i, and we show that the lower $\Sigma_{\text{mol}}$ gas is removed first. In the most H i-deficient galaxies, however, we find evidence that environmental processes reduce the typical $\Sigma_{\text{mol}}$ of the remaining gas by nearly a factor of 3. We find no evidence for environment-driven elevation of $\Sigma_{\text{H}\,{\small\text{I}}}$ or $\Sigma_{\text{mol}}$ in H i-deficient galaxies. Using the ratio of $\Sigma_{\text{mol}}$-to-$\Sigma_{\text{H}\,{\small\text{I}}}$ in individual regions, we show that changes in the ISM physical conditions, estimated using the total gas surface density and midplane hydrostatic pressure, cannot explain the observed reduction in molecular gas content. Instead, we suggest that direct stripping of the molecular gas is required to explain our results.
This study examined relationships between foodborne outbreak investigation characteristics, such as the epidemiological methods used, and the success of the investigation, as determined by whether the investigation identified an outbreak agent (i.e. pathogen), food item and contributing factor. This study used data from the Centers for Disease Control and Prevention's (CDC) National Outbreak Reporting System and National Environmental Assessment Reporting System to identify outbreak investigation characteristics associated with outbreak investigation success. We identified investigation characteristics that increase the probability of successful outbreak investigations: a rigorous epidemiology investigation method; a thorough environmental assessment, as measured by number of visits to complete the assessment; and the collection of clinical samples. This research highlights the importance of a comprehensive outbreak investigation, which includes epidemiology, environmental health and laboratory personnel working together to solve the outbreak.
The Community Research Liaison Model (CRLM) is a novel model to facilitate community-engaged research (CEnR) and community–academic research partnerships focused on health priorities identified by the community. This model, informed by the Principles of Community Engagement, builds trust among rural communities and expands capacity for community and investigator-initiated research. We describe the CRLM development process and how it is operationalized today. We followed a multi-phase process to design and implement a community engagement model that could be replicated. The resulting CRLM moves community–academic research collaborations from objectives to outputs using a conceptual framework that specifies our guiding principles, objectives, and actions to facilitate the objectives (i.e., capacity, motivations, and partners), and outputs. The CRLM has been fully implemented across Oregon. Six Community Research Liaisons collectively support 18 predominantly rural Oregon counties. Since 2017, the liaison team has engaged with communities on nearly 300 community projects. The CRLM has been successful in facilitating CEnR and community–academic research partnerships. The model has always existed on a dynamic foundation and continues to be responsive to the lessons learned by the community and researchers. The model is expanding across Oregon as an equitable approach to addressing health disparities across the state.
To determine whether primary school children’s weight status and dietary behaviours vary by remoteness as defined by the Australian Modified Monash Model (MMM).
Design:
A cross-sectional study design was used to conduct secondary analysis of baseline data from primary school students participating in a community-based childhood obesity trial. Logistic mixed models estimated associations between remoteness, measured weight status and self-reported dietary intake.
Setting:
Twelve regional and rural Local Government Areas in North-East Victoria, Australia.
Participants:
Data were collected from 2456 grade 4 (approximately 9–10 years) and grade 6 (approximately 11–12 years) students.
Results:
The final sample included students living in regional centres (17·4 %), large rural towns (25·6 %), medium rural towns (15·1 %) and small rural towns (41·9 %). Weight status did not vary by remoteness. Compared to children in regional centres, those in small rural towns were more likely to meet fruit consumption guidelines (OR: 1·75, 95 % CI (1·24, 2·47)) and had higher odds of consuming fewer takeaway meals (OR: 1·37, 95 % CI (1·08, 1·74)) and unhealthy snacks (OR = 1·58, 95 % CI (1·15, 2·16)).
Conclusions:
Living further from regional centres was associated with some healthier self-reported dietary behaviours. This study improves understanding of how dietary behaviours may differ across remoteness levels and highlights that public health initiatives may need to take into account heterogeneity across communities.
The number of people over the age of 65 attending Emergency Departments (ED) in the United Kingdom (UK) is increasing. Those who attend with a mental health related problem may be referred to liaison psychiatry for assessment. Improving responsiveness and integration of liaison psychiatry in general hospital settings is a national priority. To do this psychiatry teams must be adequately resourced and organised. However, it is unknown how trends in the number of referrals of older people to liaison psychiatry teams by EDs are changing, making this difficult.
Method
We performed a national multi-centre retrospective service evaluation, analysing existing psychiatry referral data from EDs of people over 65. Sites were selected from a convenience sample of older peoples liaison psychiatry departments. Departments from all regions of the UK were invited to participate via the RCPsych liaison and older peoples faculty email distribution lists. From departments who returned data, we combined the date and described trends in the number and rate of referrals over a 7 year period.
Result
Referral data from up to 28 EDs across England and Scotland over a 7 year period were analysed (n = 18828 referrals). There is a general trend towards increasing numbers of older people referred to liaison psychiatry year on year. Rates rose year on year from 1.4 referrals per 1000 ED attenders (>65 years) in 2011 to 4.5 in 2019 . There is inter and intra site variability in referral numbers per 1000 ED attendances between different departments, ranging from 0.1 - 24.3.
Conclusion
To plan an effective healthcare system we need to understand the population it serves, and have appropriate structures and processes within it. The overarching message of this study is clear; older peoples mental health emergencies presenting in ED are common and appear to be increasingly so. Without appropriate investment either in EDs or community mental health services, this is unlikely to improve.
The data also suggest very variable inter-departmental referral rates. It is not possible to establish why rates from one department to another are so different, or whether outcomes for the population they serve are better or worse. The data does however highlight the importance of asking further questions about why the departments are different, and what impact that has on the patients they serve.
Protein-energy malnutrition, or undernutrition, arising from a deficiency of energy and protein intake, can occur in developed countries both in hospitalised patients and in the primary care/community setting. Oral nutritional supplements (ONS) are an effective method of managing malnutrition if prescribed for patients who are malnourished or at risk of malnutrition. Pooled data of older adults at risk of malnutrition indicate that ONS combined with dietary counselling is the most effective intervention. Previous Irish research has demonstrated that management of patients ‘at risk’ of malnutrition in the primary care/community setting is sub-optimal, with low awareness of the condition and its management among non-dietetic health care professionals. Therefore, the aim of this qualitative study is to explore community nurses’ and dietitians’ experiences and opinions on the management of malnutrition and the prescription of ONS in the primary care/community setting in Ireland. Three focus groups were conducted with primary care dietitians (n = 17) and one focus group with community nurses (n = 5), one of the nurses had prescribing rights. The focus groups explored the following domains; the term malnutrition and patient population presenting as malnourished or at risk of malnutrition, barriers and facilitators in the management of malnutrition, ONS prescribing in the primary care/community setting, and future directions in the management of malnutrition and ONS prescribing. Recorded focus groups were transcribed and analysed using inductive thematic analysis. Both professional groups showed similar perspectives, and three preliminary main themes were identified; i) Malnutrition is a misunderstood term, ii) Delayed treatment of malnutrition, iii) Challenges with ONS prescription in the primary care/community setting. Both dietitians and community nurses agreed that the term malnutrition had negative connotations for patients and preferred not to use it with them. Dietitians identified the need for a multidisciplinary approach to manage patients at risk of malnutrition in the community, and community nurses agreed on their pivotal role identifying the risk of malnutrition and providing first line advice to clients. However, community nurses expressed the urgent need for training to provide first line advice to patients to improve their nutritional status to prevent malnutrition. Both groups also agreed on the need for access to more dietitians in the community, and suggested that giving dietitians prescribing rights would improve appropriate ONS prescribing. Community nurses identified a gap in their knowledge of the different ONS products, and the need to receive independent generic education on nutritional supplements.
Commonly used measures of instrumental activities of daily living (IADL) do not capture activities for a technologically advancing society. This study aimed to adapt the proxy/informant-based Amsterdam IADL Questionnaire (A-IADL-Q) for use in the UK and develop a self-report version.
Design:
An iterative mixed method cross-cultural adaptation of the A-IADL-Q and the development of a self-report version involving a three-step design: (1) interviews and focus groups with lay and professional stakeholders to assess face and content validity; (2) a questionnaire to measure item relevance to older adults in the U.K.; (3) a pilot of the adapted questionnaire in people with cognitive impairment.
Setting:
Community settings in the UK.
Participants:
One hundred and forty-eight participants took part across the three steps: (1) 14 dementia professionals; 8 people with subjective cognitive decline (SCD), mild cognitive impairment (MCI), or dementia due to Alzheimer’s disease; and 6 relatives of people with MCI or dementia; (2) 92 older adults without cognitive impairment; and (3) 28 people with SCD or MCI.
Measurements:
The cultural relevance and applicability of the A-IADL-Q scale items were assessed using a 6-point Likert scale. Cognitive and functional performance was measured using a battery of cognitive and functional measures.
Results:
Iterative modifications to the scale resulted in a 55-item adapted version appropriate for UK use (A-IADL-Q-UK). Pilot data revealed that the new and revised items performed well. Four new items correlated with the weighted average score (Kendall’s Tau −.388, −.445, −.497, −.569). An exploratory analysis of convergent validity found correlations in the expected direction with cognitive and functional measures.
Conclusion:
The A-IADL-Q-UK provides a measurement of functional decline for use in the UK that captures culturally relevant activities. A new self-report version has been developed and is ready for testing. Further evaluation of the A-IADL-Q-UK for construct validity is now needed.
Researchers now commonly collect biospecimens for genomic analysis together with information from mobile devices and electronic health records. This rich combination of data creates new opportunities for understanding and addressing important health issues, but also intensifies challenges to privacy and confidentiality. Here, we elucidate the “web” of legal protections for precision medicine research by integrating findings from qualitative interviews with structured legal research and applying them to realistic research scenarios involving various privacy threats.
We present a detailed overview of the cosmological surveys that we aim to carry out with Phase 1 of the Square Kilometre Array (SKA1) and the science that they will enable. We highlight three main surveys: a medium-deep continuum weak lensing and low-redshift spectroscopic HI galaxy survey over 5 000 deg2; a wide and deep continuum galaxy and HI intensity mapping (IM) survey over 20 000 deg2 from
$z = 0.35$
to 3; and a deep, high-redshift HI IM survey over 100 deg2 from
$z = 3$
to 6. Taken together, these surveys will achieve an array of important scientific goals: measuring the equation of state of dark energy out to
$z \sim 3$
with percent-level precision measurements of the cosmic expansion rate; constraining possible deviations from General Relativity on cosmological scales by measuring the growth rate of structure through multiple independent methods; mapping the structure of the Universe on the largest accessible scales, thus constraining fundamental properties such as isotropy, homogeneity, and non-Gaussianity; and measuring the HI density and bias out to
$z = 6$
. These surveys will also provide highly complementary clustering and weak lensing measurements that have independent systematic uncertainties to those of optical and near-infrared (NIR) surveys like Euclid, LSST, and WFIRST leading to a multitude of synergies that can improve constraints significantly beyond what optical or radio surveys can achieve on their own. This document, the 2018 Red Book, provides reference technical specifications, cosmological parameter forecasts, and an overview of relevant systematic effects for the three key surveys and will be regularly updated by the Cosmology Science Working Group in the run up to start of operations and the Key Science Programme of SKA1.
The number of people growing older with severe mental illness (SMI) is rising, reflecting societal trends towards an ageing population. Evidence suggests that older people are less likely to seek help, be referred for and receive psychological therapy compared with younger people, but past research has focused on those with mild to moderate mental health needs.
Aims:
This research aims to identify the specific barriers faced by older people with SMI.
Method:
We interviewed 53 participants (22 service users with SMI aged over 50 years, 11 carers of people with SMI, and 20 health care professionals) about their views and experiences of accessing therapy for SMI in later life.
Results:
Thematic analysis revealed five themes: organizational and resource issues; myths about therapy and attitudinal barriers; stigma; encouraging access to therapy; and meeting age-specific needs.
Conclusions:
Barriers faced by older people with SMI are not only age-related, but also reflect specific issues associated with having a SMI over many years. Improving awareness of the benefits of psychological therapies is important not only for older people with SMI themselves, but also for their carers and staff who work with them.
Almost all living organisms on Earth utilize the same 20 amino acids to build their millions of different proteins, even though there are hundreds of amino acids naturally occurring on Earth. Although it is likely that both the prebiotic and the current environment of Earth shaped the selection of these 20 proteinogenic amino acids, environmental conditions on extraterrestrial planets and moons are known to be quite different than those on Earth. In particular, the surfaces of planets and moons such as Mars, Europa and Enceladus have a much greater flux of UV and gamma radiation impacting their surface than that of Earth. Thus, if life were to have evolved extraterrestrially, a different lexicon of amino acids may have been selected due to different environmental pressures, such as higher radiation exposure. One fundamental property an amino acid must have in order to be of use to the evolution of life is relative stability. Therefore, we studied the stability of three different proteinogenic amino acids (tyrosine, phenylalanine and tryptophan) as compared with 20 non-proteinogenic amino acids that were structurally similar to the aromatic proteinogenic amino acids, following ultraviolet (UV) light (254, 302, or 365 nm) and gamma-ray irradiation. The degree of degradation of the amino acids was quantified using an ultra-high performance liquid chromatography-mass spectrometer (UPLC-MS). The result showed that many non-proteinogenic amino acids had either equal or increased stability to certain radiation wavelengths as compared with their proteinogenic counterparts, with fluorinated phenylalanine and tryptophan derivatives, in particular, exhibiting enhanced stability as compared with proteinogenic phenylalanine and tryptophan amino acids following gamma and select UV irradiation.