We use cookies to distinguish you from other users and to provide you with a better experience on our websites. Close this message to accept cookies or find out how to manage your cookie settings.
To save content items to your account,
please confirm that you agree to abide by our usage policies.
If this is the first time you use this feature, you will be asked to authorise Cambridge Core to connect with your account.
Find out more about saving content to .
To save content items to your Kindle, first ensure no-reply@cambridge.org
is added to your Approved Personal Document E-mail List under your Personal Document Settings
on the Manage Your Content and Devices page of your Amazon account. Then enter the ‘name’ part
of your Kindle email address below.
Find out more about saving to your Kindle.
Note you can select to save to either the @free.kindle.com or @kindle.com variations.
‘@free.kindle.com’ emails are free but can only be saved to your device when it is connected to wi-fi.
‘@kindle.com’ emails can be delivered even when you are not connected to wi-fi, but note that service fees apply.
Mental ill-health has a major impact on young people, with pain often co-occurring. We estimated the prevalence and impact of pain in young people with mental ill-health.
Methods
Longitudinal data (baseline and three-month follow-up) of 1,107 Australian young people (aged 12–25 years) attending one of five youth mental health services. Multi-level linear mixed models estimated associations between pain characteristics (frequency, intensity, and limitations) and outcomes with false discovery rate (FDR) adjustment. Pain characteristics were baseline-centered to estimate if the baseline score (between-participant effect) and/or change from baseline (within-participant effect) was associated with outcomes.
Results
At baseline, 16% reported serious pain more than 3 days, 51% reported at least moderate pain, and 25% reported pain-related activity limitations in the last week. Between participants, higher serious pain frequency was associated with greater anxiety symptoms (β[95%CI]: 0.90 [0.45, 1.35], FDR-p=0.001), higher pain intensity was associated with greater symptoms of depression (1.50 [0.71, 2.28], FDR-p=0.001), anxiety (1.22 [0.56, 1.89], FDR-p=0.002), and suicidal ideation (3.47 [0.98, 5.96], FDR-p=0.020), and higher pain limitations were associated with greater depressive symptoms (1.13 [0.63, 1.63], FDR-p<0.001). Within participants, increases in pain intensity were associated with increases in tobacco use risk (1.09 [0.48, 1.70], FDR-p=0.002), and increases in pain limitations were associated with increases in depressive symptoms (0.99 [0.54, 1.43], FDR-p<0.001) and decreases in social and occupational functioning (−1.08 [−1.78, −0.38], FDR-p=0.009).
Conclusions
One-in-two young people seeking support for mental ill-health report pain. Youth mental health services should consider integrating pain management.
Patients with posttraumatic stress disorder (PTSD) exhibit smaller regional brain volumes in commonly reported regions including the amygdala and hippocampus, regions associated with fear and memory processing. In the current study, we have conducted a voxel-based morphometry (VBM) meta-analysis using whole-brain statistical maps with neuroimaging data from the ENIGMA-PGC PTSD working group.
Methods
T1-weighted structural neuroimaging scans from 36 cohorts (PTSD n = 1309; controls n = 2198) were processed using a standardized VBM pipeline (ENIGMA-VBM tool). We meta-analyzed the resulting statistical maps for voxel-wise differences in gray matter (GM) and white matter (WM) volumes between PTSD patients and controls, performed subgroup analyses considering the trauma exposure of the controls, and examined associations between regional brain volumes and clinical variables including PTSD (CAPS-4/5, PCL-5) and depression severity (BDI-II, PHQ-9).
Results
PTSD patients exhibited smaller GM volumes across the frontal and temporal lobes, and cerebellum, with the most significant effect in the left cerebellum (Hedges’ g = 0.22, pcorrected = .001), and smaller cerebellar WM volume (peak Hedges’ g = 0.14, pcorrected = .008). We observed similar regional differences when comparing patients to trauma-exposed controls, suggesting these structural abnormalities may be specific to PTSD. Regression analyses revealed PTSD severity was negatively associated with GM volumes within the cerebellum (pcorrected = .003), while depression severity was negatively associated with GM volumes within the cerebellum and superior frontal gyrus in patients (pcorrected = .001).
Conclusions
PTSD patients exhibited widespread, regional differences in brain volumes where greater regional deficits appeared to reflect more severe symptoms. Our findings add to the growing literature implicating the cerebellum in PTSD psychopathology.
Background: Nipocalimab is a human IgG1 monoclonal antibody targeting FcRn that selectively reduces IgG levels without impacting antigen presentation, T- and B-cell functions. This study describes the effect of nipocalimab on vaccine response. Methods: Open-label, parallel, interventional study randomized participants 1:1 to receive intravenous 30mg/kg nipocalimab at Week0 and 15mg/kg at Week2 and Week4 (active) or no drug (control). On Day 3, participants received Tdap and PPSV®23 vaccinations and were followed through Wk16. Results: Twenty-nine participants completed the study and are included (active, n=15; control, n=14). Participants with a positive anti-tetanus IgG response was comparable between groups at Wk2 and Wk16, but lower at Wk4 (nipocalimab 3/15 [20%] vs control 7/14 [50%]; P=0.089). All maintained anti-tetanus IgG above the protective threshold (0.16IU/mL) through Wk16. While anti-pneumococcal-capsular-polysaccharide (PCP) IgG levels were lower during nipocalimab treatment, the percent increase from baseline at Wk2 and Wk16 was comparable between groups. Post-vaccination, anti-PCP IgG remained above 50mg/L and showed a 2-fold increase from baseline throughout the study in both groups. Nipocalimab co-administration with vaccines was safe and well-tolerated. Conclusions: These findings suggest that nipocalimab does not impact the development of an adequate IgG response to T-cell–dependent/independent vaccines and that nipocalimab-treated patients can follow recommended vaccination schedules.
Recent changes to US research funding are having far-reaching consequences that imperil the integrity of science and the provision of care to vulnerable populations. Resisting these changes, the BJPsych Portfolio reaffirms its commitment to publishing mental science and advancing psychiatric knowledge that improves the mental health of one and all.
The field of Quaternary entomology has focused primarily on the study of beetles (Coleoptera) and, to a lesser degree, nonbiting midges (Diptera: Chironomidae). Beetles typically predominate because they have heavily sclerotised exoskeletons, and they are abundant in a great variety of habitats. Because of taphonomy and scarcity, other Quaternary invertebrates have been less studied. Only a few records of fleas (Siphonaptera) and mites (Acari) are reported from Pleistocene deposits that span the Seward Peninsula in Alaska, United States of America, to the Klondike goldfields in central Yukon Territory, Canada. Grasshoppers (Orthoptera) and thrips (Thysanoptera) have not been reported previously from Quaternary deposits across the Arctic’s Beringia region. However, recent extensive sampling of Arctic ground squirrel, Urocitellus parryii Richardson (Rodentia: Sciuridae), middens from permafrost deposits of the Klondike goldfields has yielded specimens from each of these underrepresented invertebrate groups. Here, we present records of fleas (Oropsylla alaskensis Baker (Ceratophyllidae)), mites (including Fusacarus Michael (Astigmata: Glycyphagidae) and cf. Haemogamasus Berlese (Mesostigmata: Laelapidae)), and the first records of grasshoppers (Acrididae: Gomphocerinae) and thrips (Thysanoptera: Thripidae) from Beringia from six middens spanning approximately 80 000–13 500 years BP. We also provide brief reviews of the fossil history of each major taxon.
Objectives/Goals: We describe the prevalence of individuals with household exposure to SARS-CoV-2, who subsequently report symptoms consistent with COVID-19, while having PCR results persistently negative for SARS-CoV-2 (S[+]/P[-]). We assess whether paired serology can assist in identifying the true infection status of such individuals. Methods/Study Population: In a multicenter household transmission study, index patients with SARS-CoV-2 were identified and enrolled together with their household contacts within 1 week of index’s illness onset. For 10 consecutive days, enrolled individuals provided daily symptom diaries and nasal specimens for polymerase chain reaction (PCR). Contacts were categorized into 4 groups based on presence of symptoms (S[+/-]) and PCR positivity (P[+/-]). Acute and convalescent blood specimens from these individuals (30 days apart) were subjected to quantitative serologic analysis for SARS-CoV-2 anti-nucleocapsid, spike, and receptor-binding domain antibodies. The antibody change in S[+]/P[-] individuals was assessed by thresholds derived from receiver operating characteristic (ROC) analysis of S[+]/P[+] (infected) versusS[-]/P[-] (uninfected). Results/Anticipated Results: Among 1,433 contacts, 67% had ≥1 SARS-CoV-2 PCR[+] result, while 33% remained PCR[-]. Among the latter, 55% (n = 263) reported symptoms for at least 1 day, most commonly congestion (63%), fatigue (63%), headache (62%), cough (59%), and sore throat (50%). A history of both previous infection and vaccination was present in 37% of S[+]/P[-] individuals, 38% of S[-]/P[-], and 21% of S[+]/P[+] (P<0.05). Vaccination alone was present in 37%, 41%, and 52%, respectively. ROC analyses of paired serologic testing of S[+]/P[+] (n = 354) vs. S[-]/P[-] (n = 103) individuals found anti-nucleocapsid data had the highest area under the curve (0.87). Based on the 30-day antibody change, 6.9% of S[+]/P[-] individuals demonstrated an increased convalescent antibody signal, although a similar seroresponse in 7.8% of the S[-]/P[-] group was observed. Discussion/Significance of Impact: Reporting respiratory symptoms was common among household contacts with persistent PCR[-] results. Paired serology analyses found similar seroresponses between S[+]/P[-] and S[-]/P[-] individuals. The symptomatic-but-PCR-negative phenomenon, while frequent, is unlikely attributable to true SARS-CoV-2 infections that go missed by PCR.
To improve early intervention and personalise treatment for individuals early on the psychosis continuum, a greater understanding of symptom dynamics is required. We address this by identifying and evaluating the movement between empirically derived attenuated psychotic symptomatic substates—clusters of symptoms that occur within individuals over time.
Methods
Data came from a 90-day daily diary study evaluating attenuated psychotic and affective symptoms. The sample included 96 individuals aged 18–35 on the psychosis continuum, divided into four subgroups of increasing severity based on their psychometric risk of psychosis, with the fourth meeting ultra-high risk (UHR) criteria. A multilevel hidden Markov modelling (HMM) approach was used to characterise and determine the probability of switching between symptomatic substates. Individual substate trajectories and time spent in each substate were subsequently assessed.
Results
Four substates of increasing psychopathological severity were identified: (1) low-grade affective symptoms with negligible psychotic symptoms; (2) low levels of nonbizarre ideas with moderate affective symptoms; (3) low levels of nonbizarre ideas and unusual thought content, with moderate affective symptoms; and (4) moderate levels of nonbizarre ideas, unusual thought content, and affective symptoms. Perceptual disturbances predominantly occurred within the third and fourth substates. UHR individuals had a reduced probability of switching out of the two most severe substates.
Conclusions
Findings suggest that individuals reporting unusual thought content, rather than nonbizarre ideas in isolation, may exhibit symptom dynamics with greater psychopathological severity. Individuals at a higher risk of psychosis exhibited persistently severe symptom dynamics, indicating a potential reduction in psychological flexibility.
It is well established that there is a substantial genetic component to eating disorders (EDs). Polygenic risk scores (PRSs) can be used to quantify cumulative genetic risk for a trait at an individual level. Recent studies suggest PRSs for anorexia nervosa (AN) may also predict risk for other disordered eating behaviors, but no study has examined if PRS for AN can predict disordered eating as a global continuous measure. This study aimed to investigate whether PRS for AN predicted overall levels of disordered eating, or specific lifetime disordered eating behaviors, in an Australian adolescent female population.
Methods
PRSs were calculated based on summary statistics from the largest Psychiatric Genomics Consortium AN genome-wide association study to date. Analyses were performed using genome-wide complex trait analysis to test the associations between AN PRS and disordered eating global scores, avoidance of eating, objective bulimic episodes, self-induced vomiting, and driven exercise in a sample of Australian adolescent female twins recruited from the Australian Twin Registry (N = 383).
Results
After applying the false-discovery rate correction, the AN PRS was significantly associated with all disordered eating outcomes.
Conclusions
Findings suggest shared genetic etiology across disordered eating presentations and provide insight into the utility of AN PRS for predicting disordered eating behaviors in the general population. In the future, PRSs for EDs may have clinical utility in early disordered eating risk identification, prevention, and intervention.
Suicide prevention strategies have shifted in many countries, from a national approach to one that is regionally tailored and responsive to local community needs. Previous Australian studies support this approach. However, most studies have focused on suicide deaths which may not fully capture a complete understanding of prevention needs, and few have focused on the priority population of youth. This was the first nationwide study to examine regional variability of self-harm prevalence and related factors in Australian young people.
Methods
A random sample of Australian adolescents (12–17-year-olds) were recruited as part of the Young Minds Matter (YMM) survey. Participants completed self-report questions on self-harm (i.e., non-suicidal self-harm and suicide attempts) in the previous 12 months. Using mixed effects regressions, an area-level model was built with YMM and Census data to produce out-of-sample small area predictions for self-harm prevalence. Spatial unit of analysis was Statistical Area Level 1 (average population 400 people), and all prevalence estimates were updated to 2019.
Results
Across Australia, there was large variability in youth self-harm prevalence estimates. Northern Territory, Western Australia, and South Australia had the highest estimated state prevalence. Psychological distress and depression were factors which best predicted self-harm at an individual level. At an area-level, the strongest predictor was a high percentage of single unemployed parents, while being in an area where ≥30% of parents were born overseas was associated with reduced odds of self-harm.
Conclusions
This study identified characteristics of regions with lower and higher youth self-harm risk. These findings should assist governments and communities with developing and implementing regionally appropriate youth suicide prevention interventions and initiatives.
Clinical outcomes of repetitive transcranial magnetic stimulation (rTMS) for treatment of treatment-resistant depression (TRD) vary widely and there is no mood rating scale that is standard for assessing rTMS outcome. It remains unclear whether TMS is as efficacious in older adults with late-life depression (LLD) compared to younger adults with major depressive disorder (MDD). This study examined the effect of age on outcomes of rTMS treatment of adults with TRD. Self-report and observer mood ratings were measured weekly in 687 subjects ages 16–100 years undergoing rTMS treatment using the Inventory of Depressive Symptomatology 30-item Self-Report (IDS-SR), Patient Health Questionnaire 9-item (PHQ), Profile of Mood States 30-item, and Hamilton Depression Rating Scale 17-item (HDRS). All rating scales detected significant improvement with treatment; response and remission rates varied by scale but not by age (response/remission ≥ 60: 38%–57%/25%–33%; <60: 32%–49%/18%–25%). Proportional hazards models showed early improvement predicted later improvement across ages, though early improvements in PHQ and HDRS were more predictive of remission in those < 60 years (relative to those ≥ 60) and greater baseline IDS burden was more predictive of non-remission in those ≥ 60 years (relative to those < 60). These results indicate there is no significant effect of age on treatment outcomes in rTMS for TRD, though rating instruments may differ in assessment of symptom burden between younger and older adults during treatment.
Loss of control eating is more likely to occur in the evening and is uniquely associated with distress. No studies have examined the effect of treatment on within-day timing of loss of control eating severity. We examined whether time of day differentially predicted loss of control eating severity at baseline (i.e. pretreatment), end-of-treatment, and 6-month follow-up for individuals with binge-eating disorder (BED), hypothesizing that loss of control eating severity would increase throughout the day pretreatment and that this pattern would be less pronounced following treatment. We explored differential treatment effects of cognitive-behavioral guided self-help (CBTgsh) and Integrative Cognitive-Affective Therapy (ICAT).
Methods
Individuals with BED (N = 112) were randomized to receive CBTgsh or ICAT and completed a 1-week ecological momentary assessment protocol at baseline, end-of-treatment, and 6-month follow-up to assess loss of control eating severity. We used multilevel models to assess within-day slope trajectories of loss of control eating severity across assessment periods and treatment type.
Results
Within-day increases in loss of control eating severity were reduced at end-of-treatment and 6-month follow-up relative to baseline. Evening acceleration of loss of control eating severity was greater at 6-month follow-up relative to end-of-treatment. Within-day increases in loss of control severity did not differ between treatments at end-of-treatment; however, evening loss of control severity intensified for individuals who received CBTgsh relative to those who received ICAT at 6-month follow-up.
Conclusions
Findings suggest that treatment reduces evening-shifted loss of control eating severity, and that this effect may be more durable following ICAT relative to CBTgsh.
Patients tested for Clostridioides difficile infection (CDI) using a 2-step algorithm with a nucleic acid amplification test (NAAT) followed by toxin assay are not reported to the National Healthcare Safety Network as a laboratory-identified CDI event if they are NAAT positive (+)/toxin negative (−). We compared NAAT+/toxin− and NAAT+/toxin+ patients and identified factors associated with CDI treatment among NAAT+/toxin− patients.
Design:
Retrospective observational study.
Setting:
The study was conducted across 36 laboratories at 5 Emerging Infections Program sites.
Patients:
We defined a CDI case as a positive test detected by this 2-step algorithm during 2018–2020 in a patient aged ≥1 year with no positive test in the previous 8 weeks.
Methods:
We used multivariable logistic regression to compare CDI-related complications and recurrence between NAAT+/toxin− and NAAT+/toxin+ cases. We used a mixed-effects logistic model to identify factors associated with treatment in NAAT+/toxin− cases.
Results:
Of 1,801 cases, 1,252 were NAAT+/toxin−, and 549 were NAAT+/toxin+. CDI treatment was given to 866 (71.5%) of 1,212 NAAT+/toxin− cases versus 510 (95.9%) of 532 NAAT+/toxin+ cases (P < .0001). NAAT+/toxin− status was protective for recurrence (adjusted odds ratio [aOR], 0.65; 95% CI, 0.55–0.77) but not CDI-related complications (aOR, 1.05; 95% CI, 0.87–1.28). Among NAAT+/toxin− cases, white blood cell count ≥15,000/µL (aOR, 1.87; 95% CI, 1.28–2.74), ≥3 unformed stools for ≥1 day (aOR, 1.90; 95% CI, 1.40–2.59), and diagnosis by a laboratory that provided no or neutral interpretive comments (aOR, 3.23; 95% CI, 2.23–4.68) were predictors of CDI treatment.
Conclusion:
Use of this 2-step algorithm likely results in underreporting of some NAAT+/toxin− cases with clinically relevant CDI. Disease severity and laboratory interpretive comments influence treatment decisions for NAAT+/toxin− cases.
Female fertility is a complex trait with age-specific changes in spontaneous dizygotic (DZ) twinning and fertility. To elucidate factors regulating female fertility and infertility, we conducted a genome-wide association study (GWAS) on mothers of spontaneous DZ twins (MoDZT) versus controls (3273 cases, 24,009 controls). This is a follow-up study to the Australia/New Zealand (ANZ) component of that previously reported (Mbarek et al., 2016), with a sample size almost twice that of the entire discovery sample meta-analysed in the previous article (and five times the ANZ contribution to that), resulting from newly available additional genotyping and representing a significant increase in power. We compare analyses with and without male controls and show unequivocally that it is better to include male controls who have been screened for recent family history, than to use only female controls. Results from the SNP based GWAS identified four genomewide significant signals, including one novel region, ZFPM1 (Zinc Finger Protein, FOG Family Member 1), on chromosome 16. Previous signals near FSHB (Follicle Stimulating Hormone beta subunit) and SMAD3 (SMAD Family Member 3) were also replicated (Mbarek et al., 2016). We also ran the GWAS with a dominance model that identified a further locus ADRB2 on chr 5. These results have been contributed to the International Twinning Genetics Consortium for inclusion in the next GWAS meta-analysis (Mbarek et al., in press).
The U.S. Department of Agriculture–Agricultural Research Service (USDA-ARS) has been a leader in weed science research covering topics ranging from the development and use of integrated weed management (IWM) tactics to basic mechanistic studies, including biotic resistance of desirable plant communities and herbicide resistance. ARS weed scientists have worked in agricultural and natural ecosystems, including agronomic and horticultural crops, pastures, forests, wild lands, aquatic habitats, wetlands, and riparian areas. Through strong partnerships with academia, state agencies, private industry, and numerous federal programs, ARS weed scientists have made contributions to discoveries in the newest fields of robotics and genetics, as well as the traditional and fundamental subjects of weed–crop competition and physiology and integration of weed control tactics and practices. Weed science at ARS is often overshadowed by other research topics; thus, few are aware of the long history of ARS weed science and its important contributions. This review is the result of a symposium held at the Weed Science Society of America’s 62nd Annual Meeting in 2022 that included 10 separate presentations in a virtual Weed Science Webinar Series. The overarching themes of management tactics (IWM, biological control, and automation), basic mechanisms (competition, invasive plant genetics, and herbicide resistance), and ecosystem impacts (invasive plant spread, climate change, conservation, and restoration) represent core ARS weed science research that is dynamic and efficacious and has been a significant component of the agency’s national and international efforts. This review highlights current studies and future directions that exemplify the science and collaborative relationships both within and outside ARS. Given the constraints of weeds and invasive plants on all aspects of food, feed, and fiber systems, there is an acknowledged need to face new challenges, including agriculture and natural resources sustainability, economic resilience and reliability, and societal health and well-being.
Research into the responses of hens on perches is important in order to assess the welfare impact of alternative systems for egg production which incorporate perches in their design. Previous studies suggest that many flight and landing accidents occur in such systems as birds attempt to move between perches and facilities, resulting in a high incidence of bone breakage. In this study three horizontal perches were set with a gradient between them of 0, 30, 45 or 60 degrees according to treatment. Four groups of 15 ISABrown laying hens were individually exposed to each treatment, being placed on the uppermost perch (Perch 1) with a food reward available at the lowest perch (Perch 3). Behaviours performed before reaching Perch 3 were recorded over time. More birds failed to move to Perch 3 in 10 minutes when perches were separated by 45 or 60 degrees. In birds which stayed on the perches for the full 10 minutes, without reaching Perch 3, downward head movements, calling, intended jump behaviours, side-stepping and wing-flapping decreased significantly with time spent on the perches. Motivation to complete the task, in order to gain the food reward, was high in all treatments. However, when birds found perches difficult to negotiate, behaviours indicating intention to move to the food decreased with time and the incidence of behaviours indicating frustration and thwarting increased. In non-cage systems such frustration could reduce bird welfare.
Data from neurocognitive assessments may not be accurate in the context of factors impacting validity, such as disengagement, unmotivated responding, or intentional underperformance. Performance validity tests (PVTs) were developed to address these phenomena and assess underperformance on neurocognitive tests. However, PVTs can be burdensome, rely on cutoff scores that reduce information, do not examine potential variations in task engagement across a battery, and are typically not well-suited to acquisition of large cognitive datasets. Here we describe the development of novel performance validity measures that could address some of these limitations by leveraging psychometric concepts using data embedded within the Penn Computerized Neurocognitive Battery (PennCNB).
Methods:
We first developed these validity measures using simulations of invalid response patterns with parameters drawn from real data. Next, we examined their application in two large, independent samples: 1) children and adolescents from the Philadelphia Neurodevelopmental Cohort (n = 9498); and 2) adult servicemembers from the Marine Resiliency Study-II (n = 1444).
Results:
Our performance validity metrics detected patterns of invalid responding in simulated data, even at subtle levels. Furthermore, a combination of these metrics significantly predicted previously established validity rules for these tests in both developmental and adult datasets. Moreover, most clinical diagnostic groups did not show reduced validity estimates.
Conclusions:
These results provide proof-of-concept evidence for multivariate, data-driven performance validity metrics. These metrics offer a novel method for determining the performance validity for individual neurocognitive tests that is scalable, applicable across different tests, less burdensome, and dimensional. However, more research is needed into their application.
Laws regulating patient care are an essential component of protecting patients and doctors alike. No studies have previously examined what laws exist regarding pelvic examinations in the United States (US). This study systematically reviews and compares regulation and legislation of pelvic examinations in the U.S. and provides a comprehensive resource to educate clinicians, patients, and lawmakers. Each of the fifty States in the U.S. was included. The primary outcome was existence of any pelvic or rectal exam laws. Data was obtained for the type of examination defined within the law, exceptions to the law, to whom the law applied to, the type of consent required, and to whom the consent applied to. Laws were identified from each of the individual state legislative websites. All sections of each law pertaining to pelvic examination were reviewed and organized by state. Descriptive statistics were performed for each of the variables, including frequencies of each amongst the fifty states. State regulation for pelvic examinations varied from no law or regulation to laws pertaining to pelvic, rectal, prostate, and breast examination performed in any context. As of November 22, 2022, there are twenty states (40%) with pelvic examination laws applying to anesthetized or unconscious patients. Thirteen additional states (26%) have proposed pelvic exam laws. Seventeen states (34%) do not have any laws regarding pelvic examinations. Regulation of pelvic examinations has become an increasingly important issue over the past few years in response to growing concerns of patient autonomy and the ethical issues raised by such sensitive examinations. While pelvic examination laws that balance protection for patient autonomy and the needs of caregivers and educators exist in much of the U.S., more work needs to continue in consultation with physicians and health care providers to ensure that all states have reasonable laws protecting the autonomy of patients while also maintaining quality of care.
We present the Widefield ASKAP L-band Legacy All-sky Blind surveY (WALLABY) Pilot Phase I Hi kinematic models. This first data release consists of Hi observations of three fields in the direction of the Hydra and Norma clusters, and the NGC 4636 galaxy group. In this paper, we describe how we generate and publicly release flat-disk tilted-ring kinematic models for 109/592 unique Hi detections in these fields. The modelling method adopted here—which we call the WALLABY Kinematic Analysis Proto-Pipeline (WKAPP) and for which the corresponding scripts are also publicly available—consists of combining results from the homogeneous application of the FAT and 3DBarolo algorithms to the subset of 209 detections with sufficient resolution and $S/N$ in order to generate optimised model parameters and uncertainties. The 109 models presented here tend to be gas rich detections resolved by at least 3–4 synthesised beams across their major axes, but there is no obvious environmental bias in the modelling. The data release described here is the first step towards the derivation of similar products for thousands of spatially resolved WALLABY detections via a dedicated kinematic pipeline. Such a large publicly available and homogeneously analysed dataset will be a powerful legacy product that that will enable a wide range of scientific studies.
Trace fossils record foraging behaviors, the search for resources in patchy environments, of animals in the rock record. Quantification of the strength, density, and nature of foraging behaviors enables the investigation of how these may have changed through time. Here, we present a novel approach to explore such patterns using spatial point process analyses to quantify the scale and strength of ichnofossil spatial distributions on horizontal bedding planes. To demonstrate the utility of this approach, we use two samples from the terminal Ediacaran Shibantan Member in South China (between 551 and 543 Ma) and the early Cambrian Nagaur Sandstone in northwestern India (between 539 and 509 Ma). We find that ichnotaxa on both surfaces exhibited significant nonhomogeneous lateral patterns, with distinct levels of heterogeneity exhibited by different types of trace fossils. In the Shibantan, two ichnotaxa show evidence for mutual positive aggregation over a shared resource, suggesting the ability to focus on optimal resource areas. Trace fossils from the Nagaur Sandstone exhibit more sophisticated foraging behavior, with greater niche differentiation. Critically, mark correlation functions highlight significant spatial autocorrelation of trace fossil orientations, demonstrating the greater ability of these Cambrian tracemakers to focus on optimal patches. Despite potential limitations, these analyses hint at changes in the development and optimization of foraging at the Ediacaran/Cambrian transition and highlight the potential of spatial point process analysis to tease apart subtle differences in behavior in the trace fossil record.