We use cookies to distinguish you from other users and to provide you with a better experience on our websites. Close this message to accept cookies or find out how to manage your cookie settings.
To save content items to your account,
please confirm that you agree to abide by our usage policies.
If this is the first time you use this feature, you will be asked to authorise Cambridge Core to connect with your account.
Find out more about saving content to .
To save content items to your Kindle, first ensure no-reply@cambridge.org
is added to your Approved Personal Document E-mail List under your Personal Document Settings
on the Manage Your Content and Devices page of your Amazon account. Then enter the ‘name’ part
of your Kindle email address below.
Find out more about saving to your Kindle.
Note you can select to save to either the @free.kindle.com or @kindle.com variations.
‘@free.kindle.com’ emails are free but can only be saved to your device when it is connected to wi-fi.
‘@kindle.com’ emails can be delivered even when you are not connected to wi-fi, but note that service fees apply.
Older adults with treatment-resistant depression (TRD) benefit more from treatment augmentation than switching. It is useful to identify moderators that influence these treatment strategies for personalised medicine.
Aims
Our objective was to test whether age, executive dysfunction, comorbid medical burden, comorbid anxiety or the number of previous adequate antidepressant trials could moderate the superiority of augmentation over switching. A significant moderator would influence the differential effect of augmentation versus switching on treatment outcomes.
Method
We performed a preplanned moderation analysis of data from the Optimizing Outcomes of Treatment-Resistant Depression in Older Adults (OPTIMUM) randomised controlled trial (N = 742). Participants were 60 years old or older with TRD. Participants were either (a) randomised to antidepressant augmentation with aripiprazole (2.5–15 mg), bupropion (150–450 mg) or lithium (target serum drug level 0.6 mmol/L) or (b) switched to bupropion (150–450 mg) or nortriptyline (target serum drug level 80–120 ng/mL). Treatment duration was 10 weeks. The two main outcomes of this analysis were (a) symptom improvement, defined as change in Montgomery–Asberg Depression Rating Scale (MADRS) scores from baseline to week 10 and (b) remission, defined as MADRS score of 10 or less at week 10.
Results
Of the 742 participants, 480 were randomised to augmentation and 262 to switching. The number of adequate previous antidepressant trials was a significant moderator of depression symptom improvement (b = −1.6, t = −2.1, P = 0.033, 95% CI [−3.0, −0.1], where b is the coefficient of the relationship (i.e. effect size), and t is the t-statistic for that coefficient associated with the P-value). The effect was similar across all augmentation strategies. No other putative moderators were significant.
Conclusions
Augmenting was superior to switching antidepressants only in older patients with fewer than three previous antidepressant trials. This suggests that other intervention strategies should be considered following three or more trials.
Suicide accounts for a proportion of the early mortality in people affected by psychotic disorders. The early phase of illness can represent a particularly high-risk time for suicide. Therefore, in a cohort of young people presenting with first-episode psychosis, this study aimed to determine: (i) the prevalence of suicidal ideation, intent with plan and self-harm and any associated demographic or clinical factors and (ii) the prevalence of depressive symptoms and any associated demographic or clinical factors.
Methods:
Young people with a first episode of psychosis attending the Early Psychosis Prevention and Intervention Centre in Melbourne were included. Suicidal behaviours were recorded using a structured risk assessment – ‘Clinical Risk Assessment and Management in the Community’, and depressive symptoms were measured using the PHQ-9.
Results:
A total of 355 young people were included in the study. 57.2% were male, 95.4% were single and over one quarter were migrants. At the time of presentation, 34.6% had suicidal ideation, 6.2% had suicidal intent with a plan, and 21.4% had engaged in self-harm before their presentation. Combined, 39.7% (n = 141) presented with suicidal ideation, intent with plan or self-harm. A total of 71.5% (n = 118) had moderately severe or severe depressive symptoms, which was strongly associated with suicidal ideation or behaviours at the time of presentation (OR = 4.21, 95% C.I. 2.10–8.44).
Conclusions:
Depressive symptoms, self-harm and suicidal behaviours are commonly present in the early phases of a psychotic disorder, which has important clinical implications for assessment and management.
Mixed-layer clays of variable composition and structure occur in core samples from two drillholes (WK207 and WK210) drilled into the Te Mihi sector of the Wairakei geothermal field. These were identified by X-ray diffraction analysis of glycolated and oriented sample fractions at less than 2 μm and less than 0.2 μm.
Low permeability lacustrine sediments encountered by drillhole WK207 contain a well-developed sequence of mixed-layer clays. The shallowest downhole appearance of mixed-layered illite/smectite (I0.6/Sm) occurs at 146 m depth where temperature is only 100°C. Discrete illite is present only below 297 m (200°C) in the finer size fraction (less than 0.2 ¼m). Chlorite first appears downhole, in association with illite-smectite, at 177 m depth (110°C).
Drillhole WK210 encountered predominantly ignimbrites and rhyolites, and fluid flow here is mainly in channels. Within these rocks, a sequence of interlayered clays is poorly developed. Discrete illite and chlorite are present in core from only 244 m (180°), but the measured temperatures where interlayer clays occur ranges from 140 to 209°C.
Differences in the identity of clay minerals present in the Wairakei reservoir, where conditions are otherwise the same, demonstrate the strong control that the type of fluid flow has on their formation. In poorly-permeable sediments, where diffuse fluid flow prevails, a clearly-defined sequence of mixed-layer clays occurs. These are absent where channel flow dominates, the discrete chlorite and illite deposit directly from solution.
To investigate the symptoms of SARS-CoV-2 infection, their dynamics and their discriminatory power for the disease using longitudinally, prospectively collected information reported at the time of their occurrence. We have analysed data from a large phase 3 clinical UK COVID-19 vaccine trial. The alpha variant was the predominant strain. Participants were assessed for SARS-CoV-2 infection via nasal/throat PCR at recruitment, vaccination appointments, and when symptomatic. Statistical techniques were implemented to infer estimates representative of the UK population, accounting for multiple symptomatic episodes associated with one individual. An optimal diagnostic model for SARS-CoV-2 infection was derived. The 4-month prevalence of SARS-CoV-2 was 2.1%; increasing to 19.4% (16.0%–22.7%) in participants reporting loss of appetite and 31.9% (27.1%–36.8%) in those with anosmia/ageusia. The model identified anosmia and/or ageusia, fever, congestion, and cough to be significantly associated with SARS-CoV-2 infection. Symptoms’ dynamics were vastly different in the two groups; after a slow start peaking later and lasting longer in PCR+ participants, whilst exhibiting a consistent decline in PCR- participants, with, on average, fewer than 3 days of symptoms reported. Anosmia/ageusia peaked late in confirmed SARS-CoV-2 infection (day 12), indicating a low discrimination power for early disease diagnosis.
The potential utilization of a cold-contact approach to research recruitment, where members of the research team are unknown to the patient, has grown with the expanded use of electronic health records (EHRs) and affiliated patient portals. Institutions that permit this strategy vary in their implementation and management of it but tend to lean towards more conservative approaches. This process paper describes the Medical University of South Carolina’s transition to an opt-out model of “cold-contact” recruitment (known as patient outreach recruitment or POR), wherein patients can be contacted so long as they do not express an unwillingness to receive such communication. The work highlights the benefits of this model by explaining how it, in many ways, supports and protects autonomy, beneficence, and justice for patients. The paper then describes the process of standing up the recruitment strategy, communicating the change to patients and the community, and documenting study team contact and patient research preference. Data supporting increased access to potentially eligible patients of greater diversity as well as initial researcher feedback on perceived success of POR is also shared. The paper ends with a discussion of next steps to enhance the POR process via more detailed data collection and reengagement with community stakeholders.
This book explores issues central to contemporary theoretical debates around the nature of trust, linking abstract concerns to empirical analysis with interviews with service-users, practitioners and managers.
Through a series of interdisciplinary case studies, this topical collection is the first to focus on protest camps as unique organisational forms that transcend particular social movements’ contexts. The book offers a critical understanding of current protest events and will help to better understand new global forms of democracy in action.
Approximately 70% of patients with bipolar disorder (BPD) are initially misdiagnosed, resulting in significantly delayed diagnosis of 7–10 years on average. Misdiagnosis and diagnostic delay adversely affect health outcomes and lead to the use of inappropriate treatments. As depressive episodes and symptoms are the predominant symptom presentation in BPD, misdiagnosis as major depressive disorder (MDD) is common. Self-rated screening instruments for BPD exist but their length and reliance on past manic symptoms are barriers to implementation, especially in primary care settings where many of these patients initially present. We developed a brief, pragmatic bipolar I disorder (BPD-I) screening tool that not only screens for manic symptoms but also includes risk factors for BPD-I (eg, age of depression onset) to help clinicians reduce the misdiagnosis of BPD-I as MDD.
Methods
Existing questionnaires and risk factors were identified through a targeted literature search; a multidisciplinary panel of experts participated in 2 modified Delphi panels to select concepts thought to differentiate BPD-I from MDD. Individuals with self-reported BPD-I or MDD participated in cognitive debriefing interviews (N=12) to test and refine item wording. A multisite, cross-sectional, observational study was conducted to evaluate the screening tool’s predictive validity. Participants with clinical interview-confirmed diagnoses of BPD-I or MDD completed a draft 10-item screening tool and additional questionnaires/questions. Different combinations of item sets with various item permutations (eg, number of depressive episodes, age of onset) were simultaneously tested. The final combination of items and thresholds was selected based on multiple considerations including clinical validity, optimization of sensitivity and specificity, and pragmatism.
Results
A total of 160 clinical interviews were conducted; 139 patients had clinical interview-confirmed BPD-I (n=67) or MDD (n=72). The screening tool was reduced from 10 to 6 items based on item-level analysis. When 4 items or more were endorsed (yes) in this analysis sample, the sensitivity of this tool for identifying patients with BPD-I was 0.88 and specificity was 0.80; positive and negative predictive values were 0.80 and 0.88, respectively. These properties represent an improvement over the Mood Disorder Questionnaire, while using >50% fewer items.
Conclusion
This new 6-item BPD-I screening tool serves to differentiate BPD-I from MDD in patients with depressive symptoms. Use of this tool can provide real-world guidance to primary care practitioners on whether more comprehensive assessment for BPD-I is warranted. Use of a brief and valid tool provides an opportunity to reduce misdiagnosis, improve treatment selection, and enhance health outcomes in busy clinical practices.
The Black-capped Petrel or Diablotin Pterodroma hasitata has a fragmented and declining population estimated at c.1,000 breeding pairs. On land, the species nests underground in steep ravines with dense understorey vegetation. The only confirmed breeding sites are located in the mountain ranges of Hispaniola in the Caribbean, where habitat loss and degradation are continuing threats. Other nesting populations may still remain undiscovered but, to locate them, laborious in situ nest searches must be conducted over expansive geographical areas. To focus nest-search efforts more efficiently, we analysed the environmental characteristics of Black-capped Petrel nesting habitat and modeled suitable habitat on Hispaniola using openly available environmental datasets. We used a univariate generalized linear model to compare the habitat characteristics of active Black-capped Petrel nests sites with those of potentially available sites (i.e. random pseudo-absences). Elevation, distance to coast, and the influence of tree cover and density emerged as important environmental variables. We then applied multivariate generalized linear models to these environmental variables that showed a significant relationship with petrel nesting activity. We used the top performing model of habitat suitability model to create maps of predicted suitability for Hispaniola. In addition to areas of known petrel activity, the model identified possible nesting areas for Black-capped Petrels in habitats not previously considered suitable. Based on model results, we estimated the total area of predicted suitable nesting habitat for Black-capped Petrels on Hispaniola and found that forest loss due to hurricanes, forest fires, and encroachment from agriculture had severely decreased availability of predicted suitable habitat between 2000 and 2018.
Glyphosate is an important component of herbicide programs in orchard crops in California. It can be applied alone or in tank-mix combinations under the crop rows or to the entire field and often is used multiple times each year. There has been speculation about the potential impacts of repeated use of glyphosate in perennial crop systems, because of uptake from shallow root systems or indirectly because of effects on nutrient availability in soil. To address these concerns, research was conducted from 2013 to 2020 on key orchard crops to evaluate tree response to glyphosate regimens. Almond, cherry, and prune were evaluated in separate experiments. In each crop, the experimental design was a factorial arrangement of two soil types, four glyphosate rates (0, 1.1, 2.2, and 4.4 kg ae ha−1, applied three times annually), and two post-glyphosate application irrigation treatments. In the first 2 yr of the study, there was no clear impact of the glyphosate regimens on shikimate accumulation or leaf chlorophyll content, which suggested no direct effect on the crop. In the seventh year of the study, after six consecutive years of glyphosate application to the orchard floors, there were no negative impacts of glyphosate application on leaf nutrient concentration or on cumulative trunk growth in any of the three orchard crops. Lack of a negative growth impact even at the highest treatment rate, which included 18 applications of glyphosate totaling nearly 80 kg ae ha−1 glyphosate over the course of the experiment suggest there is not likely a significant risk to tree health of judicious use of the herbicide in these production systems. Given the economic importance of orchard crops in California, and grower and industry concerns about pesticides generally and specifically about glyphosate, these findings are timely contributions to weed management concerns in perennial specialty crops.
As it has been written, the history of humanitarian intervention is all too Whiggish and all too white. By conceptualising humanitarian intervention in the way that they do, orthodox histories should be seen as entangled in debates about the origins of human rights but also, perhaps more crucially, debates about the various formations and reinventions of human rights. Alternative codifications of rights reveal the historical possibility of a Southern practice of what we would almost certainly call ‘humanitarian intervention’. The record of a radical Third World practice to save strangers from the atrocities of colonialism and extreme racism is also a record of Western states playing staunchly sovereigntist roles, of the West's late devotion to Westphalia. To sketch out such a counterhistory is to argue the following: at a threshold moment in the international-political life of the Responsibility to Protect, it is the terms, range, and domain of the intervention debate that must be re-formulated and re-evaluated.
Southern crabgrass [Digitaria ciliaris (Retz.) Koeler] is an annual grass weed that commonly infests turfgrass, roadsides, wastelands, and cropping systems throughout the southeastern United States. Two biotypes of D. ciliaris (R1 and R2) with known resistance to cyclohexanediones (DIMs) and aryloxyphenoxypropionates (FOPs) previously collected from sod production fields in Georgia were compared with a separate susceptible biotype (S) collected from Alabama for the responses to pinoxaden and to explore the possible mechanisms of resistance. Increasing rates of pinoxaden (0.1 to 23.5 kg ha−1) were evaluated for control of R1, R2, and S. The resistant biotypes, R1 and R2, were resistant to pinoxaden relative to S. The S biotype was completely controlled at rates of 11.8 and 23.5 kg ha−1, resulting in no aboveground biomass at 14 d after treatment. Pinoxaden rates at which tiller length and aboveground biomass would be reduced 50% (I50) and 90% (I90) for R1, R2, and S ranged from 7.2 to 13.2 kg ha−1, 6.9 to 8.6 kg ha−1, and 0.7 to 2.1 kg ha−1, respectively, for tiller length, and 7.7 to 10.2 kg ha−1, 7.2 to 7.9 kg ha−1, and 1.6 to 2.3 kg ha−1, respectively, for aboveground biomass. Prior selection pressure from DIM and FOP herbicides could result in the evolution of D. ciliaris cross-resistance to pinoxaden herbicides. Amplification of the carboxyl-transferase domain of the plastidic ACCase by standard PCR identified a point mutation resulting in an Ile-1781-Leu amino acid substitution only for the resistant biotype, R1. Further cloning of PCR product surrounding the 1781 region yielded two distinct ACCase gene sequences, Ile-1781 and Leu-1781. The amino acid substitution, Ile-1781-Leu in both resistant biotypes (R1 and R2), however, was revealed by next-generation sequencing of RNA using Illumina platform. A point mutation in the Ile-1781 codon leading to herbicide insensitivity in the ACCase enzyme has been previously reported in other grass species. Our research confirms that the Ile-1781-Leu substitution is present in pinoxaden-resistant D. ciliaris.
Now in its second edition, Managing Employee Performance and Reward continues to offer comprehensive coverage of employee performance and reward, presenting the material in a conceptually integrated way. This new edition has been substantially updated and revised by a team of specialist contributors, and includes: An increased focus on employee engagement and the alignment between the organisation's goals and the personal goals of employeesExpanded coverage of coaching, now a leading-edge performance enhancement practiceExtensive updates reflecting the major changes in employee benefits in recent years, as organisations strive to attract and retain talentUpdated coverage of executive salaries and incentives in the contemporary post-GFC environment.This popular text is an indispensable resource for both students and managers alike. Written for a global readership, the book will continue to have particular appeal to those studying and practising people management in the Asia-Pacific region.
Waterhemp [Amaranthus tuberculatus (Moq.) J. D. Sauer] and Palmer amaranth (Amaranthus palmeri S. Watson) are troublesome weeds of row-crop production in the United States. Their dioecious reproductive systems ensure outcrossing, facilitating rapid evolution and distribution of resistances to multiple herbicides. Little is known, however, about the genetic basis of dioecy in Amaranthus species. In this work, we use restriction site–associated DNA sequencing (RAD-Seq) to investigate the genetic basis of sex determination in A. tuberculatus and A. palmeri. For each species, approximately 200 plants of each sex were sampled and used to create RAD-Seq libraries. The resulting libraries were separately bar-coded and then pooled for sequencing with the Illumina platform, yielding millions of 64-bp reads. These reads were analyzed to identify sex-specific and sex-biased sequences. We identified 345 male-specific sequences from the A. palmeri data set and 2,754 male-specific sequences in A. tuberculatus. An unexpected 723 female-specific sequences were identified in a subset of the A. tuberculatus females; subsequent research, however, indicated female specificity of these markers was limited to the population from which they were identified. Primer sets designed to specifically amplify male-specific sequences were tested for accuracy on multiple, geographically distinct populations of A. tuberculatus and A. palmeri, as well as other Amaranthus species. Two primer sets for A. palmeri and four primer sets for A. tuberculatus were each able to distinguish between male and female plants with at least 95% accuracy. In the near term, sex-specific markers will be useful to the A. tuberculatus and A. palmeri research communities (e.g., to predict sex for crossing experiments). In the long-term, this research will provide the foundational tools for detailed investigations into the molecular biology and evolution of dioecy in weedy Amaranthus species.
Mass gatherings are growing in frequency. Religious, or in this case, “mass” mass gatherings are also growing in complexity, requiring considerable effort from nations hosting a Papal Mass. Ireland hosted a papal mass in 1979 when the prospect of terrorism at such events was significantly lower. Large high-profile events such as a Papal Mass offer a platform via the media and social media to gain widespread coverage of adverse events. In 2018, a predicted 500,000 guests were scheduled to attend a Papal Mass gathering in Phoenix Park, Dublin, a bounded 1,700-hectare park in the center of Dublin.
Aim:
To develop a medical plan estimating numbers of people requiring medical attention at a Papal Mass held in Ireland late August 2018, and compare same with actual numbers treated post-event. This study aims to reduce the medical impact of such an event on local receiving hospitals through plans that effectively manage medical- and trauma-related presentations on site.
Methods:
A literature review of medical reports regarding medical care at Papal Mass gatherings worldwide found a range of predicted medical attendance from 21-61 per 10,000 attendees. On that basis we had prepared on-site facilities, facilities on travel routes and access point system for medical care for a crowd of 500,000 were selected.
Results:
One of 6 receiving hospitals in Dublin had an increase in average presentations on the day. Attendance was reduced significantly due to weather. 261 patients were treated on site, falling in line with lower rate predicted of 31 patients treated in hospital on site and 17 transports off-site.
Discussion:
A predictable number of patients presented for medical care. On-site medical services reduced transports to hospital. Reduced attendance ensured facilities were sufficient, but could have been under the pressure of the predicted attendance of 500,000.
The emergence of and reaction to policy scandals has been usefully studied through comparative case studies. Far less attention has been devoted, however, to the study of such scandals in long-term historical context. With the aim of illuminating longer-term social processes which shape the likelihood that (health)care scandals emerge, we delineate three areas where such changes are visible: a) changing formats of social relations and emotions within and around care provision, and thereby understandings of and demands for compassionate care; b) heightened organisational and political sensitivity to failings; and c) changes in media reporting on healthcare failings, as well as in policy-makers’ responsiveness to and manipulation of media. We consider the 2013 Mid Staffordshire scandal in the English National Health Service and the extant policy literature on this scandal to help illuminate the added analytical value of our long-term approach. In the final section we explore the interconnection of the three processes and how longer-term approaches open up new vistas for policy analysis.
Patient expectancy is an important source of placebo effects in antidepressant clinical trials, but all prior studies measured expectancy prior to the initiation of medication treatment. Little is known about how expectancy changes during the course of treatment and how such changes influence clinical outcome. Consequently, we undertook the first analysis to date of in-treatment expectancy during antidepressant treatment to identify its clinical and demographic correlates, typical trajectories, and associations with treatment outcome.
Methods
Data were combined from two randomized controlled trials of antidepressant medication for major depressive disorder in which baseline and in-treatment expectancy assessments were available. Machine learning methods were used to identify pre-treatment clinical and demographic predictors of expectancy. Multilevel models were implemented to test the effects of expectancy on subsequent treatment outcome, disentangling within- and between-patient effects.
Results
Random forest analyses demonstrated that whereas more severe depressive symptoms predicted lower pre-treatment expectancy, in-treatment expectancy was unrelated to symptom severity. At each measurement point, increased in-treatment patient expectancy significantly predicted decreased depressive symptoms at the following measurement (B = −0.45, t = −3.04, p = 0.003). The greater the gap between expected treatment outcomes and actual depressive severity, the greater the subsequent symptom reductions were (B = 0.49, t = 2.33, p = 0.02).
Conclusions
Greater in-treatment patient expectancy is associated with greater subsequent depressive symptom reduction. These findings suggest that clinicians may benefit from monitoring and optimizing patient expectancy during antidepressant treatment. Expectancy may represent another treatment parameter, similar to medication compliance and side effects, to be regularly monitored during antidepressant clinical management.
Critical reflections on professional regulation have rarely taken a long-term perspective. In this chapter we draw on insights from process sociology, following in a tradition shaped chiefly by the works of Norbert Elias, in order to make sense of changes in professional–patient interactions and the implications of these changes for societal expectations of health care and the regulation of doctors. We focus this discussion on the regulatory apparatus of medical practice within England, where a shift towards increasing state involvement in regulation has taken place. This widening of ‘who regulates’ has been accompanied by a broadened understanding of quality clinical practice, with implications for ‘what is regulated’. Tensions have become apparent here between the nature of good practice as set out by the regulator and the state, and what is being evaluated in practice by current formats of regulatory assessment. To understand the emergence of these tensions as well as their impact, a longer-term perspective provides especially valuable analytical purchase, as we aim to show in this chapter.
In the section on informalisation and functional democratisation we describe various longer-term tendencies in professional–patient power dynamics – especially the development of more informal, less asymmetric relations and interactions. We then proceed in subsequent sections to consider three key implications and challenges of such informalisation, referring to changes in the practices and regulation of doctors in the United Kingdom (UK) by way of illustration. First, we argue that performances of compassion and care have become more central to understandings of ‘quality’ practice, as reflected in recent regulatory policies, but suggest that less asymmetric and structured interactions are also less stable – posing problems for quality assurance/regulation. Second, we consider that while regulators commonly seek to reflect and uphold norms and expectations regarding standards of care, the ‘softer’ less formalised features of care are harder to capture within the inevitably bureaucratic features of health care regulation and revalidation – for example, whereby professionals are required to show evidence of patient feedback, compliments and complaints. Third, we move on to explore how informalisation processes are also bound up with moves away from a blind, blanket, profession-based trust, underpinned by classic professional regulation, towards a more critical, interaction-won trust.
Recent modelling estimates up to two-thirds of new HIV infections among men who have sex with men occur within partnerships, indicating the importance of dyadic HIV prevention efforts. Although new interventions are available to promote dyadic health-enhancing behaviours, minimal research has examined what factors influence partners’ mutual engagement in these behaviours, a critical component of intervention success. Actor-partner interdependence modelling was used to examine associations between relationship characteristics and several dyadic outcomes theorised as antecedents to health-enhancing behaviours: planning and decision making, communication, and joint effort. Among 270 male-male partnerships, relationship satisfaction was significantly associated with all three outcomes for actors (p = .02, .02, .06 respectively). Latino men reported poorer planning and decision making (actor p = .032) and communication (partner p = .044). Alcohol use was significantly and negatively associated with all outcomes except actors’ planning and decision making (actors: p = .11, .038, .004 respectively; partners: p = .03, .056, .02 respectively). Having a sexual agreement was significantly associated with actors’ planning and decision making (p = .007) and communication (p = .008). Focusing on interactions between partners produces a more comprehensive understanding of male couples’ ability to engage in health-enhancing behaviours. This knowledge further identifies new and important foci for the tailoring of dyadic HIV prevention and care interventions.