We use cookies to distinguish you from other users and to provide you with a better experience on our websites. Close this message to accept cookies or find out how to manage your cookie settings.
To save content items to your account,
please confirm that you agree to abide by our usage policies.
If this is the first time you use this feature, you will be asked to authorise Cambridge Core to connect with your account.
Find out more about saving content to .
To save content items to your Kindle, first ensure no-reply@cambridge.org
is added to your Approved Personal Document E-mail List under your Personal Document Settings
on the Manage Your Content and Devices page of your Amazon account. Then enter the ‘name’ part
of your Kindle email address below.
Find out more about saving to your Kindle.
Note you can select to save to either the @free.kindle.com or @kindle.com variations.
‘@free.kindle.com’ emails are free but can only be saved to your device when it is connected to wi-fi.
‘@kindle.com’ emails can be delivered even when you are not connected to wi-fi, but note that service fees apply.
Psychiatric comorbidity is common in children and adolescents with CHDs. Early recognition and evidence-based treatments are crucial to prevent long-term consequences. To support early identification and reduce stigma, we 1) developed and 2) tested the usability and acceptability of online information material on common mental health disorders targeted healthcare professionals and affected families. Website content was shaped by insights from interviews with healthcare professionals across sectors, parents, and adolescents. Evaluations demonstrated promising acceptability and usability of the first prototype but indicated the need for improvements in specific aspects of content, navigation, and overall aesthetics.
Guidelines recommend screening for psychiatric co-morbidities in patients with congenital heart defects alongside cardiac outpatient follow-ups. These recommendations are not implemented in Denmark. This study aimed to investigate the psychiatric co-morbidities in children and adolescents with Fontan circulation in Denmark and to evaluate the feasibility of an online screening measure for psychiatric disorders.
Methods:
Children, adolescents, and their families answered the Development and Well-Being Assessment questionnaire and a questionnaire about received help online. Development and Well-Being Assessment ratings present psychiatric diagnoses in accordance with ICD-10 and DSM-5. Parent-reported received psychiatric help is also presented. Feasibility data are reported as participation rate (completed Development and Well-Being Assessments) and parental/adolescent acceptability from the feasibility questionnaire.
Results:
The participation rate was 27%. Of the participating children and adolescents, 53% (ICD-10)/59% (DSM-5) met full diagnostic criteria for at least one psychiatric diagnosis. Of these, 50% had not received any psychiatric or psychological help. Only 12% of participants had an a priori psychiatric diagnosis.
Conclusions:
We found that a large proportion of children and adolescents with Fontan circulation are underdiagnosed and undertreated for psychiatric disorders. The results from our study emphasise the need for psychiatric screening in this patient group. Development and Well-Being Assessment may be too comprehensive for online electronic screening in children and adolescents with CHD.
Adverse factors in the psychosocial work environment are associated with the onset of depression among those without a personal history of depression. However, the evidence is sparse regarding whether adverse work factors can also play a role in depression recurrence. This study aimed to prospectively examine whether factors in the psychosocial work environment are associated with first-time and recurrent treatment for depression.
Methods
The study included 24,226 participants from the Danish Well-being in Hospital Employees study. We measured ten individual psychosocial work factors and three theoretical constructs (effort–reward imbalance, job strain and workplace social capital). We ascertained treatment for depression through registrations of hospital contacts for depression (International Statistical Classification of Diseases and Related Health Problems version 10 [ICD-10]: F32 and F33) and redeemed prescriptions of antidepressant medication (Anatomical Therapeutic Chemical [ATC]: N06A) in Danish national registries. We estimated the associations between work factors and treatment for depression for up to 2 years after baseline among those without (first-time treatment) and with (recurrent treatment) a personal history of treatment for depression before baseline. We excluded participants registered with treatment within 6 months before baseline. In supplementary analyses, we extended this washout period to up to 2 years. We applied logistic regression analyses with adjustment for confounding.
Results
Among 21,156 (87%) participants without a history of treatment for depression, 350 (1.7%) had first-time treatment during follow-up. Among the 3070 (13%) participants with treatment history, 353 (11%) had recurrent treatment during follow-up. Those with a history of depression generally reported a more adverse work environment than those without such a history. Baseline exposure to bullying (odds ratio [OR] = 1.72, 95% confidence interval [95% CI]: 1.30–2.32), and to some extent also low influence on work schedule (OR = 1.27, 95% CI: 0.97–1.66) and job strain (OR = 1.24, 95% CI: 0.97–1.57), was associated with first-time treatment for depression during follow-up. Baseline exposure to bullying (OR = 1.40, 95% CI: 1.04–1.88), lack of collaboration (OR = 1.31, 95% CI: 1.03–1.67) and low job control (OR = 1.27, 95% CI: 1.00–1.62) were associated with recurrent treatment for depression during follow-up. However, most work factors were not associated with treatment for depression. Using a 2-year washout period resulted in similar or stronger associations.
Conclusions
Depression constitutes a substantial morbidity burden in the working-age population. Specific adverse working conditions were associated with first-time and recurrent treatment for depression and improving these may contribute to reducing the onset and recurrence of depression.
The population of long-term survivors with CHDs is increasing due to better diagnostics and treatment. This has revealed many co-morbidities including different neurocognitive difficulties. However, the prevalence of psychiatric disorders among children and adolescents and the specific types of disorders they may experience are unclear. We systematically reviewed the existing literature, where psychiatric diagnoses or psychiatric symptoms were investigated in children and adolescents (age: 2–18 aged) with CHDs and compared them with a heart-healthy control group or normative data. The searches were done in the three databases PubMed, psychINFO, and Embase. We included 20 articles reporting on 8035 unique patients with CHDs. Fourteen articles repoted on psychological symptoms, four reported on psychiatric diagnoses, and two reported on both symptoms and diagnoses. We found that children and adolescents with a CHD had a higher prevalence of attention deficit hyperactivity disorder (ranging between 1.4 and 9 times higher) and autism (ranging between 1.8 and 5 times higher) than controls, but inconsistent results regarding depression and anxiety.
On average, thirty percent of patients in internet based treatments do not complete the treatment program. The majority of studies predicting adherence have focused on baseline variables. While some consistent predictors have emerged (e.g. gender, education), they are insufficient for guiding clinicians in identifying patients at risk for dropout. More precise predictors are needed. More recently, studies on prediction have started to explore process variables such as early response to treatment or program usage.
Objectives
To investigate:
i) How much variance in adherence is explained by baseline symptoms and sociodemographic variables?
ii) Can we improve the model by including early response and program usage as predictors?
iii) What is the predictive accuracy of the most parsimonious regression model?
Methods
Data will be extracted from the Danish ‘Internetpsychiatry’ clinic, which delivers guided internet based cognitive behavioural therapy for depression. Sociodemographic data is collected upon application, and symptoms of depression and anxiety are measured at the start of treatment. Further, symptoms of depression are measured between each session of the online treatment program. Early response to treatment will be conceptualized as the individual regression slope of depression scores for each patient, during the first four weeks of treatment. Program usage data will be collected from the online treatment platform (e.g. number of words per message to therapists, time spent on each session during the first four weeks, number of logins during the first four weeks).
Predictors for adherence will be examined in a hierarchical logistic regression. Models will be compared using ANOVA. The most parsimonious model will be determined using the Aikake Information Criterion. Receiver operating characteristic curve analyses will be used to classify the accuracy of the model.
Results
Analyses have not yet been conducted. Results will be available for presentation at the conference.
Conclusions
Determining more accurate predictors for adherence in internet based treatments is the first step towards improving adherence. Research findings need to be translated into clinically useful guidelines that may inform clinical decision making. Findings from this study could potentially be implemented as a system that monitors patients’ program usage and symptom development and signals therapists if a patient is at risk for dropout.
While effective project planning is crucial for the success of a clinical research project, being able to execute the plan is even more important. In Denmark, approval for health research projects is applied for at regional or national committees on health research ethics, which have been reluctant to approve clinical research projects involving forensic psychiatric in-patients, due to the admission usually being pursuant to treatment sanctions. However, recently we received approval for a clinical research project exclusively targeted towards inpatients at a large medium secure forensic psychiatric facility in Denmark.
Objectives
Describing the process of project execution from planning to submitting the manuscript which is inherently multi-faceted and inundated with stress factors. How to connect theory, knowledge, project with clinical practice, with clinical research?
Methods
Qualitative data collecting while undertaking an exploratory, open-label, non-randomised weight reducing trial with a glucagon-like peptide-1 receptor agonist.
Results
Challenges in finding, screening, motivating, recruiting, obtaining valid confirmed consent from potential study participants and other stakeholders, team communication, responsibilities and accountabilities within the team, Pareto Principle, scope creep, building project reports manually, real-time data gathering, unpredictable and other project deliverables will be presented
Conclusions
Experiences of the hospital staff (psychiatrists, doctors and nurses) in execution process of the project investigation performed and made possible through participation of their forensic psychiatric in-patients.
Improved survival has led to a growing population of adults with congenital heart disease (CHD), followed by numerous reports of late complications. Liver disease is a known complication in some patients, with most studies focusing on Fontan associated liver disease. Whether liver disease also exists in other patients with CHD is not fully investigated. Elevated central venous pressure is considered pivotal in the development of liver disease in Fontan associated liver disease, and other patients with alterations in central venous pressure may also be at risk for developing liver fibrosis. We wanted to see if liver fibrosis is present in patients with tetralogy of Fallot. Many patients with tetralogy of Fallot have severe pulmonary regurgitation, which can lead to elevated central venous pressure. Patients with tetralogy of Fallot may be at risk of developing liver fibrosis.
Materials and methods:
Ten patients (24–56 years) with tetralogy of Fallot and pulmonary regurgitation were investigated for liver fibrosis. All patients were examined with magnetic resonance elastography of liver, hepatobiliary iminodiacetic acid scan, indocyanine green elimination by pulse spectrophotometry, elastography via FibroScan, abdominal ultrasound including liver elastography, and blood samples including liver markers.
Results:
Three out of ten patients had findings indicating possible liver fibrosis. Two of these had a liver biopsy performed, which revealed fibrosis stage 1 and 2, respectively. The same three patients had an estimated elevated central venous pressure in previous echocardiograms.
Conclusions:
Mild liver fibrosis was present in selected patients with tetralogy of Fallot and may be related to elevated central venous pressure.
In January of 2010, North Carolina (NC) USA implemented state-wide Trauma Triage Destination Plans (TTDPs) to provide standardized guidelines for Emergency Medical Services (EMS) decision making. No study exists to evaluate whether triage behavior has changed for geriatric trauma patients.
Hypothesis/Problem:
The impact of the NC TTDPs was investigated on EMS triage of geriatric trauma patients meeting physiologic criteria of serious injury, primarily based on whether these patients were transported to a trauma center.
Methods:
This is a retrospective cohort study of geriatric trauma patients transported by EMS from March 1, 2009 through September 30, 2009 (pre-TTDP) and March 1, 2010 through September 30, 2010 (post-TTDP) meeting the following inclusion criteria: (1) age 50 years or older; (2) transported to a hospital by NC EMS; (3) experienced an injury; and (4) meeting one or more of the NC TTDP’s physiologic criteria for trauma (n = 5,345). Data were obtained from the Prehospital Medical Information System (PreMIS). Data collected included proportions of patients transported to a trauma center categorized by specific physiologic criteria, age category, and distance from a trauma center.
Results:
The proportion of patients transported to a trauma center pre-TTDP (24.4% [95% CI 22.7%-26.1%]; n = 604) was similar to the proportion post-TTDP (24.4% [95% CI 22.9%-26.0%]; n = 700). For patients meeting specific physiologic triage criteria, the proportions of patients transported to a trauma center were also similar pre- and post-TTDP: systolic blood pressure <90 mmHg (22.5% versus 23.5%); respiratory rate <10 or >29 (23.2% versus 22.6%); and Glascow Coma Scale (GCS) score <13 (26.0% versus 26.4%). Patients aged 80 years or older were less likely to be transported to a trauma center than younger patients in both the pre- and post-TTDP periods.
Conclusions:
State-wide implementation of a TTDP had no discernible effect on the proportion of patients 50 years and older transported to a trauma center. Under-triage remained common and became increasingly prevalent among the oldest adults. Research to understand the uptake of guidelines and protocols into EMS practice is critical to improving care for older adults in the prehospital environment.
Climate and weather conditions may have substantial effects on the ecology of both parasites and hosts in natural populations. The strength and shape of the effects of weather on parasites and hosts are likely to change as global warming affects local climate. These changes may in turn alter fundamental elements of parasite–host dynamics. We explored the influence of temperature and precipitation on parasite prevalence in a metapopulation of avian hosts in northern Norway. We also investigated if annual change in parasite prevalence was related to winter climate, as described by the North Atlantic Oscillation (NAO). We found that parasite prevalence increased with temperature within-years and decreased slightly with increasing precipitation. We also found that a mild winter (positive winter NAO index) was associated with higher mean parasite prevalence the following year. Our results indicate that both local and large scale weather conditions may affect the proportion of hosts that become infected by parasites in natural populations. Understanding the effect of climate and weather on parasite–host relationships in natural populations is vital in order to predict the full consequence of global warming.
An insect trap constructed using three-dimensional (3D) printing technology was tested in potato (Solanum tuberosum Linnaeus; Solanaceae) fields to determine whether it could substitute for the standard yellow sticky card used to monitor Bactericera cockerelli (Šulc) (Hemiptera: Psylloidea: Triozidae). Sticky cards have shortcomings that prompted search for a replacement: cards are messy, require weekly replacement, are expensive to purchase, and accumulate large numbers of nontarget insects. Bactericera cockerelli on sticky cards also deteriorate enough that specimens cannot be tested reliably for the presence of vectored plant pathogens. A prototype trap constructed using 3D printing technology for monitoring Diaphorina citri Kuwayama (Hemiptera: Psylloidea: Liviidae) was tested for monitoring B. cockerelli. The trap was designed to attract B. cockerelli visually to the trap and then funnel specimens into preservative-filled vials at the trap bottom. Prototype traps were paired against yellow sticky cards at multiple fields to compare the captures of B. cockerelli between cards and traps. The prototype trap was competitive with sticky cards early in the growing season when B. cockerelli numbers were low. We estimated that two or three prototype traps would collect as many B. cockerelli as one sticky card under these conditions. Efficacy of the prototype declined as B. cockerelli numbers increased seasonally. The prototype trap accumulated nontarget taxa that are common on sticky cards (especially Thysanoptera and Diptera), and was also found to capture taxa of possible interest in integrated pest management research, including predatory insects, parasitic Hymenoptera, and winged Aphididae (Hemiptera), suggesting that the traps could be useful outside of the purpose targeted here. We believe that 3D printing technology has substantial promise for developing monitoring tools that exploit behavioural traits of the targeted insect. Ongoing work includes the use of this technology to modify the prototype, with a focus on making it more effective at capturing psyllids and less susceptible to capture of nontarget species.
Herbicide-resistant crops (HRCs), particularly glyphosate-resistant soybean, are increasingly important in Latin America. Prior to commercial release of these crops, their short- and long-term risks should be thoroughly assessed. A risk assessment should include the identification and characterization of potential hazards and an estimation of the likelihood of these hazards occurring. For HRCs the agro-ecological hazards are mostly related to the occurrence of herbicide-resistant (HR) weeds and crop volunteers and the adverse effects from the use of pesticides within the agricultural area. Herbicide-resistant rice is used as a case study to visualize the key components of such a risk assessment. For this purpose, a model that simulates a typical rain-fed rice production system in Central America was developed. The model was used to investigate the selection of HR weedy rice populations under various scenarios. Scenarios included contrasting weed management practices, hybridization levels between the commercial HR cultivated and weedy rice, and seed predation rates. Because risks may become apparent only after long-term cultivation of HR rice, simulations were run for a 10-yr period. In a cropping system relying on glufosinate-resistant rice for weed control, the model predicted that resistance to glufosinate would occur after 3 to 8 yr of monoculture. Increasing the hybridization level from 1 to 5% decreased the time for resistance to occur by 1 to 3 yr. Increasing annual rate of weedy rice seed predation at the soil surface delayed the development of resistance. Tillage as a weed control tactic also delayed the occurrence of resistance when compared with a no-till situation.
Mutants of Bacillus subtilis can be developed to overproduce Val in vitro. It was hypothesized that addition of Bacillus subtilis mutants to pig diets can be a strategy to supply the animal with Val. The objective was to investigate the effect of Bacillus subtilis mutants on growth performance and blood amino acid (AA) concentrations when fed to piglets. Experiment 1 included 18 pigs (15.0±1.1 kg) fed one of three diets containing either 0.63 or 0.69 standardized ileal digestible (SID) Val : Lys, or 0.63 SID Val : Lys supplemented with a Bacillus subtilis mutant (mutant 1). Blood samples were obtained 0.5 h before feeding and at 1, 2, 3, 4, 5 and 6 h after feeding and analyzed for AAs. In Experiment 2, 80 piglets (9.1±1.1 kg) were fed one of four diets containing 0.63 or 0.67 SID Val : Lys, or 0.63 SID Val : Lys supplemented with another Bacillus subtilis mutant (mutant 2) or its parent wild type. Average daily feed intake, daily weight gain and feed conversion ratio were measured on days 7, 14 and 21. On day 17, blood samples were taken and analyzed for AAs. On days 24 to 26, six pigs from each dietary treatment were fitted with a permanent jugular vein catheter, and blood samples were taken for AA analysis 0.5 h before feeding and at 1, 2, 3, 4, 5 and 6 h after feeding. In experiment 1, Bacillus subtilis mutant 1 tended (P<0.10) to increase the plasma levels of Val at 2 and 3 h post-feeding, but this was not confirmed in Experiment 2. In Experiment 2, Bacillus subtilis mutant 2 and the wild type did not result in a growth performance different from the negative and positive controls. In conclusion, results obtained with the mutant strains of Bacillus subtilis were not better than results obtained with the wild-type strain, and for both strains, the results were not different than the negative control.
To limit tail biting incidence, most pig producers in Europe tail dock their piglets. This is despite EU Council Directive 2008/120/EC banning routine tail docking and allowing it only as a last resort. The paper aims to understand what it takes to fulfil the intentions of the Directive by examining economic results of four management and housing scenarios, and by discussing their consequences for animal welfare in the light of legal and ethical considerations. The four scenarios compared are: ‘Standard Docked’, a conventional housing scenario with tail docking meeting the recommendations for Danish production (0.7 m2/pig); ‘Standard Undocked’, which is the same as ‘Standard Docked’ but with no tail docking, ‘Efficient Undocked’ and ‘Enhanced Undocked’, which have increased solid floor area (0.9 and 1.0 m2/pig, respectively) provision of loose manipulable materials (100 and 200 g/straw per pig per day) and no tail docking. A decision tree model based on data from Danish and Finnish pig production suggests that Standard Docked provides the highest economic gross margin with the least tail biting. Given our assumptions, Enhanced Undocked is the least economic, although Efficient Undocked is better economically and both result in a lower incidence of tail biting than Standard Undocked but higher than Standard Docked. For a pig, being bitten is worse for welfare (repeated pain, risk of infections) than being docked, but to compare welfare consequences at a farm level means considering the number of affected pigs. Because of the high levels of biting in Standard Undocked, it has on average inferior welfare to Standard Docked, whereas the comparison of Standard Docked and Enhanced (or Efficient) Undocked is more difficult. In Enhanced (or Efficient) Undocked, more pigs than in Standard Docked suffer from being tail bitten, whereas all the pigs avoid the acute pain of docking endured by the pigs in Standard Docked. We illustrate and discuss this ethical balance using numbers derived from the above-mentioned data. We discuss our results in the light of the EU Directive and its adoption and enforcement by Member States. Widespread use of tail docking seems to be accepted, mainly because the alternative steps that producers are required to take before resorting to it are not specified in detail. By tail docking, producers are acting in their own best interests. We suggest that for the practice of tail docking to be terminated in a way that benefits animal welfare, changes in the way pigs are housed and managed may first be required.
Societal aging is expected to impact the use of emergency medical services (EMS). Older adults are known as high users of EMS. Our primary objective was to quantify the rate of EMS use by older adults in a Canadian provincial EMS system. Our secondary objective was to compare those transported to those not transported.
Methods
We analysed data from a provincial EMS database for emergency responses between January 1, 2010 and December 31, 2010 and included all older adults (≥65 years) requesting EMS for an emergency call. We described EMS use in relation to age, sex, and resources.
Results
There were 30,653 emergency responses for older adults in 2010, representing close to 50% of the emergency call volume and an overall response rate of 202.8 responses per 1,000 population 65 years and older. The mean age was 79.9±8.5 years for those 57.3% who were female. The median paramedic-determined Canadian Triage and Acuity Scale (CTAS) score was 3 and the mean on-scene time was 24.2 minutes. Non-transported calls (12.3%) for the elderly involved predominantly (54.9%) female patients of similar mean age (78.3 years) but lower acuity (CTAS 5) and longer average on-scene times (32.6 minutes).
Conclusions
We confirmed the increasingly high rate of EMS use with age to be consistent with other industrialized populations. The low-priority and non-transport calls by older adults consumed considerable resources in this provincial system and might be the areas most malleable to meet the challenges facing EMS systems.
The objective of the current experiment was to compare the effects of supplementing mid-lactation dairy cows with all-rac-α-tocopheryl acetate (SyntvE), RRR-α-tocopheryl acetate (NatvE) or seaweed meal (Seaweed) in the presence of a Control group (no supplemental vitamin E or seaweed) on the concentration of α-tocopherol in plasma and milk, and antibody response following immunization. The hypothesis was that supplementation of dairy cows with vitamin E, regardless of its form, would increase plasma and milk α-tocopherol compared to the control diet and this incremental response would be bigger with NatvE than SyntvE. Furthermore, it was hypothesized that vitamin E, regardless of its form, will provide an improved adaptive immune response to immunization than the Control diet, and cows supplemented with Seaweed meal would produce better adaptive immune response following immunization than cows in the Control group. Twenty-four Norwegian Red (NR) dairy cows in their mid-lactation were allocated randomly to the four treatments in a replicated Latin square design. The cows were fed on a basal diet of silage and concentrate on top of which the experimental supplements were provided. Plasma and milk α-tocopherol concentrations were higher in NatvE and SyntvE groups than in the other two groups. The RRR-α-tocopherol stereoisomer was the predominant form (>0·86), in both plasma and milk, whereas the remaining part was largely made up of the other three 2R stereoisomers (RRS, RSR and RSS). In cows fed the Control, Seaweed and NatvE, the proportion of the RRR-α-tocopherol stereoisomer in plasma and milk constituted >0·97 of the total α-tocopherol. Mid-lactation NR dairy cows had higher than adequate levels of plasma α-tocopherol (9·99 mg/l) even when not supplemented with external source of vitamin E, suggesting that with a good quality silage these cows may not be at risk of vitamin E deficiency. Furthermore, the present study shows that dairy cows in mid to late lactation have preferential uptake of RRR stereoisomer of α-tocopherol compared with other stereoisomers. All cows responded well to immunization with different antigens, but there were no significant group effects of the diet on the immune response measured.
In pig production, piglets are tail docked at birth in order to prevent tail biting later in life. In order to examine the effects of tail docking and docking length on the formation of neuromas, we used 65 pigs and the following four treatments: intact tails (n=18); leaving 75% (n=17); leaving 50% (n=19); or leaving 25% (n=11) of the tail length on the pigs. The piglets were docked between day 2 and 4 after birth using a gas-heated apparatus, and were kept under conventional conditions until slaughter at 22 weeks of age, where tails were removed and examined macroscopically and histologically. The tail lengths and diameters differed at slaughter (lengths: 30.6±0.6; 24.9±0.4; 19.8±0.6; 8.7±0.6 cm; P<0.001; tail diameter: 0.5±0.03; 0.8±0.02; 1.0±0.03; 1.4±0.04 cm; P<0.001, respectively). Docking resulted in a higher proportion of tails with neuromas (64 v. 0%; P<0.001), number of neuromas per tail (1.0±0.2 v. 0; P<0.001) and size of neuromas (1023±592 v. 0 μm; P<0.001). The results show that tail docking piglets using hot-iron cautery causes formation of neuromas in the outermost part of the tail tip. The presence of neuromas might lead to altered nociceptive thresholds, which need to be confirmed in future studies.
We consider two-dimensional one-sided convection of a solute in a fluid-saturated porous medium, where the solute decays via a first-order reaction. Fully nonlinear convection is investigated using high-resolution numerical simulations and a low-order model that couples the dynamic boundary layer immediately beneath the distributed solute source to the slender vertical plumes that form beneath. A transient-growth analysis of the boundary layer is used to characterise its excitability. Three asymptotic regimes are investigated in the limit of high Rayleigh number $\mathit{Ra}$, in which the domain is considered deep, shallow or of intermediate depth, and for which the Damköhler number $\mathit{Da}$ is respectively large, small or of order unity. Scaling properties of the flow are identified numerically and rationalised via the analytic model. For fully established high-$\mathit{Ra}$ convection, analysis and simulation suggest that the time-averaged solute transfer rate scales with $\mathit{Ra}$ and the plume horizontal wavenumber with $\mathit{Ra}^{1/2}$, with coefficients modulated by $\mathit{Da}$ in each case. For large $\mathit{Da}$, the rapid reaction rate limits the plume depth and the boundary layer restricts the rate of solute transfer to the bulk, whereas for small $\mathit{Da}$ the average solute transfer rate is ultimately limited by the domain depth and the convection is correspondingly weaker.