We use cookies to distinguish you from other users and to provide you with a better experience on our websites. Close this message to accept cookies or find out how to manage your cookie settings.
To save content items to your account,
please confirm that you agree to abide by our usage policies.
If this is the first time you use this feature, you will be asked to authorise Cambridge Core to connect with your account.
Find out more about saving content to .
To save content items to your Kindle, first ensure no-reply@cambridge.org
is added to your Approved Personal Document E-mail List under your Personal Document Settings
on the Manage Your Content and Devices page of your Amazon account. Then enter the ‘name’ part
of your Kindle email address below.
Find out more about saving to your Kindle.
Note you can select to save to either the @free.kindle.com or @kindle.com variations.
‘@free.kindle.com’ emails are free but can only be saved to your device when it is connected to wi-fi.
‘@kindle.com’ emails can be delivered even when you are not connected to wi-fi, but note that service fees apply.
The proportion of physician-investigators involved in biomedical research is shrinking even as the need for high-quality, interdisciplinary research is growing. Building the physician-investigator workforce is thus a pressing concern. Flexible, “light-weight” training modalities can help busy physician-investigators prepare for key stages of the research life cycle and personalize their learning to their own needs. Such training can also support researchers from diverse backgrounds and lighten the work of mentors.
Materials and Methods:
The University of Pittsburgh’s Institute for Clinical Research Education designed the Stackables Microcredentials in Clinical and Translational Research (Stackables) program to provide flexible, online training to supplement and enhance formal training programs. This training utilizes a self-paced, just-in-time format along with an interactive, storytelling approach to sustain learner engagement. Learners earn badges for completing modules and certificates for completing “stacks” in key competency areas. In this paper, we describe the genesis and development of the Stackables program and report the results of a pilot study in which we evaluated changes in confidence in key skill areas from pretest to posttest, as well as engagement and perceived effectiveness.
Results:
Our Stackables pilot study showed statistically significant gains in learner confidence in all skill areas from pretest to posttest. Pilot participants reported that the module generated high levels of engagement and enhanced their skills, knowledge, and interest in the subject.
Conclusions:
Stackables provide an important complement to formal coursework by focusing on discrete skill areas and allowing learners to access the training they need when they need it.
Most farmland in the US Corn Belt is used to grow row crops at large scales (e.g., corn, soybean) that are highly processed before entering the human food stream rather than specialty crops grown in smaller areas and meant for direct human consumption (table food). Bolstering local table food production close to urban populations in this region through peri-urban agriculture (PUA) could enhance sustainability and resilience. Understanding factors influencing PUA producers' preferences and willingness to produce table food would enable supportive planning and policy efforts. This study combined land use visualization and survey data to examine the potential for increased local table food production for the US Corn Belt. We developed a spatial visualization of current agricultural land use and a future scenario with increased table food production designed to meet 50% of dietary requirements for a metropolitan population in 2050. A survey was administered to row crop (1360) and specialty crop (55) producers near Des Moines, Iowa, US to understand current and intended agricultural land use and factors influencing production. Responses from 316 row crop and 25 specialty crop producers were eligible for this analysis. A future scenario with increased table food production would require less than 3% of available agricultural land and some additional producers (approximately 130, primarily for grain production). Survey responses indicated PUA producers planned small increases in table food production in the next three to five years. Producer plans, including land rental for table food production, could provide approximately 25% of residents' fruit, vegetables, and grains, an increase from the baseline of 2%. Row crop producers ranked food safety regulations, and specialty producers ranked labor concerns as strong influences on their decision-making. Both groups indicated that crop insurance and processing facilities were also important. Increasing table food production by clustering mid-scale operations to increase economies of scale and strengthening supply chains and production infrastructure could provide new profitable opportunities for farmers and more resilient food systems for growing urban regions in the US Corn Belt. Continuing to address producer factors and landscape-scale environmental impacts will be critical in considering food system sustainability challenges holistically.
Alzheimer’s disease (AD), a leading cause of dementia worldwide, affected an estimated 47 million people in 2015, placing a burden of over $1 trillion on health systems. Subclinical markers of AD pathology are seen many years before the clinical onset of dementia, suggesting that steps could be taken to prevent progression to disease in healthy individuals. Sleep optimizes cognition by creating a window of opportunity to consolidate memories, prune synaptic networks, and clear waste products. Studies that characterize the relationship between sleep and cognitive function prior to the onset of clinical AD could guide research into effective methods of delaying AD onset or preventing it altogether. The objective of our study is to describe how sleep quality and quantity correlate with performance on cognitive assessments within a healthy, aging population.
Participants and Methods:
Seventeen participants, between 62-82 years of age enrolled in an ongoing clinical trial assessing the effects of melatonin (5mg daily) versus placebo, were included in our study. Participants were observed over a 2-month period, during which no experimental interventions were administered. At study entry, participants underwent a comprehensive neuropsychological evaluation evaluating cognitive domains of attention, memory, speed of information processing, language, executive functioning, and mood. Afterwards, all participants wore a watch that measured actigraphy and light data (Philips Actiwatch Spectrum Pro actigraphy monitor) for 8 weeks to evaluate their sleep habits. Pearson and Spearman partial correlations were used to evaluate relationships between objective sleep parameters and baseline cognitive function test scores.
Results:
Aberrations of sleep length, sleep fragmentation, and daytime activity measures significantly correlated with cognitive performance on memory, language, visuospatial skills, and speed of processing tests (p = <0.05). Greater variability of awakenings at nighttime associated with better scores on memory tests but worse scores on language tests. Longer sleep times associated with worse language scores, while greater variability in daily activity correlated with poorer scores on visuospatial skills tests and speed of processing tests.
Conclusions:
This study establishes a framework for obtaining longitudinal sleep data in conjunction with serial cognitive function testing, encouraging further exploration into how sleep metrics affect specific domains of cognitive function. Findings suggest that having a less consistent sleep routine correlates with poorer cognitive function across multiple domains. The authors recommend broader analysis of actigraphy and cognitive function testing as objective measures of sleep and cognition in research and clinical practice.
Illicit substance use is dangerous in both acute and chronic forms, frequently resulting in lethal poisoning, addiction, and other negative consequences. Similar to research in other psychiatric conditions, whose ultimate goal is to enable effective prevention and treatment, studies in substance use are focused on factors elevating the risk for the disorder. The rapid growth of the substance use problem despite the effort invested in fighting it, however, suggests the need in changing the research approach. Instead of attempting to identify risk factors, whose neutralization is often infeasible if not impossible, it may be more promising to systematically reverse the perspective to the factors enhancing the aspect of liability to disorder that shares the same dimension but is opposite to risk, that is, resistance to substance use. Resistance factors, which enable the majority of the population to remain unaffected despite the ubiquity of psychoactive substances, may be more amenable to translation. While the resistance aspect of liability is symmetric to risk, the resistance approach requires substantial changes in sampling (high-resistance rather than high-risk) and using quantitative indices of liability. This article provides an overview and a practical approach to research in resistance to substance use/addiction, currently implemented in a NIH-funded project. The project benefits from unique opportunities afforded by the data originating from two longitudinal twin studies, the Virginia Twin Study of Adolescent and Behavioral Development and the Minnesota Twin Family Study. The methodology described is also applicable to other psychiatric disorders.
Prevention programs that are ‘transdiagnostic’ may be more cost-effective and beneficial, in terms of reducing levels of psychopathology in the general population, than those focused on a specific disorder. This randomized controlled study evaluated the efficacy of one such intervention program called Resilience Training (RT).
Methods
College students who reported mildly elevated depressive or subclinical psychotic symptoms (‘psychotic experiences' (PEs)) (n = 107) were randomized to receiving RT (n = 54) or to a waitlist control condition (n = 53). RT consists of a four-session intervention focused on improving resilience through the acquisition of mindfulness, self-compassion, and mentalization skills. Measures of symptoms and these resilience-enhancing skills were collected before and after the 4-week RT/waitlist period, with a follow-up assessment 12-months later.
Results
Compared to the waitlist control group, RT participants reported significantly greater reductions in PEs, distress associated with PEs, depression, and anxiety, as well as significantly greater improvements in resilience, mindfulness, self-compassion, and positive affect, following the 4-week RT/waitlist period (all p < 0.03). Moreover, improvements in resilience-promoting skills were significantly correlated with symptom reductions (all p < 0.05). Lastly, the RT-related reductions in PEs and associated distress were maintained at the 12-month follow-up assessment.
Conclusions
RT is a brief, group-based intervention associated with improved resilience and reduced symptoms of psychopathology, with sustained effects on PEs, in transdiagnostically at-risk young adults. Follow-up studies can further assess the efficacy of RT relative to other interventions and test whether it can reduce the likelihood of developing a serious mental illness.
To investigate a cluster of severe acute respiratory syndrome coronavirus 2 (SARS-CoV-2) infections in employees working on 1 floor of a hospital administration building.
Methods:
Contact tracing was performed to identify potential exposures and all employees were tested for SARS-CoV-2. Whole-genome sequencing was performed to determine the relatedness of SARS-CoV-2 samples from infected personnel and from control cases in the healthcare system with coronavirus disease 2019 (COVID-19) during the same period. Carbon dioxide levels were measured during a workday to assess adequacy of ventilation; readings >800 parts per million (ppm) were considered an indication of suboptimal ventilation. To assess the potential for airborne transmission, DNA-barcoded aerosols were released, and real-time polymerase chain reaction was used to quantify particles recovered from air samples in multiple locations.
Results:
Between December 22, 2020, and January 8, 2021, 17 coworkers tested positive for SARS-CoV-2, including 13 symptomatic and 4 asymptomatic individuals. Of the 5 cluster SARS-CoV-2 samples sequenced, 3 were genetically related, but these employees denied higher-risk contacts with one another. None of the sequences from the cluster were genetically related to the 17 control sequences of SARS-CoV-2. Carbon dioxide levels increased during a workday but never exceeded 800 ppm. DNA-barcoded aerosol particles were dispersed from the sites of release to locations throughout the floor; 20% of air samples had >1 log10 particles.
Conclusions:
In a hospital administration building outbreak, sequencing of SARS-CoV-2 confirmed transmission among coworkers. Transmission occurred despite the absence of higher-risk exposures and in a setting with adequate ventilation based on monitoring of carbon dioxide levels.
Hypoplastic left heart syndrome and single ventricle variants with aortic hypoplasia are commonly classified as severe forms of CHD. We hypothesised patients with these severe defects and reported genetic abnormalities have increased morbidity and mortality during the interstage period.
Methods and Results:
This was a retrospective review of the National Pediatric Cardiology Quality Improvement Collaborative Phase I registry. Three patient groups were identified: major syndromes, other genetic abnormalities, and no reported genetic abnormality. Tukey post hoc test was applied for pairwise group comparisons of length of stay, death, and combined outcome of death, not a candidate for stage 2 palliation, and heart transplant. Participating centres received a survey to establish genetic testing and reporting practices. Of the 2182 patients, 110 (5%) had major genetic syndromes, 126 (6%) had other genetic abnormalities, and 1946 (89%) had no genetic abnormality. Those with major genetic syndromes weighed less at birth and stage 1 palliation. Patients with no reported genetic abnormalities reached full oral feeds sooner and discharged earlier. The combined outcome of death, not a candidate for stage 2 palliation, and heart transplant was more common in those with major syndromes. Survey response was low (n = 23, 38%) with only 14 (61%) routinely performing and reporting genetic testing.
Conclusions:
Patients with genetic abnormalities experienced greater morbidity and mortality during the interstage period than those with no reported genetic abnormalities. Genetic testing and reporting practices vary significantly between participating centres.
Numerous theories posit different core features to borderline personality disorder (BPD). Recent advances in network analysis provide a method of examining the relative centrality of BPD symptoms, as well as examine the replicability of findings across samples. Additionally, despite the increase in research supporting the validity of BPD in adolescents, clinicians are reluctant to diagnose BPD in adolescents. Establishing the replicability of the syndrome across adolescents and adults informs clinical practice and research. This study examined the stability of BPD symptom networks and centrality of symptoms across samples varying in age and clinical characteristics.
Methods
Cross-sectional analyses of BPD symptoms from semi-structured diagnostic interviews from the Collaborative Longitudinal Study of Personality Disorders (CLPS), the Methods to Improve Diagnostic Assessment and Service (MIDAS) study, and an adolescent clinical sample. Network attributes, including edge (partial association) strength and node (symptom) expected influence, were compared.
Results
The three networks were largely similar and strongly correlated. Affective instability and identity disturbance emerged as relatively central symptoms across the three samples, and relationship difficulties across adult networks. Differences in network attributes were more evident between networks varying both in age and in BPD symptom severity level.
Conclusions
Findings highlight the relative importance of affective, identity, and relationship symptoms, consistent with several leading theories of BPD. The network structure of BPD symptoms appears generally replicable across multiple large samples including adolescents and adults, providing further support for the validity of the diagnosis across these developmental phases.
This research is motivated by the desire to control the solids distribution during the drying of a film containing particles of two different sizes. A variety of particle arrangements in dried films has been seen experimentally, including a thin layer of small particles at the top surface. However, it is not fully understood why this would occur. This work formulates and solves a colloidal hydrodynamics model for (i) diffusion alone and (ii) diffusion plus excluded volume diffusiophoresis, to determine their relative importance in affecting the particle arrangement. The methodology followed is to derive partial differential equations (PDEs) describing the motion of two components in a drying film. The diffusive fluxes are predicted by generalising the Stokes–Einstein diffusion coefficient, with the dispersion compressibility used to produce equations valid up until close packing. A further set of novel equations incorporating diffusiophoresis is derived. The diffusiophoretic mechanism investigated in this work is the small particles being excluded from a volume around the large particles. The resulting PDEs are scaled and solved numerically using a finite volume method. The model includes the chemical potentials of the particles, allowing for incorporation of any interaction term. The relative magnitudes of the fluxes of the differently sized particles are compared using scaling arguments and via numerical results. The diffusion results, without any inter-particle interactions, predict stratification of large particles to the top surface. Addition of excluded volume diffusiophoresis introduces a downwards flux on the large particles, that can result in small-on-top stratification, thus providing a potential explanation of the experimental observations.
Hypotension is an adverse event that may be related to systemic exposure of milrinone; however, the true exposure–safety relationship is unknown.
Methods:
Using the Pediatric Trials Network multicentre repository, we identified children ≤17 years treated with milrinone. Hypotension was defined according to age, using the Pediatric Advanced Life Support guidelines. Clinically significant hypotension was defined as hypotension with concomitant lactate >3 mg/dl. A prior population pharmacokinetic model was used to simulate milrinone exposures to evaluate exposure–safety relationships.
Results:
We included 399 children with a median (quarter 1, quarter 3) age of 1 year (0,5) who received 428 intravenous doses of milrinone (median infusion rate 0.31 mcg/kg/min [0.29,0.5]). Median maximum plasma milrinone concentration was 110.7 ng/ml (48.4,206.2). Median lowest systolic and diastolic blood pressures were 74 mmHg (60,85) and 35 mmHg (25,42), respectively. At least 1 episode of hypotension occurred in 178 (45%) subjects; clinically significant hypotension occurred in 10 (2%). The maximum simulated milrinone plasma concentrations were higher in subjects with clinically significant hypotension (251 ng/ml [129,329]) versus with hypotension alone (86 ng/ml [44, 173]) versus without hypotension (122 ng/ml [57, 208], p = 0.002); however, this relationship was not retained on multivariable analysis (odds ratio 1.01; 95% confidence interval 0.998, 1.01).
Conclusions:
We successfully leveraged a population pharmacokinetic model and electronic health record data to evaluate the relationship between simulated plasma concentration of milrinone and systemic hypotension occurrence, respectively, supporting the broader applicability of our novel, efficient, and cost-effective study design for examining drug exposure–response and –safety relationships.
Several recent reports have raised concern that infected coworkers may be an important source of severe acute respiratory coronavirus virus 2 (SARS-CoV-2) acquisition by healthcare personnel. In a suspected outbreak among emergency department personnel, sequencing of SARS-CoV-2 confirmed transmission among coworkers. The suspected 6-person outbreak included 2 distinct transmission clusters and 1 unrelated infection.
The goal of this study was to assess the utility of participatory needs assessment processes for continuous improvement of developing clinical and translational research (CTR) networks. Our approach expanded on evaluation strategies for CTR networks, centers, and institutes, which often survey stakeholders to identify infrastructure or resource needs, using the case example of the Great Plains IDeA-CTR Network. Our 4-stage approach (i.e., pre-assessment, data collection, implementation of needs assessment derived actions, monitoring of action plan) included a member survey (n = 357) and five subsequent small group sessions (n = 75 participants) to better characterize needs identified in the survey and to provide actionable recommendations. This participatory, mixed-methods needs assessment and strategic action planning process yielded 11 inter-related recommendations. These recommendations were presented to the CTR steering committee as inputs to develop detailed, prioritized action plans. Preliminary evaluation shows progress towards improved program capacity and effectiveness of the network to respond to member needs. The participatory, mixed-methods needs assessment and strategic planning process allowed a wide range of stakeholders to contribute to the development of actionable recommendations for network improvement, in line with the principles of team science.
Healthcare employees were tested for antibodies against severe acute respiratory coronavirus virus 2 (SARS-CoV-2). Among 734 employees, the prevalence of SARS-CoV-2 antibodies was 1.6%. Employees with heavy coronavirus disease 2019 (COVID-19) exposure had similar antibody prevalence as those with limited or no exposure. Guidelines for PPE use seem effective for preventing COVID-19 infection in healthcare workers.
Preoperative mechanical ventilation is associated with morbidity and mortality following CHD surgery, but prior studies lack a comprehensive analysis of how preoperative respiratory support mode and timing affects outcomes.
Methods:
We retrospectively collected data on children <18 years of age undergoing cardiac surgery at an academic tertiary care medical centre. Using multivariable regression, we examined the association between modes of preoperative respiratory support (nasal cannula, high-flow nasal cannula/noninvasive ventilation, or invasive mechanical ventilation), escalation of preoperative respiratory support, and invasive mechanical ventilation on the day of surgery for three outcomes: operative mortality, postoperative length of stay, and postoperative complications. We repeated our analysis in a subcohort of neonates.
Results:
A total of 701 children underwent 800 surgical procedures, and 40% received preoperative respiratory support. Among neonates, 243 patients underwent 253 surgical procedures, and 79% received preoperative respiratory support. In multivariable analysis, all modes of preoperative respiratory support, escalation in preoperative respiratory support, and invasive mechanical ventilation on the day of surgery were associated with increased odds of prolonged length of stay in children and neonates. Children (odds ratio = 3.69, 95% CI 1.2–11.4) and neonates (odds ratio = 8.97, 95% CI 1.31–61.14) on high-flow nasal cannula/noninvasive ventilation had increased odds of operative mortality compared to those on room air.
Conclusion:
Preoperative respiratory support is associated with prolonged length of stay and mortality following CHD surgery. Knowing how preoperative respiratory support affects outcomes may help guide surgical timing, inform prognostic conversations, and improve risk stratification models.
Streamwise velocity and wall-shear stress are acquired simultaneously with a hot-wire and an array of azimuthal/spanwise-spaced skin friction sensors in large-scale pipe and boundary layer flow facilities at high Reynolds numbers. These allow for a correlation analysis on a per-scale basis between the velocity and reference skin friction signals to reveal which velocity-based turbulent motions are stochastically coherent with turbulent skin friction. In the logarithmic region, the wall-attached structures in both the pipe and boundary layers show evidence of self-similarity, and the range of scales over which the self-similarity is observed decreases with an increasing azimuthal/spanwise offset between the velocity and the reference skin friction signals. The present empirical observations support the existence of a self-similar range of wall-attached turbulence, which in turn are used to extend the model of Baars et al. (J. Fluid Mech., vol. 823, p. R2) to include the azimuthal/spanwise trends. Furthermore, the region where the self-similarity is observed correspond with the wall height where the mean momentum equation formally admits a self-similar invariant form, and simultaneously where the mean and variance profiles of the streamwise velocity exhibit logarithmic dependence. The experimental observations suggest that the self-similar wall-attached structures follow an aspect ratio of $7:1:1$ in the streamwise, spanwise and wall-normal directions, respectively.
This study presents findings from a first-of-its-kind measurement campaign that includes simultaneous measurements of the full velocity and vorticity vectors in both pipe and boundary layer flows under matched spatial resolution and Reynolds number conditions. Comparison of canonical turbulent flows offers insight into the role(s) played by features that are unique to one or the other. Pipe and zero pressure gradient boundary layer flows are often compared with the goal of elucidating the roles of geometry and a free boundary condition on turbulent wall flows. Prior experimental efforts towards this end have focused primarily on the streamwise component of velocity, while direct numerical simulations are at relatively low Reynolds numbers. In contrast, this study presents experimental measurements of all three components of both velocity and vorticity for friction Reynolds numbers $Re_{\unicode[STIX]{x1D70F}}$ ranging from 5000 to 10 000. Differences in the two transverse Reynolds normal stresses are shown to exist throughout the log layer and wake layer at Reynolds numbers that exceed those of existing numerical data sets. The turbulence enstrophy profiles are also shown to exhibit differences spanning from the outer edge of the log layer to the outer flow boundary. Skewness and kurtosis profiles of the velocity and vorticity components imply the existence of a ‘quiescent core’ in pipe flow, as described by Kwon et al. (J. Fluid Mech., vol. 751, 2014, pp. 228–254) for channel flow at lower $Re_{\unicode[STIX]{x1D70F}}$, and characterize the extent of its influence in the pipe. Observed differences between statistical profiles of velocity and vorticity are then discussed in the context of a structural difference between free-stream intermittency in the boundary layer and ‘quiescent core’ intermittency in the pipe that is detectable to wall distances as small as 5 % of the layer thickness.
Projects that aim to control invasive species often assume that a reduction of the target species will increase native species abundance. However, reports of the responses of native species following exotic species control are relatively rare. We assessed the recovery of the native community in five tidal wetland locations in which we attempted to eradicate the invasive common reed [Phragmites australis (Cav.) Trin. ex Steud.]. We tested whether 3 yr of treatment were able to eradicate Phragmites and promote recovery of the native plant community. After 3 yr of treatment, Phragmites density declined sharply in all treated stands, though it was not eradicated in any of them. Native plant cover increased significantly in treated areas, and community composition, particularly in smaller stands, converged toward that of uninvaded habitat. Thus, even within the relatively short timescale of the treatments and monitoring, significant progress was made toward achieving the goals of controlling Phragmites infestations and promoting native biodiversity. There was a trend toward greater promise for success in smaller stands than larger stands, as has been observed in other studies. A greater emphasis on monitoring whole-community responses to exotic plant control, across a range of conditions, would enhance our ability to plan and design successful management strategies.
Documenting leads and lags in terrestrial records of past climate change is critical to understanding the behavior of Earth’s natural climate system and making reliable predictions of future climate conditions. However, uncertainties of several hundred years in age models make it difficult to distinguish synchronicity and feedbacks in paleo archives. In lakes this is often due to the lack of terrestrial macrofossils in climate-sensitive locations, such as high alpine or dryland settings. The potential of radiocarbon (14C) dating of pollen has long been recognized, but the difficulty of cleanly separating pollen from other kinds of organic carbon has limited its usefulness. Here we report 14C ages on pollen separated by flow cytometry, from a set of closely spaced samples from Mono Lake, California. The accuracy of the pollen ages is tested using well-dated bracketing tephras, the South Mono and North Mono-Inyo tephras. In spite of the purity of the sorted samples, the pollen dates are older than the bounding tephras by ~400 yr, similar to some other pollen-dating studies. While improvements in sample preparation protocols are planned, understanding the geological processes involved in the production, preservation, and deposition of pollen at each site will be critical to developing robust high-resolution age models.
Fluid flow through a two-dimensional fracture network has been simulated using a discrete fracture model. The computed field-scale permeabilities were then compared to those obtained using an equivalent continuum approach in which the permeability of each grid block is first obtained by performing fine-scale simulations of flow through the fracture network within that region. In the equivalent continuum simulations, different grid-sizes were used, corresponding to N by N grids with N = 10, 40, 100 and 400. The field-scale permeabilities found from the equivalent continuum simulations were generally within 10% of the values found from the discrete fracture simulations. The discrepancies between the two approaches seemed to be randomly related to the grid size, as no convergence was observed as N increased. An interesting finding was that the equivalent continuum approach gave accurate results in cases where the grid block size was clearly smaller than the 'representative elementary volume'.
Advective–diffusive transport of passive or reactive scalars in confined environments (e.g. tubes and channels) is often accompanied by diffusive losses/gains through the confining walls. We present analytical solutions for transport of a reactive solute in a tube, whose walls are impermeable to flow but allow for solute diffusion into the surrounding medium. The solute undergoes advection, diffusion and first-order chemical reaction inside the tube, while diffusing and being consumed in the surrounding medium. These solutions represent a leading-order (in the radius-to-length ratio) approximation, which neglects the longitudinal variability of solute concentration in the surrounding medium. A numerical solution of the full problem is used to demonstrate the accuracy of this approximation for a physically relevant range of model parameters. Our analysis indicates that the solute delivery rate can be quantified by a dimensionless parameter, the ratio of a solute’s residence time in a tube to the rate of diffusive losses through the tube’s wall.