We use cookies to distinguish you from other users and to provide you with a better experience on our websites. Close this message to accept cookies or find out how to manage your cookie settings.
To save content items to your account,
please confirm that you agree to abide by our usage policies.
If this is the first time you use this feature, you will be asked to authorise Cambridge Core to connect with your account.
Find out more about saving content to .
To save content items to your Kindle, first ensure no-reply@cambridge.org
is added to your Approved Personal Document E-mail List under your Personal Document Settings
on the Manage Your Content and Devices page of your Amazon account. Then enter the ‘name’ part
of your Kindle email address below.
Find out more about saving to your Kindle.
Note you can select to save to either the @free.kindle.com or @kindle.com variations.
‘@free.kindle.com’ emails are free but can only be saved to your device when it is connected to wi-fi.
‘@kindle.com’ emails can be delivered even when you are not connected to wi-fi, but note that service fees apply.
Herbaceous perennials must annually rebuild the aboveground photosynthetic architecture from carbohydrates stored in crowns, rhizomes, and roots. Knowledge of carbohydrate utilization and storage can inform management decisions and improve control outcomes for invasive perennials. We monitored the nonstructural carbohydrates in a population of the hybrid Bohemian knotweed [Polygonum ×bohemicum (J. Chrtek & Chrtková) Zika & Jacobson [cuspidatum × sachalinense]; syn.: Fallopia ×bohemica (Chrtek and Chrtková) J.P. Bailey] and in Japanese knotweed [Polygonum cuspidatum Siebold & Zucc.; syn.: Fallopia japonica (Houtt.) Ronse Decr.]. Carbohydrate storage in crowns followed seasonal patterns typical of perennial herbaceous dicots corresponding to key phenological events. Starch was consistently the highest nonstructural carbohydrate present. Sucrose levels did not show a consistent inverse relationship with starch levels. Lateral distribution of starch in rhizomes and, more broadly, total nonstructural carbohydrates sampled before dormancy break showed higher levels in rhizomes compared with crowns. Total nonstructural carbohydrate levels in crowns reached seasonal lows at an estimated 22.6% of crown dry weight after accumulating 1,453.8 growing degree days (GDD) by the end of June, mainly due to depleted levels of stored starch, with the estimated minimum of 12.3% reached by 1,220.3 GDD accumulated by mid-June. Depletion corresponded to rapid development of vegetative canopy before entering the reproductive phase in August. Maximum starch accumulation in crowns followed complete senescence of aboveground tissues by mid- to late October. Removal of aboveground shoot biomass in late June to early July with removal of vegetation regrowth in early September before senescence would optimize the use of time and labor to deplete carbohydrate reserves. Additionally, foliar-applied systemic herbicide translocation to belowground tissue should be maximized with applications in late August through early fall to optimize downward translocation with assimilate movement to rebuild underground storage reserves. Fall applications should be made before loss of healthy leaf tissue, with the window for control typically ending by late September in Minnesota.
In response to the COVID-19 pandemic, we rapidly implemented a plasma coordination center, within two months, to support transfusion for two outpatient randomized controlled trials. The center design was based on an investigational drug services model and a Food and Drug Administration-compliant database to manage blood product inventory and trial safety.
Methods:
A core investigational team adapted a cloud-based platform to randomize patient assignments and track inventory distribution of control plasma and high-titer COVID-19 convalescent plasma of different blood groups from 29 donor collection centers directly to blood banks serving 26 transfusion sites.
Results:
We performed 1,351 transfusions in 16 months. The transparency of the digital inventory at each site was critical to facilitate qualification, randomization, and overnight shipments of blood group-compatible plasma for transfusions into trial participants. While inventory challenges were heightened with COVID-19 convalescent plasma, the cloud-based system, and the flexible approach of the plasma coordination center staff across the blood bank network enabled decentralized procurement and distribution of investigational products to maintain inventory thresholds and overcome local supply chain restraints at the sites.
Conclusion:
The rapid creation of a plasma coordination center for outpatient transfusions is infrequent in the academic setting. Distributing more than 3,100 plasma units to blood banks charged with managing investigational inventory across the U.S. in a decentralized manner posed operational and regulatory challenges while providing opportunities for the plasma coordination center to contribute to research of global importance. This program can serve as a template in subsequent public health emergencies.
Health numeracy is the understanding and application of information conveyed with numbers, tables and graphs, and probabilities in order to effectively manage one's own healthcare. Health numeracy is a vital aspect of communicating with healthcare providers and participating in one's own medical decision making, which is especially important in aging populations. Current literature indicates that assessing and establishing one's health numeracy abilities is among the first steps in providing necessary resources and accommodating patients' individual needs. Additionally, older adults with diffuse cognitive impairment often have issues with facets of executive functioning; however, the extant literature does not discuss the role of executive functioning in relation to health numeracy in this population. The purpose of this study was to explore the relationship between performance on tasks of executive functioning and objectively-measured health numeracy abilities in older adult patients.
Participants and Methods:
This study included a sample of 42 older adult patients referred for neuropsychological evaluation for memory complaints who were administered the Test of Premorbid Functioning (TOPF), Trail Making Test - Part B (TMT-B), and Stroop Color and Word Test (SCWT Color Word Interference [CWI]) as part of a larger standardized battery. Patients were also administered the Numerical Understand in Medicine Instrument - Short Form (NUMI-SF). All included patients had <2 performance validity test failures. The sample was racially diverse (47.6% Black, 35.7% White, 14.3% Hispanic, 2.4% Asian) and 54.8% female. Average age was 62.95 (SD= 8.6) and average education was 14.1 (SD=2.7). Diagnostically, 47.6% of the sample were cognitively normal, 33.3% had mild cognitive impairment, and 19.0% had dementia. Average NUMI-SF score was 4.79 (SD= 1.7). Two multiple regressions were conducted to evaluate the extent to which executive functioning, as measured by the TMT-B and SCWT CWI predicted NUMI-SF, and the additive predictive power of premorbid IQ and demographics via the TOPF on the relationship between executive functioning and NUMI-SF.
Results:
The first regression, which measured the relationship between the TMT-B and SCWT CWI upon NUMI-SF scores, was not significant (p=.616). The model was significant with the addition of the TOPF (ß=.595, p<.001) and TOPF alone predicted ∼60% of the variance in NUMI-SF score, while TMT-B and SCWT CWI remained non-significant.
Conclusions:
These results indicate that common measures of executive functioning are not reliable predictors of health literacy with or without the moderating of premorbid intellectual functioning taken into consideration. This suggests that health numeracy is likely to be minimally affected by deficits in executive functioning and rather may be better accounted for by premorbid intellectual functioning and/or other sociodemographic factors (e.g. socioeconomic status, education quality, occupation). Future studies will benefit from elucidating the contributions of other social determinant factors on predicting health numeracy.
Awareness of risk factors associated with any form of impairment is critical for formulating optimal prevention and treatment planning. Millions worldwide suffer from some form of cognitive impairment, with the highest rates amongst Black and Hispanic populations. The latter have also been found to achieve lower scores on standardized neurocognitive testing than other racial/ethnic groups. Understanding the socio-demographic risk factors that lead to this discrepancy in neurocognitive functioning across racial groups is crucial. Adverse childhood experiences (ACEs), are one aspect of social determinants of health. ACES have been linked to a greater risk of future memory impairment, such as dementia. Moreover, higher instances of ACEs have been found amongst racial minorities. Considering the current literature, the purpose of this exploratory research is to better understand how social determinants, more specifically, ACEs, may play a role in the development of cognitive impairment.
Participants and Methods:
This cross-sectional study included data from an urban, public Midwestern academic medical center. There was a total of 64 adult clinical patients that were referred for a neuropsychological evaluation. All patients were administered a standardized neurocognitive battery that included the Montreal Cognitive Assessment (MoCA) as well as a 10-item ACE questionnaire, which measures levels of adverse childhood experiences. The sample was 73% Black and 27% White. The average age was 66 (SD=8.6) and average education was 12.6 years (SD=3.4). A two-way ANOVA was conducted to evaluate the interaction of racial identity (White; Black) and ACE score on MoCA total score. An ACE score >4 was categorized as “high”; ACE <4 was categorized as “low.”
Results:
There was not a significant interaction of race and ACE group on MoCA score (p=.929) nor a significant main effect of ACE score (p=.541). Interestingly, there was a significant main effect of Race on MoCA (p=.029). White patients had an average MoCA score of 21.82 (sd=4.77). Black patients had an average MoCA score of 17.54 (sd=5.91).
Conclusions:
Overall, Black patients demonstrated statistically lower scores on the MoCA than White patients. There was no significant difference on MoCA score between races when also accounting for ACE scores. Given this study’s findings, one’s level of adverse childhood experiences does not appear to impact one’s cognitive ability later in life. There is a significant difference in cognitive ability between races, specifically Black and White people, which suggests there may be social determinants other than childhood experiences to be explored that influence cognitive impairment.
Understanding healthcare information is an important aspect in managing one’s own needs and navigating a complex healthcare system. Health numeracy and literacy reflect the ability to understand and apply information conveyed numerically (i.e., graphs, statistics, proportions, etc.) and written/verbally (i.e., treatment instructions, appointments, diagnostic results) to communicate with healthcare providers, understand one’s medical condition(s) and treatment plan, and participate in informed medical decision-making. Cognitive impairment has been shown to impact one’s ability to understand complex medical information. The purpose of this study is to explore the relationship between the degree of cognitive impairment and one’s ability to perform on measures of health numeracy and literacy.
Participants and Methods:
This cross-sectional study included data from 38 adult clinical patients referred for neuropsychological evaluation for primary memory complaints at an urban, public Midwestern academic medical center. All patients were administered a standardized neurocognitive battery that included the Montreal Cognitive Assessment (MoCA), as well as measures of both health numeracy (Numeracy Understanding of Medicine Instrument-Short Version [NUMI-SF]) and health literacy (Short Assessment of Health Literacy-English [SAHL-E]). The sample was 58% female and 60% Black/40% White. Mean age was 65 (SD=9.4) and mean education was 14.4 years (SD=2.5). The sample was further split into three groups based on cognitive diagnosis determined by comprehensive neuropsychological assessment (i.e., No Diagnosis [34%]; Mild Cognitive Impairment [MCI; 29%]; Dementia [34%]).Groups were well matched and did not statistically differ in premorbid intellectual functioning (F=1.96, p=.157; No Diagnosis, M=100, SD=7.92; MCI, M=99, SD=8.87; Dementia, M=94, SD=7.72) ANOVAs were conducted to evaluate differences between clinical groups on the MoCA, NUMI-SF, and SAHL-E. Multiple regressions were then conducted to determine the association of MoCA scores with NUMI-SF and SAHL-E performance.
Results:
As expected, the Dementia group performed significantly below both the No Diagnosis and MCI groups on the MoCA (F=19.92, p<.001) with a large effect (ηp2=.540). Significant differences were also found on the NUM-SF (F=5.90, p>.05) and on the SAHL-E (F=6.20, p>.05) with large effects (ηp2=.258 and ηp2=.267, respectively). Regression found that MoCA performance did not predict performance on the NUMI-SF and SAHL-E in the No Diagnosis group (F=2.30, p=.809) or the MCI group (F=1.31, p=.321). Conversely, the MoCA significantly predicted performance on the NUMI-SF and SAHL-E for the Dementia (F=15.59, p=.001) group.
Conclusions:
Degree of cognitive impairment is associated with understanding of health numeracy and literacy information, with patients diagnosed with dementia performing most poorly on these measures. Patients with normal cognitive functioning demonstrated a significantly better understanding of health numeracy and health literacy. This study supports the notion that as cognitive functioning diminishes, incremental support is necessary for patients to understand medical information pertaining to their continued care and medical decision-making, particularly as it relates to both numerical and written information.
Area-based conservation is a widely used approach for maintaining biodiversity, and there are ongoing discussions over what is an appropriate global conservation area coverage target. To inform such debates, it is necessary to know the extent and ecological representativeness of the current conservation area network, but this is hampered by gaps in existing global datasets. In particular, although data on privately and community-governed protected areas and other effective area-based conservation measures are often available at the national level, it can take many years to incorporate these into official datasets. This suggests a complementary approach is needed based on selecting a sample of countries and using their national-scale datasets to produce more accurate metrics. However, every country added to the sample increases the costs of data collection, collation and analysis. To address this, here we present a data collection framework underpinned by a spatial prioritization algorithm, which identifies a minimum set of countries that are also representative of 10 factors that influence conservation area establishment and biodiversity patterns. We then illustrate this approach by identifying a representative set of sampling units that cover 10% of the terrestrial realm, which included areas in only 25 countries. In contrast, selecting 10% of the terrestrial realm at random included areas across a mean of 162 countries. These sampling units could be the focus of future data collation on different types of conservation area. Analysing these data could produce more rapid and accurate estimates of global conservation area coverage and ecological representativeness, complementing existing international reporting systems.
Until recently, the influence of basal liquid water on the evolution of buried glaciers in Mars' mid latitudes was assumed to be negligible because the latter stages of Mars' Amazonian period (3 Ga to present) have long been thought to have been similarly cold and dry to today. Recent identifications of several landforms interpreted as eskers associated with these young (100s Ma) glaciers calls this assumption into doubt. They indicate basal melting (at least locally and transiently) of their parent glaciers. Although rare, they demonstrate a more complex mid-to-late Amazonian environment than was previously understood. Here, we discuss several open questions posed by the existence of glacier-linked eskers on Mars, including on their global-scale abundance and distribution, the drivers and dynamics of melting and drainage, and the fate of meltwater upon reaching the ice margin. Such questions provide rich opportunities for collaboration between the Mars and Earth cryosphere research communities.
When participants in psychophysical experiments are asked to estimate or identify stimuli which differ on a single physical dimension, their judgments are influenced by the local experimental context — the item presented and judgment made on the previous trial. It has been suggested that similar sequential effects occur in more naturalistic, real-world judgments. In three experiments we asked participants to judge the prices of a sequence of items. In Experiment 1, judgments were biased towards the previous response (assimilation) but away from the true value of the previous item (contrast), a pattern which matches that found in psychophysical research. In Experiments 2A and 2B, we manipulated the provision of feedback and the expertise of the participants, and found that feedback reduced the effect of the previous judgment and shifted the effect of the previous item's true price from contrast to assimilation. Finally, in all three experiments we found that judgments were biased towards the centre of the range, a phenomenon known as the “regression effect” in psychophysics. These results suggest that the most recently-presented item is a point of reference for the current judgment. The findings inform our understanding of the judgment process, constrain the explanations for local context effects put forward by psychophysicists, and carry practical importance for real-world situations in which contextual bias may degrade the accuracy of judgments.
We examine temporal and spatial variation in morphology of the ammonoid cephalopod Discoscaphites iris using a large dataset from multiple localities in the Late Cretaceous (Maastrichtian) of the U.S. Gulf and Atlantic Coastal Plains, spanning a distance of 2000 km along the paleoshoreline. Our results suggest that the fossil record of D. iris is consistent with no within-species net accumulation of phyletic evolutionary change across morphological traits or the lifetime of this species. Correlations between some traits and paleoenvironmental conditions as well as changes in the coefficient of variation may support limited population-scale ecophenotypic plasticity; however, where stratigraphic data are available, no directional changes in morphology occur before the Cretaceous/Paleogene (K/Pg) boundary. This is consistent with models of “dynamic” evolutionary stasis. Combined with knowledge of life-history traits and paleoecology of scaphitid ammonoids, specifically a short planktonic phase after hatching followed by transition to a nektobenthic adult stage, these data suggest that scaphitids had significant potential for rapid morphological change in conjunction with limited dispersal capacity. It is therefore likely that evolutionary mode in the Scaphitidae (and potentially across the broader ammonoid clade) follows a model of cladogenesis wherein a dynamic morphological stasis is periodically interrupted by more substantial evolutionary change at speciation events. Finally, the lack of temporal changes in our data suggest that global environmental changes had a limited effect on the morphology of ammonoid faunas during the latest Cretaceous.
Knotweed (Fallopia spp.) is an herbaceous perennial from East Asia that was brought to Europe and North America and, despite control efforts, subsequently spread aggressively on both continents. Data are available on knotweed’s modes of sexual and asexual spread, historical spread, preferred habitat, and ploidy levels. Incomplete information is available on knotweed’s current global geographic distribution and genetic diversity. The chemical composition of knotweed leaves and rhizomes has been partially discovered as related to its ability to inhibit growth and germination of neighboring plant communities via phytochemicals. There is still critical information missing. There are currently no studies detailing knotweed male and female fertility. Specifically, information on pollen viability would be important for further understanding sexual reproduction as a vector of spread in knotweed. This information would help managers determine the potential magnitude of knotweed sexual reproduction and the continued spread of diverse hybrid swarms. The potential range of knotweed and its ability to spread into diverse habitats makes studies on knotweed seed and rhizome cold tolerance of utmost importance, yet to date no such studies have been conducted. There is also a lack of genetic information available on knotweed in the upper Midwest. Detailed genetic information, such as ploidy levels and levels of genetic diversity, would answer many questions about knotweed in Minnesota, including understanding its means of spread, what species are present in what densities, and current levels of hybridization. This literature review summarizes current literature on knotweed to better understand its invasiveness and to highlight necessary future research that would benefit and inform knotweed management in the upper Midwest.
In 2019, a 42-year-old African man who works as an Ebola virus disease (EVD) researcher traveled from the Democratic Republic of Congo (DRC), near an ongoing EVD epidemic, to Philadelphia and presented to the Hospital of the University of Pennsylvania Emergency Department with altered mental status, vomiting, diarrhea, and fever. He was classified as a “wet” person under investigation for EVD, and his arrival activated our hospital emergency management command center and bioresponse teams. He was found to be in septic shock with multisystem organ dysfunction, including circulatory dysfunction, encephalopathy, metabolic lactic acidosis, acute kidney injury, acute liver injury, and diffuse intravascular coagulation. Critical care was delivered within high-risk pathogen isolation in the ED and in our Special Treatment Unit until a diagnosis of severe cerebral malaria was confirmed and EVD was definitively excluded.
This report discusses our experience activating a longitudinal preparedness program designed for rare, resource-intensive events at hospitals physically remote from any active epidemic but serving a high-volume international air travel port-of-entry.
The solutions must be context-sensitive; for example, even within Sweden it was clear from this survey (Sang & Ode-Sang, 2015) that there was considerable variance in the financial resources, skills and requirements of different local authorities. This said, there are also certain issues in common which we believe will have some general relevance elsewhere, at least in so far as some hierarchy of government exists whereby local government has responsibility for granting or denying permission to specific developments of smaller scale and national authorities are involved in decisions over key infrastructure and strategic planning as well as regulatory and financial frame works. Not all the suggestions will fit equally to all domains, such as land versus marine, but we hope, in bringing together the views of all authors here, to provide some general principles by which greater capacity for scenario modelling may be built. Of course, these represent the views of modellers and academics, albeit ones with considerable collective experience of working with practitioners, policy makers, citizens and other stakeholder groups. Some of these recommendations may be ‘easier said than done’ and few are exclusively the responsibility of any one party, efforts must be coordinated, but we nonetheless believe it is of value to set out as plainly as possible the tasks at hand and where these might be most fruitfully begun.
The models discussed in other chapters in this book relate (generally) to some form of simulation or representation in a formal modelling language. The range of computational or technological complexity involved is variable, but in most cases a very high degree of domain knowledge is also required with respect to the system under investigation. This presupposes that such expertise is available, and indeed that it is sufficient to understand and represent a particular system. For large coupled systems with a wide range of socio-economic, ecological and biophysical systems interacting, this may not be feasible. As with nature itself, NBS are often part of a complex web of interdependent systems so this chapter explores data mining as a pragmatic alternative/complementary approach when systems are insufficiently well-described by current theory or where domain expertise is in short supply. Examples are provided in Table 8.1.
Hospital evacuations of patients with special needs are extremely challenging, and it is difficult to train hospital workers for this rare event.
Hypothesis/Problem:
Researchers developed an in-situ simulation study investigating the effect of standardized checklists on the evacuation of a patient under general anesthesia from the operating room (OR) and hypothesized that checklists would improve the completion rate of critical actions and decrease evacuation time.
Methods:
A vertical evacuation of the high-fidelity manikin (SimMan3G; Laerdal Inc.; Norway) was performed and participants were asked to lead the team and evacuate the manikin to the ground floor after a mock fire alarm. Participants were randomized to two groups: one was given an evacuation checklist (checklist group [CG]) and the other was not (non-checklist group [NCG]). A total of 19 scenarios were run with 28 participants.
Results:
Mean scenario time, preparation phase of evacuation, and time to transport the manikin down the stairs did not differ significantly between groups (P = .369, .462, and .935, respectively). The CG group showed significantly better performance of critical actions, including securing the airway, taking additional drug supplies, and taking additional equipment supplies (P = .047, .001, and .001, respectively). In the post-evacuation surveys, 27 out of 28 participants agreed that checklists would improve the evacuation process in a real event.
Conclusion:
Standardized checklists increase the completion rate of pre-defined critical actions in evacuations out of the OR, which likely improves patient safety. Checklist use did not have a significant effect on total evacuation time.
This article argues that post-conflict consociational arrangements in ethnically divided societies incentivize moderation by political parties, but not policy differentiation outside the main conflict. This results in little policy-driven voting. Analysing party manifestos and voter survey data, we examine the evolution of party policy and cleavage voting under power-sharing in Northern Ireland 1998–2016. We find a reduction in ethno-national policy differences between parties and that ethno-nationalism has become less important in predicting vote choice for Protestants, but not Catholics. We also find little party differentiation in other policy areas and show that vote choices are largely independent of people's policy stances on economic or social issues. Our findings are thus largely consistent with a ‘top-down’ interpretation of political dynamics.
OBJECTIVES/SPECIFIC AIMS: Objective: apply checkpoint inhibitors that are specific to the exhaustive markers expressed on tumor CD8+ T-cells ex vivo in order to improve cytokine release and cytotoxic function in comparison to two control groups: (1.) T-cells that receive no antibodies; (2.) T-cells that receive standard inhibition with PD-1 and CTLA-4 antibodies only. Long-term objective: provide personalized medicine in the treatment of HCC by using checkpoint inhibitors that are specific to the receptors expressed by an individual tumor. METHODS/STUDY POPULATION: The study population includes patients undergoing liver transplantation or surgical resection for HCC. Two grams of tumor, two grams of healthy liver tissue at least one centimeter from the tumor margin, and 50 milliliters of blood will be obtained. Solid tissue will be mechanically and enzymatically disrupted and CD8+ T-cells will be isolated from all sites. Using flow cytometry, the expression of surface receptors PD-1, CTLA-4, LAG-3, TIM-3, BTLA, CD244, and CD160 will be categorized in each tissue to identify which receptors are upregulated in the tumor microenvironment. Up to three antibodies specific to the upregulated receptor(s) on the tumor T-cells will be applied per specimen. The experimental arm will receive these antibodies and co-stimulation with CD3/CD28 and will be compared to two controls. One control will receive only CD3/CD28, and the other will receive CD3/CD28 in addition to the standard combination of PD-1 and CTLA-4 inhibitors. From each condition, flow cytometry will be used to assess the mean production of interleukin-2, tumor necrosis factor-α, interferon-γ, granzyme B, and perforin expression as an assessment of T-cell function. RESULTS/ANTICIPATED RESULTS: Preliminary data from the peripheral blood of healthy controls confirms that the developed flow cytometry panels effectively identify the surface receptors and cytokine production of CD8+ T-cells. Two patients have successfully been enrolled in this study. It is predicted that T-cells extracted from the tumor will express more inhibitory receptors than normal liver or peripheral blood and will have increased function after they are targeted with checkpoint inhibitors that are specific to the inhibitory surface receptors they express. DISCUSSION/SIGNIFICANCE OF IMPACT: HCC is the second leading cause of cancer-related death worldwide and therapeutic options are limited for patients who are not surgical candidates. T-cells are a critical component of the anti-tumor response to HCC. However, T-cells can develop an exhausted phenotype characterized by up-regulated inhibitory receptors (PD-1, CTLA-4, LAG-3, TIM-3, CD-244, CD-160, BTLA) and decreased function, allowing for immune escape. Clinical trials using combined checkpoint inhibition with PD-L1 and CTLA-4 antibodies have been considered a breakthrough for patients with advanced HCC, as up to 25% show an objective tumor response. The explanation for the varied susceptibility to checkpoint inhibition remains unknown and is hypothesized to be secondary to inconsistencies in the expression of surface inhibitory receptors. Although inhibitory receptor expression has been shown to be upregulated under conditions of hepatitis and/or HCC, there has been no single study to effectively investigate the expression of all known inhibitors in order to better explore the interplay between them. It will be of great academic interest and clinical purpose to evaluate individual receptor expression and engage the correlating antibodies given the possibility of synergism between receptors and the need for a more profound anti-tumor T-cell response in HCC.
Hill (Twin Research and Human Genetics, Vol. 21, 2018, 84–88) presented a critique of our recently published paper in Cell Reports entitled ‘Large-Scale Cognitive GWAS Meta-Analysis Reveals Tissue-Specific Neural Expression and Potential Nootropic Drug Targets’ (Lam et al., Cell Reports, Vol. 21, 2017, 2597–2613). Specifically, Hill offered several interrelated comments suggesting potential problems with our use of a new analytic method called Multi-Trait Analysis of GWAS (MTAG) (Turley et al., Nature Genetics, Vol. 50, 2018, 229–237). In this brief article, we respond to each of these concerns. Using empirical data, we conclude that our MTAG results do not suffer from ‘inflation in the FDR [false discovery rate]’, as suggested by Hill (Twin Research and Human Genetics, Vol. 21, 2018, 84–88), and are not ‘more relevant to the genetic contributions to education than they are to the genetic contributions to intelligence’.
We carry out a numerical study of the quantum walk search algorithm of Shenvi, Kempe and Whaley Shenvi et al. (2003) and the factors that affect its efficiency in finding an individual state from an unsorted set. Previous work has focused purely on the effects of the dimensionality of the dataset to be searched. In the current paper we consider the effects of interpolating between dimensions, the connectivity of the dataset and the possibility of disorder in the underlying substrate: all these factors affect the efficiency of the search algorithm. We show that in addition to the strong dependence on the spatial dimension of the structure to be searched, there are also secondary dependencies on the connectivity and symmetry of the lattice, with greater connectivity providing a more efficient algorithm. We also show that the algorithm can tolerate a non-trivial level of disorder in the underlying substrate.