Hostname: page-component-7c8c6479df-94d59 Total loading time: 0 Render date: 2024-03-28T23:59:02.826Z Has data issue: false hasContentIssue false

Can Research Contribute to Improve Educational Practice?

Published online by Cambridge University Press:  17 September 2020

Marta Ferrero*
Affiliation:
Universidad Complutense (Spain)
*
Correspondence concerning this article should be addressed to Marta Ferrero. Departamento de Investigación y Psicología en Educación de la Universidad Complutense. Rector Royo Villanova, s/n, Ciudad Universitaria. 28040 Madrid. E-mail: ferreromarta@gmail.com
Get access
Rights & Permissions [Opens in a new window]

Abstract

Teaching a diverse classroom is a challenging task. Educators are faced daily with the difficult task of making many decisions about how to educate each of their students. To do this, they mainly rely on their experience and that of their colleagues, their values, and thoughts. Although they are inherent and important in the profession of teaching, sometimes these resources may not suffice to make the best decisions, particularly when teachers are continuously bombarded with numerous fads and poorly grounded ideas about education. In this context, research-informed practice emerges as a promising approach. It involves integrating the professional expertise of teachers with the best evidence of researchers to make better decisions and improve education. However, for this approach to be successfully implemented, the gap between researchers and practitioners must first be bridged. The possible solutions to this challenge involve acting in three contexts: research production, research communication and research use. Specific measures in each of these contexts are described.

Type
Research Article
Copyright
© Universidad Complutense de Madrid and Colegio Oficial de Psicólogos de Madrid 2020

Access options

Get access to the full version of this content by using one of the access options below. (Log in options will check for institutional or personal access. Content may require purchase if you do not have access.)

During the past decade, the commercial program Brain Gym®Footnote 1, also known as “educational kinesiology", grew in popularity among several countries around the world. After many years of investment of time and resources by hundreds of schools, the whistle was blown on the lack of a research-supported theoretical foundation for this program (Hyatt, Reference Hyatt2007; Goldacre, Reference Goldacre2006). Educators who adopted Brain Gym®, as well as those who still embrace the theory of learning styles, or the theory of left- and right brain learners, among others, most likely do so with the commendable aim of offering a better education to all their students. Although well-intended, these decisions sometimes might do more harm than good (Chalmers, Reference Chalmers2003).

Teachers around the world are continuously exposed to diverse claims by publishers and advocates for different approaches (Sharples, Reference Sharples2013). At the same time, they have to make countless and diverse rapid-fire decisions every day in their classroom (for instance, what font type adopt to teach reading or how to group students to optimize learning). To this aim, they mainly draw on their own values, thoughts and experience, and those of their colleagues and professionals; but they rarely make use of scientific research (Nelson & Campbell, Reference Nelson and Campbell2017). Why? Is it possible or even desirable that research play a role in educational practice? If so, what should that contribution look like? What barriers are hindering progress in this matter? What can be do? In the following paragraphs I will try to answer these questions.

From Evidence-Based Education to Research-Informed Practice

The idea of improving connections between research and practice is not new (Levin, Reference Levin2013). In fact, there have been diverse positions around the potential contribution that research might do to the daily work of educators. On the one hand, some experts have raised concerns about whether research will ever be in a position to inform teachers about how to improve education (Cain, Reference Cain2015). On the other hand, other authors have stated that the alternative to evidence is just unfounded opinion (Coe, Reference Coe1999). Around these conflicting positions, there have been some attempts to conceptualize the different roles research might play in education (Cain & Allan, Reference Cain and Allan2017; Godfrey, Reference Godfrey2017).

The top-down evidence-to-practice approach views research findings, especially those obtained from randomized controlled trials, as the only ones with sufficient quality to inform teachers about what works in the classroom (Hammersley, Reference Hammersley2005). Under this approach, frequently linked to the term “evidence-based education”, research is aimed at producing prescriptions about what works (Biesta, Reference Biesta2007) and educators are essentially technicians who apply what research dictates. In the words of Godfrey (Reference Godfrey2017), under this approach “academics provide the evidence and practitioners work out how to implement or use it” (p. 437). Some experts have pointed out that this model is based on false premises because it assumes that research provides unmistakable links between causes and effects (Godfrey, Reference Godfrey2017) or that scientific findings are infallible (Hammersley, Reference Hammersley2001, Reference Hammersley2005) and unalterable over time (Biesta, Reference Biesta2007, Reference Biesta2010). Moreover, it is argued that this approach might lead to undesirable consequences. For instance, the complexities of the education system can be simplified; policy makers may just fund “evidence-based” ideas; and researchers might be forced to focus on “what works” in the classroom, seriously narrowing the rationale of educational research and reducing the funds for studies that are not aimed at answering this question (Godfrey, Reference Godfrey2017; Hammersley, Reference Hammersley2005).

Faced with the “what works” model for evidence to practice, an increasing number of authors are advocating for an alternative approach in which teachers have a more critical and active role during decision making and theories are not employed as prescriptions but as another source of information along with the experiential knowledge and judgment of practitioners themselves (Godfrey, Reference Godfrey2017; Hammersley, Reference Hammersley2005). The term most frequently employed to refer to this approach is “research-informed practice”.

What is or should be Research-informed Practice?

Research-informed practice entails integrating the professional expertise of teachers with the best evidence provided by research to make more precise decisions and improve the quality of teaching (Sharples, Reference Sharples2013). It should not be about prescribing teachers what to do through a set of recipes or tips (Goldacre, Reference Goldacre2013; Wiliam et al., Reference Wiliam, Lee, Harrison and Black2004). In fact, given the complexity of daily practice in the classroom, some authors consider this idea impossible to achieve (Wiliam et al., Reference Wiliam, Lee, Harrison and Black2004). Instead, this approach lies in making informed judgments not only in the light of needs, resources, priorities, preferences of practitioners but also on the basis of research evidence (Chalmers, Reference Chalmers2005). Ideally, along with situated understanding and critical reflection, educational research should provide teaches with technical knowledge and new theoretical frameworks that can enhance reflection and help teachers to discriminate between good sense and commonsense (Winch et al., Reference Winch, Oancea and Orchard2015).

Although there is some debate about what counts as evidence (National Research Council, Reference Shavelson and Towne2002), practitioners and policy makers should hopefully guide their efforts to be informed specially towards systematic reviews and meta-analysis of primary research, which emphasize the cumulative character of science and attempt to minimize the effects of biases and chance (Chalmers, Reference Chalmers2003), not as an educational panacea but as a valuable instrument. As long as teachers are equipped with detailed information about what kind of methods, resources, or programs are more effective under what circumstances, their professional skill and judgment will most likely improve (Levin, Reference Levin2010), and so will their professional independence to take their own decisions in the face of external interferences (Goldacre, Reference Goldacre2013).

Efforts made over time to synthesize the main research findings in the field of education have allowed the identification of a significant number of effective practices. For reasons of space, I will focus on some of the practices which have returned better results across different samples and subjects. For instance, retrieval practice (or testing), which consists of challenging students to retrieve some piece of information from memory, plays a critical role in consolidating learning (Rohrer & Pashler, Reference Rohrer and Pashler2010; Weinstein, et al., Reference Weinstein, Madan and Sumeracki2018). In addition, providing students with opportunities to independently practice what is learned in the classroom allows them to become fluent and automatic in that skill (Rosenshine, Reference Rosenshine2012). If this practice is spaced across time, with rest periods between learning sessions, both acquisition and retention of learning will be enhanced (Bjork, Reference Bjork, Gopher and Koriat1999; Hattie, Reference Hattie2009). Similarly, providing students with effective feedback on their performance can enhance the processing of information to be learnt. In this vein, a correct form of giving feedback involves a twofold approach: students have to receive clear information on correct responses and this information has to be connected to their prior knowledge, among other things (EFF, 2018; Hattie, Reference Hattie2009; Marzano, Reference Marzano1998). Finally, another well-grounded practice is teaching learners meta-cognitive and self-regulate strategies, such as activating prior knowledge or self-evaluating progress and final performance. Such an active role of students over the cognitive processes involved in their own learning can significantly contribute to improving performance (EFF; 2018; Hattie, Reference Hattie2009).

Which Barriers does Research-Informed Practice Face?

To advance towards a research-informed practice, the cooperation of the actors involved seems obvious. However, the gap between researchers and practitioners is well-documented (Broekkamp & van Hout-Wolters, Reference Broekkamp and Hout-Wolters2007). On the one hand, although practitioners are interested in the contribution research can make to inform their work (Cordingley, Reference Cordingley2008; Penuel et al., Reference Penuel, Briggs, Davidson, Herlihy, Sherer, Hill, Farrell and Allen2016), they rarely consult scientific literature (Cain, Reference Cain2016; Williams & Coles, Reference Williams and Coles2007). Why is this the case? It might be due to their lack of training and time on search, access, read, and interpret original research (Hammersley-Fletcher et al., Reference Hammersley-Fletcher, Lewin, Davies, Duggan, Rowley and Spink2015; Levin, Reference Levin2013; LaPointe-McEwan et al., Reference LaPointe-McEwan, DeLuca and Klinger2017). Simultaneously, although scientific studies in education have grown over the years, there are still few concise and practical results from research that teachers can directly apply to enhance learning (Broekkamp & van Hout-Wolters, Reference Broekkamp and Hout-Wolters2007; Cook & Cook, Reference Cook and Cook2004). On the other hand, researchers may lack skills, interest, or incentives from their workplaces to adapt their work to or collaborate with non-academics (Campbell & Levin, Reference Campbell and Levin2012). In addition, the different languages employed by researchers and practitioners can seriously compromise mutual understanding (Borg, Reference Borg2010; Goswami, Reference Goswami2006; Procter, Reference Procter2015). A combination of these elements might partially explain the negative attitudes towards research findings among some educators (Burkhardt & Schoenfeld, Reference Burkhardt and Schoenfeld2003; Gore & Gitlin, Reference Gore and Gitlin2004), who considered that other sources of information, such as the experience of other colleagues, are more trustworthy and practical than research results (Cook & Schirmer, Reference Cook and Schirmer2003; Landrum et al., Reference Landrum, Cook, Tankersley and Fitzgerald2002).

Faced with these barriers, some authors have pointed out to the need of an appropriate knowledge movilization (KMb) from researchers to practitioners and vice versa in order to strengthen the relation among them (Levin, Reference Levin2013; Nelson & O’Beirne, Reference Nelson and O’Beirne2014; Sharples, Reference Sharples2013). Under this scenario, researchers would ideally be, to some extent, inspired and challenged by daily concerns and questions of in-service teachers. And, at the same time, educators would be engaged with and informed by research. The question that remains is how this KMb must be performed to be successful, that is, which are the key ingredients to create a fertile common ground for both researchers and practitioners.

Ways to Move towards Research-Informed Practice

For a long time, there existed the belief that the use of research was a unidirectional process in which researchers would accumulate knowledge and this would be easily adopted by policy makers and practitioners (Levin, Reference Levin2013). However, we now know that research dissemination is not enough (Campbell et al., Reference Campbell, Pollock, Briscoe, Car-Harris and Tuters2017; Coe et al., Reference Coe, Fitz-Gibbon and Tymms2000; Levin, Reference Levin2011). In fact, there is abundant evidence about the complexity that endorses a regular use of research to improve education (Davies, Reference Davies2004; Nelson et al., Reference Nelson, Mehta, Sharples and Davey2017; Taylor, Reference Taylor2013). In this context, KMb implies an interactive process of co-creating knowledge between researchers, decision-makers and teachers to improve the education system (Campbell et al., Reference Campbell, Pollock, Briscoe, Car-Harris and Tuters2017; Cooper, Reference Cooper2014) which, in turn, requires social and behavioral change by all sides (Campbell & Levin, Reference Campbell and Levin2012; Nelson & O’Beirne, Reference Nelson and O’Beirne2014). For this purpose, different authors have highlighted the need to invest effort in at least three contexts that interact with each other: Research production, research transformation and communication, and research use or implementation (Gough et al., Reference Gough, Tripney, Kenny and Buk-Berge2011; Levin, Reference Levin2004, Reference Levin2013; Nelson & O’Bernie, Reference Nelson and O’Beirne2014; Sharples, Reference Sharples2013), all shaped by political and social context (Levin, Reference Levin2011).

The production of research on educational interventions has increased over the years (Cook & Schirmer, Reference Cook and Schirmer2003; Jones, Reference Jones2009; Levin, Reference Levin2011). Similarly, a growing number of organizations are promoting evidence synthesis to communicate these advances to education professionals (for instance, What Works Clearinghouse in the US, the Education Endowment Foundation in the UK, or Bofill Foundation in Spain). In spite of these steps, more high-quality studies are needed to offer robust evidence about effective interventions (Levin, Reference Levin2013). In addition, there is a need to create an organization which centralizes and systematizes all the efforts to produce solid evidence to inform decisions in education (Nelson & O’Bernie, Reference Nelson and O’Beirne2014).

Effective communication and implementation are just as important as knowledge production. In this sense, the elaboration of practical and accessible guidelines about how to implement evidence in schools is a promising way forward. These guidelines should include detailed information about several aspects, such as detailed description of context and intervention, management considerations, costs, or training requirements (Cordingley, Reference Cordingley2008; Nelson & O’Bernie, Reference Nelson and O’Beirne2014). Closely linked, the promotion of intermediaries, or mediators, to bridge the gap between researchers and practitioners has also been frequently emphasized (Campbell & Levin, Reference Campbell and Levin2012; Cooper et al., Reference Cooper, Levin and Campbell2009; Sin, Reference Sin2008). This role has traditionally been performed by a variety of bodies, such as media, charitable organizations, government agencies, research centers, professional organizations, or private companies (Sharples, Reference Sharples2013), and consists basically on interpreting, packaging, and distributing research evidence for policy makers and practitioners (Tseng et al., Reference Tseng, Granger, Seidman, Maynard, Weisner and Wilcox2007). Despite their potential benefits, in the case of institutions fully or partially funded by the private sector, it would be advisable to pay attention to potential conflicts of interest that may compromise their role as mediators (Honig, Reference Honig2004).

Although research in evidence implementation is still scarce (Nelson & O’Beirne, Reference Nelson and O’Beirne2014; Nelson & Campbell, Reference Nelson and Campbell2017), there are several promising proposals to foster the use of evidence-based practices in schools, such as: Research-engaged schools, which investigate key issues in education, use enquiry, promote learning communities, and turn data and experience into knowledge for decision making (Sharp et al., Reference Sharp, Handscomb, Eames, Sanders and Tomlinson2006); teaching schools, in which practitioners and researchers work together on the design of innovative education, professional development, and/or research (Broekkamp & van Hout-Wolters, Reference Broekkamp and Hout-Wolters2007); teaching school alliances, aimed at developing capability and capacity in evidence-based teaching through different initiatives, such as research journal clubs or teach meets with higher educational institutions, research networks, or dissemination events (Hammersley-Fletcher et al., Reference Hammersley-Fletcher, Lewin, Davies, Duggan, Rowley and Spink2015); or conferences and on-line conversations with academics to connect teachers to research, like ResearchEDFootnote 2 movement. Ultimately, the aim is to build a culture of research use among practitioners, so that evidence is fully embedded in decision-making. Considering the current gap between researchers and practitioners (teachers and frontline professionals), it is not surprising that many of the initiatives enlisted imply the creation of meeting spaces where both groups could actively collaborate with each other. Alongside this, it would be also desirable to enrich both initial preparation in universities and continuous professional development with training on research-related skills (Cook & Cook, Reference Cook and Cook2004).

Conclusion

Every day, hundreds of teachers face the challenging task of providing education to their hugely varied students. In this endeavor, tacit knowledge, reflection on their own practice, and values about education are not only inherent but also essential to their profession. However, these elements may not be enough to guarantee the use of effective instructional techniques for all the students, from first to last (Chalmers, Reference Chalmers2003, Reference Chalmers2005). This is particularly relevant if we consider that schools are continuously bombarded with many fads with little or no supporting evidence and widespread myths that pave the way to the adoption of unfounded methods (Dekker et al., Reference Dekker, Lee, Howard-Jones and Jolles2012; Ferrero et al., Reference Ferrero, Garaizar and Vadillo2016), such as the above mentioned Brain Gym. One way to prevent professionals from these threats while increasing their autonomy is incorporating research in educational administration and school decision making.

The way towards research-informed practice is a daunting challenge for teachers and schools, for policy makers, and for researchers and universities. In the case of practitioners, it implies a change in professional culture, so that both teachers and policy makers turn also to scientific findings to underpin educational practice (Coe, Reference Coe1999; Godfrey, Reference Godfrey2017). In the case of academics, it involves approaching real interests and needs of teachers (Cordingley, Reference Cordingley2008). The promotion of different measures to encourage collaboration between teachers and researchers, such as the creation of common spaces to share and discuss research or the boosting of intermediaries, can play an important role to this end. In turn, the leadership of administrations, universities, and schools in valuing and supporting the use of research though the provision of incentives and resources might be essential (Campbell & Levin, Reference Campbell and Levin2012).

The factors that explain why a method or tool produces good learning outcomes are very diverse (for instance, socio-economic status of family, or motivation level and previous knowledge of students), so that a small change in some of them might alter notably its effectiveness (Coe, Reference Coe1999). However, this does not mean that the results accumulated so far from research are not in position to inform educational community on some important issues. The academic success of many individuals, especially learning-disabled students, relies largely on the use of educational techniques that have been systematically proven to be effective (Cook & Cook, Reference Cook and Cook2004). For this reason alone, just because education is an undeniable right for any child, it is worthwhile for researchers and practitioners to do their best to bridge the gap that separates them and thus contribute to a better education for all.

Footnotes

Conflicts of Interest. None.

This research received no specific grant from any funding agency, commercial or not-for-profit sectors.

1 Official Brain Gym® Web Site (2018). Retrieved from https://breakthroughsinternational.org/programs/the-brain-gym-program/

References

Biesta, G. (2010). Why “what works” still won’t work: From evidence-based education to value-based education. Studies in Philosophy and Education, 29, 491503. http://doi.org/10.1007/s11217-010-9191-xCrossRefGoogle Scholar
Biesta, G. (2007). Why “what works” won’t work: Evidence-based practice and the democratic deficit in educational research. Educational Theory, 57, 122. https://doi.org/10.1111/j.1741-5446.2006.00241.xCrossRefGoogle Scholar
Bjork, R. A. (1999). Assessing our own competence: Heuristics and illusions. In Gopher, D. & Koriat, A. (Eds.), Attention and performance XVII. Cognitive regulation of performance: Interaction of theory and application (pp. 435459). Cambridge, MA: MIT Press.Google Scholar
Borg, S. (2010). Language teacher research engagement. Language Teaching, 43, 391429. https://doi.org/10.1017/S0261444810000170Google Scholar
Broekkamp, H., & van Hout-Wolters, B. (2007). The gap between educational research and practice: A literature review, symposium, and questionnaire. Educational Research and Evaluation, 13, 203220. https://doi.org/10.1080/13803610701626127Google Scholar
Burkhardt, H., & Schoenfeld, A. H. (2003). Improving educational research: Toward a more useful, more influential, and better-funded enterprise. Educational Research, 32, 314. https://doi.org/10.3102/0013189X032009003Google Scholar
Cain, T. (2016). Research utilization and the struggle for the teacher’s soul: A narrative review. European Journal of Teacher Education, 39, 616629. https://doi.org/10.1080/02619768.2016.1252912Google Scholar
Cain, T. (2015). Teachers’ engagement with research texts: Beyond instrumental, conceptual or strategic use. Journal of Education for Teaching, 41, 478492. https://doi.org/10.1080/02607476.2015.1105536CrossRefGoogle Scholar
Cain, T., & Allan, D. (2017). The invisible impact of educational research. Oxford Review of Education, 43, 718732. https://doi.org/10.1080/03054985.2017.1316252Google Scholar
Campbell, C., & Levin, B. (2012, November). Developing knowledge mobilizations to challenge educational disadvantage and inform effective practices in England [Paper presentation]. Evidence in Action seminars organized by the EEF. Toronto, Canada.Google Scholar
Campbell, C., Pollock, K., Briscoe, P., Car-Harris, S., & Tuters, S. (2017). Developing a knowledge network for applied education research to mobilize evidence in and for educational practice. Educational Research, 59, 209227. http://doi.org/10.1080/00131881.2017.1310364Google Scholar
Chalmers, I. (2005). If evidence-informed policy works in practice, does it matter if it doesn’t work in theory? Evidence and Policy, 1, 227242. https://doi.org/10.1332/1744264053730806Google Scholar
Chalmers, I. (2003). Trying to do more good than harm in policy and practice: The role of rigorous, transparent, up-to-date evaluations. The ANNALS of the American Academy, 589, 2240. https://doi.org/10.1177/0002716203254762Google Scholar
Coe, R. (1999). A manifesto for evidence-based education. Centre for Evaluation & Monitoring. http://www.cem.org/attachments/ebe/manifesto-for-ebe.pdfGoogle Scholar
Coe, R., Fitz-Gibbon, C., & Tymms, P. (2000, September). Promoting evidence-based education: The role of practitioners. [Paper presentation]. British Educational Research Association’s Annual Conference. Cardiff, Wales.Google Scholar
Cook, B. G., & Cook, L. (2004). Bringing science into the classroom by basing craft on research. Journal of Learning Disabilities, 37, 240247. https://doi.org/10.1177/00222194040370030901CrossRefGoogle Scholar
Cook, B. G., & Schirmer, B. R. (2003). What’s special about special education: Overview and analysis. Journal of Special Education, 37, 200205. https://doi.org/10.1177/00224669030370031001Google Scholar
Cooper, A. (2014). Knowledge mobilization in education across Canada: A cross-case analysis of 44 research brokering organizations. Evidence & Policy, 10, 2959. http://doi.org/10.1332/174426413X662806CrossRefGoogle Scholar
Cooper, A., Levin, B., & Campbell, C. (2009). The growing (but still limited) importance of evidence in education policy and practice. Journal of Educational Change, 10, 159171. https://doi.org/10.1007/s10833-009-9107-0CrossRefGoogle Scholar
Cordingley, P. (2008). Research and evidence-informed practice: Focusing on practice and practitioners. Cambridge Journal of Education, 38, 3752. https://doi.org/10.1080/03057640801889964CrossRefGoogle Scholar
Davies, P. (2004, September). “Is evidence-based government possible?”. [Paper presentation]. Jerry Lee Lecture, Fourth Annual International Campbell Collaboration Colloquium, Washington, DC, 19. Retrieved from http://www.policyhub.gov.uk/downloads/JerryLeeLecture1202041.pdfGoogle Scholar
Dekker, S., Lee, N. C., Howard-Jones, P., & Jolles, J. (2012). Neuromyths in education: Prevalence and predictors of misconceptions among teachers. Frontiers in Psychology, 3, Article 429. https://doi.org/10.3389/fpsyg.2012.00429Google ScholarPubMed
Ferrero, M., Garaizar, P., & Vadillo, M. A. (2016). Neuromyths in education: Prevalence among Spanish teachers and an exploration of cross-cultural variation. Frontiers in Human Neuroscience, 10, Article 496. https://doi.org/10.3389/fnhum.2016.00496CrossRefGoogle Scholar
Godfrey, D. (2017). What is the proposed role of research evidence in England’s “self-improving” school system? Oxford Review of Education, 43, 433446. https://doi.org/10.1080/03054985.2017.1329718CrossRefGoogle Scholar
Goldacre, B. (2006, March 18). Brain gym - Name & Shame. Bad Science. https://www.badscience.net/2006/03/the-brain-drain/Google Scholar
Goldacre, B. (2013). Building evidence into education. CORE. https://core.ac.uk/download/pdf/9983746.pdfGoogle Scholar
Gore, J. M., & Gitlin, A. D. (2004). [RE]Visioning the academic-teacher divide: Power and knowledge in the educational community. Teachers and Teaching: Theory and Practice, 10, 3558. http://doi.org/10.1080/13540600320000170918CrossRefGoogle Scholar
Goswami, U. (2006). Neuroscience and education: From research to practice? Natural Reviews Neuroscience, 7, 406413. https://doi.org/10.1038/nrn1907Google ScholarPubMed
Gough, D., Tripney, J., Kenny, C., & Buk-Berge, E. (2011). Evidence informed policy in education in Europe: EIPEE final project report. EPPI-Centre, Social Science Research Unit, Institute of Education, University of London.Google Scholar
Hammersley, M. (2001, September). Some questions about evidence-based practice in education. [Paper presentation]. Annual Conference of the British Educational Research Association, University of Leeds, England. http://www.leeds.ac.uk/educol/documents/00001819.htmGoogle Scholar
Hammersley, M. (2005). The myth of research‐based practice: The critical case of educational inquiry. International Journal of Social Research Methodology, 8, 317330. http://doi.org/10.1080/1364557042000232844CrossRefGoogle Scholar
Hammersley-Fletcher, L., Lewin, C., with Davies, C., Duggan, J., Rowley, H., & Spink, E. (2015). Evidence-based teaching: Advancing capability and capacity for enquiry in schools: Interim report.Google Scholar
Hattie, J. (2009). Visible learning: A synthesis of over 800 meta-analyses relating to achievement. Routledge.Google Scholar
Honig, M. I. (2004). The new middle management: Intermediary organizations in education policy implementation. Educational Evaluation and Policy Analysis, 26, 6587. https://doi.org/10.3102/01623737026001065CrossRefGoogle Scholar
Hyatt, K. J. (2007). Brain Gym. Building stronger brains or wishful thinking? Remedial and Special Education, 28, 117124. https://doi.org/10.1177/07419325070280020201CrossRefGoogle Scholar
Jones, M. L. (2009). A study of novice special educators’ views of evidence-based practices. Teacher Education and Special Education, 32, 101120. https://doi.org/10.1177/0888406409333777CrossRefGoogle Scholar
Landrum, T. J., Cook, B. G., Tankersley, M., & Fitzgerald, S. (2002). Teacher perceptions of the trustworthiness, usability, and accessibility of information from different sources. Remedial and Special Education, 23, 4248. https://doi.org/10.1177/074193250202300106Google Scholar
LaPointe-McEwan, D., DeLuca, C., & Klinger, D. A. (2017). Supporting evidence use in networked professional learning: The role of the middle leader. Educational Research, 59, 136153. http://doi.org/10.1080/00131881.2017.1304346CrossRefGoogle Scholar
Levin, B. (2004). Making research matter more. Education Policy Analysis Archives, 12, Article 56. https://epaa.asu.edu/ojs/article/view/211Google Scholar
Levin, B. (2010). Leadership for evidence-informed education. School Leadership & Management, 30, 303315. https://doi.org/10.1080/13632434.2010.497483Google Scholar
Levin, B. (2011). Mobilizing research knowledge in education. London Review of Education, 9, 1526. https://doi.org/10.1080/14748460.2011.550431Google Scholar
Levin, B. (2013). To know is not enough: Research knowledge and its use. Review of Education, 1, 231. https://doi.org/10.1002/rev3.3001CrossRefGoogle Scholar
Marzano, R. J. (1998) A theory-based meta-analysis of research on instruction. Mid-Continent Research for Education and Learning.Google Scholar
National Research Council (2002). Scientific research in education (Shavelson, R. J. & Towne, L., Eds.). The National Academies Press. https://doi.org/10.17226/10236Google Scholar
Nelson, J., & Campbell, C. (2017). Evidence-informed practice in education: Meanings and applications. Educational Research, 59, 127135. https://doi.org/10.1080/00131881.2017.1314115Google Scholar
Nelson, J., Mehta, P., Sharples, J., & Davey, C. (2017). Measuring teachers’ research engagement: Findings from a pilot study. Education Endowment Foundation & NFR. https://educationendowmentfoundation.org.uk/public/files/Evaluation/Research_Use/NFER_Research_Use_pilot_report_-_March_2017_for_publication.pdfGoogle Scholar
Nelson, J., & O’Beirne, C. (2014). Using evidence in the classroom: What works and why? National Fundation for Educational Research.Google Scholar
Penuel, W. R., Briggs, D. C., Davidson, K. L., Herlihy, C., Sherer, D., Hill, H. C., Farrell, C. C., & Allen, A.-R. (2016). Findings from a national survey of research use among school and district leaders. National Center for Research in Policy and Practice.Google Scholar
Procter, R. (2015). Teachers and school research practices: The gaps between the values and practices of teachers. Journal of Education for Teaching, 41, 464477. https://doi.org/10.1080/02607476.2015.1105535Google Scholar
Rohrer, D., & Pashler, H. (2010). Reviews/Essays: Recent research on human learning challenges conventional instructional strategies. Educational Researcher, 39, 406412. https://doi.org/10.3102/0013189X10374770Google Scholar
Rosenshine, B. (2012). Principles of instruction: Research-based strategies that all teachers should know. American Educator, 36, 1239.Google Scholar
Sharp, C., Handscomb, G., Eames, A., Sanders, D., & Tomlinson, K. (2006). Advising research-engaged schools: A role for local authorities. National Foundation for Educational Research. https://www.nfer.ac.uk/media/1907/itr03.pdfGoogle Scholar
Sharples, J. (2013). Evidence for the frontline. A report for the alliance for useful evidence. Alliance for Useful Evidence. http://www.alliance4usefulevidence.org/assets/EVIDENCE-FOR-THE-FRONTLINE-FINAL-5-June-2013.pdfGoogle Scholar
Sin, C. H. (2008). The role of intermediaries in getting evidence into policy and practice: Some useful lessons from examining consultancy-client relationships. Evidence & Policy, 4, 85103. https://doi.org/10.1332/174426408783477828CrossRefGoogle Scholar
Taylor, M. (2013). Social science teachers’ utilization of best evidence synthesis research. New Zealand Journal of Educational Studies, 48, 3450.Google Scholar
The Education Endowment Foundation (2018, September). Teaching and learning toolkit: Feedback guidance report. https://educationendowmentfoundation.org.uk/evidence-summaries/teaching-learning-toolkit/feedback/Google Scholar
Tseng, V., Granger, R., Seidman, E., Maynard, R., Weisner, T., & Wilcox, B. (2007). Studying the use of research evidence in policy and practice. The William T Grant Foundation.Google Scholar
Weinstein, Y., Madan, C. R., & Sumeracki, M. A. (2018). Teaching the science of learning. Cognitive Research: Principles and Implications, 3, Article 2 http://doi.org/10.1186/s41235-017-0087-yGoogle ScholarPubMed
Wiliam, D., Lee, C., Harrison, C., & Black, P. J. (2004). Teachers developing assessment for learning: Impact on student achievement. Assessment in Education Principles Policy and Practice, 11, 4965. https://doi.org/10.1080/0969594042000208994CrossRefGoogle Scholar
Williams, D., & Coles, L. (2007). Teachers’ approaches to finding and using research evidence: An information literacy perspective. Educational Research, 49, 185206. https://doi.org/10.1080/00131880701369719Google Scholar
Winch, C., Oancea, A., & Orchard, J. (2015). The contribution of educational research to teachers’ professional learning: Philosophical understandings. Oxford Review of Education, 41, 202216. https://doi.org/10.1080/03054985.2015.1017406Google Scholar