
Research engagement - more distance to travel
Despite recent policies to support evidence-informed teaching, and a number of important practical developments including the launch of the Chartered College of Teaching, the development of the Research Schools network, and the expansion of the researchED initiative, we still don’t know a great deal about the current extent or depth of evidence-informed practice across schools in England. In this article we present findings from a survey co-developed by the National Foundation for Educational Research (NFER) and the Education Endowment Foundation (EEF), which captured information about this issue in late 2014. It suggests that at this point, research was having only a small to moderate influence on decision making relative to other sources, despite teachers generally reporting a positive disposition towards research. Additionally, it suggests that this positive disposition was not necessarily transferring into an increased conceptual understanding about “what the research says”. This article discusses implications of these findings in the context of current developments towards evidence-informed practice, including the EEF’s approaches to supporting research engagement and use.
The rise of Evidenced informed practice
Anyone with an eye on recent educational developments will be aware of an increase in discussion and activity around “evidence-informed practice” (EIP). A number of research studies have suggested that evidence-informed schools have an important role to play in effective education systems (Brown, 2017; CUREE, 2015; Greany, 2015; Mincu, 2014; Schleicher, 2011), with a clearer understanding emerging of how research evidence can feed into those systems (Sharples, 2013) and of how schools themselves can support effective research engagement (Brown, 2015; Brown and Zhang, 2016; Brown 2017). Concurrently, there has been a surge in teacher demand for evidence, illustrated by the rapid rise of grass-roots initiatives like researchEd1, the Research Schools network2 and the launch of the Chartered College of Teaching3. This differentiates the current decade from previous years in which calls for EIP came, predominantly, although not exclusively, from university academics and researchers (see, for example, Hargreaves, 1996; Weiss, 1979). Notable examples of historic developments not driven by universities or academia include the work of Handscomb and MacBeath (2003) on The Research Engaged School, the National Teacher Research Panel (NTRP), an independent group of teachers and headteachers supported by the Centre for the Use of Research and Evidence in Education (CUREE)4; and the Collaborative Action Research Network (CARN), a network of teacher action researchers hosted by Manchester Metropolitan University’s Education and Social Research Institute,5 among others.
How much EIP is there?
Despite policies to support evidence-informed teaching, and a number of important practical developments, we still don’t know a great deal about the current extent or depth of EIP across schools in England. The NFER and EIP survey6 was developed to provide a measure of research engagement that could be applied across a series of projects, funded by the EEF, which aim to increase schools’ awareness, and use, of research evidence7. It was intended also to inform the EEF’s overall approach to scaling-up and mobilising evidence – a key priority for the organisation in the second five years of its life.
The following points are important in interpreting the results:
- Evidence is a broad term. There are many forms of evidence at a teacher’s disposal including classroom data, pupil performance data, information from research, management data and, of course, professional judgment. It is the combined application of these different forms of evidence that creates EIP.
- Information from research tends to be used less frequently as a source of evidence. For this reason, we developed our survey to focus specifically on the extent and nature of teachers’ uses of research evidence as an important component of EIP.
- Research evidence was defined in the survey as: “paper or web-based articles, reports, books or summaries based on academic research”. We used the term “academic research” to clearly distinguish between research carried out in universities or professional research organisations, and other sources such as comment pieces; books written by practitioners; or practice information shared at teacher gatherings.
The data were collected in late 2014, hence should be seen as a snapshot of research engagement at that time. Elements of the survey will be repeated in the academic year 2017-18, hopefully providing an indication of how research engagement has changed over this period.
Research method
Our survey was piloted in November 2014 with a sample of 1,200 secondary and 900 primary schools. Each school was provided with five copies of the questionnaire to be completed by up to five members of staff, equating to samples of 4,500 primary and 6,000 secondary teachers respectively (10,500 teachers in total). We offered a £5 incentive to the first 350 responding teachers. We quickly achieved 509 responses across 256 schools (an average response of two teachers per school and an achieved response rate of just under five per cent of sampled teachers).
The main aims of the pilot were to check the functioning of survey items and to create reliable factors for analysis (we explain what factor analysis is later in the article). For these purposes, it was appropriate to draw a random sample of schools from the population of schools in England and it was sufficient to have 300 responses at respondent (teacher) level. We did not need to achieve a representative sample of teachers; just a sample that covered the range of responses required to undertake factor analysis.
On analysis of the data, we realised that the pilot had yielded some interesting results, which were worth exploring further. The EEF therefore commissioned the NFER to explore the findings from the pilot study in order to understand the levels of current teacher research engagement. Recognising that we could not be certain that the teachers who responded to this survey were representative of the entire teaching workforce (the sample was not designed in this way), we calculated confidence intervals. As a result we can say that, assuming no sampling bias, we are 95 per cent confident that if we were to collect results from all teachers in England, the results would be within six percentage points of the results presented in this report8.
Our findings
Teachers had a positive view of research, and generally saw themselves as research engaged
When we asked teachers a series of questions about academic research, and how they felt it influenced their practice, the majority (typically two thirds) said they valued it, engaged with it, and used it to change classroom practice. Figure 1 provides an illustration of a series of questionnaire items that reflect this finding.

Figure 1 shows that much larger proportions of respondents ‘strongly agreed/agreed’ with the statements than ‘disagreed/strongly disagreed’ with them. For example the following proportions of teachers ‘agreed’ or ‘strongly agreed’ that:
- I am able to relate information from research to my context (77 per cent).
- I know where to find relevant research that may help to inform teaching methods/practice (70 per cent).
- Information from research plays an important role in informing teaching practice (69 per cent).
- I use information from research to help me decide how to implement new approaches in the classroom (68 per cent).
- I feel confident about analysing information from research (66 per cent).
When compared to other sources of information, research had a relatively small impact in informing teachers’ decision making
Teachers were also asked questions about the sources of information they consulted when developing approaches to teaching and learning. In these questions, options referencing research evidence were presented alongside a variety of other potential influences, such teachers’ own ideas, information from training/continuing professional development (CPD), ideas from other schools, and guidance from official bodies. The questions were framed so that they were not biased towards answers that referenced research. When the use of research information was considered in relation to other sources, rather than viewed in isolation, we found that teachers consulted it relatively rarely. This is illustrated in Figure 2, which summarises teachers’ responses to a question asking them to identify up to three sources that were most important to them when deciding on an approach to improve pupil progress.

Figure 2 – Sources consulted by teachers when developing approaches to support pupil progress
Figure 2 shows that:
- sources generated by teachers/schools had a large influence on decision making: own ideas (67 per cent); ideas from other schools (33 per cent); and action research conducted by respondents or their colleagues (17 per cent)
- sources generated by external education professionals also had a large influence: information gathered through training/CPD (43 per cent); literature based on teacher experience (14 per cent)
- academic evidence had only a moderate influence on decision making: literature based on academic research (16 per cent); and online evidence platforms, such as the Sutton Trust/EEF Teaching and Learning Toolkit9 (eight per cent)
- sources generated by policy and examinations organisations had a small influence: guidance from official bodies such as the DfE or Ofsted (10 per cent); advice from academy chains or local authorities (eight per cent); and guidance from examination boards (five per cent).
Additional survey data showed that the sources that teachers found easiest to understand were: colleagues in their own schools; pupil performance data; CPD information; and colleagues in other schools. Teachers considered information based on academic research less easy to understand.
How we interpret these findings:
- These findings illustrate the different impressions we can form about teachers’ research engagement when we ask explicit questions about the use of research, compared to exploring the use of research evidence relative to other influences. It is perhaps not surprising that research emerges as a less prominent influence when it is measured alongside other important sources.
- The survey indicates that teachers tend to listen to other teachers, and that schools tend to draw on the support of other schools. This suggests that an important mechanism for embedding EIP is via peer-to-peer support and school networks. Drawing on these findings, the EEF is increasingly collaborating with practice partners to help disseminate and apply research knowledge, through initiatives like the Research School network.
Teachers had mixed levels of research knowledge
In addition to capturing teachers’ self-reported levels of research engagement, the survey contained two sets of objective questions, designed to provide an indication of teachers’ research knowledge. The results showed a variable range of knowledge by question, but overall a relatively low level of knowledge of the evidence on effective strategies for teaching and learning.
Teachers found questions requiring scientific or specialist research knowledge more difficult to answer correctly than questions relating to general teaching and learning. For example, over two thirds (67 per cent) of teachers knew that the statement “setting pupils by ability improves learning outcomes for all pupils” was incorrect, but less than one eighth (13 per cent) recognised that the statement “drinking six to eight glasses of water per day improves pupil learning outcomes” was incorrect.
How we interpret these findings:
These findings illustrate the extent to which ideas can become embedded in the teaching profession when they resonate with messages outside of research. It is not surprising that teachers were unsure about statements regarding the impact of water on learning, given the widespread advertising on the benefits of drinking water and an active market in propagating educational “neuromyths”10. Take “learning styles” for example - there is no research evidence to support the statement that “individual pupils learn best when they receive information in their preferred learning style (e.g. auditory, visual, kinaesthetic)”, despite government previously endorsing this technique11. These findings illustrate the need for more widespread training and awareness of research evidence, through resources such as the Teaching and Learning Toolkit.
Levels of research engagement differed by school and by teacher
We used a statistical technique called factor analysis to summarise and analyse information from the survey. Factor analysis explains variability among responses and identifies trends in the data. Responses that were correlated across the survey were grouped together into single ‘factors’. These factors, which formed our outcome measures, had greater reliability than individual responses. The measures that emerged through our analysis were:
Outcome 1: Positive disposition to academic research in informing teaching practice.
Outcome 2: Use of academic research to inform selection of teaching approaches.
Outcome 3: Perception that academic research is not useful to teaching.
Outcome 4: Perception that own school does not encourage use of academic research.
Outcome 5: Active engagement with online evidence platforms.
We used these factors, and the knowledge score, to explore differences in responses between groups of teachers and schools. The analysis showed that:
- senior and middle leaders were significantl12 more likely to be research engaged than classroom teachers (on all factors, with the exception of outcome 4, and across all school phases)
- teachers in secondary schools were significantly more likely to be research engaged than primary school teachers (on two of the factors: outcome 2 – the use of academic research to inform the selection of teaching approaches; and outcome 6 – research knowledge)
- teachers in schools with the OfSTED ratings requires improvement or inadequate were significantly more likely to use online evidence platforms (outcome 5) than teachers in Ofsted-rated good or outstanding schools (across all school phases).
Our survey also contained a question that explored what the term ‘evidence-based teaching’ (EBT) meant to teachers by offering them a range of definitions. Interestingly, we discovered that an individual teacher’s definition of EBT was a good indicator of their wider research engagement and knowledge. Teachers who selected answers containing a reference to academic research in their definition scored more highly on almost all our factors than teachers who did not.
How we interpret these findings:
There appears to be a higher level of research engagement among school senior leaders than classroom teachers. This suggests senior leaders should aim to model their own enthusiasm for research, encourage leadership teams to take an evidence-informed approach and for them, in turn, to support their colleagues. It is unclear why teachers in primary schools were less likely than their secondary peers to use research to inform their teaching, or to have knowledge of research. Nevertheless, this finding suggests that researchers, and intermediaries between research and practice, need to put effort into raising awareness and understanding of research in the primary phase. The EEF’s current ‘campaign’ to develop evidence-based Primary Literacy in the North East of England is an example of such support13.
Conclusions and further work
Our survey presents some novel ideas about how we can investigate, quantify and analyse teachers’ research engagement but it cannot tell us everything we might like to know. It is designed to be used alongside qualitative research (for example, interviews, observations and case studies), which can capture deeper, and more nuanced, aspects of teachers’ research engagement. The survey has also raised some questions. These include:
How do teachers become research engaged and research literate? Our survey focused on teachers’ explicit awareness of the sources they were using, but there are many other, often implicit, ways by which research-based information can become embedded in teachers’ professional practice.
How is research used in practice? We were able to ascertain whether or not teachers said they used research evidence to inform or change classroom practice, but we were unable to explore how the evidence was implemented, adapted and evaluated.
What is the role of school culture, trust and relationships in supporting research use?
What is the relationship between teacher research and academic research? How do, and can, these different forms of research evidence interact?
We also need better understanding about how teachers ‘blend’ research evidence with other sources of information to create an evidence-informed approach. Our survey considers how teachers’ use of research evidence compares with use of other sources (such as CPD information or colleagues’ expertise), but in an evidence-informed school we would expect these sources of information to complement and support each other. This is an explicit objective of EEF-funded scale-up activities; hence, ongoing evaluations should provide useful insights on the interplay between these different forms of evidence.
Finally, our research suggests that many teachers have a natural interest in, and inclination to use, research evidence, although this interest is not necessarily translating to their decision-making processes or knowledge of research. This relationship needs investigating in more detail; nevertheless, this study suggests a number of avenues are worth exploring in terms of integrating research into the day-to-day work of schools:
- Research producers can build on teachers’ natural tendency to draw on their colleagues and other schools for support, by working through ‘practice partners’ – educators with good understanding of research and practice, who can both ‘translate’ research for school improvement and classroom practice and act as ‘advocates’ for the evidence. They should also ensure that research findings are presented clearly and accessibly, with linked guidelines for implementation.
- Training and professional-development providers can help teachers develop a working level of ‘research literacy’, enabling them to understand and critique research literature so that they can successfully expand their knowledge base. CPD providers should be explicit in identifying where research evidence has informed the development of resources and training.
- School senior leaders can model their tendency towards higher levels of research engagement with their colleagues, creating opportunities for the wider staff in their schools to develop their understanding and use of research evidence, and matching this with necessary resources. This is particularly important in the primary phase.
Prof. Jonathan Sharples, Education Endowment Foundation and EPPI-Centre, University College London.
Dr. Julie Nelson, National Foundation for Educational Research.
NOTES
1. http://www.workingoutwhatworks.com/
2. https://researchschool.org.uk
3. https://www.collegeofteaching.ac.uk/
5. http://www.esri.mmu.ac.uk/carnnew/
6. The full findings are available at https://educationendowmentfoundation.org.uk/our-work/resources-centre/research-use-survey/
8. This measure of uncertainty applies to a random sample of schools; and of teachers within schools. With bias in the sampling, the true population percentage could lie outside this range.
9. https://educationendowmentfoundation.org.uk/resources/teaching-learning-toolkit
10. See the work of the Organisation for Economic Cooperation and Development (OECD) Centre for Educational Research and Innovation on this topic: http://www.oecd.org/edu/ceri/neuromyths.htm
11. See Pedagogy and Practice: Teaching and Learning in Secondary’ Schools from the Department for Education and Skills, 2004 : http://dera.ioe.ac.uk/5706/7/DfES%200442-2004G%20PDF_Redacted.pdf
12. Statistical significance was set to p<0.05. This means there is a less than five per cent possibility that a reported difference could have arisen by chance if there was really no true difference.
13. https://educationendowmentfoundation.org.uk/our-work/campaigns/north-east-literacy-campaign
References
Brown, C. (2017). ‘Growing research capital for school improvement’, Professional Development Today, 19, 1, 52-58.
Brown, C., and Zhang, D. (2016). ‘How can school leaders establish evidence-informed schools: an analysis of the effectiveness of potential school policy levers’, Educational Management Administration and Leadership, 45, 3, 382-401. ■ Brown C. (Ed) (2015). Leading the Use of Research and Evidence in Schools. London: IOE Press. ■ Centre for the Use of Research & Evidence in Education (2011). Report of Professional Practitioner Use of Research Review: Practitioner Engagement in and/or with Research. Coventry: CUREE [online]. Available: http://www.curee.co.uk/files/publication/[site-timestamp]/Practitioner%20Use%20of%20Research%20Review%20-%20FINAL%2011_02_11.pdf [29 June, 2015]. ■ Greany, T. (2015). ‘How can evidence inform teaching and decision making across 21,000 autonomous schools? Learning from the journey in England.’ In Brown, C. (Ed) Leading the use of Research & Evidence in schools. London: Institute of Education Press. ■ Handscomb and MacBeath (2003) The Research Engaged School. FLARE Essex County Council. ■ Hargreaves, D. (1996). ‘Teaching as a research-based profession: possibilities and prospects.’ Paper presented at the Teacher Training Agency Annual Lecture, April [online]. Available: http://eppi.ioe.ac.uk/cms/Portals/0/PDF%20reviews%20and%20summaries/TTA%20Hargreaves%20lecture.pdf [23 January, 2017]. ■ Mincu, M. (2014). ‘Inquiry paper 6: teacher quality and school improvement – what is the role of research?’ In British Educational Research Association/the Royal Society for the encouragement of Arts, Manufactures and Commerce (Eds), The Role of Research In Teacher Education: Reviewing The Evidence [online]. Available: https://www.bera.ac.uk/wp-content/uploads/2014/02/BERA-RSA-Interim-Report.pdf [23 January, 2017]. ■ Sharples, J.M. (2013). Evidence for the Frontline. London: Alliance for Useful Evidence [online]. Available: htthttps://www.alliance4usefulevidence.org/p://www.alliance4usefulevidence.org/assets/EVIDENCE-FOR-THE- FRONTLINE-FINAL-5-June-2013.pdf [23 January, 2017]. ■ Schleicher, A. (2011). Building a High-Quality Teaching Profession: Lessons from Around the World. Paris: OECD Publishing [online]. Available: http://www.oei.es/formaciondocente/materiales/INFORMES/2011_OCDE.pdf [29 June, 2015]. ■ Weiss, C. (1979). The meanings of research utilization. Public Administration Review, 39 (5): 426-31.