Research shows that academic research has a relatively small impact on teachers’ decision-making – well what a surprise that is!

Recent research undertaken by Walker, Nelson, Bradshaw with Brown (2019) has found that academic research has a relatively small impact on teachers’ decision-making, with teachers more likely to draw ideas and support from their own experiences (60 per cent) or the experiences of other teachers/schools (42 per cent). Walker et al go onto note that this finding is consistent with previous research and go onto argue that these findings suggests that those with an interest in supporting research-informed practice in schools should consider working with and through schools, and those that support them, to explore their potential for brokering research knowledge for other schools and teachers.

In many ways we should not be surprised by these findings as similar findings about research use have been found in ethnographic research in UK general practice (Gabbay and le May, 2004) , where the findings show that clinicians very rarely accessed research findings and other sources of formal knowledge but instead preferred to rely on ‘mindlines’ – which they defined as ‘collectively reinforced, internalised, tacit guidelines. These were informed by brief reading but mainly by their own and their colleagues’ experience, their interactions with each other and with opinion leaders, patients, and pharmaceutical representatives, and other sources of largely tacit knowledge that built on their early training and their own and their colleagues' experience.

Now in this short blog I cannot do full justice to the concept of ‘mindlines’. Nevertheless if you would like to find out more I suggest that you have a look at (Gabbay and Le May, 2011; Gabbay and le May, 2016). That said, for the rest of this blog I’m going to draw on the work of (Wieringa and Greenhalgh, 2015) who conducted a systematic review on mindlines and draw out some of mindlines key characteristics

• Mindlines are consistent with the notion that knowledge is not a set of external facts waiting to be ‘translated’ or ‘disseminated’ but instead knowledge is fluid and multi-directional – and constantly being recreated in different settings by different people, and on an on-going basis.

• Mindlines involve a shared reality but not necessarily homogenous reality – made up of multiple individual and temporary realities of clinicians, researchers, guideline makers and patients

• Mindlines incorporate tacit knowledge and knowledge in practice in context

• Mindlines involves the construction of knowledge through social processes – this takes the form of discussions influenced by cultural and historic forces – and are validated through a process of ‘reality’ pushing back in a local context

• Mindlines are consistent with the view anyone including patients are capable of creating valid knowledge and can be experts in consultations

• Mindlines may not be manageable through direct interventions however they maybe self-organising as the best solution for a particular problem in a defined situation is sought out

What are the implications of ‘mindlines’ for those interested in brokering research knowledge in schools?

Wieringa and Greenhalgh go onto make of observations about the implications for practitioners, academics and policymakers if they embrace the mindlines paradigm, and which are equally applicable to schools.

1. We need to examine how to go about integrating various sources of knowledge and whether convincing information leads to improved decision-makings

2. In doing so, we need to think more widely about counts as evidence in school – and how these different types of evidence can best used by teachers and school leaders in the decision-making process

3. We need to examine how mindlines are created and validated by teachers, school leaders and other school stakeholders and how they subsequently develop over time.

In other words, research which focusses on how researchers can better ‘translate’ or ‘disseminate’ research is unlikely to have much impact on the guidelines teachers and school leaders use to make decisions.

And finally

I’d just like to make a couple of observations about the report by Walker et al (2019). First, I don’t like reports where reference is made to the ‘majority of respondents’ and no supporting percentage figure is given. Second, the terms climate and culture seem to be used interchangeably although they refer to quite different things. Third, significant differences between groups of teachers are highlighted yet no supporting data is provided or explanation of what is meant in this context by significant.

References

Gabbay, J. and le May, A. (2004) ‘Evidence based guidelines or collectively constructed “mindlines?” Ethnographic study of knowledge management in primary care’, Bmj. British Medical Journal Publishing Group, 329(7473), p. 1013.

Gabbay, J. and le May, A. (2016) ‘Mindlines: making sense of evidence in practice.’, The British journal of general practice : the journal of the Royal College of General Practitioners. British Journal of General Practice, 66(649), pp. 402–3. doi: 10.3399/bjgp16X686221.

Gabbay, J. and Le May, A. (2011) Organisational innovation in health services: lessons from the NHS Treatment Centres. Policy Press.

Walker, M., Nelson, J, Bradshaw, S. with Brown, C (2019) Teachers’ engagements with research; what do we know? A research briefing, London: Education Endowment Foundation,

Wieringa, S. and Greenhalgh, T. (2015) ‘10 years of mindlines: a systematic review and commentary’, Implementation Science. BioMed Central, 10(1), p. 45.

The school research lead, data-literacy and competitive storytelling in schools

A major challenge for school leaders and research champions wishing to make the most of research evidence in their school is to make sure not only that they understand the relevant research evidence but that they also understand their school context.  In particular they need to be able to analyse and interpret their own school data.  This is particularly important when discussing data and evaluating data from within your school you are in all likelihood taking part in some form or competitive storytelling (Barends and Rousseau, 2018).   The same data will be used by different individuals to tell different stories about what is happening within the school. Those stories will be used to inform decisions within the school and if we want to improve decision-making in schools it will help if decision-makers have a sound understanding about the quality of the data, on which those stories are based

 To help you get a better understanding of the data-informed stories being told in your school, in this post I’m going to look at some of the fundamental challenges in trying to understand school data and in particular, some of the inherent problems and limitations of that data.  This will involve a discussion of the following: measurement error; the small number problem; confounding; and, range restriction   In a future, I will look at some of the challenges of trying to accurately interpret the data, with special reference to how that data is presented.

The problems and limitations of school data

Measurement error

Measurement error presents a real difficulty when trying to interpret quantitative school-data.  These errors occur when the response given differs from the real value.  These mistakes may be the result of say: the respondent not understanding what is being asked of them – e.g. an  NQT not knowing what’s being measured and how to do it ; how and when the data is collected- say at 5pm on a Friday or the last day of term; or how missing data is treated – somehow it gets filled in.  These errors may be random, although they can lead to systematic bias if they are not random. 

The small number problem

When school data are based on a small number of observations, then any statistic which is calculated from them will contain random error.  For example, in schools, small departments are more likely to report data which deviates from the true value than larger departments.   For example, let’s say we have a school where the staff turnover is 20%, a small department is likely to have a greater deviation from this 20% than a larger department.  As such, you would need to be extremely careful about drawing any conclusions about the quality of leadership and management within these departments, based on this data (that said, there may be issues and other sources of data may need to be looked at)

Confounding 

A confound occurs when the true relationship between two variables is hidden by the influence of a third variable.  For example, the senior leadership team of a school may assume that there is a direct and positive relationship between teaching expertise and pupils’ results and may interpret any decline in results as a being the result of ‘poor’ teaching. However, it might not be the teachers expertise which is the major contributory factor in determining results.  It may have been that a number of pupils – for reasons completely beyond the control of the teacher – just did not ’perform on the day’ and made a number of quite unexpected errors.  Indeed, as Crawford and Benton (2017) pupils not performing on the day for some reason or another is a major factor in explaining differences in results between year groups

Range restriction

This occurs when a variable in the data has less than the range it possesses in the population as a whole and is often seen when schools use A level examination results for marketing purposes.    On many occasions, schools or sixth form colleges publicise A level pass rates of 98, 99 or 100%.  However, what this information does not disclose is how many pupils/students started A levels and subsequently either did not complete their programme of study or who were not entered for the examination.  Nor does it state how many pupils gained the equivalent of three A levels. So, if attention is focused on the number of pupils gaining three A levels or their equivalent, then a completely different picture of pupil success at A level or its equivalent may emerge

Implications

If you want to make sure you don’t draw the wrong conclusions from school data it will make sense to:

·     Aggregate data where appropriate so that you have larger sample sizes

·     Use a range of different indicators to try and get around the problem of measurement error with any one indicator

·     Try and actively look for data which challenges your existing preconceptions – make sure all the relevant data is made captured and made available – not just that data which supports your biases.

·     Avoid jumping to conclusions – more often than not there will be more than explanation of what happened.

And finally 

Remember even if you have ‘accurate’ data, this data can still be misrepresented through the misuse of graphs, percentages, mis-use of p-values and confidence limits. 

References and further reading

Barends, E and Rousseau, D (2018) Evidence-Based Management: How to make better organizational decisions, London, Kogan-Page, 

Crawford, C. and Benton, T. (2017b). Volatility Happens: Understanding Variation in Schools’ Gcse Results : Cambridge Assessment Research Report. Cambridge, UK

Jones, G (2018) Evidence-based School Leadership and Management: A practical guide, London, Sage Publishing

Selfridge, R (2018) Databusting for Schools. How to use and interpret education data. London, Sage Publishing

Promoting data-literacy in schools

Just last week I hosted a #UkEdResChat on how and when to go about supporting teachers develop research literacy.  However, the more I thought about the issue the more it seemed to me that maybe we were focussing on the wrong type of ‘literacy’ and that instead we should be discussing and focussing on developing teacher ‘data-literacy’.  For it seems to me that If we want teachers to improve their day to day practice by engaging in some form of professional inquiry then having the skills to effectively interpret data is essential. and far more relevant to classroom teachers than research literacy. So to help develop my thinking on what data-literacy could and should look like, I did a quick search on Google Scholar using the term teacher data literacy and came across Mandinach and Gummer (2016) and their conceptual framework on teacher data literacy (DLTF), and which seems to be worth sharing.  Accordingly, this post will discuss Mandinach and Gummer’s definition of data literacy and associated conceptual framework; the dispositions and habits of mind that influence data use; and a discussion of the implications for school leaders

Data literacy for teaching (DLFT)

Using data is a key component of the teachers’ role.  In England, the 2011 Teacher Standards – updated in 2013 – state that teacher should use ‘relevant data to monitor progress, set targets, and plan subsequent lessons.’  However, as Mandinach and Gummer point out – this type of statement is extremely general and does not provide any guidance about the knowledge and skills necessary to be data literate.  In order to address this issue Madinach and Gummer offer up this definition of data literacy for teaching.

Data literacy for teaching is the ability to transform information into actionable instructional (teaching) knowledge and practices by collecting, analysing, and interpreting all types of data (assessment, school climate, behavioural, snapshot, longitudinal, moment to moments etc) to help determine instructional steps.  It combines an understanding of data with standards, disciplinary knowledge and practices, curricular knowledge, pedagogical content knowledge and an understanding of how children learn (p2)

 Mandinach and Gummer then go onto state three propositions on which their work is grounded.  

1.    There is no question that educators must be armed with data?

2.    Assessment literacy is not the same as data literacy.

3.    Simply relying on professional development to develop teacher data literacy is not enough and a conceptual framework is required.

The Data Literacy For Teachers Framework

The DLTF framework was a product of number activities – including the convening of experts, a review of published materials and resources dealing with data-literacy; a review of relevant teaching standards – which were then synthesised by Mandinach and Gummer with what is viewed as good teaching – and resulted in five domains for data literacy - identify problems and frame questions; use data; transform data into information; transform information into a decision; and, evaluate outcomes.  The associated knowledge and skills associated in with each domain are summarised in Table 1 – more detail can be found in Mandinach and Gummer’s article

Screenshot 2019-04-16 at 11.59.08.png

Dispositions, habits of mind, or factors that influence data

Mandinach and Gummer the go onto identify another category of components which they viewed as necessary for data uses, which they called dispositions or habits of mind and which are linked to general teaching rather than be specific towards data use.  Nevertheless, it is argued that the six dispositions identified influence how teachers approach the use of data.  The six dispositions are:

·     Belief that all students can learn

·     Belief in data/think critically

·     Belief that improvement in education requires a continuous inquiry cycle

·     Ethical uses of data, including the protection of privacy and confidentiality of data

·     Collaborations

·     Communication skills with multiple audience

Implications for school leaders, CPD co-ordinators and school research leads

First, it’s important to remember that this conceptual framework needs to be ‘tested’ in the field and should not be seen as the definite statement of what data literacy looks like. As Mandinach and Gummer acknowledge more research is required into: how data literacy may change over time (just think social media and GDPR); how best to meet the data literacy needs of teachers at different stages of their career; the possible continuum of data literacy ranging from novice to expert what data literacy looks like for teachers at different stages of their careers and what implications this has for support made available to teacher; how data literacy may differ for a newly qualified teacher, compared to say a senior leader in a multi-academy trust?  In other words, this conceptual framework is not the finished article and is a work in progress, and should not be ‘oversold’ to colleagues

Second, what are the implications of the DLTF for schools wising to  incorporate ‘disciplined inquiry’ either as part performance management or continuous professional development?(see https://www.garyrjones.com/blog/2019/3/23/performance-management-and-disciplined-inquiry).  Initial reflection would suggest that if you accept that teachers in a school will have varied levels of data literacy – then it’s unlikely that a single model of disciplined inquiry or type of inquiry question is likely to appropriately meet the development needs of all teachers.  Indeed, before embarking on ‘disciplined inquiry’ it may be wise to undertake some form skills audit – so that you start off where colleagues are rather than where you would them to be.  This can also help you put in the place different packages of support relevant for teachers at different stages of the development of their research literacy.

Third, although there appears to tide appears to be turning against the importance of internal school data , especially in the inspection of schools (https://www.tes.com/news/ofsted-inspections-wont-examine-internal-school-data) that does not mean this framework is any less relevant.  Indeed, what this framework does is emphasise that data literacy is not just about – assessment or progress data– but is far more comprehensive. 

Finally, should attention switch from developing teacher research literacy to the development of data literacy?  Well given the lack of support research evidence which compares and contrast data or research literacy in action in schools, the answer to that question has to be no.  Instead, the two skill sets should be seen as overlapping and complimentary.  For example, for teachers seeking to identify problems and frame questions are more likely to be able to do this if they are research literate and can use research to better understand an issue.  On the other hand, a research literate teacher who subsequently makes a ‘research informed’ change in practice will need to have the capability to evaluate the outcomes of that change in practice

And finally

If you are interested in becoming more data literate, then I recommend that you have a look at Richard Selfridge’s book - Databusting for Schools: How to use and interpret education data. 

Reference

DfE (2013) Teachers’ Standards: Guidance for school leaders, school staff and governing bodies, July 2011 (introduction updated June 2013). London: DfE

Mandinach, E. B., & Gummer, E. S., What does it mean for teachers to be data literate: Laying out the skills, knowledge, and dispositions, Teaching and Teacher Education (2016), http://dx.doi.org/10.1016/j.tate.2016.07.011

Selfridge, R (2018) Databusting for Schools. How to use and interpret education data. London, Sage Publishing

Promoting inquiry-based working within your school

Just a few weeks ago I gave a presentation at ResearchED Blackpool which I explored the nature of disciplined inquiry and its role in performance management. I also examined whether there was any support for the claims being made about how inquiry contribute to teacher professionalism and school improvement.  So with that in mind, I was delighted to have come across the work of Uiterwijk-Luijk, L., Krüger, M., & Volman, M. (2019) examining the promotion of inquiry-based working and the inter-relationships between school boards, school leaders and teachers, as little appears to be known about how schools can create a culture of inquiry.  Furthermore, given the Education Endowment Foundation’s efforts to support governing bodies/trusts to become more evidence-informed, the publication of Uitwerwilk-Luik et al’s work is extremely timely. 

Promoting inquiry-based working 

Working with three Dutch primary schools, Uiterwilk-Luik et al set out to answer the following research question: How can the interplay between school boards, school leaders and teachers regarding inquiry-based working be characterised?   In addition, there were two subquestions. One, how can the interplay between school boards and school leaders regarding inquiry-based leadership be characterised? Two, how can the interplay between school leaders and teachers regarding respectively inquiry-based leadership and inquiry- based working be characterised. 

A multiple case-study design was adopted, with data being collected through interviews with school boards, school leaders and teachers.  Meetings were also observed, along with documentary analysis. Data was transcribed and coded to a coding scheme using MAXQDA coding software.  The coding scheme that emerged from the interviews influenced the analysis of the data from observations and documents.  Subsequently a cross-case analysis was undertaken along with in-case analysis.

The results of the analysis suggest that both schools’ boards and school leaders adopted four strategies to promote inquiry-based working, with each of these strategies incorporating a range of sub-approaches, and which are listed below

 Stimulating school leaders and teacher’s inquiry habits of mind by

  • Discussing students results with school leaders/teachers

  • Encouraging leaders to discuss students results with teachers and teachers discussing results with each other

  • Sharing knowledge

  • Modelling behaviour

  • Making demands 

  • Having high expectations

  • Encouraging leaders to co-operates and discuss research results with school leaders from other schools 

Stimulate leaders and teacher data literacy by

  • Involving external organisations to support school leaders conduct research

  • Developing internal expertise to support inquiry

Communicate the visions for inquiry-based working

  • Communicating orally about the vision for inquiry-based working

Share leadership

  • Sharing leadership with responsibilities with teachers

Support inquiry-based working by 

  • Providing money, time and space

  • Trusting and believing 

  • Belling open to new ideas and concerning research

  • Creating a safe environment

The interplay between the school board and school leaders 

It is worth noting that although these different approaches were seen in more than one school, the accomplishment and impact of the approaches differed.  Uiterwilk-Luik et al note that in the school where there was a focus on student results, the demand made by the school board led to ‘inquiry’ being experiences ‘as part of a performativity agenda.’  Whereas in the other two schools, the boards’ demands re inquiry working were seen as part of an attempt to raise educational quality.  In addition, it was noted than none of the schools had a clear written down vision for inquiry-based working and w

The interplay between school leaders and teachers

Interestingly all three schools adopted the same approach to stimulating inquiry, by providing time, money and space; being open to new ideas concerning research; by creating a safe environment.  That said, there were differences between the schools.  One principal talked about an implicit rule that all decisions be based on data, a rule which was not recognised by the teachers in the school.  Furthermore, in two of the schools – teachers demonstrated the inquiry culture by being critical i.e. asking critical questions – and modelling behaviour – by investigating and improving their own actions by comparing them with the work of others.

Implications for schools in England wishing to promote an inquiry-based culture.

Although the research was conducted in another education systems, and where the relationships between governing bodies, school leaders and teachers are different.  Nevertheless, the research does prompt a number of questions – which governing bodies/trusts, school leaders and teachers, might wish to reflect on within the context of their own schools/multi-academy trusts.

  • To what extent are the various strategies and approaches adopted by school boards and school leaders seen in your context?

  • Is there a gap between the rhetoric and reality of inquiry-based working within your school?

  • To what extent is the data on which decisions are based being made explicit?

  • How are school governing bodies/trusts supporting the use of inquiry-based methods?

  • Is inquiry-working within the school perceived as a genuine attempt to bring about educational improvement or is it viewed as part of a performativity agenda?

  • Does your school have an ‘inquiry-based working’ vision and mission statement?

  • How would you describe ‘inquiry-based working with your school – bottom-up, top-down or multi-directional?

If your school is engaged in inquiry-based working, do there appear to be any negative consequences?

And finally

As a consequence of writing this blogpost, it has made me realise that I have been paying insufficient attention to what school leader or teacher data-literacy looks like.  If colleagues are going to be encouraged to undertake some form of disciplined inquiry within their schools, then what knowledge and skills do they need to have so that they can draw meaningful and well-informed conclusions from their work.   

References

Uiterwijk-Luijk, L., Krüger, M., & Volman, M. (2019). Promoting inquiry-based working: Exploring the interplay between school boards, school leaders and teachers. Educational Management Administration & Leadership, 47(3), 475–497. https://doi.org/10.1177/1741143217739357

All the school research lead needs is a COM-B to bring about behaviour change

Despite the increased interest in the use of evidence-informed practice within schools, research suggests that the majority of teachers are not using research to inform their teaching practice. Research from the Sutton Trust indicates that a minority of teachers (45%) are using research to inform their teaching, with only a small minority of teachers (23%) using  the Education Endowment Foundation’s Teaching and Learning Toolkit.  Morever, other research undertaken by the NFER - Nelson, Mehta, et al. (2017)- suggests that research evidence is not playing a major role in teachers’ decision-making when developing their classroom practice, relative to other sources.  

So given the potential magnitude of the challenge, it makes sense to look at the science of behaviour change  - and the work of Michie, Van Stralen, et al. (2011)and Michie, Atkins, et al. (2016) and their Capability-Opportunity-Motivation Behaviour( COM-B)  model of behaviour change   - to see what help it can provide evidence-informed practitioner who is interested in bring about changes in teachers’ practice within their school. Indeed, interest is already being shown by the Education Endowment Foundation in the COM-B model  and how it can be used to think about interventions could be designed to increase their chance of bringing about behaviour change (Sharples, 2017).

The COM-B Model of Behaviour Change

Michie et al (2011) undertook a systematic review of behaviour change methods and identified 19 different frameworks for behaviour change. Michie et al then synthesised the 19 frameworks into one framework, with two levels, one representing intervention functions and the other high level policy. This led to the development of a Behaviour Change Wheel, which has the Capability-Opportunity-Motivations Behaviour Model at its’ hub. The COM-B model has three core components:

·      Capability - the individual’ s psychological and physical capacity to engage in the activity concerned. It includes having the necessary knowledge and skills. 

·      Motivation - those brain processes that energize and direct behaviour, not just goals and conscious decision-making. It includes habitual processes, emotional responding, as well as analytical decision-making. 

·      Opportunity - the factors that lie outside the individual that make the behaviour possible or prompt it. Michie, et al. (2011)

Each of the components can influence behaviour, for example. capability can influence motivation as can opportunity, just as enacting a behaviour can change  capability, motivation, and opportunity, and is illustrated in Figure 1.  So in the context of research-use within schools, having access to research evidence (opportunity) or being able to understand research evidence (capability) might increase motivation to use research evidence to plan teaching and learning.  However, having the capability, motivation and opportunity is not enough, as the individual teacher needs to act to use the research evidence to improve teaching and learning  

Screenshot 2019-04-03 at 19.30.01.png

However, this only provides a very brief of the model and its components.  Table 1 provides definitions and examples of the components of the COM-B model as applied to the use of research evidence in schools.

 Table 1 – Description of COM-B components and  associated evidence use in schools example 

Screenshot 2019-04-03 at 19.32.17.png

Adapted from Michie, Atkins, et al. (2014)p63

However, it needs be emphasised that the implementation of those examples alone may not be enough to bring about the desired behaviour change.  Indeed, multiple activities or changes in each component may be requited.  Indeed, research by Langer, Tripnet, et al. (2016)on the science of using science in decision-making would suggest that interventions that only focus on only one or two  of the three components are highly unlikely to bring about greater use of research evidence.   Indeed, future blogs will use the COM-B model to examine the various actions which can be taken to support behaviour change. In doing so, we will be looking at the outer rings of the Behaviour Change Wheel i.e intervention functions and higher level policy.

And finally

We need to remember that getting teachers to make use of research is not in itself enough to bring about improved outcomes for pupils.  Instead, we need research literate teachers to be using research, and current research suggests that teachers have ; a weak, but variable, understanding of the evidence-base relating to teaching and learning strategies; a weak, but variable, understanding of different research methods and their relative strengths; and a particularly poor understanding of the evidence-base that requires scientific or specialist research knowledge (e.g. the validity of ‘neuromyths’) (Nelson, et al, 2017)

References

Langer, L., Tripnet, J. and Gough, D. (2016). The Science of Using Science : Researching the Use of Research Evidence in Decision-Making. London. EPPI Centre, S. S. R. U., UCL Insitute of Educations, University College of London.

Michie, S., Van Stralen, M. M. and West, R. (2011). The Behaviour Change Wheel: A New Method for Characterising and Designing Behaviour Change Interventions. Implementation Science. 6. 1. 42.

Michie, S., Atkins, L. and West, R. (2014). The Behaviour Change Wheel. A guide to designing interventions. 1st ed. Great Britain: Silverback Publishing. 

Michie, S., Atkins, L. and Gainforth, H. L. (2016). Changing Behaviour to Improve Clinical Practice and Policy. In   Axioma-Publicações da Faculdade de Filosofia. 

Nelson, J., Mehta, P., Sharples, J. and Davey, C. (2017). Measuring Teachers’ Research Engagement: Findings from a Pilot Study: Report and Executive Summary. London. Education Endowment Foundation/NFER

Sharples, J. (2017). Untangling the ‘Literacy Octopus’ – Three Crucial Lessons from the Latest Eef Evaluation. Education Endowment Foundation Blog. https://educationendowmentfoundation.org.uk/news/untangling-the-literacy-octopus/