The School Research Lead - and a PIT-B theory of change

Recently I’ve been giving a lot of thought to the related ideas of theories of change, theories of action and logic models.  Fortunately, there are lots of really useful resources available online which provide practical advice on how to make these ideas ‘work’.  In this post I am going to look at the work of (Long et al., 2018) and their PIT-B model for developing a theory of change (TOC). Full  details of which can be found using the following link

Let’s start by stating what is meant by the term ‘theory of change’.  Put simply a theory of change explains ‘how and why’ a particular intervention will in a specific context bring about the desired or hoped for change. 

Long et al recommend that when developing a theory of change, you begin with the objective or problem you are trying to solve or address.  Second, you explain what aspect of the intervention/innovation you are proposing that allows it to solve the problem or achieve the objective.  Next, describe how the intervention facilitates and carries out this causal mechanism.  Finally, complete your causal chain by returning to the problem or objective you started with.

Long et al suggest that this can be done by using the PIT-B model

Screen Shot 2019-02-24 at 08.46.58.png
Screen Shot 2019-02-24 at 08.47.15.png

We can now use this model to help illustrate theories of change which could be developed for use in your work as a school research lead.

Example 1

Teachers in school do not see the practical relevance of educational research, when planning lessons and schemes of work, so it is not unexpected that teachers do not make use of educational research.  Our theory is that if teachers have more opportunities to read and discuss research, they will begin to make greater use of educational research in their teaching.   The school’s Journal Club promotes the use of educational research by providing setting for research literate colleagues to support colleagues think about how research could be applied in classroom setting.  We believe that this encourage an increase in educational research, and increase its use in lesson plans and schemes of work.

Example 2

Teachers in school are often demotivated by existing schemes of performance management and objective setting for accountability purposes.  Out theory is that teachers have opportunities to focus on ‘development’ objectives rather than ‘accountability’ objectives this will increase teacher motivation and engagement in performance management process.  The school’s programme of ‘disciplined inquiry’ encourages teachers to think about how they can bring about improvements both teaching and pupil outcomes.  We believe that this will increase the motivation of teachers, and eliminate the demotivating impact of accountability based performance management

However, it is important to realise that coming with a theory of change is  the ‘easy-bit’ .  In order to create a ‘good’ theory of change, Long et al recommend that the underpinning assumptions and hypotheses are stated. Second, the appropriate resources are available. Third, the language used is clear and unambiguous. Fourth, there should be agreement from all the relevant stakeholders.  Last but not least, the focus should be on one particular intervention.

And finally

Developing a robust theory of change will not guarantee that you whatever intervention you are introducing will be a success. However, it will increase you chance of success as you will be able to articulate - the how and the why of what you are trying to achieve - this allows you to then focus on what you are going to do to make it happen - and this is another story.

Reference

Long M, Macdonald A and Duncan T. (2018) Practical Tips for Developing and Using Theories of Change and Logic Models. 2018 Virginia AmeriCorps Annual Program Directors and Staff Meeting, Richmond, VA: ICF.

Want to know more

If you are interested in finding out more about theories of change, theories of action and logic models, have a look at the following 

https://www.betterevaluation.org/en/rainbow_framework/define/develop_programme_theory

 

 

The school research lead and making the most of supporting evidence-based practice in schools

School research leads across country are trying to encourage the use of evidence-based practice. No doubts lots of different interventions, be it lesson study, joint-practice development, journal clubs, conferences, seminars and disciplined inquiry - have been introduced. Alternatively the school may be involved in research studies looking at ways of developing evidence use,  Hammersley-Fletcher, Lewin, et al. (2015), Griggs, Speight, et al. (2016), Speight, Callanan, et al. (2016) and  Brown (2017).  So to make the most of all this activity, and to ensure that colleagues learn from both the success and failure of others, it is sensible to use a basic common structure to report on educational interventions designed to support evidence-informed/based practice within schools.

The GREET check-list

The GREET check-list  - Phillips, Lewis, McEvoy, Galipeau, Glasziou, Moher, et al. (2016) was developed to provide guidance n the reporting of educational interventions for evidence-based practice within medicine.   The check-list was the product of a systematic review, Delphi survey and three consensus discussions, with the result being a 17 item check-list . Guidance on how to complete the GREET checklist has been provided by Phillips, Lewis, McEvoy, Galipeau, Glasziou, Hammick, et al. (2016) and this guidance has been used to develop an exemplar report of an evidence-based educational intervention – journal clubs.

Journal Clubs

1. INTERVENTION: Provide a brief description of the educational intervention for all groups involved [e.g. control and comparator(s)].

The introduction of a journal club – facilitated by the school research lead - for teaching and other staff who wished to attend the sessions.

2. THEORY: Describe the educational theory(ies), concept or approach used in the intervention.

If teachers are ‘exposed’ to research this will ultimately bring about changes in teaching practice, resulting in improved learning outcomes for pupils

3. LEARNING OBJECTIVES: Describe the learning objectives for all groups involved in the educational intervention.

  • To develop the reading habits of the participants

  • To improve participants knowledge of relevant educational research

  • To help develop participants skills in critically appraising research and applying it to teaching

4. EBP CONTENT: List the foundation steps of EBP (ask, acquire, appraise, apply, assess) included in the educational intervention.

The core content focused on appraising educational research

5. MATERIALS: Describe the specific educational materials used in the educational intervention. Include materials provided to the learners and those used in the training of educational intervention providers.

Attendees were directed towards Chartered College of Teaching resources designed to give brief summaries of different types of research reports, including how to go about reading research.

6. EDUCATIONAL STRATEGIES: Describe the teaching / learning strategies (e.g. tutorials, lectures, online modules) used in the educational intervention.

Seminars led by the school research lead

7. INCENTIVES: Describe any incentives or reimbursements provided to the learners.

A group of ten staff – out of a possible 100 eligible staff –decided to attend the journal club. 

Attendees to the sessions were provided with light refreshments – tea, coffee and biscuits

8. INSTRUCTORS: For each instructor(s) involved in the educational intervention describe their professional discipline, teaching experience / expertise. Include any specific training related to the educational intervention provided for the instructor(s).

The sessions were facilitated by the school research lead who had recently completed a MA in Education.

9. DELIVERY: Describe the modes of delivery (e.g. face-to-face, internet or independent study package) of the educational intervention. Include whether the intervention was provided individually or in a group and the ratio of learners to instructors.

In October 2018 school research lead conducted an introductory session on how to appraise educational research.  In subsequent sessions the school research lead facilitated a structured discussion on the reading scheduled for that session.

10. ENVIRONMENT: Describe the relevant physical learning spaces (e.g. conference, university lecture theatre, hospital ward, community) where the teaching / learning occurred. 

The sessions were held in the seminar room – located within the school library. 

11. SCHEDULE: Describe the scheduling of the educational intervention including the number of sessions, their frequency, timing and duration

A total of six sessions were held, with a session being held every half-term.  Each session took place on a Wednesday and 4.00 pm and lasted approximately 45 minutes.  The intervention was implemented over the course of 2018-19 academic year.

12. Describe the amount of time learners spent in face to face contact with instructors and any designated time spent in self-directed learning activities.

Participants spent approximately 4 ½ hours in the sessions, with another 4 ½ hours spent reading materials prior to the sessions. 

13. Did the educational intervention require specific adaptation for the learners? If yes, please describe the adaptations made for the learner(s) or group(s).

Some participants had little or no knowledge of educational research and they were paired with other participants who had recently participated in post-graduate study.

14. Was the educational intervention modified during the course of the study? If yes, describe the changes (what, why, when, and how).

It had been intended to look at a key text, for example,  during each session.  It soon became apparent that participants were able to do the required reading.  Texts subsequently used were primarily used were articles from the Chartered College of Teaching journal Impact.

15. ATTENDANCE: Describe the learner attendance, including how this was assessed and by whom. Describe any strategies that were used to facilitate attendance.

On average only six out of ten staff attended the sessions.  Two participants attended all six sessions – with two participants only attending two sessions.  Records of attendances were kept by the school research lead.

16. Describe any processes used to determine whether the materials  and the educational strategies used in the educational intervention were delivered as originally planned.

The school research lead undertook research into how journal clubs had been successfully run in both medicine and schools based and devised the session based on this reading

17. Describe the extent to which the number of sessions, their frequency, timing and duration for the educational intervention was delivered as scheduled

All the sessions were delivered as scheduled

Limitations

Whilst in medicine there is some consensus on the competences associated with evidence-based practice - Dawes, Summerskill, et al. (2005) – this does not appear to be the case in education.  As such the check-list may or may not be relevant to education. And of course, it provides only the sketchiest of outlines of how the implementation was Implemented and no data of the impact of the intervention on pupils outcomes. Nevertheless, the check-list does provide a time efficient way of capturing the essence of what was done, and we should never the perfect be the enemy of the good.

And finally

If you are interested on the use of check-lists may I suggest you have look at the work of both Atul Gawande and Harry Fletcher-Wood. (see http://evidencebasededucationalleadership.blogspot.com/2015/10/the-school-research-lead-can-research.html for references)

 

 

 

 

References

Brown, C. (2017). Research Learning Communities: How the Rlc Approach Enables Teachers to Use Research to Improve Their Practice and the Benefits for Students That Occur as a Result. Research for All. 1. 2. 387-405.

Dawes, M., Summerskill, W., Glasziou, P., Cartabellotta, A., Martin, J., Hopayian, K., Porzsolt, F., Burls, A. and Osborne, J. (2005). Sicily Statement on Evidence-Based Practice. BMC medical education. 5. 1. 1.

Griggs, J., Speight, S. and Farias, J. C. (2016). Ashford Teaching Alliance Research Champion: Evaluation Report and Executive Summary. Education Endowment Foundation.

Hammersley-Fletcher, L., Lewin, C., Davies, C., Duggan, J., Rowley, H. and Spink, E. (2015). Evidence-Based Teaching: Advancing Capability and Capacity for Enquiry in Schools: Interim Report. London. National College for Teaching and Leadership.

Phillips, A., Lewis, L., McEvoy, M., Galipeau, J., Glasziou, P., Hammick, M., Moher, D., Tilson, J. and Williams, M. (2016). Explanation and Elaboration Paper (E&E) for the Guideline for Reporting Evidence-Based Practice Educational Interventions and Teaching (Greet).. University of South Australia

Phillips, A. C., Lewis, L. K., McEvoy, M. P., Galipeau, J., Glasziou, P., Moher, D., Tilson, J. K. and Williams, M. T. (2016). Development and Validation of the Guideline for Reporting Evidence-Based Practice Educational Interventions and Teaching (Greet). BMC medical education. 16. 1. 237.

Speight, S., Callanan, M., Griggs, J., Farias, J. C. and Fry, A. (2016). Rochdale Research into Practice: Evaluation Report and Executive Summary. Education Endowment Foundation.

 

The school research lead and being a bit TIDiER - making school inquiries more rigorous and useful

Over the last seven days several  research related articles in the TES have caught my eye.  First, there was Joe Nutt saying - 'Good research is good - but experience is better' with research often so indigestible as to be of little use to teachers. Then, there was an article by  Martin George asking whether ‘edtech’ is immune from rigorous research, given that  pace of technological change makes the usual evidence-gathering on effectiveness redundant. Finally, we have Professor Barbara Oakley saying that too many education researchers ‘do not do research that is founded on the scientific method,’ resulting in a crisis of replicability.  In other words, when teachers and school leaders are wishing to use to research evidence, the evidence doesn’t exist, or if it does, it’s neither comprehensible or replicable

Now it’s fair to say that there are no simple or easy answers to the questions these articles raise.  However, at the level of the school when teachers report on a disciplined inquiry or some form of collaborative practitioner inquiry, there is something which can do  i.e. use a reporting checklist - to improve the quality of reporting and in doing so make the research more accessible and useful to both themselves and colleagues.

One such checklist is the TIDieR (Template for Intervention Description and Replication) Checklist - Hoffmann, Glasziou, et al. (2014) – I’ve adapted to report on an school-based intervention which provides one-to one support for pupils studying GCSE English.

TIDiER Checklist – One to one support for pupils studying GCSE English

1.     NAME – Provide the name or the phrase which describes the intervention.

  • Additional one-to-one support for pupils studying GCSE English.

2.     WHY – Describe any rationale, theory or goals of the essential elements of the interventions

  • The provision of additional support  may lead to an improvement in individuals performance in GCSE English examinations, with more pupils gaining grade 4 or better.

3.     WHO – Describe the participants and how they were selected for the intervention

  • The participants were Y11 pupils in a mixed sex comprehensive school, where examination results are consistent with national averages. and will below average numbers of pupils receiving the pupil premium.

  • Twenty pupils - out of a total of 150 pupils studying GCSE English - were identified by English teachers as being on grade 5/4 borderline for GCSE English were asked to attend the activities associated with the intervention.  The twenty pupil included 12 boys and 8 girls.

4.     WHAT - Materials: Describe any physical or informational materials used in the intervention, including those provided to participants or used in intervention delivery or in training of intervention providers. Provide information on where the materials can be accessed (e.g. online appendix, URL).

  • Existing teaching resources were used – with teachers pooling resources .  Additional resources  were also created to respond to specific teaching problems as they emerged.  These were also shared.

5.     WHAT : Procedures: Describe each of the procedures or activities, and are processes used in the interventions including any enabling or support activities.

  • The night before their scheduled session all pupils involved in the intervention received a text message reminding them of the time and place of their support session.

6.     WHO PROVIDED: For each category of intervention provider – teachers, pastoral support, teaching assistant etc describe their expertise, backgrounds and any specific training given.

  • All teachers (five) within the English Department  were used to providing the one-to-one support to pupils.  No additional raining was provided.

7.     HOW: Describe the mode of delivery of the intervention – large group teaching, small group teaching, one to one, online etc.

  • Additional support was provided on an one to one to basis to individual pupils

8.     WHERE: Describe the location where it occurred, any necessary infrastructure or other features

  • The sessions were provided each day (Monday – Thursday) after school between 4.00 pm and 4.45 pm and held individual teacher’s base rooms.

9.     WHEN and HOW MUCH – Describe the number of times the intervention was delivered and over what period of time including the number of sessions, their schedule, and their duration

  • Each pupil was scheduled to receive 10 sessions – spread over 12 weeks, commencing in February and ending the middle of May.  Each session was expected to last 45 minutes.  

10.  TAILORING Was the intervention planned to personalised or adapted for the needs of a particular group – if so, then describe what, why, when and how.

  • Those pupils allocated to the programme were provided with a personalised programme of work – which was devised after discussion between the pupil, the class teacher and the teacher providing additional support. 

11.  MODIFICATIONS If the intervention was modified during the course of the study, describe the changes – what, why, how and when.

  • Due to staff absence – 1 member of staff was absent for the period of the intervention – those sessions were delivered by a teaching assistant

12.  HOW WELL : Planned : If intervention adherence or fidelity was assessed , describe how and how and were any strategies used to maintain/develop fidelity.

  • It was hoped that pupils would attend on average 8 sessions.  Where sessions were missed, emails were sent to the both the GCSE English teacher and group tutor to ask them to remind pupils to attend future sessions.  Where teachers were not available to take the planned sessions, support was provided by a teaching assistant

13.  HOW WELL : Actual : If intervention adherence or fidelity was assessed , describe the extent to which the intervention was delivered as planned

  • The mean number of sessions attended was 7.  Seven pupils (35%) attended ten sessions, six pupils attended nine sessions (30%) with three pupils attending one or less sessions.  The remaining four pupils attended between six and eight sessions.

14.  OUTCOMES : Actual : What outcomes were obtained

  •  19  out of 20 pupils gained at least a grade 4 in GCSE English  

15.  DISCUSSION : What has been learnt and is relevant internally and externally

  • The provision appeared to have made an impact – as in the previous academic year only 50% of a similar group pupils gained at least a grade 4

  • Each member of staff involved had to commit around 30 hours of additional time to support the innovation and was only possible due to their commitment

  • Other activities – which could have taken place in after school meetings had to be delayed till later in the year.

  • All the staff involved were experienced and effective practitioners – the model may need to be adjusted for a different profile of staff

  • Consideration needs to be given whether small group support should be provided for pupils

Limitations

Of course, when you use a checklist there are drawbacks. Although the check-list might will help you to report on the intervention, it still might not capture all the complexity of what has happened. By adopting a check-list may lead a reduction in creativity in the ways in which teachers report on interventions. Adopting a check-list may also be perceived as increasing the workload of teachers  

And finally

There are no easy answers when it comes to addressing some of the problems with using research evidence.  That said, regardless of whether you are someone who is producing research evidence or using it to help bring out about improvements for pupils, if you are conscientious, judicious and explicit in your use of evidence, you will not go far wrong.

References

Hoffmann, T. C., Glasziou, P. P., Boutron, I., Milne, R., Perera, R., Moher, D., Altman, D. G., Barbour, V., Macdonald, H., Johnston, M., Lamb, S. E., Dixon-Woods, M., McCulloch, P., Wyatt, J. C., Chan, A.-W. and Michie, S. (2014). Better Reporting of Interventions: Template for Intervention Description and Replication (Tidier) Checklist and Guide. BMJ : British Medical Journal. 348.

Humphrey, N., Lendrum, A., Ashworth, E., Frearson, K., Buck, R. and Kerr, K. (2016). Implementation and Process Evaluation (Ipe) for Interventions in Education Settings: A Synthesis of the Literature. Education Endowment Foundation, London.

The school research lead and PDSA cycles - what's the evidence .

One of the challenges in commentating about evidence-based practice, particularly if you make suggestions as to how to tackle a particular issue, is making sure you have ‘research evidence’ sitting behind whatever advice you may be giving.  In my recently published book – Evidence-Based School Leadership: A practical guide – I suggest that when acting on evidence – it makes sense to use a succession of Plan-Do-Study-Act (PDSA) cycles.  As such,  I was delighted when I came across some research by  (Tichnor-Wagner et al., 2017) which examines how educators in the US  responded to the use of PDSA cycles.  So the rest of the post will:

·      Briefly describe the characteristics of a PDSA cycle.

·      Review Tichnor-Wagner et al’s research on PDSA cycles.

·      Consider the implications for evidence-based practitioners – of whatever level – within schools

The PDSA Cycle

PDSA cycles have their origins in both the quality assurance and improvement science- (Deming, 1993)  (Langley et al., 2009) and (Bryk et al., 2015).  Put simply, a PDSA Cycle is a tool for planning, implementing, refining and improving an intervention or change, and is designed to help you answer three questions:

  • What are you trying to accomplish?

  • How will we know whether the change is an improvement?

  • What changes can we make that will result in improvement? (Langley et al., 2009)

There are four steps which are designed to be carried out repeatedly to help answer new questions as the intervention unfolds and develops:

·      Plan a small intervention— or small test of change—to learn, making predictions about·  the outcome of the intervention; 

·      Do - doing or executing it in practice - Implementing the change as planned. Collect data and document problems alongside unexpected observations. You also begin analysing the data in this stage.

·      Study - complete analysis of data and compare it with the predictions and expected outcomes. What are the lessons? Were there any unintended consequences, surprises, successes or failures

·      Act – reflecting and acting upon what was learnt in the first three phases

Screen Shot 2019-01-26 at 20.34.54.png

With this process being repeated as many times as required.

Tichnor-Wagner et al’s research on PDSA cycles

Tichnor-Wagner er al drew upon a multi-year research project and involved a comparative case study of innovation design team in two large, urban school districts engaged in improvement work In the study innovation design teams were introduced to PDSA cycles to help them; first, further develop, refine, implement, and scale the designed components of an innovation, and second, to build the capacity of schools and the district to engage in continuous improvement for innovations they might implemented in the future.  Data was collected through the use of semi- interviews with members of the innovation design teams who participated in PDSA training and implementation, surveys of participants after training as well as  documents and field notes from observation of those trainings.  Data analysis involved a first round of descriptive coding that captured examples of the local context  and learning opportunities.  From the first round of descriptive coding a subset of codes emerged , which were then used for a second round of coding. Processes were put in place in ensure the inter-reliability of the coding undertaken by researchers – 90% reliability.  Subsequent, in depth-memos were produced for each of the school districts.  Addtional analysis then examined themes related to the will and capacity of innovation design team members to implement PDSA 

Findings

In both districts, participants’ perceptions of PDSA revealed that school and district practitioners saw value in PDSA. 

The PDSA cycles built on work that participants were already doing, suggesting that PDSA may be an incremental rather than radical change to current school practices.

However, although participants thought  PDSAs were similar to what they already do, they  also felt the activity disconnected from their daily work.

Practitioners valued both the PDSA and the innovations they were testing through PDSA, they resisted the specific (researcher) forms they had to fill out for each phase and the scheduling of when the cycles would take place, which caused frustration about PDSA

There  were problems in finding time and expertise for PDSA work— which indicates that additional resources may need to be made available to support innovation and development. 

Implications for evidence-based practitioners

Drawing on Tichnor-Wagner discussion of the findings – the following may be useful for you to consider when using PDSA

There is a lot to be said for implementing intervention, which practitioners see value in and this will contribute to the motivation to engage and use the intervention.  With that in mind, I’d recommend having a look at the work of (Michie et al., 2011) and the role of motivation in their Behaviour Change Wheel.

Even though the PDSA cycle or version of it may be familiar to colleagues within your school, don’t’ assume there is the capacity or capability to make best use of the innovation.  This suggests, that you need to ensure you have a sense of colleagues’ current levels of expertise, prior to implementation.

If you are trying to get colleagues to PDSA cycles, try and make sure they replace some other activity, and become of day to day work of the school, rather than something that is ‘bolted on’.   This also applies to anyone providing training on the use of PDSAs

If you decide to use  disciplined inquiry incorporating PDSAa - to replace some element of your existing performance management (PM) processes, the problems with PM won’t disappear, they’ll in all likelihood just change.

You might want to give consideration as to whether PDSA cycle’s are applicable to all forms of intervention within your school.  For example, how useful is the PDSA cycle if you are looking to develop higher levels of trust within your school.

And finally 

It’s always worth keeping your eye out for research that makes you think about things, you take for granted or think are relatively obvious.  Even relatively simple ideas may need high levels of support to be implemented well.   

References

Bryk AS, Gomez LM, Grunow A, et al. (2015) Learning to improve: How America's schools can get better at getting better, Cambirdge, MA: Harvard Education Press.

Deming W. (1993) The new economics for industry, education, government. Cambridge, MA: Massachusetts Institute of Technology. Center for Advanced Engineering Study.

Langley GJ, Moen R, Nolan KM, et al. (2009) The improvement guide: a practical approach to enhancing organizational performance, San Francisco: John Wiley & Sons.

Michie S, Van Stralen MM and West R. (2011) The behaviour change wheel: a new method for characterising and designing behaviour change interventions. Implementation Science 6: 42.

Tichnor-Wagner A, Wachen J, Cannata M, et al. (2017) Continuous improvement in the public school context: Understanding how educators respond to plan–do–study–act cycles. Journal of educational change 18: 465-494.