Guest Post : Unleashing Great Teaching by David Weston and Bridget Clay


This week's post is a contribution from David Weston and Bridget Clay who are the authors of Unleashing Great Teaching: the secrets to the most effective teacher development, published May 2018 by Routledge. David (@informed_edu) is CEO of the Teacher Development Trust and former Chair of the Department for Education (England) CPD Expert Group. Bridget (@bridge89ec) is Head of Programme for Leading Together at Teach First and formerly Director of School Programmes at the Teacher Development Trust.


Unleashing Great Teaching 

What if we were to put as much effort into developing teachers as we did into developing students? How do we find a way to put the collective expertise of our profession at every teacher’s fingertips? Why can’t we make every school a place where teachers thrive and students succeed? Well, we can, and we wrote Unleashing Great Teaching: the secrets to the most effective teacher development to try and share what we’ve discovered in five years of working with schools to make it happen.  Quality professional learning needs quality ideas underpinning it. But, by default, we are anything but logical in the way that we select the ideas that we use. A number of psychological biases and challenges cause us to reject the unfamiliar.

We all have existing mental models which we use to explain and predict. To challenge one of these models implies that much of what we have thought and done will have been wrong. We all need to guard against this in case it leads us to reject new ideas and approaches. This is nothing new.
In 1846 a young doctor, Ignaz Semmelweis, suspected that the cause of 16% infant mortality in one clinic might be the failure of doctors to wash their hands. When he ran an experiment and insisted that doctors wash hands between each patient, the deaths from fever plummeted. However, his finding ran so against the established practice and norms that his findings were not only rejected but widely mocked despite being obviously valid. This reactionary short-sightedness gave rise to the term The Semmelweis Reflex: ‘the reflex-like tendency to reject new evidence or new knowledge because it contradicts established norms, beliefs or paradigms.’

An idea that contracts what we already think, which comes from a source that we don’t feel aligned to, or which makes us feel uneasy, is highly likely to be rejected for a whole range or reasons , even if there is a huge amount of evidence that it is far better than our current approach.

Reasons for rejection

Confirmation bias is really a description of how our brains work. When we encounter new ideas, we can only make sense of them based on what’s already in our heads, adding or amending existing thinking. This means that anything we encounter that is totally unfamiliar is less likely to stick than something partially familiar. Similarly, an idea that is mostly aligned with our existing thinking is more likely to stick than something completely at odds – the latter is a bit like a weirdly-shaped puzzle-piece: it’s very hard to find a place for it to go.  The effect of all of this is that when we hear an explanation, we remember the ideas that confirm or support our existing thinking and tend to reject or forget the ideas that don’t.

But it’s not just the nature of the ideas that affect our ability to learn. If an existing idea is associated with the memory of lots of effort and hard work, it becomes harder to change. This sunk cost bias means that we excessively value things we’ve worked hard on, no matter whether they’re actually very good or not. This bias is also known as the Ikea Effect – everyone is rather more proud of their flatpack furniture than this cheap and ubiquitous item perhaps deserves, owing to the effort (and anger!) that went into its construction.

We also see a number of social effects that mean that we don’t just listen to other people’s ideas in a neutral way. The Halo Effect is the way we tend to want to believe ideas from people we like and discount ideas from people we don’t. Of course, none of that bears any relation to whether the ideas are good. Public speakers smile a lot and make us laugh in order to make the audience feel good and thus become more likely to believe them. Two politicians of different parties can suggest the exact same idea, but supporters of the red party are much more likely to hate the idea if they hear it from the blue politician, and vice versa. A teacher from a very different type of school is much less likely to be believed than someone you can relate to more – though of course none of this necessarily affects whether their ideas are good.

If someone does present an idea that conflicts with our current thinking and beliefs, they run the risk of Fundamental Attribution Error. When we come into conflict with others, we rush to assume that the other person is of bad character. Any driver who cuts you up is assumed to be a terrible driver and a selfish person, but if you cut someone else up and they hoot then you generally get annoyed with them for not letting you in. A speaker or teacher who tells you something you don’t like is easily dismissed as ignorant, annoying or patronising.

Using evidence to support professional learning 

So how do we ensure that we’re using quality ideas to underpin professional learning? In our book we lay out some tools to help you overcome your inevitable biases.

Firstly, it’s very useful to look out for systematic reviews. These are research papers where academics have carefully scoured all of the world’s literature for anything relevant to a topic, then categorised it by type, size and quality of study, putting more weight on findings from bigger, higher quality studies and less on smaller, poorly-executed research. They bring all of the ideas together, summarising what we appear to know with confidence, what is more tentative, and where there are areas where the evidence is conflicting or simply lacking.

If you are interested in a topic, such as ‘behaviour management’ or ‘reading instruction’ then it’s a really good idea to tap it into a search engine and add the words ‘systematic review’. Look for any reviews conducted in this area to get a much more balanced view of what is known.

Secondly, raise a big red flag when you can feel yourself getting excited and enthusiastic about an idea. That’s your cue to be extra careful about confirmation bias and to actively seek out opposing views. It’s a very helpful idea to take any new idea and tap it into a search engine with the word ‘criticism’ after – e.g. ‘reading recovery criticism’ or ‘knowledge curriculum criticism’.

Thirdly, be a little more cautious when people cite lists of single studies to prove their point. You don’t know what studies they’ve left out or why they’ve only chosen these. Perhaps there are lots of other studies with a different conclusion – only a good systematic review can find this out.

Finally, be cautious of the single study enthusiasm, where newspapers or blogger get over-excited about one single new study which they claim changes everything. It may well be confirmation bias – or indeed if they are criticising it then it could also be confirmation bias causing them to do so.

To conclude

Of course, good quality ideas are only one ingredient. In our book we also explore the design of the professional learning process, offer a new framework to think about the outcomes you need in order to assist in evaluation, and discuss the leadership, culture and processes needed to bring the whole thing together. There are many moving parts, but if schools can pay the same attention to teachers’ learning as they do to students’ learning, we can truly transform our schools and unleash the best in teachers.


.