Why isn’t Engelmann in our library?

Following the little kerfuffle over the new BPP University PGCE, there has been a bit of discussion about whether or not Theory of instruction: Principles and applications by Engelmann and Carnine, should be available in university libraries for PGCE or other Education students.

It isn’t in my university’s library. I’m open to persuasion that it should be but here is what I think at the moment. I would be interested in any corrections or constructive comments.

Firstly I think it is important to be clear that Direct Instruction (DI, with capitals) refers to specific programmes of scripted lesson sequences. The original DISTAR programmes were designed for EYFS and KS1. More recently programmes for older primary children and an intervention for struggling readers have been developed. The approach is very specific, to the point where there is a rubric for deciding if a programme is genuinely Direct Instruction or not. It is worth quickly skimming this just to see its length (133 pp.) and level of detail. However, if you want to understand DI you might be a lot better off looking at this really excellent interactive tutorial for psychology students at the University of Athabasca.

By contrast, direct instruction (without capitals) tends to refer to any teaching that involves the teacher setting out the expected learning clearly, as opposed to providing learning activities that help the learner to construct their own learning. Typically, this is narrowed further to a situation in which a teacher is engaged in whole-class interactive teaching: laying out clear explanations, asking questions, modelling, and then getting children to practice independently, perhaps initially with scaffolding that is gradually withdrawn.

DI is clearly a good example of direct instruction, but the overwhelming majority of direct instruction is definitely not DI. Use that 133 page rubric to check if you’re not convinced!

Project Follow Through provided pretty conclusive evidence that the DISTAR programmes were very effective for disadvantaged young children in the USA in the ’70s. Arguably that’s a bit dated but I think, given the strength of the evidence, it is highly unlikely that a similar evaluation now would not produce a similarly positive result. Is that also evidence in support of direct instruction generally? Yes, I think so, but only weakly. Why? Because it’s not clear which elements of DISTAR made it so effective.

Here are the features of DI:

  • Attention focused on the teacher.
  • Active, cued, responding as a group and individually.
  • Frequent feedback and correction.
  • High pace.
  • Children taught in small groups; grouped by ability.
  • Scripted presentation.
  • Meticulously designed instruction
  • Faultless communication achieved through application of logic
  • Sequence of examples and non-examples, leading to generalisation

It is these last four, and arguably the grouping, that mean the DI is not conclusive evidence in favour of direct instruction generally. Firstly there is other evidence that suggests expertly designed, scripted lessons may be more effective than lessons planned by individual teachers. Secondly, most direct instruction does not use the logical system of examples and non-examples that is central to DI.

Much better evidence in favour of direct instruction is available. The two main sources I make use of are the ubiquitous Kirschner, Sweller & Clark (2006) (or the lighter professional version of this paper Clark, Kirschner & Sweller (2012) ) and Muijs & Reynolds (2010). The latter is a very good overview of the evidence on effective teaching and learning from two important figures in the Educational Effectiveness Research movement. I should state that Daniel is a colleague and David used to be, but some time on Google will probably establish their credentials and some of their thinking.

But if DI and direct instruction are both effective, why are my students not reading about both? Well, I think that DI is really about curriculum design and I think it is at completely the wrong level, with it’s incredibly meticulous approach, for anyone planning sequences of lessons as a trainee or early-career teacher, or a HoD squeezing out a new SoW in their evenings and weekends. There are things that have worked for me in the past, that are similar to some features of DI, and I share these as part of the PGCE I work on. There has been work done elsewhere (I’m particularly aware of Kris Boulton) but I haven’t come across anything fully developed, nor have I come across any of the NFIDI programmes in use over here. If I’m missing something then do please let me know. I think there should probably be more comparison of examples and categorisation exercises in teaching, and therefore in ITE, but until I’ve read an accessible overview that my trainee teachers can make properly make use of I don’t think there is very much they can usefully take from reading Theory of Instruction.

There is another argument I’m aware of. As a Russell Group university with one of the leading schools of education in the country, perhaps this book should be available for students engaging with the traditional/progressive debate, and maybe part of this is to be aware of the way in which DISTAR was sidelined despite the quality of evidence in its favour from Project Follow Through, but Theory of Instruction isn’t going to tell them much about that. Possibly a later publication deals with that well – if anyone thinks that there is a book specifically on this topic that beats the journal articles that the students can already access then let me know and I’ll pass on the recommendation to colleagues whose modules head in that direction. However, as far as the PGCE goes, I’m going to upset some people by saying that I don’t think the progressive/traditional debate is very useful and I don’t think I’m the first person to suggest that focusing on evidence-informed ideas about effective teaching is more helpful to trainee teachers than getting into the sociology of education. I therefore intend to stick with direct instruction, and will park Direct Instruction until I am convinced of its utility: convince me!

References:

Clark R., Kirschner P. & Sweller J. (2012) Puttting Students on the Path to Learning: The case for fully guided instruction. American Educator, Spring 2012

Engelmann S. and Carnine D. (1991). Theory of instruction: Principles and applications. Eugene, OR: ADI Press

Kirschner P., Sweller J. and Clark R. (2006) Why Minimal Guidance During Instruction Does Not Work: An Analysis of the Failure of Constructivist, Discovery, Problem-Based, Experiential, and Inquiry-Based Teaching. Educational Psychologist. 41(2) pp.75-86

Muijs D. and Reynolds D. (2011). Effective Teaching: Evidence and Practice. London: Sage Publications Ltd.

Advertisements

NKOTB – the BPP PGCE

The announcement of the new BPP PGCE in the TES yesterday generated a lot of twittering, most of which struck me as knee-jerk in nature, unduly negative, and rather uninformed. Now, given the nature of Robert Peal’s polemic against the educational establishment – which ended by describing all those people who (whether right or wrong) had dedicated their working lives to trying to provide a good education for our children, as a persistent national embarrassment – I’m not overly bothered that criticisms that he only trained 6 years ago, hasn’t even been in the classroom for all of that time, and therefore doesn’t have the necessary experience to run the course, are a bit personal. However, I do think trying to tar the subject tutor team with the same brush is unfair since some of them can count their teaching experience in decades and all those I know anything about have clearly been effective teachers and have thought long and hard about teaching within their subject specialisms.

However, my main issue with a lot of the reaction is the misunderstanding of the nature of the PGCE in relation to QTS and the role of the university and the SCITT. I can forgive anyone for not being able to understand the current complexity of ITE in England – I regularly fail to explain it clearly to prospective trainee teachers myself – but a lot of the negative commentary and questioning clearly assumes that the BPP PGCE is all the training that is provided and I very much doubt that’s accurate.

Now, I don’t know exactly how this new PGCE integrates with the Pimlico-London SCITT so instead of speculating, this is what I do know, based on working with various SCITTs and HEIs.

A SCITT is an accredited ITT provider, based around a school or group of schools and approved by the NCTL to recruit trainee teachers and recommend the award of QTS. As such, it is their responsibility to ensure that all the Teachers Standards are met, that trainee teachers get suitable training (including things like professional expectations, subject knowledge, planning, behaviour training, understanding of SEND issues, assessment and so on) and good support, and that the quality of NQTs at the end of the course is appropriate. Ofsted’s view on this is that all NQTs should be exceeding the Teachers’ Standards at the point QTS is awarded and if Ofsted inspect and conclude that the training has omissions or the NQTs are not good enough, then a Grade 3 (or 4) effectively shuts them down.

All of this relates only to QTS; a SCITT is not a university and cannot award a PGCE, which is a post-graduate qualification, usually at Level 7 with associated Masters credits. Typically the PGCE consists of two or three assignments totalling 12000 words. These will usually include relating practice to theory and may include collecting data from lessons but, importantly, the PGCE is an academic qualification and the grading has absolutely no link to the actual quality of classroom practice of the trainee teacher.

Some SCITTs therefore just do QTS but most have a contract with a university to provide the additional PGCE. This gives the training more kudos and makes it more portable e.g. to Australia. How much input the university has depends entirely on the nature of that contract. Here are some examples that I know exist:

The university train the SCITT tutors to deliver the PGCE and never work directly with the trainee teachers. The university sets the PGCE assignments but the SCITT tutors do the input and mark the work, and the university just moderate the marking and make the award.

The university provide all the teaching and assessment of the PGCE assignments. This might involve the trainee teachers going to the university, or university tutors coming to the SCITT.

The university provide all the teaching and assessment of the PGCE assignments and also some subject-specific training. This is what we do at Southampton. The advantage for the SCITT is just related to economies of scale. A small SCITT may only have 20-30 trainee teachers. Pulling an experienced teacher out of the classroom for every subject, for a dozen days a year, to do subject-specific training, is difficult, whereas a university has enough trainee teachers for there to be someone (like me) in each subject for whom this is the core of a full-time post.

Set against this background, the BPP PGCE model looks perfectly acceptable to me. The university are supplying some training, which is likely to be more subject-specific than the SCITT can provide on their own, and the PGCE assignments, which the SCITT can’t provide at all. The only thing that is a bit unusual is that BPP itself is not an accredited ITE provider and it doesn’t have a track-record, or any intention, of producing education research. I’m not convinced that’s a problem. In some universities, the QTS and PGCE work is done by tutors who have a research element to their contract. In others, like Southampton, there is some overlap but most of the tutors are on teaching-only contracts. I don’t think that reduces the quality of what we do at all.

Finally, PGCE tutors are typically qualified at Masters level, and I think that’s not entirely the case for the BPP PGCE. On the other hand the intellectual capability of the team is clearly strong and it is the university’s job to ensure standards are appropriate, which presumably includes having an external examiner from another university.

So, it’s different in flavour, and it’s sort of starting from scratch. I think the content of ITE generally, and this course in particular, is a legitimate, and very important, area for debate (and helpfully some clear indication of this is already available for English and history). I think knee-jerk responses of “this must be rubbish” because of who it is and how it is set up, are unhelpful. We train 40 000 teachers each year in this country. In that context BPP is a tiny but potentially interesting innovation. Let’s either engage with what they are actually doing (or ignore it, if preferred), and see how it goes.

KS1 SATs: A parent’s perspective

Last night was the final parent’s evening before my son moves from Infant to Junior school. This is a big change for us all; we will be turning right instead of left when we get throught the school gate in the morning, and there will be NO AFTERNOON BREAK!!!

Possibly there will be other differences, but it’s the afternoon break my son is focusing on for now.

As a secondary teacher, I have a passing idea of what ‘working towards’, ‘working at’, and ‘working in greater depth’ might mean in relative terms, but nothing more. I have a peripheral awareness of the somewhat traumatic events around the changes to KS1 and KS2 assessment. Beyond that, I’m not so different to other primary school parents (perhaps a bit more forgiving – good grief some parents expect miracles of perfection for £5k/year) so I thought I would jot down my experience of the end-of-key-stage reporting from the other side of the school gates.

I have to say, the school have been absolutely brilliant about how they have handled the SATS. There was one meeting on a Tuesday afternoon that neither my wife nor I could get to (amazingly, we are sometimes both working – I have very few criticisms of my son’s school but a bit more awareness that we are not always available to either attend meetings, or build castles/sew Robin Hood outfits/bake cakes/construct papier mache dinosaurs would be one). In the absence of a helpful video recording or something, I tentatively probed to see if there was any sense of pressure coming on the children; my son had no inkling at all. The closest he could come to understanding what I was asking, was the weekly spelling test. The other parents said the same thing. Now, given this is an Infant School, I think that shows exceptional professionalism and possibly low cunning on the part of his teachers. Not only did they avoid putting any pressure on the kids but they seem to have actually managed to do all the assessments without the children even noticing. That is really impressive.

At the parents’ evening we got the results. So, as a parent, what matters to me? Well, the teacher in me is slightly appalled by the parent in me but, hey, suspendisse discimus – I largely don’t care about the detail of what he can and can’t do; I do care how he compares to other children in Y2. That is almost completely the opposite of what I want as a teacher from any assessment process. As a teacher, it doesn’t matter to me where someone is in the pecking order because you can only start from where you’re at. As a teacher I want specific detail that helps me to find the gaps and fix them.

But, as a parent, I’m middle-class enough to think that academic success matters, and have read enough to know that this is actually true, so I want to know my son is doing well enough to be in the top part of the heap. I don’t care whether or not he is right at the top (the local GP’s son seems to have that spot sown up anyway). It was exactly the same when the children were grouped by achievement for maths and guided reading – I know how much better it is likely to be in the long term to be in the top groups, rather than the middle ones, even if working hard to catch up with lots of movement between groups. I just wanted to know he was doing well, rather than struggling – that there wasn’t a chance that some drawbridge would be raised and he would be left to swim the moat.

So, from that perspective “working at greater depth at the expected standard” is just fine (yesss – get in!), however arbitrary the assessment process. I’m sure as a teacher I would be tearing my hair out, and I really have very little idea of what this standard is, but it tells me he is doing pretty well, and that’s actually all I need to know. I do hope his Y2 and Y3 teachers do a good job of communicating the specifics as part of the school transition but he is only moving to a classroom 50m across the playground.

As a further bonus, we found out at parents’ evening that, although his maths was always going to be ‘at greater depth’ his reading and writing were not as secure. As a consequence he has been in a small group doing some carefully targeted extra work, particularly on writing. In other words, his progress has been of greater value to the school than the progress of other children and so he has had more than his fair share of attention. As a parent, that’s great; from my professional prospective it doesn’t seem very fair. Also, those are points in the bag now for the Infant School, but it’s going to be tougher to add the expected value for the Junior School (as Education Datalab has pointed out, although that analysis shows additional subtlety). I just hope that in Y6, the teachers are as good at keeping the pressure off the children and he doesn’t get flogged to make some unreasonable amount progress.

To finish, I have to say his teachers have been absolutely brilliant across all three years, and I am incredibly grateful for the way they have guided him, and his classmates, through the amazing complexity of learning to read, write, and the basics of maths. I am painfully aware of the obtuseness of the English language in a way I never was before (just trying to help him untangle tough, though, through, thought, and thorough, at the moment – good grief!). I am more aware than ever of how critical an extensive vocabulary and cultural literacy has been for my son at school, and what some children miss out on in this respect. I’ve been astonished to see the way successful, professional adults – some holding down jobs paying more than twice as much as mine – can be pushed around by their own children, and I’ve discovered that I like KS1 kids a lot!

I hope Junior School is going to be just as good.

And you should read Tom Sherrigton’s blog on assessment standards and the bell curve – same idea, I think, but a different perspective.

 

Practical: The Fallacy of Induction

@SotonEdThis is the third and final post in a series about the value of practical work in science. In the first post I have suggested that science trainee teachers (and possibly some qualified teachers too), have a tendency to make assumptions about the value, and the learning, associated with practical work in science. In the second post I illustrated this with an example and briefly tackled two questions I think are important: whether or not children enjoying practical work is sufficient justification, and whether or not just doing practical will make them better at it. I left a third question hanging and ask it again now.

Do children learn important science ideas and/or develop their understanding from seeing the theory ‘in the flesh’? Often trainee teachers think that this is self-evident. I’m not convinced.

Some of the most useful work on children’s ideas, and misconceptions in science was completed by Rosalind Driver and colleagues in the 1990s. I think this is an essential resource for all science teachers because of the evidence that knowledge of children’s misconceptions is an important distinguishing feature between more and less effective teachers. Some may find elements of the suggestions for classroom practice overly constructivist but for me, as well as the identification of a whole range of misconceptions, the other really useful idea I have taken from this work is the ‘fallacy of induction’.
The fallacy of induction is the mistaken belief that children, when presented with relevant evidence, for example from practical work, will tend to work out (induce) the appropriate scientific theory.

The problem is that correct scientific theories are often simple when you know them, but are tremendously hard to generate directly from evidence. After all, it took a lot of very skilled scientific thinkers hundreds of years to do this the first time. What’s worse, children inevitably develop naïve theories as they grow up, so in secondary school they are sometimes not just trying to learn correct scientific thinking but are trying to un-learn naïve thinking that serves them perfectly well outside the classroom.

As teachers, we can of course select learning experiences, including practical work, that provide far more scaffolding and direction than Galileo, Copernicus, Newton, Darwin, Lavoisier, Faraday or Wegener were working with but, however well we do this, I think that induction from practical work, or other activities, is doomed to failure.

For conceptually straightforward science, where there are no misconceptions to overcome, I think that, as a science teacher, we can clearly see how the correct scientific principle follows from the practical observations and it is very easy to assume this will be apparent to the learners too. For the teacher, the scientific principle already exists as a complete and correct schema (like a mind map) in their long-term memory, and they know which features of the practical are relevant, so making this match is relatively easy. For the learner this is not the case. They just don’t have enough of the necessary knowledge chunked in long-term memory to manage the cognitive load – they can’t see the wood for the trees. Like many cognitive load problems, it may be possible to scaffold or adapt the activity sufficiently to allow children to see the wood, but you have to question whether a forest is the right starting place, or whether a nice piece of rough sawn timber from B&Q might be a better option.

Where there are misconceptions, Driver and others have suggested that cognitive conflict, created by exposure to direct evidence that the existing ideas are untenable, will help to resolve the problem. That was certainly my thinking for many years. It seems obvious that, when presented with evidence that is in conflict with their misconceptions, learners will tend to respond by correcting their ideas (their mental representations or schemas). What actually seems to happen a lot of the time is that they ignore, fail to focus on, or distort the evidence, so that their naïve theory survives and may even be reinforced. This explains why so many intelligent people stuck with Aristotle’s ideas about force and motion for a thousand years despite blatant evidence to the contrary.

The ideas of Daniel Kahneman and others help to explain why people have an overwhelming tendency to respond in this way. David Didau in his #WrongBook is also very good on the reasons why our response to contradictory evidence tends to be irrational.

My personal experience is that I have eventually learned the situations where my quick thinking will be wrong and I need to over-write with the correct scientific idea. For something like weight and mass I can pretty much do this automatically but with something more taxing like the tendency to see speed and mistakenly think as if force and acceleration behave in the same way, the best I can do is stop myself and know that I need to think very hard and apply Newton’s Laws with great care.

I don’t think typical practical work ever produces enough clarity in either the results or the conclusions to even begin to address these stubborn misconceptions. I love asking hinge questions, like the Veritasium videos, that throw up misconceptions, but the next step is to tackle the problem head on. I don’t think there are many situations where children can discover scientific principles directly through practical work and I think it even less likely that misconceptions can be effectively challenged and addressed.

So, what role does that leave for practical work in teaching science? I think, if you’ve read this far, you might be thinking there isn’t much practical work in my science teaching, and that perhaps the children taught by my @SotonEd trainee teachers aren’t getting much either, and what little they are getting is restricted to training in purely practical skills – accurate measuring, and manipulation of equipment. Not so! For me, practical work is terrific for the stage beyond basic theoretical knowledge, for three reasons:

Science is stuffed with abstract concepts and there is good evidence that concrete representations help children to understand these abstract concepts. I think sometimes physical models are more useful but practical work can often play this role. For example, you can find a good, clear explanation (with diagrams and perhaps photographs) of chromatography in any textbook but I think the actual physical process of separating out ‘black’ ink colours makes a big difference to children’s grasp of what this really looks like, and the time scale – that painfully slow diffusion – over which it happens.

Secondly, when new knowledge is acquired it will be very fixed to the original context. Deeper understanding comes from making this knowledge more flexible and filtering out the key points from the peripheral detail. Practical work provides an excellent additional level of complexity through which the scientific principle can be seen. Another way to think of this is that children often need to encounter the same idea in several different ways before it sticks; again, a practical can provide this.

Finally, there is something joyful about seeing abstract theory writ large (or often actually quite small) in the fabric of the universe. Science differs from other subjects because it is humankind’s ultimate attempt to describe, and perhaps even understand, the physical world around us. As science teachers, we need to be careful not to think that children see practical work the way we do, but if we ever lose the joy then it’s time to do something else.

Driver R. (1994). The fallacy of induction in science teaching. Chapter 3 in Levinson, R. ed. (1994) Teaching Science. London: Routledge

Nuthall G. (2007) The Hidden Lives of Learners. Wellington: NZCER Press

Pashler H., Bain P.M., Bottge B.A., Graesser A., Koedinger K, McDaniel M and Metcalfe J. (2007) Organizing Instruction and Study to Improve Student Learning: IES Practice Guide. Washington, DC: National Center for Education Research, Institute of Education Sciences, U.S. Department of Education

Sadler P.M. and Sonnert G (2016) Understanding Misconceptions: Teaching and Learning in Middle School Physical Science. American Educator. 2016 (Spring)

Shtulman A. and Valcarcel J. (2012) Scientific knowledge suppresses but does not supplant earlier intuitions. Cognition. 124(2) pp.209-215

Thorn C.J., Bissinger K., Thorn S. and Bogner F.X. (2016) “Trees Live on Soil and Sunshine!”: Coexistence of Scientific and Alternative Conception of Tree Assimilation. PLoS ONE. 11(1)

Willingham D. (2002) Inflexible Knowledge: The First Step to Expertise. American Educator. 2002 (Winter)

Practical: Grinding Frustration

I’ve seen a lot of trainee teachers knacker lessons up with a well run, but ultimately pointless, practical. Whole-class practicals, in particular, are massively time-consuming with many filling an entire lesson. If all that has been learned in that time is “It went blue, sir” then I don’t think that’s good enough.

The problem is usually a confusion over learning objectives. My previous blog set out the way I see learning objectives in relation to practical work but I’ll recount an example. I went to visit a promising trainee teacher a few years ago; the lesson was part of a unit on separation techniques I think; this was certainly part of a sequence on chromatography. When I looked at the lesson plan, saw that it was mainly going to be chromotography of pigments from leaves, and that the learning objective was “To separate leaf pigments by chromatography” I tried to help by asking her what she actually wanted the children to learn. I just couldn’t get past “I want them to separate the yellow pigment from the green chlorophyll”. It’s not fair to pull the rug just before an observation so I let it go and waited to see. Bless those lovely Y8s; they chopped and crushed and ground their little hearts out. They followed the instructions as well as they could, set up their chromatography paper (several submerged the spot in the propanone), and then did a little write up whilst they waited for the chromatograms to be ready. Some got a bit of green and yellow differentiation and the rest didn’t. Whilst they were working I went round and asked a few questions, such as “Can you tell me why you’re doing this?” and “What’s the point of chromatography?” I didn’t even get half answers, just pretty much universal “Don’t know”.

In the feedback session I didn’t get any further, really. The trainee teacher was very disappointed with the lesson. She carefully evaluated the quality of the practical work and made some perceptive comments about maybe splitting the practical into sections and briefing more closely to ensure the leaves were finely chopped, the amount of propanone was reduced, and the papers were set up correctly. But she completely and stubbornly failed to identify the problem, which was that, her ‘learning objective’ wasn’t about learning at all; it was about getting the practical to work. Had the chromatograms come out well, she would have been satisfied with the lesson,  Even when I directly asked the questions “Did the children understand the process?” and “Did the children understand what chromatography was?” and pointed out that they had been unable to tell me anything about these things, she couldn’t really see that this was a much bigger problem than the poor results.

There are plenty more examples where that comes from. Some worked nicely as expected. Some didn’t. All suffered irrevocably from a sense at the planning stage that the practical somehow justified its own existence just by being practical. Often, I find a defensiveness of practical work that I don’t see when pointing out other misaligned learning objectives. That sense that practical is self-justifying can be difficult to change. Why is this difficult?

In the end this boils down to the questions of whether or not (a) children enjoying practical work is sufficient justification, (b) just doing practical will make them better at it, and (c) children will learn important science ideas and/or develop their understanding from seeing the theory ‘in the flesh’. Often I think trainee teachers think, perhaps sub-consciously, that some or all of these are self-evident. I’ll tackle (a) and (b) here and leave (c) to another blog.

For me, enjoyment can’t ever be an end in itself for what happens in science lessons; that just reflects my personal belief in what school is for – no evidence presented. On the other hand, if enjoyment leads to better learning, higher motivation, more time doing science outside lessons, improved post-16 take up, and so on, then the judgement is maybe about balance between enjoyment and learning. I don’t have the expertise to offer a definitive review of the evidence but I’ve certainly been influenced by Abrahams (2009) Does Practical Work Really Motivate? and I’m not convinced practical work is as critical to motivation as is often assumed. The ASPIRES final report makes a brief reference to reduced interest in science after Y9, which might or might not correlate with reduced practical; personally I think it is the  GCSE curriculum content, and looming exams, that is to blame, but can’t offer more than a hunch.

Is it good teaching to explain how to do something tricky and complicated, and then get the children to try lots of examples with very general (that one’s good, that one’s bad) feedback? No, of course not. So why would practical skills be any different? Most of us have had years and years of experience through school, university, and maybe in the classroom, to hone our practical skills. Many of us have probably also taken things to bits and re-built them, developed fine motor, and problem-solving skills, through art and craft and cooking and all sorts. We tend to massively underestimate how difficult it is to extract chlorophyll from leaves, prepare a cheek cell slide, or connect up lamps in parallel. The cognitive load for these things, for children, is very high. In the lesson described above, the instruction sheet and the teacher were both clear about the level of the propanone on the chromatography paper, but at least a third of the class submerged the spot. There was just too much new information for them. These things need breaking down, step by step, with practice or at least immediate feedback at each stage. Without this, children just get used to practicals not working half the time (and working more often for the ‘smart’ kids and more rarely for the others) and accept this is the way of the world. Sometimes there is value in unexpected results, but not if a shrug of the shoulders is the typical response. If we are trying to teach practical skills then we need to plan carefully for those skills, and get precise and accurate work from the children.

Which takes me back to that chromatography lesson. I would have been very happy if the learning objective had been something like “To improve practical skills: use of mortar and pestle to extract plant material; setting up chromatograms; precise working” and then the trainee teacher’s reflection would have been at least a useful starting point. That was an aspect of the intention, but actually, if I’m being generous and assuming the practical wasn’t just picked because it was on the SoW, the stronger intention was something vague about understanding chromatography better by doing a practical example. Failure to separate learning practical skills from developing understanding is a big problem but this idea that doing a practical will improve understanding is, I think, the worst mistake.

Next blog coming up…

 

 

 

Practical: Young people’s view on science education from the Wellcome Trust Science Education Tracker

This morning the TES published a confusing article on key findings from the Wellcome Trust Science Education Tracker. This is a survey of over 4000 young people Y10-Y13 asking about their views on their science education and careers. The TES don’t even seem to have managed a link but the tracker, including a breakdown of the questions and responses is at https://wellcome.ac.uk/what-we-do/our-work/young-peoples-views-science-education

Hopefully readers from the science education community might have quickly got past the ‘hands thrown up in horror’ headline and be asking whether the survey tells us anything useful about the quantity or quality of practical work in schools and colleges. Actually there are 144 questions and only 3 are about practical work. There is a mine of useful data here for questions around post-16 STEM participation, science capital, and availability and participation in triple GCSE, which has been a problematical issue, but that’s probably best seen through the lens of the ASPIRES2 work. Hopefully they’ll blog about the survey results at some point.

I’ve only had a quick look but these are my first impressions of the 3 questions (T66-T68) directly asking about practical work.

Firstly, some caution is always required when dealing with self-report measures, and also the way the responses are reported. For example (T66), young people might well have different views on what constitutes “Designing and carrying out an experiment / investigation” and “A practical project lasting more than one lesson” but I can’t see how any of last year’s Y10 or Y11 could not have completed an ISA across multiple lessons. The responses to these two questions were about 75% and 55% respectively with 10% responding “None of these”. What were the 10% doing? Did at least 35% squeeze an ISA into one lesson or do their ISAs in only one of the two KS4 years? How many didn’t think an ISA was an investigation (justifiably?). My take on this is that we need the responses to the same question for current Y10 to see the impact of the new GCSEs, otherwise we are discussing history, but I’m not convinced about the merits of practical projects and multiple-lesson investigations anyway.

Secondly, it’s important to interpret the findings critically. About 1/3 were happy with the amount of practical work and nearly 2/3 would have liked more. As pointed out in @alomshaha’s excellent video, this might be because practical is an easy option, not because it is the best way to improve learning. Even children have a keen awareness of this issue; in the Student Review of the Science Curriculum (Murray & Reiss 2003), about 70% had “Doing an experiment in class” in the top 3 most enjoyable activities (along with watching a video and going on a trip) but only about 40% thought it top 3 for “Most useful and effective activities”.

However, there is one thing in the data we ought to be thinking about. These are the figures for “When doing practical work, how often would you say that you just followed the instructions without understanding the purpose of the work?”

just-follow-instructions

That suggests this statement is true for maybe 1/3 of practicals; this concurs with a lot of practice I see out in schools (from trainee teachers, mostly, but I have a suspicion it’s quite widespread). I think this is a problem.

It’s really good to see a School Science Review article by Millar & Abrahams (2009) here on the AQA website. This is a summary of a significant bit of work they, and some others, did looking at the effectiveness of practical work. Essentially the problem they identify is confusion over learning objectives. Just like all lesson planning, the objectives need to drive the activities, not the other way round. Whole-class practicals form such a big and obvious chunk of a lesson that it’s really easy to start planning from the activity. The trouble is that you then lose sight of the wood for the trees so that a successful practical outcome becomes the real objective – the one you focus on – although the lesson actually has an objective related to knowledge and application of science content. You then emphasise the procedure and just hope the children understand how it relates to the science content. And the children then just follow your instructions (hence the survey response) and, as Millar and Abrahams put it the emphasis becomes “producing the phenomenon”.

Millar and Abrahams go on to suggest there are three broad categories of learning objectives that are served by practical work and, based on a related article, I’ve broken these down further. I find this really helpful in getting a clearer focus on what purpose the practical serves in the lesson and therefore what the best way to approach it is.

broad-los2

If conceptual understanding is what you want, then the children need to spend time thinking about the practical in relation to the relevant content. There are maybe three options here:

  • Whole class practical with lots of time afterwards to do work on how the practical is a demonstration of the content.
  • Whole class practical with very high level of practical competence so children have capacity to think about the content.
  • Demo or video (maybe simulation) so children don’t have to think about manipulating equipment and the teacher can direct their attention with questions and explanations.

The second of these could come from prior learning, but could also be a result of very careful briefing. This is, I think, what @oliviaparisdyer is describing in her blog post about practical work. It is certainly how I remember my excellent O-Grade Chemistry teacher doing it, several decades back into the last century.

If investigative skills are what you want then don’t try to teach conceptual understanding at the same time and remember that as science graduates we tend to massively underestimate the complexity of designing and conducting a full investigation. That’s why the ISAs were such an unpleasant exercise in trying to temporarily get children to remember enough to hit whatever ridiculous coursework target grade they had. I’ve had the good fortune to work with A-Level students on some terrific independent projects (for A-Level Physics and EPQ) but even post-16 they are barely ready for high-quality work. In my experience, either very high-levels of scaffolding, or acceptance of interesting but very rickety work, are needed for 11-16 classes, though that may not be true for all teachers.

Finally, if practical skills are what you want, then again you need to focus on them. Something like reaction of copper(II)oxide with sulfuric acid and then filtering and evaporating to get copper(II)sulfate involves a stack of excellent practical skills to do well. This would be a great practical for improving these skills; I think it’s a massive waste of time for learning the chemistry of metal oxide + acid reactions. By all means combine the two, so do the practical in that unit, and start or finish with the chemistry, but don’t expect the children to learn anything about the chemistry content whilst trying not to scald or gas themselves or – more hopefully – produce nice blue crystals.

This blog is already a bit long; next post I’ll try to use an example to explore these ideas about confused objectives a bit further, and then I’ll try and write another post on why children don’t automatically develop understanding from seeing a scientific principle ‘in the flesh’ and about Driver’s excellent Fallacy of Induction.

 

 

 

 

 

Linking ITT and workforce data: a step in the right direction

I had the great pleasure of meeting Becky Allen back at the beginning of the year for a bit of a discussion about the work Education Datalab were doing on matching teacher training records to the School Workforce Census. I suspect a pretty monumental amount of effort has gone into nailing down the final details since then but two of the three linked reports are now published. I suggest you start here to either have a quick look at the key findings, or to access the full reports. So far I’ve just read the NCTL one.

It is immediately apparent that this is something the DfE ought to have done years ago. There is a lot of talk of evidence-based policy-making but any kind of genuine commitment to such a thing would have seen this sort of data-analysis set up prior to the seismic changes to ITT that have been implemented since 2010. Hey-ho; better late than never.

In theory this methodology could be used for a much longer-term project that might start generating some really useful data on the impact of various approaches to training teachers. It is easy to pick up this work and think it is limited to evaluating structural issues about ITT routes but if you consider the richness of a data set that can pretty much link every teacher in the maintained sector back to their ITT experiences, there is almost unlimited potential. Inevitably, for ITT providers, there is a pretty steady (and self-selecting) drift out of contact over the years after qualification. This work potentially solves that problem for research on any aspect of ‘what works’ in ITT. That’s something for the future; what of the findings here?

It would be tremendously easy for a lot of people in ITE to say “I told you so” in regard to the Teach First retention figures. Actually, I think the useful questions are more subtle than that but figures first. Using the lower-bound numbers, traditional HEI-led routes have about 60% of those initially recruited working as teachers in the maintained sector in their third year after qualifying. SCITTs are higher at 70% (but these would have been the early adopters). School Direct hasn’t been running long enough to have figures. Teach First is under 50%.

datalab retention graph

However, there are several things to remember about Teach First. Their qualifying year involves teaching potentially difficult classes, mostly in schools with more challenging behaviour, with variable levels of in-school/in-class support, whereas university-led trainee teachers are supernumerary, on lower timetables, and working in a wider range of schools, and rarely those in a category or Grade 3. Teach First are also possibly more likely to continue to work in more challenging schools although I think that is an assumption I would want to see data on because certainly some participants move from TF schools to schools at the opposite end of the socio-economic spectrum.

There are also a few things to remember about HEI-led courses. Financial survival, and the need to make up the numbers across all the shortage subjects, probably mean that in these subjects the HEI-led cohort has a longer tail than for any other route. SCITTs may have some of these pressures too but, particulary in the years for this report, are likely to have had the opportunity to be more selective. I suspect it’s the other way round for subjects like PE, English and history where the larger scale of HEIs generates a larger pool of applicants compared to SCITTs. Since shortage subjects make up the bulk of an HEI cohort, you would expect to have a lower qualification rate, and also some marginal grade 2s where support (or lack of it) in their employing school might determine success in their NQT year. As pointed out right at the beginning, the report can’t tell us anything about what would happen to the same trainee teachers if they were trained via a different route.

Teach First recruitment has been astonishingly successful. Having seen the marketing machine in action, and with access to funding that very few providers can match, that is perhaps not completely surprising but it has been terrific nonetheless. This means they probably have the strongest cohort of all at the start of training. For me, the critical question to ask is, if Teach First training was more like the HEI-led route, or a SCITT, would there be hundreds more high quality teachers still in the classroom. There is no way to tell from this report but, anecdotally, the Teach First participants I have worked with would all have had excellent outcomes on the HEI-led course or School Direct programmes I mainly work on. What I don’t know is whether they would have gone into teacher training at all.

If Teach First is mainly putting people who would never have tried teaching into struggling schools with teacher recruitment problems, to do a decent job for two or three years, then that is probably a justifiable use of public money; if they are putting potentially high quality, long-career teachers through training in a way that knocks an additional 10-20% off retention, that doesn’t look so good. I suppose there might be other benefits; I’m unconvinced by these but make up your own mind. Sam Freedman sets out the most positive case here.

What about the other findings?

  • Three regions of England – North East, North West and South West – appear to have large numbers of new qualified teachers who do not join a state-sector school immediately after achieving QTS.
    • This is pretty good evidence that the NCTL need to sort out the Teacher Supply Model, but that was already very apparent. We are waiting on tenterhooks for the announcement on allocation methodology (so presumably they are desperately trying to invent something at the moment; let’s hope they don’t make another almighty cock-up!
  • Those studying on undergraduate with QTS courses have low initial retention rates in the profession, though we cannot know whether this results from subsequent choices made by the individual or recruitment decisions made by schools.
    • They do, but the data also shows they catch up later. I suspect that if you have a B.Ed. sooner or later it becomes the best option for a professional career whereas PGCEs have their UG degree as an alternative option (depending on subject a bit)
  • Teach First has very high two year retention rates, but thereafter their retention is poorer than other graduate routes.
    • I’m hoping, perhaps in vain, that the move away from QTS  might link teacher development across from ITT into the first year(s) of post-qualification employment for other routes and get a bit of the 2-year TF programme effect into other routes.
  • Ethnic minority teacher trainees have very low retention rates.
    • I suspect because they are much more likely to have limited experience of the UK education system if educated abroad, and are also more likely to be EAL, both of which, in my experience, can affect classroom relationships. It would be enormously useful to have data that separates UK and non-UK educated teachers and drill down a bit. In my part of the world, UK-educated BME applicants are thin on the ground but I don’t notice anything that would lower their retention rate.
  • Individuals who train part-time or who are older have much poorer retention rates, which may simply reflect other family commitments that interfere with continuous employment records.
    • UoS doesn’t do part-time. I have a hunch that retention might actually be better for older trainee teachers on our Science PGCE – they do mostly need a proper job to pay mortgages whereas younger trainees often don’t have that commitment. On the other hand, whilst they are nearly all tremendous people to work with, developing into a good teacher is partly about developing habits that are effective in the classroom and I think changing habits gets harder as you get older. It’s also a very fast-moving environment when you are a novice and again I think adapting to this gets harder with age. They are quite often particularly good at developing relationships with teenagers though, so it’s swings and roundabouts, maybe.

So those are my first thoughts. I think we have some way to go to get stable and effective initial teacher education that is structurally sound and therfore with the potential for continuous improvement. NCTL have tried quite hard to break what we had; now we need to take the best of the many pieces and put them back together again, hopefully to end up with something better than before. High quality evidence is a key part of this process, as are people in high places that are prepared to pay attention to it. This report is a very important step in the right direction.