A year ago I wrote about the conference that proved to be the beginning of the researchED movement, researchED13. Over the past year Tom Bennett has turned a Twitter discussion into something that has legs.
So we met again in London for a second national conference researchED14. Many of those who believe in evidence-informed practice gathered together to share ideas. To get some idea of the breadth of discussion, you can download the programme from the shiny new researchED website – which has the catchy address www.workingoutwhatworks.com
I had the opportunity to speak at the conference again this year (thanks, Tom) and talked about York Science as a research-informed project that is developing questions and tasks that will support the teaching of science at key stage 3. The presentation is available here. Measuring levels or monitoring understanding
After last year’s conference I expressed the hope that “before we meet again in 2014 we will have made progress in building the networks.” That has certainly happened – there were two regional researchED conferences earlier this year – one in Birmingham, and one here in York. More are in the pipeline – keep your eye on the researchED website for dates over the coming months.
And now we need to move beyond scepticism to sharing how people are becoming involved in research in their schools and using the good research evidence that is out there. There was some of that at this conference, I hope there will be more in a year’s time.
There should be – researchED is helping to make it happen, including supporting a network of Research Champions, teachers in schools who have taken on the role of Director of Research – the fact that these roles are beginning to exist is a sign of recognition by many head teachers and governors that research does matter. I look forward to hearing from these Research Champions at researchED15.
For more from the conference keep a eye on the website where there are videos of some of the talks, with more being added over the next week or so.
A year ago I wrote about the conference that proved to be the beginning of the researchED movement, researchED13. Over the past year Tom Bennett has turned a Twitter discussion into something that has legs.
At the recent NTEN ResearchED meeting at York (3 May 2014), I attended a very interesting talk entitled ‘The Learning Brain: a new science of learning’ given by Jonathan Sharples of the Institute for Effective Education and Education Endowment Foundation. One aspect of his talk really caught my interest with the York Science project in mind. Jonathan presented some research on the effect of repeated testing on learning. Four groups of research participants had to learn 40 words in Swahili, all participants had no prior knowledge of the language. Initially, all students studied the list of Swahili-English word pairs (e.g. mashua –boat) and were then tested (e.g. mashua – ?). However, once a word pair was recalled correctly, it was treated differently by each of the groups, as follows:
- continued to be studied and tested throughout
- studied but no further testing
- no further study but still tested
- no further study or testing of the words.
The results for the four groups in the study are shown in the Figure below. However, before reading on decide which of the results you think match each of the four groups.
The result A is group 1 (studied and tested), B is group 3 (no study only testing), C is group 2 (study with no testing) and D is group 4 (no study and no testing). The results provide evidence that testing does not merely measure learning but contributes to the process of learning, with repeated testing enhancing learning. Jonathan advocated that in addition to Assessment for Learning and Assessment of Learning maybe we should be thinking about Assessment as Learning. The repeated recruitment of neural pathways strengthening the retrieval networks involved in the learning. Could it be that the formative assessment tasks of York Science designed to allow teachers to obtain evidence of learning may also provide an additional learning opportunity for students?
Read more in Jeffrey D. Karpicke and Henry L. Roediger (2008) The Critical Importance of Retrieval for Learning. Science 319, 966-968
The paper can be downloaded from: http://learninglab.psych.purdue.edu/publications/
Anne Scott is a member of the York Science team.
It was researchED York2014 on Saturday, this was a collaboration between the researchED team and the NTEN. I was at Huntington School with 300 other educators who had given up their Bank Holiday Saturday to think and talk about some of the things that matter to them in the world of education, in particular about how we can use research to improve teaching and learning. The buzz was just as good as it had been at the first researchED last September. You can read more about the excitement of the day from Tom Bennett and Alex Quigley. This is blogpost covers the material I talked about on Saturday.
Measuring levels or monitoring understanding?
There is anxiety amongst some teachers and senior managers about what to do in the absence of levels, this article looks briefly at some of the reasons why levels were not fit for purpose – and at some ways of better checking progress in science learning.
What are levels?
Levels of attainment were introduced into the National Curriculum nearly 30 years ago with a report from the National Curriculum Task Group on Assessment and Testing – generally known as the TGAT report. There, levels were described in this way:
the scaling system should be directly related to the development of pupils’ competences as described by the attainment targets of the national curriculum. …………We shall use the word level to define one of a sequence of points on a scale to be used in describing the progress of attainment in the profile component.
How are levels used?
Formatively - to provide information about a student’s learning that contributes directly to the learning process by providing feedback to the learner.
Summatively - to describe a student’s attainment at a point in time (e.g. the beginning of the lesson, the end of the lesson, the end of a term, or year).
For accountability - to provide evidence of the effectiveness of teachers and schools.
This are all important purposes for assessment, but is describing a student bya level the best way to do all these things? In his excellent book Testing Times, The uses and abuses of assessment (2008) Gordon Stobart warns that
in accountability cultures with frequent high-stakes testing, making headway with formative assessment will be more difficult (p159)
If levels are to be used for reporting, how easy is it to use the same assessment formatively?
But there is another concern too; I have often heard people say something like, “well, we know levels are not perfect, but at least we know where we are with them”. But do you? In a useful report from NFER Where have all the levels gone? (Brill & Twist, 2013) the authors state “There are well documented discussions about whether it is possible to make reliable and consistent judgements confirming that a pupil is working at a certain level.” So are the judgements that teachers have to make, and that management use as an accountability measure valid? I suggest not.
But what to do?
York Science does not have the answer to how schools should monitor overall progress of a cohort, or report to parents on the progress of individuals, but we can help with monitoring progress in understanding within a lesson.
What do we need to do to describe progress?
Let’s begin by looking at what does not describe progress.
The two paragraphs above are both about the particle model of matter. One is from the new Key Stage 3 Programme of Study. The other is from the new GCSE criteria for GCSEs in Science and Chemistry from 2016.
Can you tell which is which?
So statements like this are not enough; a teacher would want to know what a student needs to do to show learning. The programme of study doesn’t give any information about that, the statement is set out as it is here. The GCSE criteria say that specifications should require students to: “recall and explain the main features of the particle model in terms of the states of matter and change of state, distinguishing between physical and chemical changes”.
Teachers will want to know what the questions will look like by the time students take their exams in 2018, they will have to wait until later this year or early next year for that, awarding organisations are working on this right now. In the meantime those students (currently in year 7) are already learning about the particle model. What sort of things would it be reasonable to imagine a key stage 3 student doing to show that they can recall and explain the particle model?
In York Science we begin by stating the Learning Intention, in this case:
- Understand a basic particle model of matter that can explain states of matter and changes of state.
Then we write some Evidence of Learning Statements that describe the sort of things we think a student will be able to do as they develop their understanding of the particle model.
Here they are, in no particular order:
You could sort them into an order that provides some differentiation:
But these could still be outcomes in a KS3 or KS4 scheme of work. What you need are the questions and tasks that will demonstrate evidence of learning at the stage the student currently is.
Let’s take one statement from that set:
“Understand a basic particle model of matter that can explain states of matter and changes of state”
You want to know if students understand how the particle model explains that liquids are runny. Here is a question written by Phil Johnson (2011) for his Assessing Students’ Concept of a Substance project at Durham University.
This is a great diagnostic question to use at the start of a lesson – it will identify not only those that get it right, but also provide information about what those who get it wrong are thinking.
So after some teaching to develop the students understanding the teacher wants to check that the students have made progress. Another question is called for (not the same one again, students might remember which is the correct answer, but not understand why).
Another source of questions that check understanding is the Evidence-based Practice in Science Education project (Millar, R. et al, 2002), which was based here at the University of York. This question comes from that bank:
This is more demanding than the last question – all the statements are correct, so now the student has to pick out the relevant statements, showing greater understanding than was needed for the first question. For a weaker group of students the teacher could tell them how many of the statements to choose.
Evidence of Learning – not evidence of levels
Individual questions and tasks can provide evidence of progress in understanding a particular idea at a granular level. But one question or task cannot provide hard evidence of the overall level of attainment of a student. Generic level descriptors need to be used at a summative level, maybe once a term, and certainly not during the course of an individual lesson.
Planning for progress in a lesson
And so we come back to the idea of planning a lesson using the backward design approach:
- Think about the Learning Intention – what learning do you want to take place in the lesson?
- Identify the questions and tasks that provide evidence of where students are at the start of the lesson and whether there is progression in their understanding during the lesson.
- Only then think about the teaching activities that will (hopefully) deliver that progression.
Thinking about obtaining evidence of progress first helps
to ensure that teaching activities are purposeful.
The presentation can be downloaded. YS Measuring levels or monitoring understanding
Brill, F. and Twist, L. (2013). Where have all the levels gone? The importance of a shared understanding of assessment at a time of major policy change. NFER https://www.nfer.ac.uk/publications/99940/99940.pdf
DES/WO (1988) National Curriculum Task Group on Assessment and Testing—a report
Johnson, P. and Tymms, P (2011) Assessing Students’ Concept of a Substance project, Durham University
Millar, R. et al (2002) Evidence-based Practice in Science Education project, University of York
Stobart, G. (2008) Testing times The uses and abuses of assessment. Routledge
If you attended the Association for Science Education Annual Conference in Birmingham last week, and if you’re anything like me, you probably spent Sunday in a state of zombie-like exhaustion. But hopefully you also – like me – feel that it was well worth the effort, you learned some things you didn’t know before, you made useful contacts, and you’re excited about putting some new ideas into practice.
The mostly dry and sunny weather in Birmingham was a pleasant change from the snow and howling winds of conferences past, and the beautiful University of Birmingham campus was a superb setting for a science education get-together. I joined the York Science project group in September 2013 after a number of years at OCR running the assessment and professional development programmes for GCSE Twenty First Century Science, so this was one of my first opportunities to talk about our resources with teachers.
What Would Your Students Say?
On our stand in the exhibition marquee we debuted our snazzy new York Science flier and challenged visitors to think about how their students would answer some of the York Science evidence of learning items (ELIs).
The bolt in the beaker provoked a lot of animated discussion, showing how a practical and very simple POE (predict-observe-explain) activity can be used to probe students’ (and teachers’!) knowledge – and misunderstandings – about forces. Play-Doh® cell models showed that formative assessment activities can be as fun for the students as they are informative for the teacher.
Many of the visitors to our stand on Friday were PGCE trainees. Although activities such as making cell models have been done in classrooms by many teachers for many years, to the trainees this was new and it was clear from their reactions that the power of using such an engaging activity as a way of testing knowledge was an exciting revelation. Dylan Wiliam (2011) wrote that “sharing high quality questions may be the most significant thing we can do to improve the quality of student learning”, and one of the aims of York Science is to share high-quality formative assessment items with teachers new and old alike.
Developing formative assessment in practice
While the stand enabled us to teach new dogs some old tricks, our workshop on Saturday was all about exploring the potential of effective formative assessment with some more experienced teachers. Run by Professor Robin Millar & Mary Whitehouse, the workshop – entitled ‘Developing formative assessment in practice’ – guided teachers through the principles of ‘backward design’ curriculum development, emphasising the benefits of starting by defining what you want students to be able to do and identifying how you will assess that before deciding what and how to teach to help them do it. During the workshop teachers worked in groups to create formative assessment items of their own using York Science materials as templates. The session was so well attended that we had to move to a larger room to accommodate everybody, and feedback from the attendees was positive and encouraging.
The introductory presentation from the workshop is now available to download, and keep an eye on the Inspired by York Science section of this website where we will publish some of the ‘Evidence of Learning Items’ created by participants.
If you visited our stand or any of our events during the conference please leave a comment below to let us know what you thought, or send us an email at firstname.lastname@example.org. You can also follow York Science, Mary Whitehouse and me on Twitter.
See you at ASE in Reading in 2015, if not before!
If you’d like to meet the York science team – and other members of the University of York Science Education Group (UYSEG) then come along to the University of Birmingham in the next few days.
It’s the ASE Annual Conference – an annual jamboree for science educators from across the world. Today (Wednesday) is the International Conference, which is followed by three more days of talks, workshops, and cutting edge science lectures.
I have written before about the ASE and this conference is its most visible manifestation, open to all, not just members, it is an opportunity for science teachers to meet with each other and also get some of the best value professional development possible. There is a wide range of talks and workshops to meet every need.
York Science will be presenting two sessions:
Thursday 9th January at 14.00 Getting to grips with lesson planning – we have been using our work in York Science to inform our PGCE Science programme at York. Anne Scott and Mary Whitehouse will be talking about how we have introduced the trainees to the backward design approach to lesson planning. LR8 in the Arts Building
Saturday 11th January at 11.00 Developing formative assessment in practice – I have written about how teachers have been inspired by York Science to take some of the question types and ‘make another one like it’. In this workshop Robin Millar and Mary Whitehouse will provide a example of a range of questions styles with suggestions for how to adapt to other contexts. Participants will develop new assessment items which we plan to put on this website for everyone to share. Room 12, Muirhead Tower.
UYSEG also has a stand in the exhibition marquee – stand BS18. Come along: Predict, Observe, Explain, collect a sticker for the prize draw, and chat to the team.
If you haven’t booked, it doesn’t matter, just come along for the day – see you there!
You know you are doing something right when teachers tell you that they used one of your resources and now they have adapted it for other purposes.
I wrote a blog post about diagnostic questions and confidence grids. and soon after that @DoctorACook tweeted about how she had used the PLC3.1 Dark room Evidence of Learning Item. And then she made some more confidence grids of her own and she wrote about them on this blog Using diagnostic questions.
As time went on I heard about other people who were using the Dark Room question and making more confidence grids. And I began to think about how we might share the ones people had made.
Other people picked up on my post about Making the best use of examination questions, including @NeedhamL56 , who developed it to use in training for the Science Learning Centres. @hrogerson used the examiner reports for OCR Gateway Science to make some questions for her students and wrote about it on her blog.
So people are picking up and running with our suggestions, which was exactly what we hoped would happen when the project began two years ago, and, even better, they are willing to share their items with everyone else.
Take a look at the new Inspired by York Science tab at the top and dip into the resources these lovely people have shared. I hope you will be inspired to make some more and share them too. There are even templates to download to get you on your way.
The Association for Science Education (ASE) celebrated its 50th Anniversary in 2013. For as long as I have been working in science education the ASE has been part of my life – as a young teacher Education in Science was my chief source of news about what was happening in science education and how I could get involved – of course there was no internet in those distant times.
Soon I was taking part in my first curriculum development project, spending my weekends meeting with teachers from across the country to write and trial Science in Society, led by John Lewis of Malvern College. Later, some of us from the Science in Society team went on to work with John Holman and Andrew Hunt on the development of SATIS . The SATIS project began in 1984 as the result of the report from an ASE working party which was convened to consider how the relationship between science, technology and society could be integrated into 11-16 school science. If you don’t know about SATIS take a look at the resources, which are now all available on the National Stem Centre elibrary. Although some may seem somewhat dated, there are many great ideas there for hooking students into science and developing ideas about how science works, and ‘thinking scientifically’.
ASE has been publishing resources for teachers over many years, and continues to do so. The ASE bookshop publishes and sells a whole range of books that should be in your department library. There are also journals and magazines for science teachers, including School Science Review, a peer-reviewed journal for secondary science teachers. I must declare an interest here as I edited SSR for a while, but it really is a great journal, arriving through the door four times a year, and specifically aimed at the classroom teacher, it contains an eclectic mix of ideas for new experiments, book reviews, and longer articles about science teaching and science education research.
And if you are not yet convinced this is an organisation you should join, I must mention the role ASE has in representing all science teachers to the rest of the world. ASE is a key partner with the learned societies in SCORE, which enables the science community and the science education community to speak with one voice to government.
I could go on to list may other ways in which science teachers benefit from the work of the ASE, whether or not they are members – #ASEChat, #ASEteachmeets, the Annual Conference (which will be at the University of Birmingham in January – see the side panel to the right) and so on…… But ASE is a membership organisation that largely operates thorough the goodwill of its members, many of whom put in countless hours to ensure that the classroom voice is heard in discussions at the highest levels. So if you want to continue to benefit from its work, why not join now?
Oh, and the reason I started this paean to the ASE? There is an article about York Science, based on my presentation at the ASE Summer Celebration Conference in June this year, in the latest edition of School Science Review.
You can download the article here Embedding assessment to improve learning. To read about other sessions at the conference you will need to become a member of the ASE.
Edited to add a link to SATIS Revisited mentioned by Nick Swift in the comments.
A few months ago there was a conversation on Twitter about researched informed practice in education. One thing led to another and on the first weekend in September 500 teachers, researchers and others with an interest in education gathered together at researchED 2013 to discuss how we could use research to improve learning in schools. I was invited to speak about how we use research evidence in curriculum development here at York.
There was a real buzz to the conference – after all these folk had all given up their Saturday to be there. Ben Goldacre started us off with an amusing, yet thought-provoking, talk about the connections between practitioners and researchers, drawing parallels with the ways in which medical researchers and practitioners share information, through formal and informal networks, such as databases of GPs willing to be involved in trials and in-house journal clubs. Of course we already have our own Science Teaching Journal Club , started by Alby Reid (@alby) and Alom Shaha (@alomshaha). The club meets periodically on Tuesday evenings on Twitter.
An education journal club is a place to discuss a published research paper on a topic of interest related to education, concluding with some thoughts about whether those in the group might try out the ideas in their teaching. A school where there is an interest in using evidence-based practice to develop teaching and learning might have a group of teachers interested in running such a club. King Edward VI Grammar School, Chelmsford is such a school. KEGS is a Leading Edge school where there are “teachers who are passionate about teaching, and leadership committed to research evidenced innovation to raise achievement and improve learning outcomes”. At researchED the headteacher, Tom Sherrington (@headguruteacher), spoke passionately about some of the many and varied ways he and his staff are carrying out research. Tom blogs regularly about his thinking about teaching, learning and school leadership. My take-home messages:
- it is possible for teachers to engage in small scale action research in school
- I’d love to be back teaching in a school like KEGS
Professor Robert Coe (@ProfCoe), Director of the Centre for Evaluation and Monitoring (CEM) at Durham University, is always an engaging speaker; he contributed two sessions: Practice & Research in Education: How can we make both better, & better aligned? (his PowerPoint is here) and Effect size in educational research (his PowerPoint is here). My take-home messages:
- if you are looking for how to get best value for money from spare development cash in school, take a look at the Teacher Toolkit, developed by CEM with the Education Endowment Foundation
- effective CPD is
- Intense: at least 15 contact hours, preferably 50
- Sustained: over at least two terms
- Content focused: on teachers’ knowledge of subject content & how students learn it
- Active: opportunities to try it out & discuss
- Supported: external feedback and networks to improve and sustain
- Evidence-based: promotes strategies supported by robust evaluation evidence
(Coe, 2013 Practice and research in education Slide 22)
I spoke about the way in which we are using research evidence to inform the development of York Science. Much of what I said has been covered in blog posts here – Evidence-based practice, Backward design and Diagnostic questions. The session was videoed, you can watch it on the researchED YouTube channel.
My presentation YS Research-informed curriculum development ResearchED pdf
It was clear from many conversations at the conference that there is an appetite for greater engagement between researchers and practitioners and I hope that before we meet again in 2014 we will have made progress in building the networks.
My previous posting about assessment of practical work provided the stimulus for a discussion on #ASEChat. #ASEChat is the hashtag used on Twitter for any comments that would be of interest to science teachers, but every Monday evening there is a focused discussion on a chosen topic.
On Monday 22nd July the questions raised in my blog post were discussed. There were interesting suggestions about ways that assessment could work and also some concern that if practical work is not assessed it will not be taught. A full summary of the discussion can be found on the ASE website here.
Twitter is not the only place for such conversations, members of the Institute of Physics Education Forum met in August to discuss the draft GCSE Criteria. There was much lively discussion about practical work at the meetings – which was followed up in the forums later. The TalkPhysics forums can be found here. The Institute of Physics response to the draft GCSE Criteria forms part of the response from SCORE – you can download the responses from the SCORE website.
The Department for Education consultation on the draft GCSE criteria has now closed. But is not too late to respond to the Ofqual consultation, which is about the regulatory aspects of the GCSE reform; the consultation includes questions about practical work assessment and tiering that will be of interest to science teachers. The consultation closes on 3rd September. So if you care about these matters make sure you respond to the consultation.