The Learning Brain – does testing make you smarter?

At the recent NTEN ResearchED meeting at York (3 May 2014), I attended a very interesting talk entitled ‘The Learning Brain: a new science of learning’ given by Jonathan Sharples of the Institute for Effective Education and Education Endowment Foundation.  One aspect of his talk really caught my interest with the York Science project in mind. Jonathan presented some research on the effect of repeated testing on learning. Four groups of research participants had to learn 40 words in Swahili, all participants had no prior knowledge of the language. Initially, all students studied the list of Swahili-English word pairs (e.g. mashua –boat) and were then tested (e.g. mashua – ?).  However, once a word pair was recalled correctly, it was treated differently by each of the groups, as follows:

  1. continued to be studied and tested throughout
  2. studied but no further testing
  3. no further study but still tested
  4. no further study or testing of the words.

The results for the four groups in the study are shown in the Figure below. However, before reading on decide which of the results you think match each of the four groups.

 

Bar chart showing proportion recalled. Bars A and B are at abotu 0.8, C at about 0.35 and D at about 0.3

Figure 1 Proportion recalled on the final test one week after learning.
Error bars represent standard errors of the mean.

The result A is group 1 (studied and tested), B is group 3 (no study only testing), C is group 2 (study with no testing) and D is group 4 (no study and no testing).   The results provide evidence that testing does not merely measure learning but contributes to the process of learning, with repeated testing enhancing learning. Jonathan advocated that in addition to Assessment for Learning and Assessment of Learning maybe we should be thinking about Assessment as Learning. The repeated recruitment of neural pathways strengthening the retrieval networks involved in the learning. Could it be that the formative assessment tasks of York Science designed to allow teachers to obtain evidence of learning may also provide an additional learning opportunity for students?

Read more in Jeffrey D. Karpicke and Henry L. Roediger (2008) The Critical Importance of Retrieval for Learning. Science 319, 966-968

The paper can be downloaded from:   http://learninglab.psych.purdue.edu/publications/

Anne Scott is a member of the York Science team.

researchED York 2014

researchED north logo

It was researchED York2014 on Saturday, this was a collaboration between the researchED team and the NTEN. I was at Huntington School with 300 other educators who had given up their Bank Holiday Saturday to think and talk about some of the things that matter to them in the world of education, in particular about how we can use research to improve teaching and learning. The buzz was just as good as it had been at the first researchED last September. You can read more about the excitement of the day from Tom Bennett and Alex Quigley. This is blogpost covers the material I talked about on Saturday.

Measuring levels or monitoring understanding?

There is anxiety amongst some teachers and senior managers about what to do in the absence of levels, this article looks briefly at some of the reasons why levels were not fit for purpose – and at some ways of better checking progress in science learning.

What are levels?

Levels of attainment were introduced into the National Curriculum nearly 30 years ago with a report from the National Curriculum Task Group on Assessment and Testing – generally known as the TGAT report. There, levels were described in this way:

the scaling system should be directly related to the development of pupils’ competences as described by the attainment targets of the national curriculum. …………We shall use the word level to define one of a sequence of points on a scale to be used in describing the progress of attainment in the profile component.
(DES/WO, 1988)

How are levels used?

Formatively - to provide information about a student’s learning that contributes directly to the learning process by providing feedback to the learner.

Summatively -  to describe a student’s attainment at a point in time (e.g. the beginning of the lesson, the end of the lesson, the end of a term, or year).

For accountability - to provide evidence of the effectiveness of teachers and schools.

This are all important purposes for assessment, but is describing a student bya level the best way to do all these things? In his excellent book Testing Times, The uses and abuses of assessment (2008) Gordon Stobart warns that

in accountability cultures with frequent high-stakes testing, making headway with formative assessment will be more difficult (p159)

If levels are to be used for reporting, how easy is it to use the same assessment formatively?

But there is another concern too; I have often heard people say something like, “well, we know levels are not perfect, but at least we know where we are with them”. But do you? In a useful report from NFER Where have all the levels gone? (Brill & Twist, 2013) the authors state “There are well documented discussions about whether it is possible to make reliable and consistent judgements confirming that a pupil is working at a certain level.” So are the judgements that teachers have to make, and that management use as an accountability measure valid? I suggest not.

But what to do?

York Science does not have the answer to how schools should monitor overall progress of a cohort, or report to parents on the progress of individuals, but we can help with monitoring progress in understanding within a lesson.

What do we need to do to describe progress?

Let’s begin by looking at what does not describe progress.

Particle model LIs

The two paragraphs above are both about the particle model of matter. One is from the new Key Stage 3 Programme of Study. The other is from the new GCSE criteria for GCSEs in Science and Chemistry from 2016.
Can you tell which is which?
So statements like this are not enough; a teacher would want to know what a student needs to do to show learning. The programme of study doesn’t give any information about that, the statement is set out as it is here. The GCSE criteria say that specifications should require students to: “recall and explain the main features of the particle model in terms of the states of matter and change of state, distinguishing between physical and chemical changes”.
Teachers will want to know what the questions will look like by the time students take their exams in 2018, they will have to wait until later this year or early next year for that, awarding organisations are working on this right now. In the meantime those students (currently in year 7) are already learning about the particle model. What sort of things would it be reasonable to imagine a key stage 3 student doing to show that they can recall and explain the particle model?
In York Science we begin by stating the Learning Intention, in this case:

  • Understand a basic particle model of matter that can explain states of matter and changes of state.

Then we write some Evidence of Learning Statements that describe the sort of things we think a student will be able to do as they develop their understanding of the particle model.
Here they are, in no particular order:

 ELS

You could sort them into an order that provides some differentiation:

ELS sorted

But these could still be outcomes in a KS3 or KS4 scheme of work. What you need are the questions and tasks that will demonstrate evidence of learning at the stage the student currently is.

Let’s take one statement from that set:

“Understand a basic particle model of matter that can explain states of matter and changes of state”

You want to know if students understand how the particle model explains that liquids are runny. Here is a question written by Phil Johnson (2011) for his Assessing Students’ Concept of a Substance project at Durham University.

Johnson Q

This is a great diagnostic question to use at the start of a lesson – it will identify not only those that get it right, but also provide information about what those who get it wrong are thinking.

So after some teaching to develop the students understanding the teacher wants to check that the students have made progress. Another question is called for (not the same one again, students might remember which is the correct answer, but not understand why).

Another source of questions that check understanding is the Evidence-based Practice in Science Education project (Millar, R. et al, 2002), which was based here at the University of York. This question comes from that bank:

EPSE Q

This is more demanding than the last question – all the statements are correct, so now the student has to pick out the relevant statements, showing greater understanding than was needed for the first question. For a weaker group of students the teacher could tell them how many of the statements to choose.

Evidence of Learning – not evidence of levels

Individual questions and tasks can provide evidence of progress in understanding a particular idea at a granular level. But one question or task cannot provide hard evidence of the overall level of attainment of a student. Generic level descriptors need to be used at a summative level, maybe once a term, and certainly not during the course of an individual lesson.

Planning for progress in a lesson

And so we come back to the idea of planning a lesson using the backward design approach:

  • Think about the Learning Intention – what learning do you want to take place in the lesson?
  • Identify the questions and tasks that provide evidence of where students are at the start of the lesson and whether there is progression in their understanding during the lesson.
  • Only then think about the teaching activities that will (hopefully) deliver that progression.

Thinking about obtaining evidence of progress first helps
to ensure that teaching activities are purposeful.

 The presentation can be downloaded. YS Measuring levels or monitoring understanding

References

Brill, F. and Twist, L. (2013). Where have all the levels gone? The importance of a shared understanding of assessment at a time of major policy change. NFER https://www.nfer.ac.uk/publications/99940/99940.pdf

DES/WO (1988) National Curriculum Task Group on Assessment and Testing—a report

Johnson, P. and Tymms, P (2011) Assessing Students’ Concept of a Substance project, Durham University

Millar, R. et al (2002) Evidence-based Practice in Science Education project, University of York

Stobart, G. (2008) Testing times The uses and abuses of assessment. Routledge

 

Reflections on ASE Annual Conference 2014

Alistair mans the stand at ASE 2014

Alistair mans the stand at ASE 2014

If you attended the Association for Science Education Annual Conference in Birmingham last week, and if you’re anything like me, you probably spent Sunday in a state of zombie-like exhaustion. But hopefully you also – like me – feel that it was well worth the effort, you learned some things you didn’t know before, you made useful contacts, and you’re excited about putting some new ideas into practice.

The mostly dry and sunny weather in Birmingham was a pleasant change from the snow and howling winds of conferences past, and the beautiful University of Birmingham campus was a superb setting for a science education get-together. I joined the York Science project group in September 2013 after a number of years at OCR running the assessment and professional development programmes for GCSE Twenty First Century Science, so this was one of my first opportunities to talk about our resources with teachers.

What Would Your Students Say?

Practical 'Evidence of Learning Items' on the York Science stand

Practical ‘Evidence of Learning Items’ on the York Science stand

On our stand in the exhibition marquee we debuted our snazzy new York Science flier and challenged visitors to think about how their students would answer some of the York Science evidence of learning items (ELIs).

The bolt in the beaker provoked a lot of animated discussion, showing how a practical and very simple POE (predict-observe-explain) activity can be used to probe students’ (and teachers’!) knowledge – and misunderstandings – about forces. Play-Doh® cell models showed that formative assessment activities can be as fun for the students as they are informative for the teacher.

Many of the visitors to our stand on Friday were PGCE trainees. Although activities such as making cell models have been done in classrooms by many teachers for many years, to the trainees this was new and it was clear from their reactions that the power of using such an engaging activity as a way of testing knowledge was an exciting revelation. Dylan Wiliam (2011) wrote that “sharing high quality questions may be the most significant thing we can do to improve the quality of student learning”, and one of the aims of York Science is to share high-quality formative assessment items with teachers new and old alike.

Developing formative assessment in practice

'Evidence of Learning Items' created by teachers at our workshop

‘Evidence of Learning Items’ created by teachers at our workshop

While the stand enabled us to teach new dogs some old tricks, our workshop on Saturday was all about exploring the potential of effective formative assessment with some more experienced teachers. Run by Professor Robin Millar & Mary Whitehouse, the workshop – entitled ‘Developing formative assessment in practice’ – guided teachers through the principles of ‘backward design’ curriculum development, emphasising the benefits of starting by defining what you want students to be able to do and identifying how you will assess that before deciding what and how to teach to help them do it. During the workshop teachers worked in groups to create formative assessment items of their own using York Science materials as templates. The session was so well attended that we had to move to a larger room to accommodate everybody, and feedback from the attendees was positive and encouraging.

The introductory presentation from the workshop is now available to download, and keep an eye on the Inspired by York Science section of this website where we will publish some of the ‘Evidence of Learning Items’ created by participants.

If you visited our stand or any of our events during the conference please leave a comment below to let us know what you thought, or send us an email at uyseg-yorkscience@york.ac.uk. You can also follow York Science, Mary Whitehouse and me on Twitter.

See you at ASE in Reading in 2015, if not before!

Great value professional development for science teachers

Discussion at a York Science workshop at the ASE Conference 2013

Discussion at a York Science workshop at the ASE Conference 2013

If you’d like to meet the York science team – and other members of the University of York Science Education Group (UYSEG) then come along to the University of Birmingham in the next few days.

It’s the ASE Annual Conference – an annual jamboree for science educators from across the world. Today (Wednesday) is the International Conference, which is followed by three more days of talks, workshops, and cutting edge science lectures.

I have written before about the ASE and this conference is its most visible manifestation, open to all, not just members, it is an opportunity for science teachers to meet with each other and also get some of the best value professional development possible. There is a wide range of talks and workshops to meet every need.

York Science will be presenting two sessions:

Thursday 9th January at 14.00  Getting to grips with lesson planning – we have been using our work in York Science to inform our PGCE Science programme at York. Anne Scott and Mary Whitehouse will be talking about how we have introduced the trainees to the backward design approach to lesson planning.  LR8 in the Arts Building

Saturday 11th January at 11.00  Developing formative assessment in practice – I have written about how teachers have been inspired by York Science to take some of the question types and ‘make another one like it’. In this workshop Robin Millar and Mary Whitehouse will provide a example of a range of questions styles with suggestions for how to adapt to other contexts. Participants will develop new assessment items which we plan to put on this website for everyone to share.  Room 12, Muirhead Tower.

UYSEG also has a stand in the exhibition marquee – stand BS18. Come along: Predict, Observe, Explain, collect a sticker for the prize draw, and chat to the team.

If you haven’t booked, it doesn’t matter, just come along for the day – see you there!

Inspired to write more

You know you are doing something right when teachers tell you that they used one of your resources and now they have adapted it for other purposes.

It all started with the cupboard under the stairs. cupboard_drawing

I wrote a blog post about  diagnostic questions and  confidence grids.  and soon after that @DoctorACook  tweeted about how she had used the PLC3.1 Dark room Evidence of Learning Item. And then she made some more confidence grids of her own and she wrote about them on this blog Using diagnostic questions.

As time went on I heard about other people who were using the Dark Room question and making  more confidence grids. And I began to think about how we might share the ones people had made.

Other people picked up on my post about Making the best use of examination questions, including  @NeedhamL56 , who developed it to use in training for the Science Learning Centres. @hrogerson  used the examiner reports for OCR Gateway Science to make some questions for her students and wrote about it on her blog.

So  people are picking up and running with our suggestions, which was exactly what we hoped would happen  when the project began two years ago,  and, even better, they are willing to share their items with everyone else.

Take a look at the new  Inspired by York Science tab at the top and dip into the resources these lovely people have shared. I hope you will be inspired to make some more and share them too. There are  even templates to download to get you on your way.

 

 

 

 

In praise of the ASE and School Science Review

The Association for Science Education (ASE) celebrated its 50th Anniversary in 2013. For as long as I have been working in science education the ASE has been part of my life – as a young teacher Education in Science  was my chief source of news about what was happening in science education and how I could get involved – of course there was no internet in those distant times.

Soon I was taking part in my first  curriculum development project, spending my weekends meeting with teachers from across the country to write and trial Science in Society, led by John Lewis of Malvern College. Later, some  of us from the  Science in Society team went on to work with John Holman and Andrew Hunt on the development of  SATIS . The SATIS project began in 1984 as the result of the report from an ASE working party which was convened to consider how the relationship between science, technology and society could be integrated into 11-16 school science. If you don’t know about SATIS take a look at the resources, which are now all available on the National Stem Centre elibrary. Although some may seem somewhat dated, there are many great ideas there for hooking students into science and developing ideas about how science works, and ‘thinking scientifically’.

ASE has been publishing resources for teachers over many years, and continues to do so. The ASE bookshop publishes and sells a whole range of books that should be in your department library.  There are also journals and magazines for science teachers, including School Science Reviewa peer-reviewed journal for secondary science teachers. I must declare an interest here as I edited SSR for a while, but it really is a great journal, arriving through the door four times a year, and specifically aimed at the classroom teacher, it contains an eclectic mix of ideas for new experiments, book reviews, and longer articles about science teaching and science education research.

And if you are not yet convinced this is an organisation you should join, I must mention the role ASE  has in representing all science teachers to the rest of the world. ASE is a key partner with the learned societies in SCORE, which enables the science community and the science education community to speak with one voice to government.

I could go on to list may other ways in which science teachers benefit from the work of the ASE, whether or not they are members – #ASEChat, #ASEteachmeets, the Annual Conference (which will be at the University of Birmingham in January – see the side panel to the right) and so on…… But ASE is a membership organisation that largely operates thorough the goodwill of its members, many of whom put in countless hours to ensure that the classroom voice is heard in discussions at the highest levels. So if you want to continue to benefit from its work, why not join now?

Oh, and the reason I started this paean to the ASE? There is an article about York Science, based on my presentation at the ASE Summer Celebration Conference in June this year,  in the latest edition of School Science Review.

You can download the article here Embedding assessment to improve learning. To read about other sessions at the conference you will need to become a member of the ASE.

Edited to add a link to SATIS Revisited mentioned by Nick Swift in the comments.

From the National Curriculum to a Scheme of Work

So you have downloaded the Programme of Study for key stage 3 Science and you are wondering where to start in making that into a scheme of work for early secondary school? You might start by reading about the principles behind backward design. This suggests that you begin by thinking about what you want learners to know, understand, and be able to do by the end of your teaching programme. That question might be better asked as ‘what do you want your students to know, understand and be able to do as they begin their examination programme, which might be GCSE Sciences (separate sciences or double award?) or BTEC Science or OCR Nationals, or ……

To help you with this you will want to know what the new GCSEs will look like. We don’t really know yet, but the (now closed) Consultation on the subject content for GCSE Sciences will give you some clues. It is clear from that document that there is an intention to make the content at GCSE more demanding. So you will want your students to come to those courses with a solid understanding of the sciences ideas you have taught in the early years of secondary school.

In fact it may be better to think of the first five years of secondary school as a continuum, with a diverse range of assessments at the end of year 11. So in the early years you will be teaching a fairly common curriculum, and at the same time thinking about what course will be most suitable for each student. At some point, probably during year 9, decisions will be made about the curriculum pathway most suitable for each student.

How can York Science help with your planning?

Let’s look at an example. In the Key stage 3 programme of study for Chemistry there is this:

Pupils should be taught about:
The particulate nature of matter

  • the properties of the different states of matter (solid, liquid and gas) in terms of the particle model, including gas pressure
  • changes of state in terms of the particle model

(DfE (2013) Science programmes of study: key stage 3. p. 8)

In the resources area of this website is a section about the particle model of matter, including a list of some of the things we think a student will be able to do if they understand a basic particle model of matter that can explain states of matter and changes of state, for instance we think they should be able to:

  • describe the main features of the particle model
  • identify limitations in representations of the particle model
  • explain how the particle model distinguishes between pure and impure samples of substances
  • use the model to explain characteristics of substances in the solid, liquid and gas states
  • use the model to explain changes of state

This is not an exclusive list, but if students could do all these things successfully they would be well prepared for starting a course in GCSE Chemistry – where the content includes this statement:

  • recall and explain the main features of the particle model in terms of the states of matter and change of state, distinguishing between physical and chemical changes

(DfE. (2013) Science: GCSE subject content and assessment objectives. p.17)

So, if you were writing a scheme of work for the early years of secondary school, you might use the list above as the intended learning outcomes for a section of work on the particle model.

And then of course you need a set of questions and tasks that will provide the evidence for each of those items – this is a crucial part of the planning – making sure you have items that you can use in the course of your teaching to provide evidence that learning has taken place. This is what York Science is doing developing questions and tasks for each of our stated learning intentions. (We call these Evidence of Learning Items, ELIs)
For instance, a student who understands the particle model should be able to use it to explain characteristics of substances in the solid state. Here is a question you might project on the whiteboard and ask students to think about in groups:

CSU2.4aP Particles and the solid state

This question is taken from a large bank of questions written for the
Evidence-based Practice in Science Education (EPSE) Research Network

The question will not tell you if they have learned the particle model statements – but learning by rote without understanding is not enough. Listening to the conversations of the students as they discuss their answer will help you find out whether they understand the particle model. So much of what follows in studying all the sciences depends on a good understanding of the particle model, so it is worth spending time in the early years to ensure students have a good grounding.
This question is one of a set that checking students’ understanding of the particle model of solids. You can download the set from the resources page.
And if you haven’t tried the confidence grid approach, here is an opportunity:

CSU2.4aP Particles and the solid state grid

So please download the frameworks that are currently available on the resources pages and use them in your planning. Let us know if you think they will be useful,  and whether you would like more of them to become available over the coming months.

researchED 2013

A few months ago there was a conversation on Twitter about researched informed practice in education. One thing led to another and on the first weekend in September 500 teachers, researchers and others with an interest in education gathered together at researchED 2013 to discuss how we could use research to improve learning in schools. I was invited to speak about how we use research evidence in curriculum development here at York.

There was a real buzz to the conference – after all these folk had all given up their Saturday to be there. Ben Goldacre started us off with an amusing, yet thought-provoking, talk about the connections between practitioners and researchers, drawing parallels with the ways in which medical researchers and practitioners share information, through formal and informal networks, such as databases of GPs willing to be involved in trials and in-house journal clubs. Of course we already have our own Science Teaching Journal Club , started by Alby Reid (@alby) and Alom Shaha (@alomshaha). The club meets periodically on Tuesday evenings on Twitter.

An education journal club is a place to discuss a published research paper on a topic of interest related to education, concluding with some thoughts about whether those in the group might try out the ideas in their teaching. A school where there is an interest in using evidence-based practice to develop teaching and learning might have a group of teachers interested in running such a club. King Edward VI Grammar School, Chelmsford is such a school. KEGS is a Leading Edge school where there are “teachers who are passionate about teaching, and leadership committed to research evidenced innovation to raise achievement and improve learning outcomes”. At researchED the headteacher, Tom Sherrington (@headguruteacher), spoke passionately about some of the many and varied ways he and his staff are carrying out research. Tom blogs regularly about his thinking about teaching, learning and school leadership. My take-home messages:

  • it is possible for teachers to engage in small scale action research in school
  • I’d love to be back teaching in a school like KEGS

Professor Robert Coe (@ProfCoe), Director of the Centre for Evaluation and Monitoring (CEM) at Durham University, is always an engaging speaker; he contributed two sessions: Practice & Research in Education: How can we make both better, & better aligned? (his PowerPoint is here) and Effect size in educational research (his PowerPoint is here). My take-home messages:

  • if you are looking for how to get best value for money from spare development cash in school, take a look at the Teacher Toolkit, developed by CEM with the Education Endowment Foundation
  • effective CPD is
    •  Intense: at least 15 contact hours, preferably 50
    • Sustained: over at least two terms
    • Content focused: on teachers’ knowledge of subject content & how students learn it
    • Active: opportunities to try it out & discuss
    • Supported: external feedback and networks to improve and sustain
    • Evidence-based: promotes strategies supported by robust evaluation evidence

(Coe, 2013 Practice and research in education Slide 22)

 I spoke about the way in which we are using research evidence to inform the development of York Science. Much of what I said has been covered in blog posts here – Evidence-based practice, Backward design and Diagnostic questions.  The session was videoed, you can watch it on the researchED YouTube channel.

My presentation YS Research-informed curriculum development ResearchED pdf

It was clear from many conversations at the conference that there is an appetite for greater engagement between researchers and practitioners and I hope that before we meet again in 2014 we will have made progress in building the networks.

Congratulations to all those who helped make the conference a great success, particularly Tom Bennett, @tombennett71 and Helene Galdin-O’Shea @hgaldinoshea

 

Assessment of practical work – part 2

My previous posting about assessment of practical work provided the stimulus for a discussion on #ASEChat.  #ASEChat is the hashtag used on Twitter for any comments that would be of interest to science teachers, but every Monday evening there is a focused discussion on a chosen topic.

On Monday 22nd July the questions raised in my blog post were discussed. There were interesting suggestions about ways that assessment could work and also some concern that if practical work is not assessed it will not be taught. A full summary of the discussion can be found on the ASE website here.

Twitter is not the only place for such conversations, members of the Institute of Physics  Education Forum met in August to discuss the draft GCSE Criteria. There was much lively discussion about practical work at the meetings – which was followed up in the forums later. The TalkPhysics forums can be found here. The Institute of Physics response to the draft GCSE Criteria forms part of the response from SCORE – you can download the responses from the SCORE website.

The Department  for Education consultation on the draft GCSE criteria has now closed. But is not too late to respond to the Ofqual consultation, which is about the regulatory aspects of the GCSE reform; the consultation includes questions about practical work assessment and tiering that will be of interest to science teachers. The consultation closes on 3rd September. So if you care about these matters make sure you respond to the consultation.

Assessment of practical work

 settle4

It is clear that teachers, and others, think that it is important that practical work is an important part of science education. Recently #ASEChat discussed ‘The challenges and benefits of practical work in science teaching’. The discussion was led by Alom Shaha (@alomshaha), who has made a video for Nuffield about how to get the most out of practical work. There was much heated debate, and you can read a summary written by Richard Needham (@viciascience) here. At the end of the summary is a useful list of links for those interested in getting more out of their practical work. In particular for those who have not come across it, take a look at Getting Practical and also Robin Millar’s work. Stephen Taylor (@iBiologyStephen), an IB teacher in Japan has also responded to Alom’s video in this post on his i-Biology blog.

But this post isn’t about practical work in the classroom, in this post I want to raise the issue of assessment of practical work. Michael Reiss, Ian Abrahams, and Rachael Sharpe carried out a study for Gatsby of how practical work is assessed in other places. It makes interesting reading

Recently the DfE published a consultation document in which they lay down their proposals for the subject content and assessment objectives (AOs) (pp47-48) for GCSE Science for examinations from 2017. Teachers often don’t spend long looking at the AOs in a specification, more important to them is how they are exemplified in the assessment materials. However, in this consultation document the AOs contain many more statements and give far more detail than in previous subject criteria and these AOs will drive changes in the the assessment.

So, wearing my York Science hat, let’s take a look at the AOs that relate directly to teaching practical work:

AO1 Recognise, recall and show understanding of:

  • uses of scientific instrumentation and apparatus
  • scientific quantities and their determination
  • working safely in a scientific context.

AO3 The experimental skills and abilities to:

  • select or formulate propositions amenable to experimental test
  • devise procedures and select apparatus and materials suitable for synthesising substances or producing or checking the validity of data, conclusions, generalizations and hypotheses
  • recognise and explain variability and unreliability in experimental measurements
  • evaluate quantitative and qualitative data acquired through practical work, the design of experiments and experimental observations, draw conclusions and suggest improvements where appropriate.

AO4 The ability to:

  • follow instructions accurately
  • use scientific instrumentation, apparatus and materials appropriately
  • work with due regard for safety, managing risks
  • observe, measure and record accurately and systematically
  • carry out and report on investigations or parts of investigations.

The statements listed under AO1 are a subset of the full AO1 list, which all together are worth 30% of the marks. AO3 is worth 10% and AO4 is worth 10%. So more than 20% of the GCSE marks are related to practical work.

AO1 and AO3 are to be assessed through examination papers. AO4 is to be assessed ‘directly’ by the teacher.

So my first question:

Are the objectives in that list important things for students to show they can do at the end of KS4?

If we want to develop assessments that assess what we think is important and should be accredited as part of a GCSE qualification, (rather than assess what is easy to set (and mark)) we need to think what the questions and tasks should look like.

More questions:

What sort of questions could legitimately asked in an exam that would address AO1 and AO3?

What sort of evidence of performance would be acceptable for those things described by AO4?

Alongside the consultation from DfE, Ofqual are also consulting on the GCSE reforms, there is a discussion of assessment of science practical skills on page 42. They are inviting views on the tensions between authentic assessment of practical skills (such as the first 4 bullets under AO4) and the validity of the assessment when teachers are under such pressure to give students maximum marks.

There is a discussion about assessment of practical work amongst physics teachers and others in the TalkPhysics forum . Are there similar discussions amongst chemistry teachers and biology teachers anywhere?

Amongst the suggestions from the TalkPhysics group:

  • include in the specification a list of the competencies that are expected e.g. take readings from a circuit with a voltmeter and ammeter, use a thermometer, measure pH …..
  • include in the specification a requirement for candidates to complete a list of core practicals, evidence of the activity could be, for example a table of results, or photo evidence. Evidence is supported by a signature that the student has done the work themselves.
  • remove investigations from the AO4 list, all the components are in AO3 – so doing the investigations will build up the skills, which are assessed in an exam and teachers don’t have to mark investigations

What do you think?

Discuss on Twitter using #ASEChat, alternatively, add your comments below to get the discussion going.

Here are some starter questions for the discussion:

  • do you support the idea of a list of required practicals in the specification that could be form the context for questions in written papers? if you do what practicals would you include?
  • do you support the retention of investigations to be marked by teachers? If you do, what sort of investigation is worthwhile? If not why not?
  • how could the sort of practical competencies as listed in AO4 be assessed? would you include specific competencies in the list or keep it generic? would it matter if everyone scored full marks?

 

Follow

Get every new post delivered to your Inbox

Join other followers:

alkbayle