Design in Practice

ISDDE 14 logo

The 2014 conference of International Society for Design and Development in Education, titled Design in Practice, was held at the Centre for Mathematical Sciences at the University of Cambridge at the end of September. It brought people together working in curriculum design from UK, Australia, USA, Netherlands, Germany and Israel. A great opportunity to meet with other curriculum designers, share ideas and learn what other people are working on.

There was a varied programme and here I just mention a couple of things that piqued my interest.

Working memory

Prior to the conference I had been reading Daniel Willingham’s Why don’t children like school? (2009); in which Willingham, a cognitive psychologist shows how what is known about learning can be used to improve teaching in the classroom. So I looked forward to hearing Susan Gathercole, another cognitive psychologist, and Director of the MRC Cognition and Brain Sciences Unit at Cambridge, talk about Cognitive challenges in educational design. She focused on the challenges that are faced by children who have poor working memories. It starts at the functional level of not being able to follow a string of instructions, because they can’t remember them all; and of course will go on to affect all aspects of learning: “More than 80% of children with poor working memory fail to achieve expected levels of attainment in either reading or maths, typically both” (Gathercole & Alloway, 2008).

I’m sure there were times when I was teaching when I thought that a student who had failed to carry out a series of instructions ‘hadn’t listened to a word I had said’, when in fact he was listening but couldn’t hold it all in his head to carry out the task. I wish I had had the useful booklet for teachers Understanding working memory: A classroom guide and also a book: Working Memory and Learning: A Practical Guide for Teachers both by Gathercole and Alloway.

The big question that this talk raised for me was, “how much can curriculum designers take these issues into account in designing teaching resources?”

Successful curriculum design

Here at York we make every effort to use the available research evidence to support our curriculum development work. I presented a poster showing a little about how we use the available research evidence about teaching and learning in our curriculum design.

As well as using evidence about teaching and learning and children’s understanding of science, we also use our experience of earlier curriculum development projects to inform how we approach a new project. So I was particularly interested to find out more about the Design dimensions project, which is in the early stages of investigating “Across phases of design, (analysis, development, and evaluation) what processes and strategies are critical to successfully obtain large scale implementation with significant impacts on learners?”  The project is a good example of ‘working out what works’;  we will be using messages from their research to inform our future curriculum development work here. This page includes their posters and ‘abstracts’. I shall follow this project with interest.




Setting the standard – international benchmarks?

A few separate calls on my attention in the last week or so have set me thinking about attempts to make comparisons between the standards achieved by students in various parts of the world – and in particular the UK government’s desire to peg standards here to those achieved by students in ‘high performing jurisdictions’ (HPJ) (this seems to mean those parts of the world whose students do well in the PISA tests).

Reform of GCSE Sciences in the UK

Here in the Science Education Group at the University of  York we are working with OCR, the awarding body, to develop the fourth iteration of Twenty First Century Science course, to be examined for the first time in June 2018. All GCSE specifications must meet the criteria laid down by Ofqual; for this latest development the Department for Education (DfE) determined the content for the new GCSEs. The criteria can be found on the DfE website:

Separate sciences

Combined sciences

When these documents were published the Secretary of State said that they were intended to “…make these qualifications more ambitious, with greater stretch for the most able …..…..  These changes will increase the rigour of qualifications, strengthening the respect in which they are held by employers and universities alike. Young people in England deserve world-class qualifications and a world-class education – and that is what our reforms will deliver.” (Gove, M. 2014)

So that is the ambition of this change – but how will we know if it is achieved? If the measure of success is a higher ranking in the PISA league tables, these new criteria do not seem to be driving in the right direction.

OECD states that  “Rather than examine mastery of specific school curricula, PISA looks at students’ ability to apply what they learn in school to real-life situations.” (OECD, 2014)  Although OECD do not release the papers used for the rankings test, they do release some sample questions such as those here.

Looking at some of these questions, it is clear that to be successful students need to be able to read about science and to interpret what they read; as well as science knowledge they need an understanding of how science works.

The new GCSE criteria for 2016 are more content-laden, with LESS emphasis on how science works  than the previous versions we have worked with, and as a result the  constructs to be examined by the new GCSEs will differ from those assessed by PISA; there is no evidence that the new GCSE criteria have been written with a consistent reference to any other international benchmarks either.

Setting the standards

Ofqual (Office of Qualifications and Examinations Regulation) have had the task of determining how standards will be set for the new GCSEs. Currently GCSEs are graded from G-A*;  some people, including the former Secretary of State, have expressed concerns that too many students were getting the top grade of A* (this has been ascribed to ‘grade inflation’, rather than the result of better teaching and hard work by students). The new grading system will run from 1-9, presumably allowing for the opportunity to add in a grade 10 if ‘inflation’ takes over again.

In a letter to Ofqual the Secretary of State wrote, “At the level of what is widely considered to be a pass (currently indicated by a grade C), there must be an increase in demand, to reflect that of high-performing jurisdictions.” (quoted by Ofqual in their recent report). However this is easier said than done. TIMMS (Trends in International Mathematics and Science Study)  tests are carried out with students at an earlier stage in their schooling (grades 4 and 8); PISA science tests, taken by 15 year olds,  do not measure exactly the same constructs as GCSE Sciences.

In September 2014 Ofqual published the report mentioned above in which they set out how the new grades would be determined. Essentially the new numbering system will be tied to the statistical measures currently used, with a more differentiated top end.  The current grade C will become the new grade 4 and Grade 5 will become the new ‘pass’ mark.

There will be some post hoc checking to see whether their estimate of where to pitch the new ‘pass’ is right:

“49. The DfE analysis indicates that if students presently achieving grade Cs were to achieve grade 5s, that is broadly in line with what would be required to match the average performance of 16 year olds in England with the PISA mathematics performances of countries such as Finland, Canada, the Netherlands and Switzerland. Once the new grades have been awarded, we propose carrying out analyses to see how students with a grade 5 perform on international surveys such as PISA. We can then consider, in the context of Ofqual’s qualifications standards objective, whether our expectations of the grade 5 standard are being met. (Ofqual, 2014. page 11)”

So the new grading will be linked to the current system, to ensure continuity but there is no ‘setting of standards’, no descriptions of what a grade 5 means in terms of what a student can do.

What is happening elsewhere?

I have been aware for some time that there has been a project to write a new ‘national curriculum’ see note below for science in the US. Each state, and indeed each district, has a lot of control over education but the Next Generation Science Standards  (NGSS) have been developed in an attempt “to provide all students an internationally benchmarked science education” (sic) in the US. The process of writing these standards began with the development of A Framework for K-12 Science education, which was put together following much consultation within both the science community and the science education community.  Altogether a much more measured approach than the recent developments of the National Curriculum here in England.

These Standards are now being used in schools and colleges across the US and the Twitter discussion on #iteachphysics  this weekend was about standards-based grading. The discussion was led by @BlackPhysicists who asked me “Where can we get an articulation of the GSCE standards for physics that we can compare with NGSS & state stnds here?” This question drew me into think about whether it really is possible to make such comparisons.

In our current specifications (such as the Twenty First Science Physics specification) there are ‘Grade descriptors’ which were originally written to be used when setting grade boundaries after examination papers had been taken. But as Ofqual has said in their recent report:

“When GCSEs were first being developed in the mid-1980s the Government’s intention was that criteria-related grades would be introduced as soon as practicable with candidates who reached the required standard being awarded those grades. Despite heroic efforts, it proved impossible in practice to meet that intention. So GCSEs have never been criterion referenced.” (Ofqual, 2014. p5-6)

So the ‘standards’ are not laid down anywhere – the specifications describe the content to be learned and assessed, the Assessment Objectives describe the kinds of things candidates are expected to do in examinations, but nowhere is the expected standard described.

So how can we make comparisons?

It comes back to the same place that I always end up – it’s the questions we ask and the work of students in responding to those questions and tasks that show what they can do. So maybe that’s what we need to do – share questions and examples of student’s work.

NOTE (added 1st October 2014) @blackphysicists has pointed out that the Common Core and National Standards in the US are not a ‘curriculum’ in the sense of a suggested route through the teaching, but then neither is the National Curriculum in the UK.

Evidence-informed practice

A year ago I wrote about the conference that proved to be the beginning of the researchED movement, researchED13. Over the past year Tom Bennett has turned a Twitter discussion into something that has legs.
So we met again in London for a second national conference researchED14. Many of those who believe in evidence-informed practice gathered together to share ideas. To get some idea of the breadth of discussion, you can download the programme from the shiny new researchED website – which has the catchy address
I had the opportunity to speak at the conference again this year (thanks, Tom) and talked about York Science as a research-informed project that is developing questions and tasks that will support the teaching of science at key stage 3. The presentation is available here. Measuring levels or monitoring understanding
After last year’s conference I expressed the hope that “before we meet again in 2014 we will have made progress in building the networks.” That has certainly happened – there were two regional researchED conferences earlier this year – one in Birmingham, and one here in York. More are in the pipeline – keep your eye on the researchED website for dates over the coming months.
And now we need to move beyond scepticism to sharing how people are becoming involved in research in their schools and using the good research evidence that is out there. There was some of that at this conference, I hope there will be more in a year’s time.
There should be – researchED is helping to make it happen, including supporting a network of Research Champions, teachers in schools who have taken on the role of Director of Research – the fact that these roles are beginning to exist is a sign of recognition by many head teachers and governors that research does matter. I look forward to hearing from these Research Champions at researchED15.
For more from the conference keep a eye on the website where there are videos of some of the talks, with more being added over the next week or so.

The Learning Brain – does testing make you smarter?

At the recent NTEN ResearchED meeting at York (3 May 2014), I attended a very interesting talk entitled ‘The Learning Brain: a new science of learning’ given by Jonathan Sharples of the Institute for Effective Education and Education Endowment Foundation.  One aspect of his talk really caught my interest with the York Science project in mind. Jonathan presented some research on the effect of repeated testing on learning. Four groups of research participants had to learn 40 words in Swahili, all participants had no prior knowledge of the language. Initially, all students studied the list of Swahili-English word pairs (e.g. mashua –boat) and were then tested (e.g. mashua – ?).  However, once a word pair was recalled correctly, it was treated differently by each of the groups, as follows:

  1. continued to be studied and tested throughout
  2. studied but no further testing
  3. no further study but still tested
  4. no further study or testing of the words.

The results for the four groups in the study are shown in the Figure below. However, before reading on decide which of the results you think match each of the four groups.


Bar chart showing proportion recalled. Bars A and B are at abotu 0.8, C at about 0.35 and D at about 0.3

Figure 1 Proportion recalled on the final test one week after learning.
Error bars represent standard errors of the mean.

The result A is group 1 (studied and tested), B is group 3 (no study only testing), C is group 2 (study with no testing) and D is group 4 (no study and no testing).   The results provide evidence that testing does not merely measure learning but contributes to the process of learning, with repeated testing enhancing learning. Jonathan advocated that in addition to Assessment for Learning and Assessment of Learning maybe we should be thinking about Assessment as Learning. The repeated recruitment of neural pathways strengthening the retrieval networks involved in the learning. Could it be that the formative assessment tasks of York Science designed to allow teachers to obtain evidence of learning may also provide an additional learning opportunity for students?

Read more in Jeffrey D. Karpicke and Henry L. Roediger (2008) The Critical Importance of Retrieval for Learning. Science 319, 966-968

The paper can be downloaded from:

Anne Scott is a member of the York Science team.

researchED York 2014

researchED north logo

It was researchED York2014 on Saturday, this was a collaboration between the researchED team and the NTEN. I was at Huntington School with 300 other educators who had given up their Bank Holiday Saturday to think and talk about some of the things that matter to them in the world of education, in particular about how we can use research to improve teaching and learning. The buzz was just as good as it had been at the first researchED last September. You can read more about the excitement of the day from Tom Bennett and Alex Quigley. This is blogpost covers the material I talked about on Saturday.

Measuring levels or monitoring understanding?

There is anxiety amongst some teachers and senior managers about what to do in the absence of levels, this article looks briefly at some of the reasons why levels were not fit for purpose – and at some ways of better checking progress in science learning.

What are levels?

Levels of attainment were introduced into the National Curriculum nearly 30 years ago with a report from the National Curriculum Task Group on Assessment and Testing – generally known as the TGAT report. There, levels were described in this way:

the scaling system should be directly related to the development of pupils’ competences as described by the attainment targets of the national curriculum. …………We shall use the word level to define one of a sequence of points on a scale to be used in describing the progress of attainment in the profile component.
(DES/WO, 1988)

How are levels used?

Formatively - to provide information about a student’s learning that contributes directly to the learning process by providing feedback to the learner.

Summatively -  to describe a student’s attainment at a point in time (e.g. the beginning of the lesson, the end of the lesson, the end of a term, or year).

For accountability - to provide evidence of the effectiveness of teachers and schools.

This are all important purposes for assessment, but is describing a student bya level the best way to do all these things? In his excellent book Testing Times, The uses and abuses of assessment (2008) Gordon Stobart warns that

in accountability cultures with frequent high-stakes testing, making headway with formative assessment will be more difficult (p159)

If levels are to be used for reporting, how easy is it to use the same assessment formatively?

But there is another concern too; I have often heard people say something like, “well, we know levels are not perfect, but at least we know where we are with them”. But do you? In a useful report from NFER Where have all the levels gone? (Brill & Twist, 2013) the authors state “There are well documented discussions about whether it is possible to make reliable and consistent judgements confirming that a pupil is working at a certain level.” So are the judgements that teachers have to make, and that management use as an accountability measure valid? I suggest not.

But what to do?

York Science does not have the answer to how schools should monitor overall progress of a cohort, or report to parents on the progress of individuals, but we can help with monitoring progress in understanding within a lesson.

What do we need to do to describe progress?

Let’s begin by looking at what does not describe progress.

Particle model LIs

The two paragraphs above are both about the particle model of matter. One is from the new Key Stage 3 Programme of Study. The other is from the new GCSE criteria for GCSEs in Science and Chemistry from 2016.
Can you tell which is which?
So statements like this are not enough; a teacher would want to know what a student needs to do to show learning. The programme of study doesn’t give any information about that, the statement is set out as it is here. The GCSE criteria say that specifications should require students to: “recall and explain the main features of the particle model in terms of the states of matter and change of state, distinguishing between physical and chemical changes”.
Teachers will want to know what the questions will look like by the time students take their exams in 2018, they will have to wait until later this year or early next year for that, awarding organisations are working on this right now. In the meantime those students (currently in year 7) are already learning about the particle model. What sort of things would it be reasonable to imagine a key stage 3 student doing to show that they can recall and explain the particle model?
In York Science we begin by stating the Learning Intention, in this case:

  • Understand a basic particle model of matter that can explain states of matter and changes of state.

Then we write some Evidence of Learning Statements that describe the sort of things we think a student will be able to do as they develop their understanding of the particle model.
Here they are, in no particular order:


You could sort them into an order that provides some differentiation:

ELS sorted

But these could still be outcomes in a KS3 or KS4 scheme of work. What you need are the questions and tasks that will demonstrate evidence of learning at the stage the student currently is.

Let’s take one statement from that set:

“Understand a basic particle model of matter that can explain states of matter and changes of state”

You want to know if students understand how the particle model explains that liquids are runny. Here is a question written by Phil Johnson (2011) for his Assessing Students’ Concept of a Substance project at Durham University.

Johnson Q

This is a great diagnostic question to use at the start of a lesson – it will identify not only those that get it right, but also provide information about what those who get it wrong are thinking.

So after some teaching to develop the students understanding the teacher wants to check that the students have made progress. Another question is called for (not the same one again, students might remember which is the correct answer, but not understand why).

Another source of questions that check understanding is the Evidence-based Practice in Science Education project (Millar, R. et al, 2002), which was based here at the University of York. This question comes from that bank:


This is more demanding than the last question – all the statements are correct, so now the student has to pick out the relevant statements, showing greater understanding than was needed for the first question. For a weaker group of students the teacher could tell them how many of the statements to choose.

Evidence of Learning – not evidence of levels

Individual questions and tasks can provide evidence of progress in understanding a particular idea at a granular level. But one question or task cannot provide hard evidence of the overall level of attainment of a student. Generic level descriptors need to be used at a summative level, maybe once a term, and certainly not during the course of an individual lesson.

Planning for progress in a lesson

And so we come back to the idea of planning a lesson using the backward design approach:

  • Think about the Learning Intention – what learning do you want to take place in the lesson?
  • Identify the questions and tasks that provide evidence of where students are at the start of the lesson and whether there is progression in their understanding during the lesson.
  • Only then think about the teaching activities that will (hopefully) deliver that progression.

Thinking about obtaining evidence of progress first helps
to ensure that teaching activities are purposeful.

 The presentation can be downloaded. YS Measuring levels or monitoring understanding


Brill, F. and Twist, L. (2013). Where have all the levels gone? The importance of a shared understanding of assessment at a time of major policy change. NFER

DES/WO (1988) National Curriculum Task Group on Assessment and Testing—a report

Johnson, P. and Tymms, P (2011) Assessing Students’ Concept of a Substance project, Durham University

Millar, R. et al (2002) Evidence-based Practice in Science Education project, University of York

Stobart, G. (2008) Testing times The uses and abuses of assessment. Routledge


Reflections on ASE Annual Conference 2014

Alistair mans the stand at ASE 2014

Alistair mans the stand at ASE 2014

If you attended the Association for Science Education Annual Conference in Birmingham last week, and if you’re anything like me, you probably spent Sunday in a state of zombie-like exhaustion. But hopefully you also – like me – feel that it was well worth the effort, you learned some things you didn’t know before, you made useful contacts, and you’re excited about putting some new ideas into practice.

The mostly dry and sunny weather in Birmingham was a pleasant change from the snow and howling winds of conferences past, and the beautiful University of Birmingham campus was a superb setting for a science education get-together. I joined the York Science project group in September 2013 after a number of years at OCR running the assessment and professional development programmes for GCSE Twenty First Century Science, so this was one of my first opportunities to talk about our resources with teachers.

What Would Your Students Say?

Practical 'Evidence of Learning Items' on the York Science stand

Practical ‘Evidence of Learning Items’ on the York Science stand

On our stand in the exhibition marquee we debuted our snazzy new York Science flier and challenged visitors to think about how their students would answer some of the York Science evidence of learning items (ELIs).

The bolt in the beaker provoked a lot of animated discussion, showing how a practical and very simple POE (predict-observe-explain) activity can be used to probe students’ (and teachers’!) knowledge – and misunderstandings – about forces. Play-Doh® cell models showed that formative assessment activities can be as fun for the students as they are informative for the teacher.

Many of the visitors to our stand on Friday were PGCE trainees. Although activities such as making cell models have been done in classrooms by many teachers for many years, to the trainees this was new and it was clear from their reactions that the power of using such an engaging activity as a way of testing knowledge was an exciting revelation. Dylan Wiliam (2011) wrote that “sharing high quality questions may be the most significant thing we can do to improve the quality of student learning”, and one of the aims of York Science is to share high-quality formative assessment items with teachers new and old alike.

Developing formative assessment in practice

'Evidence of Learning Items' created by teachers at our workshop

‘Evidence of Learning Items’ created by teachers at our workshop

While the stand enabled us to teach new dogs some old tricks, our workshop on Saturday was all about exploring the potential of effective formative assessment with some more experienced teachers. Run by Professor Robin Millar & Mary Whitehouse, the workshop – entitled ‘Developing formative assessment in practice’ – guided teachers through the principles of ‘backward design’ curriculum development, emphasising the benefits of starting by defining what you want students to be able to do and identifying how you will assess that before deciding what and how to teach to help them do it. During the workshop teachers worked in groups to create formative assessment items of their own using York Science materials as templates. The session was so well attended that we had to move to a larger room to accommodate everybody, and feedback from the attendees was positive and encouraging.

The introductory presentation from the workshop is now available to download, and keep an eye on the Inspired by York Science section of this website where we will publish some of the ‘Evidence of Learning Items’ created by participants.

If you visited our stand or any of our events during the conference please leave a comment below to let us know what you thought, or send us an email at You can also follow York Science, Mary Whitehouse and me on Twitter.

See you at ASE in Reading in 2015, if not before!

Great value professional development for science teachers

Discussion at a York Science workshop at the ASE Conference 2013

Discussion at a York Science workshop at the ASE Conference 2013

If you’d like to meet the York science team – and other members of the University of York Science Education Group (UYSEG) then come along to the University of Birmingham in the next few days.

It’s the ASE Annual Conference – an annual jamboree for science educators from across the world. Today (Wednesday) is the International Conference, which is followed by three more days of talks, workshops, and cutting edge science lectures.

I have written before about the ASE and this conference is its most visible manifestation, open to all, not just members, it is an opportunity for science teachers to meet with each other and also get some of the best value professional development possible. There is a wide range of talks and workshops to meet every need.

York Science will be presenting two sessions:

Thursday 9th January at 14.00  Getting to grips with lesson planning – we have been using our work in York Science to inform our PGCE Science programme at York. Anne Scott and Mary Whitehouse will be talking about how we have introduced the trainees to the backward design approach to lesson planning.  LR8 in the Arts Building

Saturday 11th January at 11.00  Developing formative assessment in practice – I have written about how teachers have been inspired by York Science to take some of the question types and ‘make another one like it’. In this workshop Robin Millar and Mary Whitehouse will provide a example of a range of questions styles with suggestions for how to adapt to other contexts. Participants will develop new assessment items which we plan to put on this website for everyone to share.  Room 12, Muirhead Tower.

UYSEG also has a stand in the exhibition marquee – stand BS18. Come along: Predict, Observe, Explain, collect a sticker for the prize draw, and chat to the team.

If you haven’t booked, it doesn’t matter, just come along for the day – see you there!

Inspired to write more

You know you are doing something right when teachers tell you that they used one of your resources and now they have adapted it for other purposes.

It all started with the cupboard under the stairs. cupboard_drawing

I wrote a blog post about  diagnostic questions and  confidence grids.  and soon after that @DoctorACook  tweeted about how she had used the PLC3.1 Dark room Evidence of Learning Item. And then she made some more confidence grids of her own and she wrote about them on this blog Using diagnostic questions.

As time went on I heard about other people who were using the Dark Room question and making  more confidence grids. And I began to think about how we might share the ones people had made.

Other people picked up on my post about Making the best use of examination questions, including  @NeedhamL56 , who developed it to use in training for the Science Learning Centres. @hrogerson  used the examiner reports for OCR Gateway Science to make some questions for her students and wrote about it on her blog.

So  people are picking up and running with our suggestions, which was exactly what we hoped would happen  when the project began two years ago,  and, even better, they are willing to share their items with everyone else.

Take a look at the new  Inspired by York Science tab at the top and dip into the resources these lovely people have shared. I hope you will be inspired to make some more and share them too. There are  even templates to download to get you on your way.





In praise of the ASE and School Science Review

The Association for Science Education (ASE) celebrated its 50th Anniversary in 2013. For as long as I have been working in science education the ASE has been part of my life – as a young teacher Education in Science  was my chief source of news about what was happening in science education and how I could get involved – of course there was no internet in those distant times.

Soon I was taking part in my first  curriculum development project, spending my weekends meeting with teachers from across the country to write and trial Science in Society, led by John Lewis of Malvern College. Later, some  of us from the  Science in Society team went on to work with John Holman and Andrew Hunt on the development of  SATIS . The SATIS project began in 1984 as the result of the report from an ASE working party which was convened to consider how the relationship between science, technology and society could be integrated into 11-16 school science. If you don’t know about SATIS take a look at the resources, which are now all available on the National Stem Centre elibrary. Although some may seem somewhat dated, there are many great ideas there for hooking students into science and developing ideas about how science works, and ‘thinking scientifically’.

ASE has been publishing resources for teachers over many years, and continues to do so. The ASE bookshop publishes and sells a whole range of books that should be in your department library.  There are also journals and magazines for science teachers, including School Science Reviewa peer-reviewed journal for secondary science teachers. I must declare an interest here as I edited SSR for a while, but it really is a great journal, arriving through the door four times a year, and specifically aimed at the classroom teacher, it contains an eclectic mix of ideas for new experiments, book reviews, and longer articles about science teaching and science education research.

And if you are not yet convinced this is an organisation you should join, I must mention the role ASE  has in representing all science teachers to the rest of the world. ASE is a key partner with the learned societies in SCORE, which enables the science community and the science education community to speak with one voice to government.

I could go on to list may other ways in which science teachers benefit from the work of the ASE, whether or not they are members – #ASEChat, #ASEteachmeets, the Annual Conference (which will be at the University of Birmingham in January – see the side panel to the right) and so on…… But ASE is a membership organisation that largely operates thorough the goodwill of its members, many of whom put in countless hours to ensure that the classroom voice is heard in discussions at the highest levels. So if you want to continue to benefit from its work, why not join now?

Oh, and the reason I started this paean to the ASE? There is an article about York Science, based on my presentation at the ASE Summer Celebration Conference in June this year,  in the latest edition of School Science Review.

You can download the article here Embedding assessment to improve learning. To read about other sessions at the conference you will need to become a member of the ASE.

Edited to add a link to SATIS Revisited mentioned by Nick Swift in the comments.

From the National Curriculum to a Scheme of Work

So you have downloaded the Programme of Study for key stage 3 Science and you are wondering where to start in making that into a scheme of work for early secondary school? You might start by reading about the principles behind backward design. This suggests that you begin by thinking about what you want learners to know, understand, and be able to do by the end of your teaching programme. That question might be better asked as ‘what do you want your students to know, understand and be able to do as they begin their examination programme, which might be GCSE Sciences (separate sciences or double award?) or BTEC Science or OCR Nationals, or ……

To help you with this you will want to know what the new GCSEs will look like. We don’t really know yet, but the (now closed) Consultation on the subject content for GCSE Sciences will give you some clues. It is clear from that document that there is an intention to make the content at GCSE more demanding. So you will want your students to come to those courses with a solid understanding of the sciences ideas you have taught in the early years of secondary school.

In fact it may be better to think of the first five years of secondary school as a continuum, with a diverse range of assessments at the end of year 11. So in the early years you will be teaching a fairly common curriculum, and at the same time thinking about what course will be most suitable for each student. At some point, probably during year 9, decisions will be made about the curriculum pathway most suitable for each student.

How can York Science help with your planning?

Let’s look at an example. In the Key stage 3 programme of study for Chemistry there is this:

Pupils should be taught about:
The particulate nature of matter

  • the properties of the different states of matter (solid, liquid and gas) in terms of the particle model, including gas pressure
  • changes of state in terms of the particle model

(DfE (2013) Science programmes of study: key stage 3. p. 8)

In the resources area of this website is a section about the particle model of matter, including a list of some of the things we think a student will be able to do if they understand a basic particle model of matter that can explain states of matter and changes of state, for instance we think they should be able to:

  • describe the main features of the particle model
  • identify limitations in representations of the particle model
  • explain how the particle model distinguishes between pure and impure samples of substances
  • use the model to explain characteristics of substances in the solid, liquid and gas states
  • use the model to explain changes of state

This is not an exclusive list, but if students could do all these things successfully they would be well prepared for starting a course in GCSE Chemistry – where the content includes this statement:

  • recall and explain the main features of the particle model in terms of the states of matter and change of state, distinguishing between physical and chemical changes

(DfE. (2013) Science: GCSE subject content and assessment objectives. p.17)

So, if you were writing a scheme of work for the early years of secondary school, you might use the list above as the intended learning outcomes for a section of work on the particle model.

And then of course you need a set of questions and tasks that will provide the evidence for each of those items – this is a crucial part of the planning – making sure you have items that you can use in the course of your teaching to provide evidence that learning has taken place. This is what York Science is doing developing questions and tasks for each of our stated learning intentions. (We call these Evidence of Learning Items, ELIs)
For instance, a student who understands the particle model should be able to use it to explain characteristics of substances in the solid state. Here is a question you might project on the whiteboard and ask students to think about in groups:

CSU2.4aP Particles and the solid state

This question is taken from a large bank of questions written for the
Evidence-based Practice in Science Education (EPSE) Research Network

The question will not tell you if they have learned the particle model statements – but learning by rote without understanding is not enough. Listening to the conversations of the students as they discuss their answer will help you find out whether they understand the particle model. So much of what follows in studying all the sciences depends on a good understanding of the particle model, so it is worth spending time in the early years to ensure students have a good grounding.
This question is one of a set that checking students’ understanding of the particle model of solids. You can download the set from the resources page.
And if you haven’t tried the confidence grid approach, here is an opportunity:

CSU2.4aP Particles and the solid state grid

So please download the frameworks that are currently available on the resources pages and use them in your planning. Let us know if you think they will be useful,  and whether you would like more of them to become available over the coming months.


Get every new post delivered to your Inbox

Join other followers: