You are here

Post date: Thursday, 8 September 2016

Argumentation: another candidate for measuring progress?

When I reflect on educational aims at the school level, I think of the key competencies in the curriculum (Ministry of Education, 2007). And these key competencies (thinking; using language, symbols and texts; managing self; relating to others; and, participating and contributing) all seem to be subsumed under thinking if thinking is interpreted broadly. (Begg, 2016)

Andy Begg is a widely respected mathematics educator. His comment above introduces an article in the latest edition of Teachers and Curriculum. I’ve added the emphasis at the end to highlight just how complex thinking is – and to underscore that we can frame it narrowly or more broadly, and broadly is surely better for educative purposes. Andy describes nine broad and partially overlapping forms of thinking, one of which is critical thinking.

I’ve come to similar conclusions about the centrality of different forms of thinking to key competency development and deployment. I recently developed a framework to capture the many different aspects of critical thinking (when interpreted broadly) to be used in a retrospective exploration of how key competencies were assessed in the first round of NMSSA – the National Monitoring Study of Student Achievement at years 4 and 8. I’ll come to what NMSSA has to say about primary school students’ progress in learning science in an upcoming post. Here I want to pick up from my last blog post to explore a second aspect of critical thinking 

In the previous blog I introduced causal thinking as a potential candidate for measuring progress in learning science.  This week I discuss argumentation. These are overlapping but distinct aspects of critical thinking. I see both of them as important additions to the ‘citizenship’ toolkit that students take away from their science learning at school. The developers of the PISA assessment framework apparently think so too. The image below shows how the PISA scale models progress in this aspect of science learning.  


Note: this comes from a slide and I did some abbreviating in the interests of readability for the large audience at SCICON.* Where you see dots add the phrase “explanations, models, interpretations of data, experimental designs”

* The goal post clip art is from Shutterstock and the increasing size is meant to signal that the goal is getting bigger/ harder  

Level 6 ups the ante from level 5 by specifying that students can do these things in a range of contexts. But notice that the sense of progress turns on how the qualifiers are understood. What is a partial argument? What is a simple argument (and how does it differ from a more complex argument – which seems to be the contrast implied)? Our experiences with NCEA have demonstrated all too clearly that semantics can only take us so far when it comes to exemplifying the ‘standard’ in statements like these. Fortunately help is at hand – argumentation is a hot research topic in science education.

The team at Stanford University, led by prominent science educator Jonathon Osborne, have been researching progression in relation to the structure of arguments. In this web page, Jonathon explains why argumentation capabilities are vitally important as an outcome of science for citizenship. That helps establish the ‘so what’ link to NZC’s stated purpose for all students to learn science so that “so that they can participate as critical, informed, responsible citizens in a society in which science plays a significant role” (NZC, p.17).     

Plus, this research is very helpful for answering the questions I just posed. The team at Stanford have developed a carefully researched progression that shows how claims (C), evidence (E) and reasoning (R) gradually become linked in more complex ways.

To illustrate, I’ve clipped Level 1c from their progression:   

So if this is a ‘complete’ (simple?) argument, what might a more complex one look like? Notice how the ante is really upped at the following two levels:

You can access the full progression on the web page and there is a video that explains it in more detail. The research team worked with middle school students but, looking at the top levels of the progression, I wonder how many of our students in their NCEA years can show us this level of argumentation ability? Again, as for causal thinking, the inclusion of partial and very simple arguments at the lower levels of the PISA scale suggests that a sizeable number of students are missing out on something they could – and should - be taught to do.

Other researchers have also demonstrated how much harder it is for students to make counter-arguments. Deanna Kuhn, for example, has been researching this topic for many years, not just in science education. Some of her recent research has shown how carefully structured group experiences help students to learn to make carefully reasoned counter-arguments – even if, as individuals, they don’t do this when writing formal essays (see the 2015 Educational Researcher article listed in her webpage but not hyperlinked).       

In summary, I think there are a number of important reasons why we might want to focus on assessing students’ progress in argumentation:

  • Argumentation is an important capability for citizenship
  • A sound research basis for how and why progress occurs has been established 
  • Knowing this, students can get better at argumentation if they experience explicit teaching, and practice ‘next steps’ that are appropriate for them
  • All the key competencies are likely to be in play (a topic we could expand on if this was our agenda)
  • An argument is ‘about’ something – the judicious section of science topics has the potential to integrate argumentation with ‘content’ learning

So there it is. I’d be really interested to hear from anyone who has been developing and assessing argumentation as an explicit science learning focus. I’m also wondering now if this might be a useful framework for revisiting some of the NCEA achievement standards that are still content-heavy.  If we did this at national level, we could generate explicit support material relevant to NZC that all science teachers could access.

Meanwhile, next week I’ll take a look at Science Thinking with Evidence – NZCER’s assessment tool for Years 7-10 that incorporates some elements of both argumentation and causal reasoning.     

Add new comment

Community guidelines

Please refer to the NZCER blogs guidelines for participation on NZCER blog posts.