The July edition of Modern English Teacher was, as usual, an exceptional read with a strong focus on classroom research. I have already shared my thoughts on the benefits of teacher-led research in previous posts (see here for the first in the series) so in this post I will comment on two articles from the special issue that focus on preparing students for language assessments (along with another idea I was reminded of).

These are of particular interest to me as I will be starting a new job in September with expectations of overhauling the assessment system for young learners. The challenge is how to ensure preparation lessons encourage active student involvement and do not simply end up as a never-ending series of past papers and practice activities.

The first article to catch my eye was by Alison Crooks: Learner-generated material – is it worth the effort? This contribution focuses on preparing teenage and adult learners for PET and FCE tests by setting them the task of creating their own practice questions. It was of particular interest to me as it echoes my first article for MET Making exam preparation child’s play from the July 2015 edition.

 

kids in classroom

 

Despite the different age groups (I wrote about getting primary learners ready for Starters, Movers, and Flyers tests by getting them to create their own questions and test-style tasks), the effects were very similar. Students gain a much better understanding of the demands of each exam task when given the opportunity to construct a task themselves. It also allows for them to work autonomously and collaboratively instead of isolated lockstep while ploughing through sample question after sample question. A strong opportunity for personalisation is also presented as students can select themes important and relevant to them.

Crooks gives a wonderful example of a student who lacked confidence becoming engaged and motivated by creating her own open cloze test based on an article that personally interested her. In my experience, my primary learners loved how we exploited their own pictures for ‘describe the differences’ and true/false tasks, as well as their short stories for comprehension and picture matching activities.

I was particularly impressed by Crooks’ support to her students to help them create their own listening tasks. They drafted and revised scripts, made recordings (this was something I was unable to do as access to quality audio recording equipment was not so prevalent back in the dark ages of about 5 years ago!), and created multi-choice questions to test their peers with. Crooks finds, as I did when getting my learners to make their own reading tasks, that this results in much deeper engagement with the source material and the task type than presenting the class with yet another practice test. They also love taking on the teacher role and quizzing their classmates!

Another article in the current edition of MET with a focus on preparation for tests is Miranda Hamilton’s Classroom research: rethinking testing. The context this time is quite different from my own as Hamilton describes how she helped adult learners prepare for a weekly progress test. However, the idea is one I have also used with young learners at both primary and secondary level – scaffolded assessment. This involves preparing for a test by allowing students to work collaboratively on practice questions in open class format.

 

students working in class

 

As mentioned above, one issue with exam practice questions is that, if taken at face value, they can become solitary activities performed in lockstep with whole class feedback afterwards. Allowing students to collaborate and refer to notes and materials makes the process more engaging and supportive. In my own experience, even young learners are happy to help each other in determining the best answer and as they work together, I have time to circulate and provide support in a way that would be impossible if students were working on the task silently.

Hamilton also notes that using this open class format results in students paying more attention to task instructions and rubrics. Under the stress of exam conditions, learners often dive straight into questions missing key elements of the instruction (especially a problem for secondary students when years of true/false at primary level is suddenly replaced by true/false/not given). Working collaboratively, they are more likely to process these instructions and drop fewer marks due to careless errors.

Hamilton’s approach reminded me of a task I have used in the past with secondary students aged 11-13 – ‘Cheat’. I came across this idea in Luke Meddings and Lindsay Clandfield’s book 52 and the premise is simple – inform your students that you will let them bring one page of handwritten notes to the test (what would otherwise be referred to as a ‘cheat sheet’). The trick is, as the notes have been prepared by the students themselves, the class often end up better prepared than they normally would be. Of course, when I used this myself, it was for an informal in-house test that I had prepared myself – this approach is not to be recommended for high stakes external assessments!

Collectively, all of the above also brought to mind an ‘official’ test I prepared for the same group of learners (this time one that was used with the entire year group). For this test, while no cheat sheets or collaboration were allowed, I made in an open book assessment. The students were being tested on set texts they had read in class and an open book format allowed me to pose questions demanding they inferred and processed information instead of simply searching for factual answers to right/wrong questions.

There was also a vocabulary section testing ‘key words’ from the texts and for this I got the students to write their own questions. From across the year group, over 100 questions were generated and ten of the best ones (as in not ridiculously easy or incredibly difficult) were included in the actual test, which really motivated the students to contribute and engage.

(You can read more about the approach I used and thinking behind it in this post from my personal blog.)

One interesting point to close on – testing is a results-based business so does any of the above actually result in improved test scores? Well, no, not really. With my own students for Starters, Movers, and Flyers, the average scores were similar to their peers who had gone through the practice materials in the ‘normal’ way. Hamilton also notes that having a scaffolded practice test ahead of the actual test did not result in an increase of average weekly scores.

So, why bother with promoting this increased student involvement then? The answer lies in how you view the concept of testing. If you are looking for a simple measurement of what has or hasn’t been learned, then you may want to stick with standard practice activities. If, however, you view assessment as part of the learning process, and if you want your students to be more engaged with less stress about upcoming tests, then these approaches are the answer. They encourage students to get inside the exam question format, gain a better understanding of the task requirement, and engage with the texts and task types on a more personal, student-centred level.

After all, we learn more by asking questions than answering them!