Letting Your Students Do the Work: Student-Generated Exam Questions – Veronica Kitchen

exam-photo2My first year at Waterloo was also my first year teaching my own courses. In mid-November I found myself overwhelmed by the task of writing exams for my large second-year political science course. Since I’d never taught the course before, I had no question bank, no old exams to adapt, and not a whole lot of spare time. I struggled to write enough questions to populate my exam, plus a make-up or two. I mentioned this demoralizing state of affairs to Nicola Simmons at a CTE event.

Why don’t you get your students to write the exam?” she suggested. Surely this was wrong, I thought. I’m the teacher, and writing the exam is my job. Surely letting my students write the exam would be tantamount to giving them the answers beforehand. But then I thought about it a little more, and decided it was worth a try. At the very least, writing exam questions would be a good review for students.

This year, when teaching the same course again, I offered students the opportunity to write multiple choice and short answer questions (and answers!), and promised to consider using them on the final exam. I set up discussion pages in UW-ACE to receive the questions so all students in the class could use them as a review tool. I offered up to a 2% bonus their final mark for students who submitted questions I used: 1% for a question I used more or less intact, and half a percent for a question I had to modify significantly (for instance, re-writing the distracters on a multiple-choice question). I provided two links to resources on how to write multiple-choice exam questions, available here and here, and asked them to consider how short-answer questions would fit into Bloom’s Taxonomy.

The initial response to my announcement that I would let my students create the final exam was disbelief, followed by silence on the discussion boards for most of the term. I was sure my experiment was a bust. But as the exam period drew near, students finally began to submit questions. I was pleased with the results: many students wrote questions similar to ones I had drafted myself, suggesting that they were grasping the important points of the course. Others were certainly worthy, sometimes with a little editing, of being included in my question bank, and I used several on the final exam (without telling students beforehand which ones I had used, of course). Students used the questions for review, although some of the student-written questions made them a little panicky. I had to re-assure them that I would not be asking questions on the model “On page 67 of the textbook, which of the following arguments does Author X make?”, and that I would only ask short-answer questions that really did have short answers.

The class average on the final exam was fairly close to the previous year’s average, so my concern about giving away the answers was obviously unfounded. Writing the exam(s) and answer key was still tedious, but having students write the rough draft of questions helped by pulling together important concepts from each lecture and the readings, and by generating examples and applications I didn’t have to dream up on my own. For the students, the exercise resulted in a set of review questions everyone in the class could use. There is probably a larger discussion here on students writing their own assessments—and I’d be interested in hearing how other instructors have used this technique to manage their own time and to benefit their students.

Published by

Mark Morton

As Senior Instructional Developer, Mark Morton helps instructors implement new educational technologies such as clickers, wikis, concept mapping tools, question facilitation tools, screencasting, and more. Prior to joining the Centre for Teaching Excellence, Mark taught for twelve years in the English Department at the University of Winnipeg. He received his PhD in 1992 from the University of Toronto, and is the author of four books: Cupboard Love; The End; The Lover's Tongue; and Cooking with Shakespeare.

2 thoughts on “Letting Your Students Do the Work: Student-Generated Exam Questions – Veronica Kitchen”

  1. Sounds like a great class activity, Veronica.

    I just wanted to bring attention to a similar learning activity in the UW-ACE Instructor Resource Repository. In this activity an online drop box is used to gather multiple choice questions from students on topics that will be assessed on their midterm or exam. These questions are then used by the instructor who creates an online self-assessment quiz that students can access before their main assessment. It is called the “Multiple Choice Exam Preparation” activity and only the students who submit questions to the drop box are “allowed” to get the benefit of the practice with the self-assessment quiz (providing some incentive to be a contributor!). This activity is easy to import into any UW-ACE course and has been used successfully as an individual and a group activity on our campus.

  2. This is a very interesting article. With professors overworked as they are, this technique for writing exams could prove to be useful for gauging understanding by students of the course material and could alleviate a little of the workload of professors.

    However, from the point of view of a student, I believe that educators have become so focused on administering tests, assignments, and exams to evaluate students that the spirit of learning is being eroded from the classrooms of canadian universities. This trend of increased focus on evaluation can be seen by the proposal made to the University Senate to allow tests on Saturdays.

    As students, we are seeing an increase in the number of midterms in various classes, from none or one to one or two per term. For many of my colleagues, it is no longer a question of learning the material taught in class, much less going beyond what is taught. It has become a matter of passing the next evaluation.

    This increased focus on evaluating student is costly to both course staff and students. In courses where the number of evaluations is increasing, professors and teaching assistants must spend more of their time correcting evaluations when they could be mentoring students or performing research, which are tasks attributed to university professors.

    For students, the impact of the increase in evaluations is twofold. Firstly, due to the increased workload, students must spend more time working on evaluations. As such, they must reduce the amount of time they have available for other activities. For some, much needed leasure time will be sacrificed “for the good of their grades”. Unfortunately, most will cut into the time they have alloted to themselves for studying. As many will acknowledge, a student spends much more energy working on assignments (in terms of time/amount of knowledge) than they will when studying because of document formatting and content exactness issues. Thus, students that have more evaluations will retain less knowledge than their counterparts who have less evaluations.** Secondly, students may (and a significant number will) decide that learning the subject matter is less important than passing the evaluations and the course. As a result, they will limit their studies exclusively to the materials essential to passing the evaluations, inherently “dumbing them down”.

    Another issue that has come to the attention of the students is the level of bureaucracy that has inserted itself into the academic curriculum; but that is a rant for another blog post.

    ** It should be noted that fields where applications of subjet matter is essential may not follow the same trends. There are always exceptions.

Leave a Reply