Advanced Assessment Considerations

We spend a fair bit of time in the CTE talking about assessment. Whether it be in workshops, one-on-one consultations about teaching, curriculum meetings, teaching observations, or just in passing conversation among colleagues. In these instances, discussion often focuses on the function of the assessment – is it being used for diagnostic (to ascertain prior knowledge), formative (to provide feedback and improve performance during the learning process), or summative (to evaluate the learner’s knowledge, skills, or values) purposes. Discussion of course revolves around the type of assessment being used, as well as questions regarding grading, but there are aspects of assessment that go beyond this and begin to explore the role of the student in the assessment process.1197947341_89d0ff8676_z

When thinking about assessment and how we position it in our teaching, it can be helpful to think about it conceptually while considering what purpose we as instructors assign to it. To do so, we might turn to three approaches to assessment: assessment of learning, assessment for learning, and assessment as learning.

Assessment of learning suggests that assessment is being done for the primary purpose of determining what students have learnt in the class; it is typically done in a summative manner so that students can receive a grade.

Assessment for learning views assessment more as a process, whereby students learn from the assessment; feedback is of great importance as it assists the student in the learning process.

Assessment as learning takes this even further, situating the student him- or herself as an assessor and therefore takes greater responsibility in the learning and assessment process.

None of these approaches to assessment are inherently better than the other, but I would encourage you to reflect on your assessments from a different perspective, adapting what I call Advanced Assessment Considerations. These considerations are intended to allow you to reflect and evaluate your assessment during the design, facilitation, and grading & feedback stages. They do not suggest distinct or varied methods of assessment, but rather, attempt to provide options as to how to modify an existing assessment to make it even better. And by better, I mean better for the students – assessment should motivate and empower students to demonstrate what they have learnt, and the more we can have students become invested in the assessment process, the more meaningful it will be to them.

So what are these advanced assessment considerations? I’ll run through each briefly, providing some initial insight into what each consideration entails and leave you with some questions to consider.

Design considerations apply to how the assessment itself is constructed before the students begin to actively work on the assessment. They are intended to provide learners with agency to be contributors to the assessment and determine what their role will be in the assessment.

To incorporate these into our assessments, we should look at things such as:

  • Student choice
    • Develop student responsibility for learning with controlled options for assessment; students can choose the type of assessment they feel most suits their learning needs, but must reflect on why they chose it/didn’t choose others (Weimer, 2011)
  • Low-stakes and high-stakes assessments
    • Implement a variety of low-stakes and high-stakes assessments that allow for students to build confidence and motivation and alleviate stress or anxiety; create a culture of routine assessment, and don’t be afraid to allow strategies (such as cheat sheets or open-book resources) to alleviate the stress of high-stakes assessments such as exams; good students do well, and poor students do poorly, regardless of the exam type (Gharib & Phillips, 2013)

Facilitation considerations apply to the process whereby students complete the assessment, and the ways in which the instructor can help manage this process. They are intended to ensure that assessments are structured logistically and rationally so as to promote student learning and alleviate difficulties that are external to the objectives of the assignment.

These can be implemented when considering the process underlying the assessment by thinking about aspects such as:

  • Spacing effect
    • Ensure sufficient space between assessments so as to continually reinforce understanding before information is forgotten; the repetition of course content and the process of retrieving content as students begin to forget it helps to reinforce and solidify understanding (Cepeda, Pashler, Vul, Wixted & Rohrer, 2006; Kornell, Castel, Eich & Bjork, 2010)
  • Testing effect
    • Ensure that formative assessment in course matches type of summative assessment; students learn better by testing course content as opposed to simply studying from a textbook, and they learn especially well if testing is accompanied by some form of feedback (Agarwal et al., 2007)

Finally, grading & feedback considerations apply to the means by which feedback is provided and received by students as a means to close the learning cycle. These ensure that sufficient and meaningful feedback is provided to each student multiple times throughout the course, and that the provided feedback is reflected upon by the student.

In some ways, these are most important as they directly encourage the learner to be involved as the assessor to some extent, the degree to which can vary depending on the assessment. To involve the learner in this capacity, consider the following:

  • Peer feedback
    • Peer feedback allows for rich and meaningful insight that can directly support and scaffold learning (Dochy, Segers & Sluijsmans, 1999)
  • Self-assessment
    • Students need the opportunity to examine feedback and determine how to improve through structured means; this can be especially powerful if part of the peer review process (Reinholz, 2015)
  • Student-led feedback
    • Support individual student development by encouraging reflection on their own learning goals before receiving feedback; have students take ownership of what they want to receive feedback on
  • Assessment guideline creation
    • Involving students directly in the creation of assessment guidelines can result in sustained motivation; students can create their own rubrics for the assessments they are completing in consultation with the instructor

Ultimately, depending on your assessment, some of these considerations may simply not be applicable, and attempting to adapt many of these to a single assessment may prove equally challenging. I would however encourage you to think about how some of these considerations could find their way into your own assessments to make the experience all the more meaningful to students and instructors alike.


Agarwal, P.K., Karpicke, J.D., Kang, S.H.K., Roediger III, H.L. & McDermott, K.B. (2007). Examining the testing effect with open- and closed-book tests. Applied Cognitive Psychology, 22, 861-876.

Cepeda, N. J., Pashler, H., Vul, E., Wixted, J. T., & Rohrer, D. (2006). Distributed practice in verbal recall tasks: A review and quantitative synthesis. Psychological Bulletin, 132, 354–380.

Dochy F., Segers, M. & Sluijsmans, D. (1999) The use of self-, peer and coassessment in higher education: A review. Studies in Higher Education, 24(3), 331-350.

Gharib, A. & Phillips, W. (2013). Test anxiety, student preferences and performance on different exam types in introductory psychology. International Journal of e-Education, e-Business, e-Management and e-Learning, 3(1), 1-6.

Kornell, N., Castel, A.D., Eich, T.S., & Bjork, R.A. (2010). Spacing as the friend of both memory and induction in young and older adults. Psychology and Aging, 25(2), 498-503.

Reinholz, D. (2015). The assessment cycle: A model for learning through peer assessment. Assessment & Evaluation in Higher Education, 41(2), 37-41, 1-15.

Weimer, M. (2011, June, 21). A rose for student choice in assessment? Retrieved from


Kyle Scholz

Faculty of Arts and University Colleges Liaison

More Posts - Website

Thank you, Jane Holbrook, and all the best!

Observers of educational development at Waterloo will know that we’ve had a teaching centre onsite for 40 years. Christopher Knapper was the founder of the Teaching Resources and Continuing Education (TRACE) unit in 1976, which kept the same name until the 2006-2007 academic year, at which point a merger with Learning and Teaching Through Technology (LT3) and Learning Resources and Innovation (LRI) led to the formal creation of the Centre for Teaching Excellence pretty much as we know it today.

I think I’m feeling rather wistful and nostalgic at this point because our Senior Instructional Developer for Blended Learning, Jane Holbrook, retires this week. We can hardly believe this to be true, but true it is. I’d like to take a moment to acknowledge Jane’s work with LT3 and CTE. It’s difficult for me to accept that this marks nearly 10 years since CTE’s inception, and the occasion of Jane’s retirement is cause for reflection about where we’ve come and where we’re going. Mainly, though, it’s an opportunity to appreciate Jane’s contributions to scholarship in the areas of blended learning and educational development, as well as her commitment to supporting our Waterloo teaching community over many years.


Jane back in the day

Jane started teaching courses in Biology here around 1989, but in or around 2001, prepared a report for Tom Carey in LT3 about a new model of support for educators in Waterloo’s six Faculties. The result? Our much-praised and oft-copied Faculty Liaison model. Jane took up one such role, for Science, and others followed soon thereafter. I can remember looking at LT3 first from my vantage point at Trent’s Interactive Learning Centre, and later from Guelph’s Teaching Support Services, with a certain amount of envy — in large part because of this model.

I was very happy to join CTE, then, and to work directly with people whose efforts and processes I’d admired from afar. I was not disappointed. In the 9 years I have worked here as a Senior Instructional Developer, I have relied on Jane as a source of wisdom, especially as I learned the ropes of managing other people and managing multiple projects.

Jane Holbrook Winter 2016

Jane in 2016

Jane is a model of honest, astute, intelligent leadership. She never shies away from difficult conversations, always provides incisive input on university-wide and CTE committees or as a personal mentor, and pulls more than her share of administrative weight at one of Canada’s largest teaching centres. I aspire to emulate her level-headed, savvy, and caring approach towards both people and projects.

Jane Holbrook gestures over a copper pot.

Jane blends stuff

Working on blended learning initiatives, Jane has applied her considerable creativity and scholarly approach in ways that have helped many professors to think differently about their practice, and indeed change that practice in ways that increase learning for many generations of students to date, and many more to come.

Thank you, Jane, and all the best in your own next steps. I am thrilled to be working alongside Mary Power, your replacement in the SID role, and will also miss you enormously.


As Senior Instructional Developer, Curriculum and Programming, Trevor Holmes plans and delivers workshops and events in support of faculty across the career span. Prior to joining the Centre for Teaching Excellence, Trevor worked at a variety of universities teaching courses, supporting faculty and teaching assistants through educational development offices, and advising undergraduates. Trevor’s PhD is from York University in English Literature, with a focus on gothic literature, queer theory, and goth identities. A popular workshop facilitator at the national and international levels, Trevor is also interested in questions of identity in teaching and teaching development.

More Posts - Website

CTE’s 2015-2016 Annual Report — Mark Morton

looking-backwardCTE’s 2015-2016 Annual Report is nearing completion and will soon be sent to the printers. It’s hard work creating the report, but also revealing and affirming: it gives us a chance to look back over the past year and discern what we have accomplished from a “big picture” perspective. And of course it also helps us reorient ourselves for a new year of activity.

As a preview of our 26-page report, I’ll paste below some of the achievements that our Director, Dr. Donna Ellis, highlights in her preamble to the report:

  • Thanks to strategic plan funding, we hired a new Instructional Developer to assist with the development of our students’ communication skills. This staff member helps instructors at all levels learn strategies for teaching and assessing writing across the curriculum, as well as supports our instructional programs for graduate students.
  • We contributed to two university-wide committees on large-scale change projects to assist with teaching quality: one on student evaluations of teaching and another on teaching and learning spaces. We bring research evidence and best practices to bear on these important and complex initiatives.
  • In conjunction with the Graduate Studies Office, we launched a two-day Graduate Student Supervision series to ensure high-quality graduate instruction and assist new faculty members in attaining supervision status.
  • With colleagues from Western University and Queen’s University, we developed two of six new online modules on university teaching for use in our instructional programs.
  • We increased participation in our instructional development programming: since 2013, the number of unique participants in our workshops has increased by 19 per cent, with total workshop completions increasing by 37 per cent. This increase reflects an improved uptake, as our total number of workshops increased by only 23 per cent in the same timeframe.
  • We added more instructor profiles to our high-traffic website to help promote public awareness of Waterloo’s teaching excellence.
  • We started three projects to encourage innovative methods of course delivery using learning technologies. One project involves developing a new process for soliciting information from instructors about their use of learning technologies (beyond LEARN) so we can report on their usage and facilitate the sharing of best practices.

If you’re interested in receiving a copy of CTE’s 2015-2016 Annual Report, just let me know: .We’ll also be adding a link on our website to an accessible PDF version.

Mark Morton

Mark Morton

As Senior Instructional Developer, Mark Morton helps instructors implement new educational technologies such as clickers, wikis, concept mapping tools, question facilitation tools, screencasting, and more. Prior to joining the Centre for Teaching Excellence, Mark taught for twelve years in the English Department at the University of Winnipeg. He received his PhD in 1992 from the University of Toronto, and is the author of four books: Cupboard Love; The End; The Lover's Tongue; and Cooking with Shakespeare.

More Posts - Website

SAM vs. ADDIE and 5 other takeaways from Madison — Tonya Elliott

madisonThe Distance Teaching and Learning Conference held in Madison, Wisconsin, has been running for 32 years and is the largest and longest-running distance education conference in the USA.

I’m writing this blog from Frank Lloyd Wright’s gorgeous Monona Terrace after 3 full days of #UWdtl keynotes, presentations, demo booths, ePosters, and discussions.  I was one of 4 Canadians who attended and presented to a room of 45 people about the online STEM/Math work we’ve been doing at Waterloo.

It will take me some time to fully digest everything from the conference, but here are 6 takeaways that immediately stood out.

1. Most instructional design models are some derivation of ADDIE (analyze, design, develop, implement, and evaluate), but ADDIE’s applicability to digital environments has been under scrutiny for some time. Other instructional design models are emerging, such as Allen’s Successive Approximation Model (SAM), shown below, and McKenney and Reeves’ problem-focused Education Design Research (EDR). Their books are referenced below and I now have two new books on my Amazon wish list!

ADDIE design model2. Learning analytics are becoming more prevalent and the potential to better understand our learners with concrete data is awesome. Learning analytics happens at three levels and each level involves both understanding what’s happening and sharing the information with students in a way that’s useful for them.  Here are the levels:

  • what students are doing today/have been doing in the past,
  • where students are likely going (using predictive modelling), and
  • where students have the potential to go/what is their optimal path.

Unfortunately, nobody at the conference had concrete examples about implementing levels 2 and 3 in any great depth, but I enjoy thinking about analytics in these three parts.

3. Some institutions spend a lot of money on remote proctoring services like Examity.  Math or other courses that can’t easily require students to complete their timed work electronically are trying out things like requiring students to position their web cameras downwards towards their papers and hands.  Unsurprisingly, privacy issues are surfacing. For example, same -sex options are now required at some institutions after female students reported being uncomfortable having unknown male proctors watching them work in their bedrooms.

4. You’re more likely to get buy-in for new initiatives if you start small. People are almost always willing to let you pilot something, and pilots can quickly and easily turn into beta versions.  If a beta version works out for a project, it’s almost always seamless to fully implement (and find funding for) it.  This path is much more efficient than trying to find approval for or fund something “big”.

5. Hooks are necessary: courses and classes should start with stories, problem questions, or other “hooks” instead of a bulleted list of outcomes.   Similarly, rather than nicely wrapping up a class, they should end with another “hook” to get students thinking about the next class.  Cognitive psychologists refer to this process as an open-loop.

6. Wisconsin’s recently launched Online Teaching Experiences site has been very well received and their site analytics reveal that the most popular part of their site is the instructor videos. I wonder if, in addition to our Instructor Community of Practice, CEL should investigate/create (digital) resources and videos for our fully online instructors.  Would this kind of resource be valuable at Waterloo? I’d love to hear our online instructors thoughts about this (so please email me your thoughts –

References and resources

Tonya Elliott

Tonya Elliott

In her role as an Online Learning Consultant (OLC) with the Centre for Extended Learning (CEL), Tonya Elliott provides instructional design and project management support to faculty and staff who wish to design, develop, and/or deliver fully online courses, programs, and resources. The majority of her projects are with members of the Faculty of Mathematics; however, she really enjoys working on a variety of online projects from faculty and staff from all areas of campus.

More Posts

Course Design Broke my Brain – Crystal Tse

Aaron Silvers Attribution

I took Course Design Fundamentals a few weeks ago, and it broke my brain – in a good way! I have taught before, but this was a great opportunity for me to revisit the course that I’ve been teaching for the past few years from a fresh perspective.

Here are a couple of my take-aways from this workshop that lays out the best practices for course design:

  • Alignment, alignment, alignment – between the intended learning outcomes for your students in the course, the course activities, and the assessment of students’ learning. It was great to have this connection made explicit. However, it was also a jarring experience as some of the concepts I wanted my students to learn were not made explicit in the activities the students engaged in. Time to remedy that!
  • Concept maps for your course are tough to make! I had never created one before for my course and was at a loss at first of how to structure it and what the main concepts I wanted my students to get out of my course. A bit of brainstorming and lots of sticky notes later, I finally fleshed out the main concepts. Two of them were actually not about course content. One was about helping first year students transition to university life (e.g., coping with stress effectively, how to study and take tests). I spend my first lecture telling students about my own experiences as a first year student – that it’s difficult and stressful, but that this stress was temporary and would soon be overcome. I revisit this point by telling stories of my own failures and successes, talking about healthy living, and checking in with students throughout the term. Another way to help with students’ transition is to build community in your classroom so students have support networks they can draw on in times of stress and uncertainty.
  • The other concept was to encourage metacognitive skills (i.e., how to encourage students to reflect and think about their own learning). I do different lecture wrappers (e.g., one minute summaries where students spend a minute writing about the main take-away from the class and what questions they still have that can be addressed in the next class). CTE has a great tipsheet on strategies you can use to encourage self-regulation in students’ learning that can be quick and don’t require a complete overhaul of your course. There are also many evidence-based strategies based on psychological research that can help students study more effectively and engage in more critical thinking.
  • Thinking more about incorporating students’ own experiences into the course in addition to my own perspective. Students come with a wealth of prior knowledge and life experiences that can be drawn on. In the past I have solicited students’ anonymous comments about a topic in the course (especially one that can be particularly controversial or sensitive) prior to class so they are ready for discussion. I’m excited to do this more!


Image provided by Aaron Silvers under the Creative Commons “Attribution” license.

Crystal Tse

Crystal Tse

Crystal is the Educational Research Associate at the CTE where she contributes to the Scholarship of Teaching and Learning work and to program evaluation. She received her PhD in social psychology in the Department of Psychology at the University of Waterloo, where her research involved applying psychological theory to inform evidence-based interventions that address different social issues.

More Posts

Crowdmark – Online grading for large courses

This Fall 2016, the University of Waterloo will have 25 courses with stockvault-pile-of-paper117595a class size of between 500 and 1000 students and 10 courses of  between 1000 and 2000 students.

The amount of paper handling to administer the potential 33,000 final exam papers from these large courses will be monumental. (For fun, guestimate the volume of paper this amounts to.)

The Mathematics Faculty has been  successfully experimenting for a year with a online grading system called Crowdmark, a company founded by Professor James Colliander of the Mathematics Department of the University of Toronto.

Professor Colliander was faced with a similar problem: grading 5000  Canadian Open Mathematics Competition (COMC) papers each year with  100 volunteers.  As with final exams, each paper is typically graded by a number of markers so keeping track of which questions are graded on which papers and when the papers are free to be passed to another marker is a time consuming and error prone business.

Crowdmark (CM) attempts to eliminate some of the time and trouble spent managing the grading process.   We are not talking about a quiz system with automatic grading. Crowdmark is hand-marking done online.  Skilled people still grade, and tests and assignments are still created for printing on paper so there is nothing new in this part of an instructor’s routine.

So, what is it that makes the marking process more efficient when done online?

  • Markers are able to grade the same paper at the  same time.  No more locating and waiting for a paper that someone else is grading.  Or waiting for a batch of papers to arrive at your location to begin your stage of grading.  Grading can be done concurrently at multiple locations and times.
  • Grades can be automatically summed, collected, summarized, distributed and recorded in a Learning Management System without needing to check for arithmetic or transcription errors.
  • No time needs to be spent returning piles of exam papers.

There is a time and money cost to using online grading.  The physical papers have to be scanned into digital format (PDF file) before grading can start. High speed scanners (500 pages per minute) can process 1000 10-page exams  in 20-30 minutes once delivered to the scanning machine.

Here I’ll briefly discuss how instructors and students use CM.

Steps for an instructor:

  • upload one test or exam pdf file into CM (leave 2 inches blank on the top of each page for CM ID info and set 1 question per page)
    • CM duplicates the test pdf for each student and adds a paper and page ID to each page
  • download from CM the pdf file of student tests and print it
  • after the test scan all written test papers into a pdf file and upload the file into CM
    • CM arranges the pdf file pages into a grid pattern: each row holds a student’s test pages
  • each marker clicks on a page in the grid to read, comment, and grade it
    • when grading is complete page grades are summed for each test paper by CM
  • match each test paper cover page student ID with a student name in your CM course (assigned seating at UW can eliminate this step)
  • you choose whether CM sends each student their grade and a CM link to their graded test paper or to keep the grades and graded papers private and just download the grades for inclusion into a course grade

Steps for a student:

  • write the test paper by hand as usual
  • may receive an email from CM with a link to a CM page showing their test results

The links at the end of this post provide further details about Crowdmark.  In addition, 2 live sessions demonstrating Crowdmark are coming up at the end of August and the beginning of September.   The first is an introduction to Crowdmark on Wednesday August 31 and  the second follows up a week later on Wednesday September 7 (1:30-3 PM) with details about a University of Waterloo system named Odyssey that works with Crowdmark.  Odyssey organizes test papers, students and exam room seating providing relief from some time-consuming management overhead.

Crowdmark is not a free service, but the University of Waterloo has a licence so there is no charge to individuals (instructors or students) at the university.

If you are interested in learning more about online grading for your course please get in touch with me.

Paul Kates
Mathematics Faculty CTE Liaison, x37047, MC 6473

Intro to Online Marking using Crowdmark: Wednesday, August 31, 2016 – 10:30 AM to 11:30 AM EDT
Crowdmark home page,   help pages and  youtube channel.
UW Odyssey Examination Management

Paul Kates

Paul Kates

Centre for Teaching Excellence (CTE) Liaison to the Faculty of Mathematics (

More Posts

Online Math Numbers at Waterloo, and Comparative Judgments as a Teaching Strategy — Tonya Elliott, CEL

math equationOnline Math Numbers

If you weren’t already aware, here are a few numbers about online math at the University of Waterloo:

  • The Math Faculty has been offering fully online courses since Fall 2003 and, since that time, has offered 55 unique online courses to more than 21,000 students
  • The Centre for Education in Mathematics and Computing (CEMC), with support from the Centre for Extended Learning (CEL) and local software company Maplesoft, was the first group on campus to release a large set of open educational resources (OERs). Called CEMC courseware, the OERs include lessons, interactive worksheets, and unlimited opportunities for students to practice skills and receive feedback. At the time of this post, the resources have received over 1.8 million hits from 130,000 unique users in 181 different countries.
  • In 2015, the Canadian Network for Innovation and Excellence (CNIE) recognized CEMC, CEL, and Maplesoft for their OERs through an Award of Excellence and Innovation.
  • The Master for Mathematics for Teachers (MMT) program has the highest enrolment of all the fully online Masters programs offered at the University of Waterloo. MMT and CEL staff who work on the program were one of three teams from Waterloo who won a 2016 Canadian Association for University Continuing Education program award.
  • Maplesoft is using a focus group from Math, CEL, and CTE to develop a new authoring environment that will specifically target the needs of online STEM course authors. It is anticipated that this tool will be released in early 2017 and, over time, should save development costs by 50%.
  • The Math Faculty, together with the Provost’s office, has dedicated $1.2 Million over the next three years for additional work on online projects; over 90 course development slots allocated by CEL have already been filled.

These numbers are some of the reasons Waterloo is considered a leader in the area of online math education.

Comparative Judgments

From June 19 – 22, a small group from Waterloo and I joined an international team of mathematics educators to discuss digital open mathematics education (DOME) at the Field’s Institute in Toronto. Lots of great discussions happened including opportunities and limitations of automated STEM assessment tools, integrity-related concerns, and practical challenges like lowering the bar so that implementing fully online initiatives isn’t the “heroic efforts” for Faculty it’s often viewed as being today. Of all the discussion topics, however, the one that got me most excited – and that my brain has returned to a few times in the month since the conference – is using Comparative Judgement (CJ) in online math courses.

colour shadesThe notion behind CJ is that we are better at making comparisons than we are at making holistic judgments, and this includes judgments using a pre-determined marking scheme.  It doesn’t apply to all types of assessments, but take this test on colour shades to see an example of how using comparisons instead of holistic rankings makes a lot of sense. Proof writing and problem solving may also lend themselves well to CJ and three journal articles are listed at the end of this blog for those who would like to read more.

Here are some of the questions I’ve been pondering:

  • Are there questions we aren’t asking students because we can’t easily “measure” the quality of their responses using traditional grading techniques? How much/when could CJ improve the design of our assessments?
    • Example: Could CJ, combined with an online CJ tool similar to No More Marking, be used by students in algebra courses as a low-stakes peer assessment activity so students could see how different proofs compare to one another? Perhaps awarding bonus credit to students whose proofs were rated in the top X%.
  • Which Waterloo courses would see increases in reliability and validity if graders used CJ instead of traditional marking practices?
  • How much efficiency could Waterloo departments save if high-enrolment courses used CJ techniques instead of marking schemes to grade exam questions or entire exams? Could CEMC save resources while using CJ to do their yearly contest marking?

I don’t have answers to any of these questions yet, but my brain is definitely “on” and thinking about them. I encourage you to read the articles referenced below and send me an email (  If you like the idea of CJ, too, or have questions about anything I’ve written.  If you have questions about Waterloo’s online math initiatives, you’re welcome to email me or Steve Furino.


Jones, I., & Inglis, M. (2015). The problem of assessing problem solving: can comparative judgement help? Educational Studies in Mathematics, 89, 3, pp. 337 – 355.

Jones, I., Swan, M., & Pollitt, A. (2014). Assessing mathematical problem solving using comparative judgement. International Journal of Science and Mathematics Education, 13, pp. 151–177.

Pollitt, A. (2012). The method of Adaptive Comparative Judgement. Assessment in Education: Principles, Policy, & Practice. 19, 3, pp. 281 – 300.


Blackboard image courtesy of AJC1.




Tonya Elliott

Tonya Elliott

In her role as an Online Learning Consultant (OLC) with the Centre for Extended Learning (CEL), Tonya Elliott provides instructional design and project management support to faculty and staff who wish to design, develop, and/or deliver fully online courses, programs, and resources. The majority of her projects are with members of the Faculty of Mathematics; however, she really enjoys working on a variety of online projects from faculty and staff from all areas of campus.

More Posts