Designing for the user experience — Pia Zeni and Matt Justice, Centre for Extended Learning

user-experienceWe’ve all encountered scenes like the one pictured above – you may even be looking at one outside your office window: pedestrians choosing to ignore the nicely-constructed, costly, often very pretty footpaths designed for them, and choosing instead to forge their own path.  But have you ever thought about what scenarios like this say about design?  Why aren’t pedestrians selecting the paths constructed for them? What do their choices say about the paths designers have constructed? What goal(s) motivate them to forge their own?  These are the types of questions user experience (UX) designers ask.

The picture presents a useful allegory for designers of any stripe: the idea being, of course, that if we want to design valuable things, we need to consult the needs, expectations, and yes, even wants, of our users.

Let’s translate that principle to an online learning context:  “If we want to design valuable online learning experiences for students, we need to take their needs, expectations, and yes, even wants into account.” Whether this strikes you as common sense, or fairly radical, it is a design approach that the Centre for Extended Learning (CEL) has recently adopted with our User Experience Design for Learning (UXDL) framework, an adaptation of UX Honeycomb, developed by leading user-experience advocate Peter Morville.

You can learn more about our UXDL framework and how our design process is evolving to put our users – our students – front and centre at This is a new initiative for us, so we welcome your ideas, thoughts, and reflections.

Pia Zeni (

Matt Justice (


[Photo Source: Kalve, S. (2014, September 11). Design vs UX I Nydalen. [Photo]. Retrieved from]

Looking Beyond the Evidence: What’s Your Story? — Donna Ellis, Director of the Centre for Teaching Excellence

Face covered with data
Have you ever felt overwhelmed?  I’m sitting at my computer on a late November afternoon contemplating what I have taken away from two recent events: a provincial symposium on assessing learning outcomes and an international conference for educational developers on transformative relationships in relation to fostering cultures of deep learning.

I attended numerous sessions and overall I came away with a sense of what I call “data overwhelmosis”. We have more data and more evidence available to us than ever before in higher education.  We have software to help us identify specific learning outcomes and each student’s level of achievement for each outcome. We have online templates for course syllabi that generate maps of the learning outcomes for an entire program’s curriculum. We can use learning analytics and data analytics to monitor students’ progress (or failure).  We can do social network analyses to show how we connect to one another, how information flows within a unit or across an entire institution (or beyond).  We know what educational development practices have empirical backing. The list goes on.  My point is that it’s clear that we can capture almost anything. We can collate massive amounts of data and generate evidence for (or against) almost anything you can imagine. But to what end? What’s the purpose? And what’s the overarching plan?

We’ve talked a lot about these questions as part of devising and implementing our Centre’s assessment plan as well as our upcoming external review.  Just because we can get data doesn’t mean it’s a good idea.  How much is enough? What will we do with what we collect?  Why will it matter?  Data collection takes time and effort.  We know this from any research project we have undertaken.  In our line of work, any time that we ask our staff to input data about their work, this is time not spent working with a client.  There has to be a good reason to ask staff members to spend time in this way.  This is where the role of questions becomes critical.

For research projects, we determine research questions.  We did the same when devising our assessment plan.  These questions guide our every move:  our methodological decisions, the types of data we need, the appropriate analysis methods, and the way we write up our results.  The questions enable us to select the data that will help us determine answers, and these limited data become the evidence for our conclusions.  We’ve realized that we don’t need every piece of data that we could collect – just the data that are relevant to the questions.  This is a freeing revelation.

But it doesn’t end there.  The evidence isn’t enough.  We need to find the story.  What does the evidence mean?  How will it affect what we do tomorrow or in the next five years?  I worry that higher education in general – and educational development specifically – is getting bogged down in the weeds and not stepping back to identify what those weeds are telling us.  The examples that I noted in the second paragraph help to illuminate the issue.  But what are we overlooking?  Which way is the wind blowing now and in the future?  Our questions create important frames to make data manageable and even meaningful, but thinking about how to tell the story of the evidence seems the most crucial of all to me.

In the next few months, we will be aiming to tell the story of CTE in our self-study, which will extend far beyond what we convey in our annual reports.  We will be analyzing existing relevant data and collecting new data as needed to fill perceived gaps.  We will be striving to ensure that we have sufficient information to assist our external reviewers in addressing the questions set in the Terms of Reference for the review.  But from all of this, what we most need is to tell our story and listen to what it is telling us.  I’m not entirely sure what we’ll hear, but I am very intrigued by what will emerge.  The evidence is critical, but we need to move beyond it to better understand where we are and where we’re going.

Donna Ellis

Donna Ellis

Donna Ellis has supported the teaching development of Waterloo faculty members and graduate students since 1994. In her role as Director, she oversees the development and delivery of all the Centre for Teaching Excellence programming and services, which include individual faculty consultations; events directed at graduate students, new faculty, and established faculty regarding face-to-face teaching, blended learning, and emerging technologies; online resources; curriculum and program review consultations; and research support services. Donna has a PhD from Waterloo’s Management Sciences program and completed her dissertation research on instructional innovations. She also has an MA in Language and Professional Writing from Waterloo, and has taught in the Speech Communication program. Donna, along with her husband, spends time away from work raising three fine boys.

More Posts - Website

“Go on a field trip!”¹: An opinion piece – Anita Helmers

Magic School Bus book cover

As a student at the elementary and secondary level I always looked forward to day-trips to the zoo, the museum, or to a provincial park. Honestly, who wouldn’t be excited to be out of school for a day?

Regretfully I reflect back on my earlier field trips and can say I only appreciated them for the opportunity to get out of the classroom and took for granted the educational purposes. It was not until beginning my undergraduate career that I gained an appreciation for field trips and their educational experience (although it is still nice to get out of that lecture hall).

At the university level, field trips are few and far between for many students. But why is that? Field trips at the university level can offer hands-on “real-life” opportunities for students to, as Ms. Frizzle says, “Take chances! Make Mistakes! Get Messy!”¹ Field trips are an opportunity to put to practice the theories taught in the classroom. Field trips, as a teaching method, should be used to expose students to the realities of their surrounding environment and provide a safe, low-risk space to learn from experience.

I am not suggesting that professors take students to the Moon in a Magic School Bus; professors do not even need to take students off-campus. As a student in the School of Planning I have been taken on field trips across Ring Road to Laurel Creek to learn about water testing; I have travelled to North Campus to learn about soil horizons; and I have toured campus learning how to classify trees. The main goal for field trips should be to enrich students’ educational experience and to further exemplify the theories and concepts covered in class. Without wandering around campus staring at trees I would have never fully understood which elements of a tree to focus on in order to classify it. Or, I would have never had the low-consequential experience of cross contaminating my water samples from Laurel creek. Field trips should be used as a stepping stone from classroom to “real-life”; a step that is cushioned to allow for chances, mistakes, and a safe space for failures before the professional world.

I have also been spoiled with the opportunities to travel off campus – this is not something every university student can say. I was given the chance to explore Spongy Lake, the Distillery District in Toronto, Liberty Village in Toronto, and Guelph’s abandoned Correctional facility, to name a few places. Through these field trips I have learned ecological processes, planning practices such as adaptation, and the reality that I have so much left to learn before entering the professional world. I can say that without being exposed to the realities of my surrounding environment, I would enter the Profession of Planning with utopian, unrealistic perceptions of how cities develop.

So, although the University of Waterloo does not have a bus that can transform into a spaceship, a submarine, or even an alligator, students still desire hands-on experience and the chance to “get out there and explore!”¹ I ask that University professors consider field trips as a teaching method that is feasible for all disciplines. Whether you simply take students outside to study tree species or you take students across the country to practice the French language, any and all exposure counts towards an enriched education.

¹ “The Magic School Bus ™.” Magic School Bus | FAQs | N.p., n.d. Web. 18 Nov. 2016.

Image provided by xmoltarx under the Creative Commons “Attribution-ShareAlike” license.

Looking Forward with CTE’s 2015-2016 Annual Report – Donna Ellis

2015-2016 CTE Annual ReportLooking forward always entails looking back, which is why we at CTE are committed to continuous improvement founded on critical reflection and evidence. A year has passed since we produced our first annual report as part of our comprehensive plan to assess the work of the Centre. I am happy to announce the publication of this year’s annual report, which builds on the hard work of my colleagues to develop and engage in the assessment practices we know to be so important to the evaluation and ongoing development of the work that we do.

Numbers certainly don’t tell the whole story, but I want to share a few with you as a window into what we’ve been up to at CTE over the past year. Here are some highlights of our 2015-2016 year:

  • Our 2016 Teaching and Learning Conference, Learning from Failure and Challenge, attracted 260 participants, 96% of whom rated the conference as “good” or “excellent”
  • CTE staff provided 5,055 consultations to 1,172 instructors and delivered 226 workshops to 1,019 unique instructors, graduate students, and staff members
  • 18 postdoctoral fellows attended our Teaching Development Seminars, bringing the total of fellows who have taken the seminars to 165
  • 480 graduate students participated in 129 microteaching sessions, 1,674 attended workshops, 163 completed the Fundamentals of University Teaching program, and 18 completed the Certificate in University Teaching program
  • CTE’s online resources—our Teaching Tips in particular—were accessed more than one million times from locations around the world

CTE staff have also made strides in promoting teaching excellence beyond the University of Waterloo. Thirty-eight presentations were given by our staff members at conferences and other institutions, four articles and one book chapter written by our staff members appeared in peer-reviewed publications, and CTE staff received two research grants to conduct educational research.

We are already engaged in continuing the process of assessment and reflection for the 2016-2017 year. In the winter of 2017, we will prepare a self-study as part of CTE’s External Review, a process that will provide us with new insights into the work that we do in support of our mission: collaborating with individuals, academic departments, and academic support units to foster capacity and community around teaching and to promote an institutional culture that values effective teaching and meaningful learning.

There is so much more to say, but rather than dive deeper here, I encourage you to read the 2015-2016 annual report to get a more comprehensive sense of CTE’s story. My colleagues and I are already looking forward to a year of new achievements at CTE.

VoiceThread Project: Call for Participation — Gillian Dabrowski

voicethreadAre you looking for ways to engage your students in learning? Consider partnering with the Centre for Extended Learning (CEL) and the Centre for Teaching Excellence (CTE) to pilot a new instructional tool in Waterloo’s on-campus and online classes: VoiceThread.

You may be interested in learning about the pilot if your goal is to engage students in any of the activities below:

  • Idea sharing and interaction
  • Community building
  • Social learning
  • Peer instruction
  • Critical reflection
  • Presentation practice
  • Digital literacy skills building
  • Language practice

What is VoiceThread?

VoiceThread is a media-based discussion tool. A key feature of VoiceThread is that it enables you and your students to create digital presentations and make them the centre of a discussion. Presentations can include documents, images, PowerPoint slides, audio, or video. Students attach comments to the presentation using a keyboard (text), a microphone or telephone (audio), or a webcam (video). Discussions are asynchronous, meaning students are not online at the same time.

Why use VoiceThread?

Penn State’s Use Case Introduction gives several examples of why instructors use VoiceThread:

  • On-campus, create digital presentations on difficult to comprehend concepts and processes. Students can review content multiple times and ask the instructor questions on specific slides.
  • Enable students to present knowledge and research digitally. The class benefits from exposure to a multitude of topics. The presenter benefits from practice articulating themselves verbally and peer feedback.
  • Actively engage students in online lectures by prompting them to comment on specific slides or respond to questions posed within the presentation.
  • Increase your online instructor teaching presence and build online class community by initiating weekly kick-off discussions.
  • Create an online ‘seminar’ course experience where students grapple with heavy readings together in both written and verbal formats.

Pilot Details

The VoiceThread pilot is scheduled to run from Winter 2017–Winter 2018. Faculty who participate in the pilot will receive a VoiceThread account linked to their LEARN user account and a course site. Training and support for the pilot will be supported by CEL, CTE, and LEARN Help. Faculty participants and course participants will be asked to provide feedback via survey response, panel discussion, and interview.

If you would like to volunteer to be a part of this pilot, please contact CEL’s Gillian Dabrowski,, or your CTE Liaison with the following details:

  1. Name
  2. Course information (CourseID, name, section) and expected number of students
  3. A description of how you will use VoiceThread in your course to support student engagement and assessment. How might VoiceThread help solve a problem you are experiencing with discussions or assessment as you currently use them?

More Information


Gao, F. & Sun, Y. (2010). Supporting an online community of inquiry using VoiceThread. In C. Maddux et al. (Eds.) Research Highlights in Information Technology and Teacher Education 2010 (pp.9-18). Chesapeake, VA: Society for Information Technology and Teacher Education (SITE).

Advanced Assessment Considerations

We spend a fair bit of time in the CTE talking about assessment. Whether it be in workshops, one-on-one consultations about teaching, curriculum meetings, teaching observations, or just in passing conversation among colleagues. In these instances, discussion often focuses on the function of the assessment – is it being used for diagnostic (to ascertain prior knowledge), formative (to provide feedback and improve performance during the learning process), or summative (to evaluate the learner’s knowledge, skills, or values) purposes. Discussion of course revolves around the type of assessment being used, as well as questions regarding grading, but there are aspects of assessment that go beyond this and begin to explore the role of the student in the assessment process.1197947341_89d0ff8676_z

When thinking about assessment and how we position it in our teaching, it can be helpful to think about it conceptually while considering what purpose we as instructors assign to it. To do so, we might turn to three approaches to assessment: assessment of learning, assessment for learning, and assessment as learning.

Assessment of learning suggests that assessment is being done for the primary purpose of determining what students have learnt in the class; it is typically done in a summative manner so that students can receive a grade.

Assessment for learning views assessment more as a process, whereby students learn from the assessment; feedback is of great importance as it assists the student in the learning process.

Assessment as learning takes this even further, situating the student him- or herself as an assessor and therefore takes greater responsibility in the learning and assessment process.

None of these approaches to assessment are inherently better than the other, but I would encourage you to reflect on your assessments from a different perspective, adapting what I call Advanced Assessment Considerations. These considerations are intended to allow you to reflect and evaluate your assessment during the design, facilitation, and grading & feedback stages. They do not suggest distinct or varied methods of assessment, but rather, attempt to provide options as to how to modify an existing assessment to make it even better. And by better, I mean better for the students – assessment should motivate and empower students to demonstrate what they have learnt, and the more we can have students become invested in the assessment process, the more meaningful it will be to them.

So what are these advanced assessment considerations? I’ll run through each briefly, providing some initial insight into what each consideration entails and leave you with some questions to consider.

Design considerations apply to how the assessment itself is constructed before the students begin to actively work on the assessment. They are intended to provide learners with agency to be contributors to the assessment and determine what their role will be in the assessment.

To incorporate these into our assessments, we should look at things such as:

  • Student choice
    • Develop student responsibility for learning with controlled options for assessment; students can choose the type of assessment they feel most suits their learning needs, but must reflect on why they chose it/didn’t choose others (Weimer, 2011)
  • Low-stakes and high-stakes assessments
    • Implement a variety of low-stakes and high-stakes assessments that allow for students to build confidence and motivation and alleviate stress or anxiety; create a culture of routine assessment, and don’t be afraid to allow strategies (such as cheat sheets or open-book resources) to alleviate the stress of high-stakes assessments such as exams; good students do well, and poor students do poorly, regardless of the exam type (Gharib & Phillips, 2013)

Facilitation considerations apply to the process whereby students complete the assessment, and the ways in which the instructor can help manage this process. They are intended to ensure that assessments are structured logistically and rationally so as to promote student learning and alleviate difficulties that are external to the objectives of the assignment.

These can be implemented when considering the process underlying the assessment by thinking about aspects such as:

  • Spacing effect
    • Ensure sufficient space between assessments so as to continually reinforce understanding before information is forgotten; the repetition of course content and the process of retrieving content as students begin to forget it helps to reinforce and solidify understanding (Cepeda, Pashler, Vul, Wixted & Rohrer, 2006; Kornell, Castel, Eich & Bjork, 2010)
  • Testing effect
    • Ensure that formative assessment in course matches type of summative assessment; students learn better by testing course content as opposed to simply studying from a textbook, and they learn especially well if testing is accompanied by some form of feedback (Agarwal et al., 2007)

Finally, grading & feedback considerations apply to the means by which feedback is provided and received by students as a means to close the learning cycle. These ensure that sufficient and meaningful feedback is provided to each student multiple times throughout the course, and that the provided feedback is reflected upon by the student.

In some ways, these are most important as they directly encourage the learner to be involved as the assessor to some extent, the degree to which can vary depending on the assessment. To involve the learner in this capacity, consider the following:

  • Peer feedback
    • Peer feedback allows for rich and meaningful insight that can directly support and scaffold learning (Dochy, Segers & Sluijsmans, 1999)
  • Self-assessment
    • Students need the opportunity to examine feedback and determine how to improve through structured means; this can be especially powerful if part of the peer review process (Reinholz, 2015)
  • Student-led feedback
    • Support individual student development by encouraging reflection on their own learning goals before receiving feedback; have students take ownership of what they want to receive feedback on
  • Assessment guideline creation
    • Involving students directly in the creation of assessment guidelines can result in sustained motivation; students can create their own rubrics for the assessments they are completing in consultation with the instructor

Ultimately, depending on your assessment, some of these considerations may simply not be applicable, and attempting to adapt many of these to a single assessment may prove equally challenging. I would however encourage you to think about how some of these considerations could find their way into your own assessments to make the experience all the more meaningful to students and instructors alike.


Agarwal, P.K., Karpicke, J.D., Kang, S.H.K., Roediger III, H.L. & McDermott, K.B. (2007). Examining the testing effect with open- and closed-book tests. Applied Cognitive Psychology, 22, 861-876.

Cepeda, N. J., Pashler, H., Vul, E., Wixted, J. T., & Rohrer, D. (2006). Distributed practice in verbal recall tasks: A review and quantitative synthesis. Psychological Bulletin, 132, 354–380.

Dochy F., Segers, M. & Sluijsmans, D. (1999) The use of self-, peer and coassessment in higher education: A review. Studies in Higher Education, 24(3), 331-350.

Gharib, A. & Phillips, W. (2013). Test anxiety, student preferences and performance on different exam types in introductory psychology. International Journal of e-Education, e-Business, e-Management and e-Learning, 3(1), 1-6.

Kornell, N., Castel, A.D., Eich, T.S., & Bjork, R.A. (2010). Spacing as the friend of both memory and induction in young and older adults. Psychology and Aging, 25(2), 498-503.

Reinholz, D. (2015). The assessment cycle: A model for learning through peer assessment. Assessment & Evaluation in Higher Education, 41(2), 37-41, 1-15.

Weimer, M. (2011, June, 21). A rose for student choice in assessment? Retrieved from


Kyle Scholz

Faculty of Arts and University Colleges Liaison

More Posts - Website

Thank you, Jane Holbrook, and all the best!

Observers of educational development at Waterloo will know that we’ve had a teaching centre onsite for 40 years. Christopher Knapper was the founder of the Teaching Resources and Continuing Education (TRACE) unit in 1976, which kept the same name until the 2006-2007 academic year, at which point a merger with Learning and Teaching Through Technology (LT3) and Learning Resources and Innovation (LRI) led to the formal creation of the Centre for Teaching Excellence pretty much as we know it today.

I think I’m feeling rather wistful and nostalgic at this point because our Senior Instructional Developer for Blended Learning, Jane Holbrook, retires this week. We can hardly believe this to be true, but true it is. I’d like to take a moment to acknowledge Jane’s work with LT3 and CTE. It’s difficult for me to accept that this marks nearly 10 years since CTE’s inception, and the occasion of Jane’s retirement is cause for reflection about where we’ve come and where we’re going. Mainly, though, it’s an opportunity to appreciate Jane’s contributions to scholarship in the areas of blended learning and educational development, as well as her commitment to supporting our Waterloo teaching community over many years.

Jane back in the day

Jane started teaching courses in Biology here around 1989, but in or around 2001, prepared a report for Tom Carey in LT3 about a new model of support for educators in Waterloo’s six Faculties. The result? Our much-praised and oft-copied Faculty Liaison model. Jane took up one such role, for Science, and others followed soon thereafter. I can remember looking at LT3 first from my vantage point at Trent’s Interactive Learning Centre, and later from Guelph’s Teaching Support Services, with a certain amount of envy — in large part because of this model.

I was very happy to join CTE, then, and to work directly with people whose efforts and processes I’d admired from afar. I was not disappointed. In the 9 years I have worked here as a Senior Instructional Developer, I have relied on Jane as a source of wisdom, especially as I learned the ropes of managing other people and managing multiple projects.

Jane Holbrook Winter 2016
Jane in 2016

Jane is a model of honest, astute, intelligent leadership. She never shies away from difficult conversations, always provides incisive input on university-wide and CTE committees or as a personal mentor, and pulls more than her share of administrative weight at one of Canada’s largest teaching centres. I aspire to emulate her level-headed, savvy, and caring approach towards both people and projects.

Jane Holbrook gestures over a copper pot.
Jane blends stuff

Working on blended learning initiatives, Jane has applied her considerable creativity and scholarly approach in ways that have helped many professors to think differently about their practice, and indeed change that practice in ways that increase learning for many generations of students to date, and many more to come.

Thank you, Jane, and all the best in your own next steps. I am thrilled to be working alongside Mary Power, your replacement in the SID role, and will also miss you enormously.


As Senior Instructional Developer, Curriculum and Programming, Trevor Holmes plans and delivers workshops and events in support of faculty across the career span. Prior to joining the Centre for Teaching Excellence, Trevor worked at a variety of universities teaching courses, supporting faculty and teaching assistants through educational development offices, and advising undergraduates. Trevor’s PhD is from York University in English Literature, with a focus on gothic literature, queer theory, and goth identities. A popular workshop facilitator at the national and international levels, Trevor is also interested in questions of identity in teaching and teaching development.

More Posts - Website