As many of you know, one of CTE’s key services involves facilitating curriculum retreats for departments as they prepare a self-study for academic program review. We help faculty to: identify program-level outcomes, map courses according to those outcomes, and explore how course assessments fit into the curriculum map. See our curriculum renewal website for more details. We feel strongly that we should also undertake similar processes for our own work. Even though we have only a few programs that might look like a coherent curriculum, and fewer still that lead to some kind of certification, everything we do is meant to be an opportunity for learning about teaching.
As part of the Academic Programming strategic plan, CTE is up for external review in 2017. Like an academic department, we recognize that some preparation is in order, including thinking carefully about where we’ve been and where we’re going. One key preparatory step has been our work over the past few terms on designing a comprehensive plan to assess the work of the Centre. Despite a growing body of literature in the area of teaching centre assessment, no set standards have emerged for assessing a Centre’s work, so we’ve been developing some parameters and frameworks to help guide our efforts. In many ways we have been predicting this activity since we began preparing to support new academic review processes in 2007, and it is exciting to be acting more fully on it now.
Key principles underlying our plan’s development include being: a) collaborative by involving multiple stakeholders, b) defensible by drawing from the evaluation literature, and c) comprehensive while not assessing everything all the time. Most importantly, we are striving to create a plan that is sustainable so that no one feels overburdened by contributing to its implementation: we can focus on providing our services and you can focus on your ongoing development.
Our plan is centred around four key questions:
- Who comes to us?
- To what extent are we meeting our participants’ needs?
- What intended outcomes are our participants meeting?
- How effective are our processes?
These questions are meant to help us gather evidence about our overall impact and indicate how well we are achieving our aims of building capacity, building community, and promoting a culture that values teaching and learning. We are interested in both short- and long-term results, including the ripple effects to our work, and we are poised to look more systematically at connections and transformations.
Building on the evaluation tool of a logic model, we have identified both output and outcome data to collect:
- Output Data: event registrations and staff reports (e.g., consultation numbers); resource hits (e.g., website traffic); internal planning (e.g., workload analyses)
- Outcome Data: surveys (post-event and long-term); participant reports/narratives; interviews and focus groups; other data (e.g., network analyses)
We are now at the stage of building a matrix that involves us mapping our various programs and services according to the questions that need to be addressed in each case and the most appropriate type(s) of data to collect. A CTE working group is also developing survey questions based on our programming outcomes. Our goal is to pilot our new instruments and processes this Fall, so expect to experience some new ways of providing feedback on our services. We thank you in advance for assisting us with this important work.
Another part of our plan is to release an annual report that highlights key assessment data. We plan to replace our Fall newsletter with this annual report, with the first one scheduled to come out this Fall. Our new assessment plan won’t be in place by then, but we already have various data to report on to help increase transparency around CTE and the impact of our services.
I am grateful for all of the talented individuals at CTE who are contributing to our assessment plan and annual report, and to our faculty colleagues who have been providing feedback along the way. Like academic program reviews, our assessment work is part of a larger ongoing continuous improvement process. There is more work ahead of us, but it is exciting to contemplate all that we can learn from it.
If you have feedback about our assessment approach or would like to learn more, please give me a call (ext. 35713) or send me an email (firstname.lastname@example.org).
Grabove, V., Kustra, E., Lopes, V., Potter, M.K., Wiggers, R., & Woodhouse, R. (2012). Teaching and Learning Centres: Their Evolving Role Within Ontario Colleges and Universities. Toronto: Higher Education Quality Council of Ontario.
Hines, S.R. (2011). How mature teaching and learning centers evaluate their services. In J.E. Miller & J.E. Groccia (Eds.). To improve the academy: Resources for faculty, instructional, and organizational development (pp. 277-279). San Francisco, CA: Jossey-Bass.
Wright, M. C. (2011). Measuring a teaching center’s effectiveness. In C.E. Cook & M. Kaplan (Eds.), Advancing the culture of teaching on campus: How a teaching center can make a difference (pp.38-49). Sterling, VA: Stylus.
Program development and evaluation logic model templates from University of Wisconsin-Extension: http://www.uwex.edu/ces/pdande/evaluation/evallogicmodelworksheets.html