Artificial Teaching Assistants

The "draughtsman" automaton created by Henri Maillardet around 1800.
The “draughtsman” automaton created by Henri Maillardet around 1800.

The dream of creating a device that can replicate human behaviour is longstanding: 2500 years ago, the ancient Greeks devised the story of Talos, a bronze automaton that protected the island of Crete from pirates; in the early thirteenth century, Al-Jazari designed and described human automata in his Book of Knowledge and Ingenious Mechanical Devices; in the eighteenth-century, the clockmaker Henri Maillardet invented a “mechanical lady” that wrote letters and sketched pictures; and in 2016, Ashok Goel, a computer science instructor at Georgia Tech, created a teaching assistant called Jill Watson who isn’t a human – she’s an algorithm.

Goel named his artificial teaching assistant after Watson, the computer program developed by IBM with an ability to answer questions that are posed in ordinary language. IBM’s Watson is best known for its 2011 victory over two former champions on the gameshow Jeopardy! In Goel’s computer science class, Watson’s job was to respond to questions that students asked in Piazza, an online discussion forum. Admittedly, the questions to which Watson responded were fairly routine:

Student: Should we be aiming for 1000 words or 2000 words? I know, it’s variable, but that is a big difference.

Jill Watson: There isn’t a word limit, but we will grade on both depth and succinctness. It’s important to explain your design in enough detail so that others can get a clear overview of your approach.

Goel’s students weren’t told until the end of the term that one of their online teaching assistants wasn’t human – nor did many of them suspect. Jill Watson’s responses were sufficiently helpful and “natural” that to most students she seemed as human as the other teaching assistants.

Over time – and quickly, no doubt – the ability of Jill Watson and other artificial interlocutors to answer more complex and nuanced questions will improve. But even if those abilities were to remain as they are, the potential impact of such computer programs on teaching and learning is significant. After all, in a typical course how much time is spent by teaching assistants or the instructor responding to the same routine questions (or slight variations of them) that are asked over and over? In Goel’s course, for example, he reports that his students typically post 10,000 questions per term – and he adds that Jill Watson, with just a few more tweaks, should be able to answer approximately 40% of them. That’s 4000 questions that the teaching assistants and instructor don’t have to answer. That frees up a lot of their time to provide more in-depth responses to the truly substantive questions about course content.

More time to give better answers: that sounds like a good thing. But there are also potential concerns.

It’s conceivable, for example, that using Watson might not result in better answers but in fewer jobs for teaching assistants. Universities are increasingly keen to save money, and if one Watson costs less than two or three teaching assistants, then choosing Watson would seem to be a sound financial decision. This reasoning has far broader implications than its impact on teaching assistants. According to a recent survey, 60% of the members of the British Science Association believe that within a decade, artificial intelligence will result in fewer jobs in a large number of workplace sectors, and 27% of them believe that the job losses will be significant.

Additionally, what impact might it have on students to know that they are being taught, in part, by a sophisticated chatbot – that is, by a computer program that has been designed to seem human? Maybe they won’t care: perhaps it’s not the source of an answer that matters to them, but its quality. And speaking for myself, I do love the convenience of using my iPhone to ask Siri what the population of Uzbekistan is – I don’t feel that doing so affects my sense of personal identity. On the other hand, I do find it a bit creepy when I phone a help desk and a ridiculously cheery, computerized voice insists on asking me a series of questions before connecting me to a human. If you don’t share this sense of unease, then see how you feel after watching 15 seconds of this video, featuring an even creepier encounter with artificial intelligence.

Communities of Practice — Rudy Peariso (Centre for Extended Learning)

build community

Community! Not often the first word that comes to mind when thinking of online learning, but it is for a group of like-minded instructors at the University of Waterloo. The inaugural meeting of the Online Instructors Community of Practice took place during the last week of April.

Sometimes online classes can have the reputation of being solitary for both teachers and learners. Although at the Centre for Extended Learning we work with instructors to dispel that myth for learners, we hadn’t fully considered the impact that online teaching has on instructors.  One of our instructors was looking for advanced workshops and a way to share her experiences, and the Online Instructor Community of Practice was born.

Wenger, McDermott, and Snyder (2002) define a Community of Practice as “groups of people who share a concern, a set of problems, or a passion about a topic, and who deepen their knowledge and expertise in this area by interacting on an on-going basis.”

Over lunch, hosted by the Centre for Extended Learning (CEL), nineteen instructors, who teach online at the University of Waterloo, discussed the successes and challenges of teaching online.  Topics on student engagement, teaching presence, academic integrity, and blended learning all emerged.  Community members were overheard talking about how nice it was just to talk to others who had the same challenges and successes as they experience.

CEL is actively looking at ways to enhance the community, and have opted to offer a meeting once per term. Suggestions for the meeting include, a show and tell, select topics and a discussion of dilemmas. A newly created listserv gives instructors the opportunity to share suggestions and ask questions of the community.

If you currently teach online and want to join the Community of Practice, contact the Centre of Extended Learning.

If you are interested in establishing a Community of Practice for your discipline or interest, check out the following resources:

Image by Niall Kennedy, Creative Commons License.