Artificial Intelligence (AI) use has become part of the technology literacy discussion and Canada is embracing its potential uses.
AI use is on the rise and a recent Leger survey reported 30 per cent of Canadians use AI, a five per cent increase from the previous year.
The same survey reports that half of Canadians, 50 per cent, are familiar with chatbot AI systems such as automated chatting services.
The other half is a group not as familiar with chatbot AI, but it's a figure that is decreasing.
Compared to February 2023, 50 per cent is a four point increase from the previous 46 per cent.
As Canadians use AI more, they’re learning how to utilize this rapidly developing technology.
Erin Kjaer, program manager of Canada Learning Code, an organization teaching adults digital skills, says a key part of accepting AI use is understanding that it is not a placeholder for humans.
“A big part of what we talk about is how to use those tools in a way that is representative of yourself and your skills, and not as a replacement,” Kjaer said.
AI systems produce results based on an amalgamation of information it gathers.
“If I’m training an AI system that is designed to provide close captioning and I'm only training it on people who have a specific regional dialect or a specific accent I'm only training it to understand one way to say words,” Kjaer said. “If I wanted it to be more effective, I would want to train it on lots of different accents.”
Training an AI based on one type of human interaction can create implicit bias against race, gender, religion, income and other factors, which is the premise of the documentary Coded Bias.
Canada Learning Code hosted a screening of the film on Feb. 18 in the It's Ok* Studios on Queen Street West.
Charlotte Nurse, director of programs with Canada Learning Code, said the documentary digs into the root source of AI’s biases.
“The most important thing is these aren’t glitches, it’s built into the tools because of the inherent biases of the team that build them,” Nurse said.
Coded Bias is about revealing the biases that are embedded into AI systems such as job application sorters and facial recognition software.
Joy Buolamwini, an accomplished computer scientist and activist, is the icon of AI ethics in the film and continues to be a symbol of digital activism today.
During her studies at MIT, Buolamwini took an interest in facial recognition software.
However, she ran into a critical error while testing facial recognition on herself.
Buolamwini, a woman of colour, must adopt a white mask so that software can recognize her.
Similar technology was the centre of discussion at a Committee on Oversight and Government Reform hearing in 2019.
Buolamwini testified as a witness, which was highlighted in the climactic sequence of Coded Bias.
Erin Kjaer said educating adults on the ethics and implications of AI is important because they can teach others, including their kids.
“Part of that starts with educating adults because so that they are prepared to have those conversations with their kids and with the students they teach,” she said.
Renfrew County Catholic District School Board (RCCDSB) recently started allowing their teachers to freely use AI tools to familiarize themselves with this significant technology.
Tyson Holly, RCCDSB’s experiential learning and technology coordinator, said this rollout is part of a two-year plan to incorporate AI into the classroom.
Part of this plan was assembling the AI Working Group, which Holly also chairs. Other board members include Mark Searson, the RCCDSB’s director of education.
“Starting in June 2024, we started to meet and talk about how we might use this year of learning for implementing AI into our district,” Holly said.
“We are encouraging of it, and I think you need to have more of a positive approach and embrace it,” he said.
Holly said the goal of this program is to incorporate AI into the classroom without taking away the core learning values and addressing how AI cannot truly create its content.
“If we're going to be letting students use AI for their projects, we're going to need to update our policies, especially around plagiarism,” he said, “This isn’t just a tool to outsource your thinking.”
RCCDSB gave its educators and other staff access to Google Gemini, which they could use for assistance in administrative tasks.
Google Gemini is an AI assistant with speech recognition which can answer questions and utilize Google's suite of apps.
For example, Gemini can be used to create appointments on Google Calendar without a physical input.
“We’ve had some very positive conversations with educators who have used the tool and found that it saved them time,” he said. “They’ve used it for all kinds of things like writing emails, creating lesson plans and creating rubrics.”
Once the educators become more familiar with AI and how it functions, Gemini will be distributed to students in grades nine through 12 in the second phase of RCCDSB’s approach to AI.
However, discussions in the classroom about proper AI use are up to the educators until the second phase of RCCDSB’s plan is rolled out.
“I think those conversations are going to happen when we launch Gemini for Grades 9 to 12,” Holly said, “Those conversations need to happen.”
Amina Yousaf, associate head of the Early Childhood Studies program at University of Guelph-Humber (UGH), said that educating children on AI is essential for ethical use.
“As long as we are teaching children to be critical thinkers, I think that's the missing piece,” Yousaf said, “If we are encouraging youth to be critical, then using AI can be done in an ethical sense.”