Chat GPT, which gives free public access to AI technology, was released in November 2022. In the short time since, AI has increasingly become a factor in provision of education programs at all levels.
The AI learning curve is steep and without guardrails for all involved — and there is no reliable map.
Some individual teachers may be using AI as a timesaver for scheduling, developing general and personalized lesson plans and even report-card tracking. At the same time, a growing number of students are experimenting with AI as a study aid, research tool and for collaboration with other students.
In September 2023, the Canadian Press surveyed 10 school boards in different parts of Canada asking whether they would implement a formal AI-use policy for the 2023-24 school year. Such a policy would cover teacher and student use of AI, such as chatbots that can solve math problems or write essays.
Among the boards that responded to the survey, none had an official AI-specific policy in place.
Here in B.C., the provincial government seems, as one education writer put it, to be viewing the inevitable pathway between AI and the classroom “from 33,000 feet.”
That opinion is validated by a Ministry of Education document entitled “Digital Literacy and the Use of AI in Education: Supports for British Columbia schools”
The document suggests that “While AI can act as a supplementary tool and offer enrichment and depth to classroom practices, choices about the use of AI in the classroom are ultimately made by teachers based on the needs of the students.”
That leaves B.C.’s 40,000 teachers wondering how to deal with AI in their classrooms.
In contrast, according to the online research service Mindshift, educational organizations in the U.S. ranging from small school districts to Harvard University are developing clearly laid-out guidelines for generative AI use by both students and teachers.
Teacher professional development is also specifically designed to give educators a foundational understanding of how AI works in the many districts quoted in the article.
Some districts have instituted Cyber Week, an optional week during the summer for teachers to explore innovative AI teaching practices.
At the other end of the education spectrum is Harvard’s “metaLAB,” part of the Berkman Klein Center for Internet & Society, and part of Harvard’s AI Pedagogy Project, a resource aimed at guiding educators in the how, when and why to engage with AI tools in their teaching.
According to Sarah Newman, director of art and education at metaLAB, the project offers:
• A searchable collection of educator-designed assignments for integrating AI into syllabi, and doing so responsibly and critically
• Understandable AI concept descriptions designed to outline essential concepts and skills in a streamlined guide
• Recommendations for educators on how to begin their AI journey in the classroom
• An interactive tutorial on using large language models
• A resource list for further AI exploration, including related projects
The AI Pedagogy Project, says Newman, seeks to demystify AI, and encourage critical engagement and creative AI applications in all levels of education. It’s an evolving platform, she says, designed to both adapt alongside AI advancements and their societal impacts, as well as to engage students in analyzing and critically re-imagining the future of this technology.
If that all sounds like a bit of a scramble since November of 2022, that’s because it has been. Nobody from teacher-training institutions to provincial policy makers wants to be caught up in the rush to be last.
At the same time, it is difficult for everybody involved to agree on what to do with AI, something we do not yet fully understand.
As far back as 1932, then U.S. president Franklin Delano Roosevelt offered this advice for those faced with a complex problem: “It is common sense to take a method and try it. If it fails, admit it frankly and try another. But above all, try something.”
That advice still rings true today.
Cliff Justice, a principal in KPMG’s Innovation & Enterprise Solutions team, which leads the firm’s Cognitive Automation initiatives, challenges us to think about AI in a different way, “more like a relationship with technology, as opposed to a tool that we program,” because, he says, “AI is something that evolves and learns and develops the more it gets exposed to humans.”
Now there’s a scary thought.
Geoff Johnson is a former superintendent of schools.