Steep learning curve: The future of education with AI
School days may look very different in the future.
“Imagine a classroom in which students spend part of their days engaging with AI, learning one-on-one,” says Dr Vitomir Kovanović, senior lecturer in learning analytics at the University of South Australia.
“After that, they enter a different kind of classroom, where the focus is on social connections, discussion and peer learning. Teachers would be mentors and guides in this collaborative process, rather than sole content providers as they are today.”
This could take place in schools or remotely. Either way, if AI technology is going to take on the role of personal tutor, it’s going to need some tweaking, Kovanović says.
“At the moment the problem with AI is that it is like a genie in a bottle: you ask it to do something, and it will do it for you. You might ask it, ‘I have an assignment this Friday, can you write it for me?’ It says, ‘Sure, here’s your essay’.
“An AI tutor, however, will push you to develop, nudging you, prompting you, getting you unstuck if you are stuck. But it won’t do it for you. We need to develop those pedagogical agents that will do as good tutors do.”
The way students are assessed may also have to change, he says. Rather than looking only at the final product, teachers will look at the processes students have gone through, including which prompts they have given their AI tutors, revealing their level of critical thinking.
“AI will be a problem if it’s merely going to be used by students to cheat. However, if it can be integrated into more productive learning processes and teaching processes, then it will become a great opportunity.”
Quick learners
Most governments’ immediate response to the fact that students suddenly had access to a ready-made personal essay writer in ChatGPT was to ban it in public schools early this year. It was a futile gesture, Kovanović says.
“Bans don’t achieve anything. Students have home internet access. They still use it. The genie is out of the bottle; there’s really no reason to fight it like that. We need to find ways to make it more productive.”
Education departments have since taken a somewhat more proactive approach. In April, the Western Australian government reversed its ban on teachers using ChatGPT in schools. In May, the federal government launched an inquiry into the use of generative artificial intelligence in the Australian education system to examine the ways in which AI might affect outcomes for students in the Australian education system.
Then in July, a consultation paper seeking views on a draft National AI in Schools Framework was released. The result of a collaboration between the commonwealth, state and territory education ministers, and representatives of the non-government school sector, the framework “sets out the core elements and principles that will guide education systems, tools and teachers in using generative AI safely and ethically, to improve teaching and learning, lift student outcomes, and reduce the administrative and workload burden in schools”.
Leslie Loble, industry professor at the University of Technology Sydney, describes the framework as an important and useful first step.
“But we need to go further,” she says, “including standards to ensure the edtech in Australian schools is high quality and demonstrates both effectiveness and safety; investment in professional learning to support best practice in using edtech; and policies and resources to ensure the benefits of edtech are equitably distributed to help tackle our learning divide.”
That equity of distribution is going to be crucial as AI scales rapidly, Loble says. “Edtech tools are poised to become the next frontier of the digital — and learning — divide as better-off schools, teachers and students are able to access and master these learning applications, thus enhancing an already significant education advantage.”
Loble believes AI could improve student outcomes, help teachers plan, assess, and adapt instruction to different needs, reduce administrative tasks, and provide insights for schools and systems. “There are many potential benefits to embracing AI-enabled learning applications and the positive evidence is building, but the use of these tools in educational settings also raises thorny quality and ethical questions,” she says. “These technologies need to be carefully researched, monitored and guided to maximise educational benefit and minimise risks.”
Teaching trial
South Australia describes itself as the one jurisdiction that didn’t ban ChatGPT. Instead, it got together with Microsoft and developed an AI chatbot, EdChat, which it has just trialled in eight schools for eight weeks.
Based on the same technology as ChatGPT, but modified to prevent access to inappropriate material and with extra provisions for privacy and data security, EdChat was made available to students 24 hours a day, and showed them how to use AI to support their studies. The government also provided schools and parents with guidance about the benefits and risks of using AI in education. The results of the trial are yet to be released.
The South Australian Department for Education’s chief executive, Professor Martin Westwell, describes the department’s approach to AI in schools as cautiously ambitious.
“The department wants to provide a way for our students and teachers to learn together about how to use this tool ethically and productively,” he says. “We know that this trial will help the department, schools and students to better understand how to support the positive uses of the technology and protect against any negative outcomes.
“The next steps are under consideration at the moment, and we expect the first step will be to continue the trial, but expand the program to another small group of schools. The feedback we have received will be used to design the professional learning that teachers have identified will be most useful, resources to use with students, and inform how we think AI will be used across the curriculum and learning.”
For Kovanović, this is just the type of trial that’s needed. “People are proposing the ideal way to use it, but we really have to start using it, and then that will emerge,” he says. “Educators can quickly adjust their trials, failing, and trialling again, and I have faith in their ability to cope with that, given the time and opportunities and resources.”
The magical (and contested) road ahead for government AI
- Follow the money: Australian Taxation Office sets the pace for government adoption of AI
- Developing public sector AI promises to be a tricky and exciting ride
- Navigating new ground with AI: Supporting health in the military
- Artificial intelligence will transform the public sector, but at what cost?
- Safety first: Why AI standards need to be ‘medical grade’
- Navigating AI adoption as it outpaces regulation
- Steep learning curve: The future of education with AI
- What happens when powerful technology falls into the wrong hands
- Navigating the nexus of AI, ML and automation – the deliberate evolution of government programs
- The oblique state of ‘explainable’ AI in government
- AI promises to revolutionise the public service, but only if it plays by our rules
- Navigating the risk in AI – select the right people over the right technology