Text size: A A A

Steep learning curve: The future of education with AI

School days may look very different in the future.

“Imagine a classroom in which students spend part of their days engaging with AI, learning one-on-one,” says Dr Vitomir Kovanović, senior lecturer in learning analytics at the University of South Australia.

“After that, they enter a different kind of classroom, where the focus is on social connections, discussion and peer learning. Teachers would be mentors and guides in this collaborative process, rather than sole content providers as they are today.”

This could take place in schools or remotely. Either way, if AI technology is going to take on the role of personal tutor, it’s going to need some tweaking, Kovanović says.

“At the moment the problem with AI is that it is like a genie in a bottle: you ask it to do something, and it will do it for you. You might ask it, ‘I have an assignment this Friday, can you write it for me?’ It says, ‘Sure, here’s your essay’.

“An AI tutor, however, will push you to develop, nudging you, prompting you, getting you unstuck if you are stuck. But it won’t do it for you. We need to develop those pedagogical agents that will do as good tutors do.”

The way students are assessed may also have to change, he says. Rather than looking only at the final product, teachers will look at the processes students have gone through, including which prompts they have given their AI tutors, revealing their level of critical thinking.

“AI will be a problem if it’s merely going to be used by students to cheat. However, if it can be integrated into more productive learning processes and teaching processes, then it will become a great opportunity.”

Quick learners

Most governments’ immediate response to the fact that students suddenly had access to a ready-made personal essay writer in ChatGPT was to ban it in public schools early this year. It was a futile gesture, Kovanović says.

“Bans don’t achieve anything. Students have home internet access. They still use it. The genie is out of the bottle; there’s really no reason to fight it like that. We need to find ways to make it more productive.”

Education departments have since taken a somewhat more proactive approach. In April, the Western Australian government reversed its ban on teachers using ChatGPT in schools. In May, the federal government launched an inquiry into the use of generative artificial intelligence in the Australian education system to examine the ways in which AI might affect outcomes for students in the Australian education system.

Then in July, a consultation paper seeking views on a draft National AI in Schools Framework was released. The result of a collaboration between the commonwealth, state and territory education ministers, and representatives of the non-government school sector, the framework “sets out the core elements and principles that will guide education systems, tools and teachers in using generative AI safely and ethically, to improve teaching and learning, lift student outcomes, and reduce the administrative and workload burden in schools”.

Leslie Loble, industry professor at the University of Technology Sydney, describes the framework as an important and useful first step.

“But we need to go further,” she says, “including standards to ensure the edtech in Australian schools is high quality and demonstrates both effectiveness and safety; investment in professional learning to support best practice in using edtech; and policies and resources to ensure the benefits of edtech are equitably distributed to help tackle our learning divide.”

That equity of distribution is going to be crucial as AI scales rapidly, Loble says. “Edtech tools are poised to become the next frontier of the digital — and learning — divide as better-off schools, teachers and students are able to access and master these learning applications, thus enhancing an already significant education advantage.”

Loble believes AI could improve student outcomes, help teachers plan, assess, and adapt instruction to different needs, reduce administrative tasks, and provide insights for schools and systems. “There are many potential benefits to embracing AI-enabled learning applications and the positive evidence is building, but the use of these tools in educational settings also raises thorny quality and ethical questions,” she says. “These technologies need to be carefully researched, monitored and guided to maximise educational benefit and minimise risks.”

Teaching trial

South Australia describes itself as the one jurisdiction that didn’t ban ChatGPT. Instead, it got together with Microsoft and developed an AI chatbot, EdChat, which it has just trialled in eight schools for eight weeks.

Based on the same technology as ChatGPT, but modified to prevent access to inappropriate material and with extra provisions for privacy and data security, EdChat was made available to students 24 hours a day, and showed them how to use AI to support their studies. The government also provided schools and parents with guidance about the benefits and risks of using AI in education. The results of the trial are yet to be released.

The South Australian Department for Education’s chief executive, Professor Martin Westwell, describes the department’s approach to AI in schools as cautiously ambitious.

“The department wants to provide a way for our students and teachers to learn together about how to use this tool ethically and productively,” he says. “We know that this trial will help the department, schools and students to better understand how to support the positive uses of the technology and protect against any negative outcomes.

“The next steps are under consideration at the moment, and we expect the first step will be to continue the trial, but expand the program to another small group of schools. The feedback we have received will be used to design the professional learning that teachers have identified will be most useful, resources to use with students, and inform how we think AI will be used across the curriculum and learning.”

For Kovanović, this is just the type of trial that’s needed. “People are proposing the ideal way to use it, but we really have to start using it, and then that will emerge,” he says. “Educators can quickly adjust their trials, failing, and trialling again, and I have faith in their ability to cope with that, given the time and opportunities and resources.”

The magical (and contested) road ahead for government AI

The ATO has been pioneering machine learning for years. Playing the long game with cohesive data architecture and ethics is now paying off.
The APS needs to strike a delicate balance between appropriate regulation, relevant governance and innovation — and commit to rapid upskilling of the workforce.
ptsd ai
There have been exciting advances in artificial intelligence and machine learning supporting mental and physical health of those in the armed forces.
Before an AI makeover, there needs to be more oversight, transparency and a strong dose of AI literacy.
Before going all-in on artificial intelligence, experts believe government organisations must set the rules of engagement.
AI's rapid adoption has outpaced the government’s ability to adequately regulate and govern its responsible and effective use, or guard against malicious use of the technology.
Education experts are optimistic about a future where artificial intelligence is used as a tutor, not a short cut.
On the eve of a global summit on artificial intelligence, world leaders are expressing concerns about what bad actors could do with advanced technology.
Balancing the cautious pace of public sector decision-making with the rapid evolution of technology demands a paradigm shift.
While AI may be the word on everyone's lips right now, global research shows that the technology is not that well understood. If it is going to be used effectively, that has to change.
AI may appear intimidatingly advanced but it is simply part of the 'computing revolution', and viewing it as such enables us to work with it more effectively.
ai risk
What is needed? Policy, regulatory and legislative mechanisms for developing artificial intelligence and machine learning. Easy to say but challenging to do.