AI will be essential in education — but do kids know how to use it?
As AI becomes increasingly prevalent in education, concerns arise over children’s proficiency in navigating its complexities and potential pitfalls.
Artificial intelligence (AI) is everywhere — it can rewrite famous novels, compose songs and create realistic videos. While older generations fear its takeover, the youth of today are embracing the technology.
A study from the Pew National Research Center in the United States from November 2023 revealed that nearly one-in-five teenagers between the ages of 13-17 who have heard of the AI chatbot ChatGPT have used it to help with schoolwork, which amounts to roughly 13% of all teens in the U.S.
The same study showed that seven out of 10 teens said it is okay to use the chatbot when researching something new and exploring a topic. However, this raises concerns, particularly when it comes to AI producing misleading or false information.
This was recently the case when Google’s AI chatbot Gemini started producing “woke” and inaccurate depictions of historical scenes, for which it had to issue an apology.
Certainly, there is no going back from the introduction of AI to the next generation. However, the question still remains on best AI practices for youth, particularly when it comes to education.
Certainly, there’s no reversing the impact of AI on the next generation. However, the question persists regarding the optimal AI strategies for young people, especially in education.
Cointelegraph spoke with Brandon Da Silva, the CEO of ArenaX Labs, to better understand how AI can be implemented productively and safely into youth education.
AI boosts tech-savviness
ArenaX Labs recently released AI Arena, a player-vs-player fighter game in which players train AI models to battle each other autonomously with the goal of using “play” to boost AI literacy.
Da Silva said that teaching young people how to actually “train” or “program” AI holds significant importance, beyond simply asking tools like ChatGPT questions.
“If you’re using ChatGPT and it starts to give you a weird answer, it’s important to understand why,” he explained. “Otherwise, some people might start to think that what ChatGPT tells them is basically the gospel, without concerns that it might not be correct.” He added:
“It’s essential to understand the limitations of these tools, because you can only truly understand where, why, and how they might go wrong when you know how they work.”
He said kids that begin to interact with AI at a young age, will more likely become a lot more “tech-savvy” than their peers who do not.
“We believe that AI will transform society and that it will be part of everyone’s lives going forward – and because of that, it’s important that people get familiar with it starting from a young age.”
Related: OpenAI accuses New York Times of hacking AI models in copyright lawsuit
Da Silva drew a parallel between those who grew up learning how to program from a young age. “Many of these people were better at programming as high schoolers than some full-time employees who have done programming for 10 years as adults,” he said.
However the issue is a multi-faceted one, “there are both benefits and risks,” he continued. The tendency for kids to become “glued” to technology like iPads and phones could be translated into AI usage if not monitored.
He also pointed to the aforementioned risk of AI’s inherited bias, like with Google’s Gemini.
“This type of thing can be very dangerous – as adults, it’s easier to recognize bias. But as a child, you don’t know.”
AI in education
This is where proper AI education and attention from educators who are using AI come into play.
Similar to the way people will need to develop the discernment skills to spot deepfakes, Da Silva said we will all need to learn how to ask these important questions surrounding bias when AI is involved, starting with the youth. He said:
“Educators need to emphasize the importance of critical thinking skills when it comes to student interactions with AI.”
It will also be important to consider the user when teaching how to interact with AI – there is no “one-size-fits-all solution.” He said having different “levels” of communication with AI for different kinds of learners is important.
Another important aspect for educators and those dealing with youth’s interactions with AI is the emotional relationships one can develop with an AI. Recent research from the Digital Wellness Lab said children can form “parasocial relationships,” or one-way emotional attachments with AI-enabled digital assistants.
It cited a study of children aged 6-10, in which 93% of participants said that a “digital assistant” they were familiar with was smart, with 65% responding that the device could be a friend.
De Silva said:
“Developing an emotional connection with an AI can help students become more invested in their learning. At the same time, having an emotional connection with something like an AI has a risk of believing what it says more than one should.”
In such cases objectivity could play a lesser role, because that kind of connection can lead one to feel that the AI is a trusted authority that doesn’t need to fact-check – especially among youth.
It is a critical moment for society as AI’s evolution continues to happen at light-speed, while humans are trying to wrap their heads around the technology itself. However, for the youth of today, this is a moment to safely learn and engage with a tool that will most likely shape their future.
Responses