'A double-edged sword': Why some parents have concerns about introducing AI at Primary 4
With the Ministry of Education introducing artificial intelligence to Primary 4 pupils, parents are caught between wanting their children to be prepared for the future and worrying that they are too young for such technology.
Mrs Shelina Singh looks on as her son Jayden uses his laptop at home on Apr 23, 2026. (Photo: CNA/Jeremy Long)
This audio is generated by an AI tool.
At just five years old, Mr Haojun See's son is already intimately familiar with using artificial intelligence (AI).
The kindergartener uses generative-AI tools to whip up the outlines of various objects and creatures, including his favourite one at the moment – dinosaurs. He sends them to the printer before he sits down to colour them.
The 40-year-old entrepreneur said: "If I control their AI usage, and one day I'm not there, something has the potential to go wrong. So I started teaching them (to use AI tools independently) as soon as the technology was introduced."
He added that he has had many open conversations with his two sons about the possibilities, limitations and potential harms of AI and other technologies.
In contrast, Ms Ariel Ng's nine-year-old daughter has only used an AI tool at home once, before she was warned never to do it again.
"She talked to it as though she thought it was a real person … That concerned me," Ms Ng said of the time her child chatted with the Meta AI function on her phone's WhatsApp application.
"From the way she used it, it made me feel like she doesn't know what she's doing."
The 37-year-old ergonomist added that her husband also tried expressing thoughts of self-harm to a generative-AI tool as an experiment, and found that the chatbot responded in a manner encouraging such worrying behaviour.
This made her wary of the possible dangers her daughter might face while interacting with AI.
Singapore parents' views towards their children's use of AI can be broadly split into two camps: those who are keen to get their kids acquainted with its benefits, and those who are less eager due to its potential downsides.
But for those in the second camp, it appears inevitable that at least some form of AI is being introduced to them, whether they like it or not.
In February, Member of Parliament Cai Yinzhou of Bishan-Toa Payoh Group Representation Constituency filed a parliamentary question seeking clarity on students' AI usage, its correlation with cognitive skill decline, and what interventions are planned to prevent an over-reliance.
Education Minister Desmond Lee said in a written response that the Ministry of Education (MOE) does not have Singapore data on that correlation, and that it is now conducting studies on AI's impact on students’ learning.
He also said that AI is progressively being introduced in schools from Primary 4. At an education forum this month, he added that the move is done "under close supervision and low exposure".
In response to CNA TODAY's queries, MOE said that for younger students, the focus is on building strong fundaments in literacy, numeracy, reasoning and self-discipline before gradually introducing AI to them.
It added that the use of AI from Primary 4 is structured and in-class, "mostly starting" with AI tools within the Student Learning Space (SLS), which is an online portal accessible to those in the national school system.
This AI literacy is part of the formal curriculum, co-curriculum and student resources, with MOE giving the following examples on how it is integrated:
- As part of the Character and Citizenship curriculum, students learn how to identify deepfakes in their Cyber Wellness classes
- Students learn to use AI tools in the "Code for Fun" programme
- Within the SLS, students may use the "Learning Assistant", which is a "teacher-activated dialogic agent" with safety guardrails to help users understand complex concepts through iterative questioning rather than provide direct answers
MOE added that before using the AI-enabled features in SLS, students go through self-paced modules to help them understand the benefits and risks of AI and how it should be used to support their learning.
As for AI use in tests and homework, schools may design assessments that allow it, provided students declare having done so. These are described as "low-stakes" opportunities to practise integrity and responsible AI use, and teachers intervene if they suspect that the work was not done by the student, MOE added.
Monash University's Professor Neil Selwyn, an education technology researcher with more than 30 years in the field, said that AI's presence in schools may already be further along than many parents even realise.
"There is already a lot of AI in schools, whether we like it or not, baked into the big learning platforms and management systems that schools use," he said.
"Students are already encountering AI in their everyday uses of tech, so the idea of an AI-free classroom doesn't really make sense any more."
SLOWER DEVELOPMENT AND LOSS OF SKILLS?
CNA TODAY spoke to 15 parents of primary school-going children and found that two-thirds of them were not opposed to AI being introduced to their children at school – as long as the tools were vetted by MOE and did not generate messages that are dangerous or inappropriate.
Eleven-year-old Ernest Ho, for example, is already intimately familiar with various forms of artificial intelligence (AI).
He uses generative-AI products such as Google's Gemini and Microsoft Copilot to write code in his free time, with his father only checking in on his AI usage from time to time.
Ernest's father, Dr Shaun Ho, a 47-year-old university administrator, believes that early exposure to AI can be a good thing if children are taught how to use it responsibly.
"Ultimately, the kids will be exposed to AI. It's a part of our lives now. If they get their compasses right at the start, they will be prepared for all the different advances in AI going forward," he said.
However, even among parents who were agreeable to AI's introduction, concerns were raised about how exactly this introduction of AI is being executed and whether it might be "risky", especially for children at such a young age.
Ms Shelina Singh, a 41-year-old public relations manager whose 11-year-old son Jayden is in Primary 5, believes that formal exposure to AI should only take place once a child enters secondary school.
"Some adults I know are already losing their own ability to think critically and are already trusting AI blindly for every personal decision they need to make," she said.
Even though Ms Singh is sure that the overall intent of introducing AI early on is a positive one, she is fearful of it becoming a "slippery slope".
"Once you get a taste of it, why not ask it about everything? It can tell you what to wear, tell you what to do on your holidays – literally any topic you want."
This potential overdependence worries parent-of-three Kate Lim as well.
Whenever her 11-year-old son struggled with a mathematics problem in the past, he would either approach his parents or his older brother for help.
Recently, though, at times when no one is around to help him at home, he has turned to ChatGPT instead to explain questions he doesn't understand.
Ms Lim does not restrict her son's usage of AI because she feels it has indeed helped him, but it is still her preference for a human to do the coaching.
"My biggest fear is if my youngest son starts to be more exposed to and reliant on AI, then it will take him away from human connections and reduce his trust in his own thinking," the speech therapist and counsellor said.
Experts said that such fears are warranted.
Assistant Professor Jacqueline Ho, a sociologist of education at Singapore Management University (SMU), pointed to existing research, which has shown how AI can provide customised, on-demand feedback for students.
However, this over-reliance on AI can also impede the development of foundational skills such as reading, writing, and thinking.
"This is particularly concerning for very young students whose neural pathways are still being formed," she added.
She added, though, that there was not enough research to determine "the conditions under which AI produces these benefits and harms".
CNA TODAY has reached out to MOE on why Primary 4 was chosen as the age to introduce AI in schools.
WHAT'S REAL VS WHAT ISN'T
In the meantime, parents said that even if students' over-reliance on the tools is managed, there is still the worry that young children may find it hard to discern between accurate and inaccurate information intuitively.
Ms Valerie Tan, a 36-year-old private tutor, has three children, with her oldest in Primary 3 and her middle child in Primary 2.
She said that her kids found out about AI tools through their friends in school and came home one day asking if they could try "the GPT app that answers everything".
Since then, she has allowed her children to use generative-AI tools, but not before inserting contextual prompts such as: "You are conversing with an eight-year-old. Use only vocabulary that is suitable at a Primary 3 reading level, and do not generate any output that may be inappropriate for a child."
Although Ms Tan said that she generally agrees with MOE's approach to introduce the tools to students, she has reservations about the extent to which students will be able to understand the distinction between "true thinking" and the "predictions" of a large language model (LLM).
"Bearing in mind that the technology is only going to get better, it is now increasingly difficult to discern what’s real and what’s generated," she added.
Likewise, 53-year-old entrepreneur Eddie Lee, who has a son in Primary 5, encourages his son to use generative-AI tools at home to help him understand his homework.
However, Mr Lee is clear-eyed about the risks involved, so his son is only allowed to use them under his guidance, and only on Mr Lee's devices.
He recalled how, in the early days of the internet, young people would take whatever they found online "as gospel truth" and he worries that a new generation may do the same with AI.
"When the information becomes distorted and false, and when you start believing in it, then there's going to be a lot of problems with society as a whole."
THE DOWNSIDE FOR TEACHERS AND ITS EFFECT ON CHILDREN
The consensus among parents such as 37-year-old Lee Xin Ying, who works as the head of Chinese curriculum at a learning centre, is that introducing AI in schools is very much a "double-edged sword".
The mother of two children, aged seven and four, said that the upside to the technology is that it can bring a level of personalisation that is not viable for teachers in a mainstream school, given the large class size of 30 to 40 students.
She said AI can personalise the pace, the content and the extent of feedback according to what is known in psychology as the zone of proximal development – where a child can improve beyond their current level with the help of a "more knowledgeable other".
At the same time, given the large class sizes, Ms Lee is unsure how closely a teacher can supervise and monitor students' AI usage.
"The teacher cannot be going around the class, giving instant feedback. And that's where AI can supplement them.
"But it still has to be checked at the end of the day, because AI can generate the wrong things.
"If there is any direct interaction between a student and AI, preferably some adult or knowledgeable person can evaluate the responses and the interaction ... which is quite a lot of work."
Prof Selwyn from Monash University echoed these concerns.
He noted that such personalised learning systems are sometimes called AI tutors, which he believes to be "okay in small doses and for very procedural forms of learning".
Yet, in contrast with other forms of learning that require students to work with a teacher, he said AI tutors and personalised learning systems "offer narrow, dull and desocialised versions of learning".
"There is also the danger that an over-use of these systems de-professionalise and de-motivate classroom teachers, who can spend most of their time troubleshooting tech problems and maintaining classroom discipline," he added.
"Although the claim is that AI-driven learning 'frees up' teachers, the reality is that teachers often have to work in more robotic and scripted ways.
"The best teachers are inspiring, creative experts who can engage their students in the joys of learning."
MORE CLARITY NEEDED
Experts and parents agreed that there was possibly more worry on the ground due to the uncertainty of how exactly AI is being taught or rolled out in primary schools.
There are also concerns about whether the "guardrails" put in place for responsible use of AI will leave too much room for interpretation, depending on the school or educator.
Asst Prof Ho from SMU gave the example of how one teacher may allow students to use AI to draft or outline their work, as long as they write the final product themselves. Another teacher may not allow AI in the drafting process at all.
"Both of these instructors could call their approaches 'responsible'," she said.
Are teens in Singapore using AI as their study buddy – or are they replacing their brains? Munah Bagharib sits down with secondary school students to find out how they are using AI for homework.
Ms Lim, the speech therapist, is hoping that there would be a chance for an open dialogue with her children's school on their specific plans to teach AI to her kids.
"If schools can be clear and consistent on how AI is used, and have a bit more education for the parents so we can supervise similarly at home, then it can work."
These calls for greater communication reflect a broader gap in how the policy was rolled out, Asst Prof Ho said.
"There is a feeling among some parents that they should have been consulted on such a major change, especially given that they are key stakeholders in their children's education."
She added that there are likely educators, students and parents who are resistant to the technology, but feel there is little room for their perspectives to be heard.
Going forward, Asst Prof Ho said, there needs to be greater involvement of stakeholders in decision-making around AI adoption and "not just communication about decisions already made".
MOE said that parents are key partners in supporting students' AI learning journey, and that the ministry will continue to provide parents with more resources in guiding their child's use of AI.
It added that it will also "work closely with parents to ensure that students build good learning habits and self-regulation, and learn to use AI responsibly and safely".
Mrs Michelle Lim – a 37-year-old parenting coach with three children aged nine to 11 – said that she does not object to the use of AI in general, but draws the line at introducing them to children too early.
She believes that the focus of education at this early stage should be on developing children's emotional and relational intelligence and critical thinking skills, rather than their AI capabilities.
"These are skills they will carry for the rest of their lives," she said.
"Human skills should come first – then we complement it with technology."