Liberal Party of Canada members have voted to support stricter limits on youth access to social media and artificial intelligence tools, as delegates wrapped up their national convention in Montreal.
Party grassroots members backed two resolutions on Saturday targeting digital platforms used by teenagers and children. One proposal supports a legal minimum age of 16 for creating social media accounts. Additionally, it would require companies to actively prevent underage users from signing up.
Another resolution calls for a ban on access to AI chatbots for anyone under 16. Furthermore, it frames these tools as potentially harmful forms of digital interaction for young users.
Delegates debated both measures before adopting them through a vote. However, the resolutions do not automatically become law. Instead, they aim to influence future federal policy discussions and legislation.
Supporters argue the measures address growing concerns about how digital platforms affect youth behaviour. In addition, they point to risks tied to online interactions, including reduced social engagement and exposure to harmful content.
The resolution documents claim some platforms have encouraged inappropriate conversations or contributed to harmful outcomes among vulnerable users. Consequently, advocates believe stronger safeguards are necessary.
Quebec Member of Parliament Rachel Bendayan supported the social media age restriction during the convention. She said Canadians are capable of having a serious discussion about protecting young people online. Additionally, she indicated that youth perspectives should form part of that conversation.
Meanwhile, critics warn that outright bans may fail to achieve their intended results. Experts suggest teenagers could bypass restrictions or migrate to less regulated digital spaces.
Read more: Oracle job cuts raise questions as AI spending climbs and visa filings increase
Read more: Tempus AI, Daiichi Sankyo team up to boost cancer drug trials with AI
Banning access could push teens to hidden online communities
Taylor Owen, a media and communications expert at McGill University, said the issue stems from platform design rather than user behaviour. He argued that companies have created environments that prioritize engagement over safety. Consequently, he believes regulation should focus on forcing safer product design.
Owen also cautioned that banning access could push teens into private or hidden online communities. These environments may expose them to greater risks with fewer safeguards. Furthermore, he said policymakers should avoid placing responsibility solely on young users.
He has advocated for an independent regulator to oversee digital platforms in Canada. Such a body would require companies to conduct risk assessments and provide transparency. Additionally, it could enforce standards designed to protect users.
Prime Minister Mark Carney has acknowledged that age restrictions remain under consideration. He said the government continues to examine options as it develops online harms legislation. However, he has not committed to a specific policy direction.
During a recent news conference in Tokyo, Carney described the issue as one requiring careful debate. He indicated that policymakers must weigh both safety concerns and practical enforcement challenges. Consequently, the government has not finalized its approach.
Advocacy groups have also pushed for stricter controls on youth access to digital platforms. Unplugged Canada, a national organization, has urged Ottawa to adopt a minimum age of 16 for social media use. Additionally, its open letter has attracted thousands of supporters.
Read more: DOE launches UPRISE program to boost U.S. nuclear output by 5 GW by 2029
Read more: X-Energy moves toward IPO as tech giants drive nuclear power demand
One case tied to Tumbler Ridge incident
The letter includes endorsements from medical and mental health organizations. These groups argue that digital platforms should meet safety standards before being offered to young users. Furthermore, they compare the issue to road safety regulations, which combine age limits, engineering standards and oversight.
Supporters of stricter rules believe a comprehensive framework is necessary. They argue that age limits alone cannot address all risks. Instead, they call for platform accountability, education and regulatory enforcement.
Meanwhile, the federal government continues to revisit elements of the proposed Online Harms Act. The legislation previously stalled but remains a key focus of digital policy discussions. Additionally, experts say any new framework must address emerging technologies like AI chatbots.
Owen said excluding chatbots from regulation would undermine the effectiveness of any online safety law. He noted that public concern has increasingly shifted toward these tools. Consequently, policymakers face pressure to broaden the scope of regulation.
Recent events have intensified scrutiny of AI platforms. One case involved a chatbot account linked to a violent incident in British Columbia. The company identified and banned the account months before the attack but did not alert authorities.
The situation raised questions about the responsibilities of technology companies. Furthermore, it highlighted gaps in communication between private firms and law enforcement.
Public opinion appears to favour stronger restrictions on youth access to digital platforms. A recent poll by the Angus Reid Institute found broad support for a ban on social media use for those under 16. Additionally, about three-quarters of respondents backed the idea.
However, experts caution that public support does not guarantee effective policy outcomes. They stress the need for balanced approaches that combine regulation with education and platform reform.
.