The legal text, which will come into force in 12 months, was approved on Thursday the 28th by the Senate with 34 votes in favor and 19 against, after obtaining the green light in the House of Representatives with 101 votes in favor and 13 against. The law even has the support of some of the opposition.
The regulations, pioneering in the oceanic country, seek to prohibit minors under 16 years of age from accessing social networks. In addition, it contemplates fines of up to 50 million Australian dollars (33 million dollars) for platforms that fail to comply with the regulation.
“This is about protecting young people, not punishing or isolating them,” said Michelle Rowland, Australia’s communications minister, who stated that the Drug abuse, eating disorders and violence are some of the harms that children can encounter online.
The House of Representatives must still ratify the amendments introduced by the opposition in the Senate, a procedure that is considered a formality, since the Government has assured that they will be approved.
Nevertheless, critics say there are important unanswered questions about how the law will be applied, how users’ privacy will be protected and, crucially, whether the ban will actually protect children.
When consulted by El Comercio on the subject, the lawyer specialized in computer legislation, Óscar Montezuma, stated that “These types of prohibitions have already been tried in the past and do not prevent “Minors find other ways to access technology.”. According to the expert, this approach could generate adverse effects, such as the creation of a black access market. “It is more important to build capacities in minors for the safe and responsible use of these platforms”he added, emphasizing that education in schools is a key tool to address the problem.
For his part, Erick Iriarte, also an expert in digital law, pointed out that “Platforms already establish age limits as part of their self-regulation, but these measures usually remain declarative. “A minor can change their date of birth and the platforms have no way to verify it without a more robust system.” Furthermore, consider that This law seeks to convert this self-regulation into a legal obligation, requiring platforms to implement effective identity validation mechanisms.
How does the new law work?
The initiative aims to protect children and adolescents from bullying and potential mental health problems. To achieve this, it creates a new category called “age-restricted social media platforms,” which prohibits minors under 16 years of age, including those with active accounts, from accessing these networks.
Among the main networks affected are Facebook, Instagram (both Meta), Reddit, Snapchat, X and TikTok. However, platforms considered low risk, such as YouTube, are exempt from these regulations.
Companies will have a year to define how they will implement the restrictions before the sanctions take effect.
According to polls cited by The New York Times, the majority of Australians support the ban. Parent groups have been broadly supportive, although some say the law doesn’t go far enough and should cover more platforms.
But to digital media experts and some parent groups, the heterogeneous nature of the platforms that will be included and excluded in the ban makes it It is not clear what exactly children are intended to be protected from.
Criticism and detractors
Senator David Shoebridge, of the Green Party, said that mental health specialists fear that the law can dangerously isolate children who use networks to find support, especially in regional communities and between the LGTBQI population. Shoebridge described the measure as “deeply flawed” y “dangerous”.
Amnesty International also expressed concern, noting that the law “does not address the fundamental problem that social media companies profit from harmful content, addictive algorithms and surveillance (to users),” according to a statement published Thursday.
The rule is designed in such a way that the responsibility for implementing these measures falls on the platforms, not on parents or minors. Montezuma questions this approach, indicating that “Digital literacy should be a priority in families and schools, not something that depends only on the platforms”. Furthermore, he pointed out that “there are effective parental controls “They allow parents to restrict certain content, although they are not perfect”. In Finland, for example, programs have been implemented in schools to teach children to spot fake news, a measure it considers more effective than simply blocking access.
Iriarte agrees with this idea, expressing that “a regulatory solution is not enough; A training approach is also necessary. We must work on developing awareness and teaching users, especially minors, how to manage these digital environments.”.
Specialists consider that thinking that everything that happens on social networks is negative is a mistake. There are also positive dynamics and support networks that could be affected. Furthermore, biometric control to implement these measures would be complex and expensive; it would be necessary to question whether the benefits justify the high implementation costs.
For their part, technology giants such as Meta and Google have asked the Australian Government for more time to develop age verification systems.
Background in the world
Australia is not the first country to take measures of this type. Similar initiatives already exist in the United States and the European Union.
In the US, 14 state prosecutors, both Democrats and Republicans, have sued TikTok for its negative impact on children’s mental health, accusing it of using addictive algorithms to profit from minors. Lawsuits have also been filed against Meta, owner of Instagram, WhatsApp and Facebook, under similar arguments.
In March, a bill was approved that prohibits minors under 13 from opening accounts on social networks. New York has gone further with two laws: one requires parental consent for minors under 18 to use recommendation algorithms and another limits the collection of data from minors.
In Europe, the General Data Protection Regulation establishes 16 years as the minimum age to open accounts on social networks, although it allows member countries to reduce it to 13 years.
Since February 2024, the European Commission has been investigating TikTok due to complaints from parents who accuse the platform of causing serious damage to the mental health of their children, including cases of suicide.
In Spain, where the average age to obtain a cell phone is 11 years, it is planned to raise the limit for opening accounts on social networks from 14 to 16 years.
Currently, 98.5% of Spanish adolescents are registered on some social network, and 83.5% on three or more.
In July 2024, Puerto Rico established 18 years as the minimum age to open a social media account.
Batuhan Sağlam – Videographer
MikroTik Academy – SMK AL HASRA
Youwin Giriş Adresi – Hepsibahis Güncel Giriş Adresleri
home – 上海隆匠网络科技有限公司
MC Makina – Hidrolik Kırıcı ve İş Makinası Tamir Bakım – Satış
Certificate verification problem detected
Index of /
Certificate verification problem detected
Trang Thông Tin Điện Tử Trường THPT Nguyễn Xuân Ôn
Darwins life
Nhà Xe Đồng Hương Sông Lam – Xe Giường Nằm Hà Nội – Nghệ An Uy Tín, Tiện Nghi
Clínica Incorpore
Rulo Paper – POS Cihazları ve Yazar Kasa Termal Rulo Kağıtları
Kí gửi môi giới nhà đất – nhà bán bmt, Buôn Ma Thuột
AMBLE ACTION PORTAL – Your Preferred IT Distributor
Tenkil Memorial – Tenkil Memorial
The Aawards – International Business Summit Awards
Sign Stop | Complete Sign Service
sheikhslife.com
Home – Clean Works Yeg
ierdu-idrc.org
Página no encontrada – Masdrone.com – Operadora de drones
Certificate verification problem detected
Sayfa bulunamadı – Big Master Electrical
Page not found – Welcome To English Fetish – English Fetish
The Gardens Mall – Home
Certificate verification problem detected
Fortune Tiger Demo: Explore o Slot Asiático com Giros Grátis –
Fortune Tiger: Descubra Prêmios no Slot Asiático da Sorte! –
Fortune Tiger 777: Descubra a Fortuna no Slot Asiático –