The Australian government has made strides towards enforcing age verification for social media users as it approaches the December deadline for new legislation aimed at protecting children online. Age Check Certification Scheme (ACCS) has partnered with KJR, a user-focused software consultancy. Collectively, they were recently brought to the production of a preliminary report on the outcomes of an Age Assurance Technologies trial. The full report published on Friday details those patterns and trends in a broad manner. It fails to include detailed outcomes from the tests that were run.
Last year, Rep. Under the new laws, access must be prohibited for all users under the age of 16 on major platforms such as TikTok, Instagram, Snapchat and Facebook. The government argues that this ban is necessary to protect the mental health and well-being of children and teenagers. They are responding to the most parental and public concern about social media use at a time.
Preliminary Findings
The new two-page summary from the Age Check Certification Scheme & KJR summarizes critical challenges with age verification technology. Significantly, the trial’s results showed that there is no “one-size-fits-all” approach when it comes to age assurance. Rather, it will be up to social media platforms to determine and adopt the technology that best meets their needs.
Tony Allen, a spokesperson for KJR, commented on the findings, stating, “The preliminary findings indicate that there are no significant technological barriers preventing the deployment of effective age assurance systems.” He added, “These solutions are technically feasible, can be integrated flexibly into existing services and can support the safety and rights of children online.”
The report found that some face-scanning technology tested inside of schools on students could predict their age. It only does this accurately for up to 18 months in 85 percent of scenarios. This variability highlights the challenges involved in adopting meaningful age verification solutions on various online platforms.
The report’s findings were fascinating. It didn’t go far enough to include key information such as what tests were used, what methods and/or technologies were observed/tested, or the outcome of each method/technologies. This lack of transparency has frustrated stakeholders’ ability to understand the effectiveness of proposed solutions.
Legislative Context
Australia is set to ban Tik Tok and other social media platforms from government devices this week. This change is intended to address the issues surrounding youth’s engagement with online spaces. As it stands, the legislation gives climate projects a 12-month safe harbor. This period provides the e-Safety commissioner with ample time to develop appropriate implementation plans for the age restriction.
The government and members of the Coalition argue that these measures are necessary to protect children from harm, with a spokesperson for Anika Wells stating, “We know that social media age restrictions will not be the be-all and end-all solution for harms experienced by young people online, but it’s a step in the right direction to keep our kids safer.”
Many critics have raised doubts about the ban’s unlikely prospect of success. Lisa Given announced, “I don’t believe the ban is sustainable.” She stressed that we should focus on more holistic solutions, rather than just enforcement and limitations. She further stressed, “The government must get this right. No more young lives can be lost or families destroyed because of the toxicity of social media.”
Future Implications
The complete findings from the trial are to be sent to the communications minister by the end of next month. In a few months, they will be released to the public. Stakeholders across the nation are looking forward to these results. Through these pilots, they will learn about the efficacy of different age verification technologies and their potential effects on social media use by minors.
The preliminary report raised “concerning evidence” that some social media companies might be “over-anticipating the eventual needs of regulators” due to a lack of specific guidance from the government. This example highlights gaps in regulatory frameworks that need to be addressed to foster innovation while protecting children.