Thu, November 20, 2025
Wed, November 19, 2025
Tue, November 18, 2025

Experts Urge Greater Empathy and Social-Media Literacy Education After Alarming Online Trends

85
  Copy link into your clipboard //media-entertainment.news-articles.net/content/ .. racy-education-after-alarming-online-trends.html
  Print publication without navigation Published in Media and Entertainment on by Adelaide Now
  • 🞛 This publication is a summary or evaluation of another publication
  • 🞛 This publication contains editorial commentary or bias from the source

Experts Urge Greater Empathy and Social‑Media Literacy Education After Alarming Online Trends

In a recent piece published by The Adelaide Now, a coalition of child‑psychology specialists, digital‑safety researchers, and educational policy makers have warned that the surge of cruel online trends—ranging from harassing memes to self‑harm challenges—has highlighted a glaring gap in how young people learn to navigate the digital world. The article, which follows a growing number of high‑profile incidents where minors were publicly shamed or threatened on social‑media platforms, calls for a multi‑pronged response: stronger empathy training, systematic social‑media literacy programs in schools, and more accountable platform‑level moderation.


The “New Dark Side” of Social Media

The article opens with a chilling anecdote about a 14‑year‑old girl who was forced to delete her Instagram account after a viral “challenge” that urged teenagers to post self‑destructive content to gain “followers.” The challenge, which quickly went viral on TikTok and was replicated across Facebook, generated thousands of comments encouraging the girl to “keep posting.” When she stopped, she faced relentless hate‑mail and a barrage of harassing messages that left her feeling unsafe.

Experts note that such incidents are not isolated. A 2023 Australian Institute of Health and Welfare report—linked in the article—found that 18% of Australian youths aged 12–18 had experienced some form of online bullying in the past year. “We’re seeing a disturbing trend where the line between harmless joking and outright cruelty is disappearing,” says Dr. Marina Kaur, a child‑psychologist at the University of Adelaide. “The content is increasingly tailored to manipulate emotions, and that’s a new level of psychological harm.”

The piece also discusses the role of “echo chambers” on algorithmic feeds. It cites a study by the Australian Cyber Security Centre (ACSC), which the article links to, that shows how platforms amplify sensational or harmful content to keep users engaged. This amplification effect makes cruel trends spread faster and more widely than ever before.


Why Empathy Matters

One of the article’s central arguments is that fostering empathy should be a core component of digital‑safety education. Dr. Kaur emphasizes that empathy can serve as a preventative tool. “If students can put themselves in the shoes of someone who is being targeted, they’re less likely to engage in or perpetuate harassing behaviour,” she says. “Empathy training has been shown to reduce bullying incidents in a number of pilot programmes across the UK and Canada.”

The article references a 2022 pilot programme from the Victorian Department of Education that integrated empathy modules into the digital‑citizenship curriculum. The pilot found a 23% reduction in student‑reported cyber‑bullying after one school year. “We need to scale this up across the country,” Dr. Kaur insists.


Social‑Media Literacy as a Necessity, Not a Luxury

Beyond empathy, the article makes a compelling case for comprehensive social‑media literacy education. It draws on a joint statement from the Australian Digital Citizenship Foundation and the Australian Teachers of Technology (link provided) that outlines six pillars of digital literacy: content creation, privacy, mental‑health awareness, critical thinking, digital‑rights, and online etiquette.

The experts argue that understanding how algorithms work, how to spot misinformation, and how to protect one’s personal data can reduce the likelihood of falling prey to or becoming part of cruel trends. “Students should be taught to ask: Who benefits from this post? What are the potential consequences? Is this respectful?” says Dr. John Hargreaves, an educational psychologist at Griffith University.

The article further links to a resource by the Department of Education and Training’s “Digital Skills and Literacy” initiative, which outlines curriculum guidelines for integrating these competencies into Year 9 and Year 10 courses.


Platform Accountability

While the call for educational change is strong, the article does not ignore the responsibilities of social‑media platforms. It cites a recent policy shift by TikTok, which announced in 2024 that it would automatically flag and remove content that encourages self‑harm. However, critics—including the article’s authors—point out that the removal is reactive, not preventative. The article links to an analysis by the Australian Human Rights Commission that argues for a mandatory “harm‑prevention” framework that requires platforms to proactively identify and remove content that could lead to bullying or self‑harm.

Platform executives are quoted as acknowledging the need for better moderation tools. “We are investing heavily in AI‑driven content moderation, but human oversight is still essential,” says a spokesperson from a major platform. The article notes that the tech giant’s new AI system has reduced harmful content by 12% in the first six months, but experts say that real progress requires a culture shift toward empathy‑driven design.


Policy Recommendations and Next Steps

At the conclusion of the piece, the authors provide a clear policy roadmap:

  1. Curriculum Integration: Introduce empathy and social‑media literacy modules as mandatory subjects in middle and high school.
  2. Teacher Training: Offer professional development courses that equip teachers with the tools to facilitate these discussions.
  3. Cross‑Sector Partnerships: Encourage collaboration between schools, mental‑health services, and social‑media companies to develop shared resources.
  4. Parental Involvement: Provide workshops that help parents recognise signs of online distress and engage in constructive conversations with their children.
  5. Legislative Reform: Push for federal laws that hold platforms accountable for not only removing harmful content but also for designing algorithms that minimise the spread of such content.

The article links to the federal Government’s “Digital Safety Strategy” and to the proposed “Online Harm Prevention Bill” that is currently under parliamentary review.


Takeaway

The Adelaide Now’s feature underscores a stark reality: the rapid evolution of social‑media platforms has outpaced our current frameworks for safeguarding young people online. By centring empathy and robust social‑media literacy in education, and by demanding greater accountability from tech companies, experts hope to curb the spread of cruel online trends and to build a safer digital future for Australia’s youth. The article concludes with a rallying call to educators, policymakers, parents, and the public alike: “It is not enough to wait for technology to change; we must change how we teach, how we respond, and how we care for one another in the digital age.”


Read the Full Adelaide Now Article at:
[ https://www.adelaidenow.com.au/education/support/technology-digital-safety/experts-call-for-more-empathy-and-social-media-literacy-education-following-cruel-online-trends/news-story/869058d60707045e40b38eae7246b8cf ]