ONE IN EVERY THREE STUDENTS FROM 13 TO 15 IS A VICTIM OF CYBERBULLYING: DATA

In case you missed it Most Read

Sun 30 September 2023:

Anxiety, emotional distress and even child suicide are among the damaging consequences of cyberbullying, and better prevention strategies involving big tech must be developed, the UN Human Rights Council heard on Wednesday.

According to findings from the UN Children’s Fund (UNICEF), 130 million students worldwide experience bullying, which has been exacerbated by the spread of digital technologies. UNICEF estimates that one in every three students from 13 to 15 is a victim.

The Council heard heartfelt testimony from 15-year-old Santa Rose Mary, a children’s advocate from Uganda, who said that once one’s personal information or intimate photos have been shared online, “you can’t even face the community where you live, you can’t even face your own parents”.

She warned that such situations can bring a child to take their own life when they “have that feeling of not being needed in the community”.

UN deputy human rights chief Nada Al-Nashif noted that according to the Committee on the Elimination of All Forms of Discrimination against Women (CEDAW), cyberbullying affects girls almost twice as much as boys.

Al-Nashif quoted research from the UN World Health Organization (WHO), showing that children who are subject to bullying are more likely to skip school, perform less well on tests and can suffer sleeplessness and psychosomatic pain.

Some studies also show “far-reaching effects extending into adulthood”, such as high prevalence of depression and unemployment, she said.

Al-Nashif told the Council that the “complex” topic of cyberbullying lies at the intersection of human rights, digital and policy issues.

“To get this right, we must adopt a holistic approach, and address root causes”, she said, underscoring that “central to this is the voice of children themselves”.

She also stressed the “centrality and power of companies in the online space”, insisting on the responsibility of tech companies to provide adapted privacy tools and follow content moderation guidelines “in line with international human rights standards”.

A representative of Meta, Safety Policy Director Deepali Liberhan, took part in the discussion and spoke to the magnitude of the problem.

She said in the third quarter of 2023 alone, some 15 million pieces of content had been detected on Meta’s platforms Facebook and Instagram that constituted bullying and harassment; most were proactively removed by Meta before even being reported, she said.

Ms. Liberhan highlighted the company’s content moderation policies and ways in which Meta was enforcing them on its platforms, partnering with experts to inform the action it takes, and incorporating anti-bullying tools into the user experience.

At the conclusion of the session, panellist Philip Jaffé, Member of the Committee on the Rights of the Child, stressed the “collective” responsibility for the safety of our children.

“We need to make children more aware of their rights and make States and other components of society more aware of their obligations to protect [them],” he insisted.

NEWS AGENCIES

______________________________________________________________ 

FOLLOW INDEPENDENT PRESS:

TWITTER (CLICK HERE) 
https://twitter.com/IpIndependent 

FACEBOOK (CLICK HERE)
https://web.facebook.com/ipindependent

Think your friends would be interested? Share this story!

Leave a Reply

Your email address will not be published. Required fields are marked *