Content media

Aotearoa needs solutions to fight harmful online content

A new study published today highlights our growing concern about harmful content online.

Today the Classifications Office released its report, ‘What We Watch – New Zealanders’ opinions of what we see on screen and online’.

The report shows that 83% of New Zealanders are concerned about harmful or inappropriate content on social media, video-sharing sites or other websites. And more than half of us (53%) had seen content online that promotes or encourages harmful attitudes or behaviors, such as discrimination, terrorism or suicide.

InternetNZ Acting Chief Executive Andrew Cushen said the report’s findings add to growing evidence that this is a critical issue here in Aotearoa.

“Current systems for dealing with harmful, hateful and potentially illegal online content are not working for communities. There are real concerns, real risks and real hurts right now.” Cushen said.

The report’s findings show that most New Zealanders are concerned that our systems are not working well enough. It indicates that many lack the confidence to report harmful content. Most New Zealanders (74%) would consider reporting harmful, dangerous or illegal online content to an official agency in New Zealand. However, the results showed a high level of uncertainty about how to report such content, or what the response would be.

“Currently, harm reporting is split between different organizations and regulations. This makes it very difficult for people to know where to go, and also means that there is no shared data collection on the types of harm that occur. produce.

“Harmful content online is a problem facing governments and communities globally. But while there are no easy or obvious answers, there are approaches we can take in New Zealand. Zeeland to make sure we find solutions that work the most for the people and communities affected.

“The government is currently carrying out a thorough review of the content regulatory system. This is our chance to update the laws and regulations that have been put in place for a different world, but also to explore non-regulatory approaches.

“In order to put these systems in place, we need to listen to the people most affected by harmful behavior online as well as the people and groups in our communities who are already working on these issues. The coordination and resources for this kind of dialogue could start now and should be part of how the system continues to evolve over time,” Cushen says.

Although online harm is widespread, we know that certain groups and communities are more affected than others. For example, the report states that it is more common for Māori and Pacific participants to see content promoting hatred or discrimination based on race, culture and religion. And it’s more common for young people between the ages of 16 and 29 to see content promoting violent extremism or terrorism.

The report also shows that people lack trust in tech companies to keep them safe. Only 33% ‘somewhat’ or ‘strongly’ agree that online platforms provide what people need to stay safe.

At the same time, the group of products managed by Meta is an overwhelming feature of the social media landscape in New Zealand. InternetNZ research shows that using Facebook, Facebook Messenger, Instagram or WhatsApp is something 79% of New Zealanders who are online do on a daily basis.

“The government – together with communities, platforms and experts – must find effective ways to ensure that these services are part of the solution rather than part of the problem.

“We believe content regulatory review is the best way to work through these issues. It could offer effective ways to address these harms, particularly if community voices are supported to participate in designing approaches that meet the needs of the people of Aotearoa,” says Coushen.

You can view the ‘What We’re Watching’ report on the Classifications Office website here:

© Scoop Media