NZFVC webinar on consultation on regulating online services and media platforms


Thu 29 Jun 2023

We are hosting a webinar on 10 July 2023 looking at the consultation by Te Tari Taiwhenua | Department of Internal Affairs asking for feedback on a new system for keeping people safe online. Submissions for the consultation are due by 31 July 2023.

photo of a laptop, mobile phone and notepad and pen on a wooden desk

NZFVC webinar

The New Zealand Family Violence Clearinghouse is hosting a webinar on Understanding the Safer Online Services and Media Platforms consultation. The webinar will give an overview of the proposed reforms and why this is important for people working in family violence and sexual violence. The webinar will be on Monday 10 July from 12:00pm to 1:00pm. Panellists for the webinar are Anjum Rahman from Inclusive Aotearoa Collective Tāhono, Kate Hannah from The Disinformation Project and Jo Robertson from The Light Project.

See more information about the panellists and register to attend

Consultation on regulating online services and media platforms

Te Tari Taiwhenua | Department of Internal Affairs (DIA) has proposed changes to the way media and online platforms are regulated in New Zealand to keep people safe online. DIA is inviting feedback on these reforms.

Submissions to the Safer Online Services and Media Platforms consultation close 31 July 2023.

General Manager Policy, Department of Internal Affairs, Suzanne Doig says "Child protection and consumer safety for media and online content is not as strong as it should be in Aotearoa. The existing regulatory system is decades old and predates media platforms such as social media."

Why is reform needed?

The regulatory legislation, including the Films, Videos and Publications (Classification Act) 1993 and the Broadcasting Act 1989, is over 30 years old. It does not cover the range of harms people experience across online services and media platforms.

A key focus of the proposed reform is around protecting children and young people from harm caused by online content. The DIA Frequently Asked Questions also highlights that many people are harmed by online content:

"During the targeted engagement phase of the programme, we also repeatedly heard concerns about how social media is used to harass, bully, or otherwise harm people. This concern was consistent across our engagement with Māori, young people, advocates for children, women, older people, disabled people, LGBTQIA+ communities, and communities that have been or are targets of ethnic, racial and/or religious discrimination.

These are real harms that are happening to New Zealanders. That’s why we need to get these settings right."

Misogynistic abuse and violent threats against women in New Zealand including wāhine Māori was also identified as a significant issue.

The consultation Discussion Document highlights many gaps and issues in the current regulatory system including challenges for seeking help or reporting harmful content:

"New Zealanders must figure out which of five industry complaint bodies to go to if they feel content is unsafe or breaches the conditions of the platform it is on. On top of that, not all forms of content are covered by those bodies. The system is also very reactive because it relies mainly on complaints about individual pieces of content. For most forms of content, we do not have the tools and powers to ensure that platforms are doing what they should to manage the risks of harmful content."

The Discussion Document also highlights that under our current system, behaviour that is illegal is sometimes tolerated online:

"Our current system has legal powers to deal with the most awful and illegal content like child sexual exploitation and promotion of terrorism, regardless of whether it is delivered online or through traditional forms of media such as printed publications. But sometimes content that includes other illegal actions (such as threatening to injure) can be taken less seriously or even amplified online."

The major change in the proposed reform is the way that social media platforms are regulated. Social media content is not consistently regulated in New Zealand. Many platforms have their own systems for dealing with unsafe content, but these are not overseen by a regulatory authority in New Zealand and are often voluntary. The Discussion Document notes:

'Unlike traditional broadcasters, like television and radio, online platforms do not have a single agreed code of standards, ethics, and rules. While platforms have their own policies to manage these harms, it is now internationally acknowledged that they need to be brought into formal regulatory systems to reduce the risk of harm."

What is proposed?

The proposed reform outlines a new way to regulate providers, like social media platforms (such as Facebook, YouTube and Twitter) and traditional media platforms (like radio and TV) under one framework. The focus is on regulating platforms, versus content. The purpose of the reforms is to enhance protections by reducing exposure to harmful content regardless of the platform. The changes would bring all platforms into one framework with consistent safety standards.

The are 4 key elements to the proposed changes. These elements are based on the principle that platforms are responsible for the safety of the products and services they provide. The 4 elements include:

  • an industry regulation model that uses codes of practice
  • an independent regulator
  • continuing to remove and block access to the most harmful content
  • investment in education and awareness

The consultation Factsheet explains the different roles in the proposed changes:

  • "Parliament will set New Zealand’s expectations for the safety-based outcomes platforms must achieve.
  • Codes of practice will set out more detailed minimum expectations for harm minimisation, user protection and transparency across services.
  • An independent regulator would be responsible for approving the codes and making sure platforms comply with those codes, as well as funding and finding opportunities for education."

The Discussion Document provides more detailed information about the key elements, different roles and how the codes of practice would work. It also includes 4 appendices that cover more details about the current situation, the principles guiding the work, the rights framework in New Zealand and a comparison table for frameworks in other countries.

DIA is not proposing changes to what is considered illegal in New Zealand. The Discussion Document says:

"The system would retain powers of censorship for the most extreme types of content (called ‘objectionable’ material). This material is already illegal, and it will remain illegal to produce, publish, possess and share."

And it goes on to say:

"The regulator would also have powers to require illegal material to be removed quickly from public availability in New Zealand. These powers exist already for objectionable material. We are proposing that the regulator should also have powers to deal with material that is illegal for other reasons, such as harassment or threats to kill. We seek your feedback on what other kinds of illegal material the regulator should have powers to deal with."

The proposed changes are focused on regulating platforms based on their role in providing a platform for distributing harmful content. The current proposals would significantly affect the functions of the Classification Office, Film and Video Labelling Body and Broadcasting Standards Authority (see page 65 of the Discussion Document).

The Discussion Document also notes:

"Organisations such as Netsafe will continue to help people navigate the new framework, and generally educate people on how to keep themselves safe online. Netsafe would also be an important partner in helping to identify emerging systemic issues for the regulator’s attention, as well as supporting the regulator’s monitoring and public awareness role."

More information

DIA has prepared several documents to explain the changes. This includes:

Most of these documents are available in alternate formats including large print, audio recording and video recording with sign language. You can find all information on the DIA Public Consultation: Safer Online Services and Media Platforms webpage.

DIA is hosting information session webinars. These sessions will explain the proposed changes, how to give feedback and answer questions. There are two sessions remaining on 8 July 2023 and 20 July 2023. Register for the free sessions.

InternetNZ | Ipurangi Aotearoa is hosting a free webinar on the basics of the Safer Online Services and Media Platforms framework on 4 July 2023. Register for the webinar.

DIA has also provided background information and a report summarising key themes from initial targeted engagement, Content Regulatory Review Summary of initial targeted engagement: September 2021 (published April 2022).

Update: InternetNZ has created short guides about key factors of the Safer Online Services and Media Platforms consultation. This includes information about how the consultation considers Te Tiriti o Waitangi and issues for Māori

Update: The Institute for Strategic Dialogue published the policy paper Misogynistic Pathways to Radicalisation: Recommended Measures for Platforms to Assess and Mitigate Online Gender-Based Violence in September 2023.

Update: Announcing their 2022/23 Annual Report, the Broadcasting Standards Authority said:

"Urgent and long-overdue reforms are needed to bring outdated laws and regulations into line with today’s broadcasting reality and ensure a sustainable media sector, the Broadcasting Standards Authority says.

It its 2022/23 annual report published today, the BSA emphasises the need for change to the 34-year-old legislation it operates under, to respond to the risks of a fast-changing media landscape."

Update: Te Kāhui Tika Tangata | Human Rights Commission published the independent report, How to improve the Aotearoa New Zealand Code of Practice for Online Safety and Harms? in December 2023. The report looks at how the Code of Practice could be improved to address Te Tiriti and human rights. The Code is a voluntary code signed by Meta (Facebook and Instagram), Google (YouTube), Tiktok, Twitch, and Twitter (now X) to guide how these tech companies manage online harm in Aotearoa. Anjum Rahman, Founder and Project Co-Lead of Inclusive Aotearoa Collective Tāhono, spoke with radio 531pi about the report and issues with the Code. Aliya 'Allyn' Danzeisen, National Coordinator for Islamic Women's Council New Zealand, was recently interviewed about digital responsibility and the part the government needs to play.

The 2023 report from the World Internet Project (NZ) found that 61% of people believed social media companies should be more strongly regulated than they are now. See media outlet The Press's article Most Kiwis spending at least 5 hours a day online but trust in social media low

How to give feedback

The Discussion Document outlines 26 questions. You can provide feedback on some or all of these questions. You can also provide feedback that is not addressed in the questions.

You can give feedback by:

Questions for feedback are listed in the Discussion Document on pages 12-13. For each question, it lists the page where you can find the relevant information in the Discussion Document.

Comments from community agencies

Anjum Rahman, Co-Lead of Inclusive Aotearoa Collective Tāhono “...welcome[d] the move to an independent regulator that will be protected from political interference by the State or any political actors. ” However, Rahman highlighted concerns saying

“...as the proposal is for the regulator to regulate platforms rather than content, the content regulation is left to industry codes of practice.

"We have been advocating for the past two years to have proper involvement of impacted communities in the development of the Aotearoa New Zealand Code of Practice for Online Safety and Harms administered by NZ Tech. We continue to be of the opinion that this Code will not serve the needs of the communities who need it most.

“To have further codes of practice developed by industry, rather than civil society and communities, will lead to similar results. Any new regulatory structures must include a strong community voice embedded in the model, rather than at the discretion of industry actors."

InternetNZ Chief Executive, Vivien Maidaborn welcomed the consultation saying "It could potentially be one of the most important opportunities in a generation to enable regulation that helps address harmful online content." Maidaborn encouraged communities who disproportionately experience harm to give feedback saying:

“It is especially important that communities that are at most risk have their say. This proposed regulation, and the codes to be developed under it, will not serve those communities unless they are heard during this process.

“Feedback from diverse voices is critical if the government is to get this right and the systems are going to be effective.”

Related news

InternetNZ | Ipurangi Aotearoa published a new report that explores what an ‘Internet for good’ means for people in Aotearoa. The report, An Internet that benefits (July 2023), shares findings from conversations with over 140 people across Aotearoa about their experiences with and aspirations for the internet. InternetNZ commissioned Toi Āria: Design for Public Good and Making Everything Achievable to do this research. The team actively sought to hear from tāngata whenua, Pasifika, people living with disability, LGBQTTQIA+ communities, migrant and former refugee communities and younger people. The report highlights 7 themes:

  • "Our Aotearoa context is unique
  • The Internet is here to stay
  • The Internet is changing us
  • The Internet is not safe for everybody
  • A better Internet is a more accessible Internet
  • A better Internet is a more diverse Internet
  • A better Internet needs better education".

For more information see the InternetNZ media releases New research shows people want an active part in shaping the future of the Internet and Aspirations for a better Internet for all of Aotearoa.

Related media

Young person found hoarding thousands of child exploitation, animal cruelty images, RNZ, 26.03.2024

Jo Robertson: Make Sense co-founder on Spark becoming New Zealand's first internet provider to join the Internet Watch Foundation, NewstalkZB, 24.03.2024

Combating child porn: New Zealand’s digital child exploitation filters need an overhaul, NZ Herald, 24.03.2024

Spark to introduce barriers to help stamp out objectionable material, cyber risks and scams, RNZ, 21.03.2024

Five years since the Christchurch terror attacks, are our online spaces any safer?, The Spinoff, 15.03.2024

Regulation needed for online media giants, Waatea News, 30.11.2023

‘I warned Mark Zuckerberg teens weren’t safe on Instagram – he ignored me’, Stuff, 20.11.2023

Māori tech solutions are key to helping those scammed online, Te Ao Māori News, 01.11.2023

Indigenous women share racist online experiences at conference, Te Karere TVNZ, 29.10.2023

Amokura Panoho | Member of Te Pūkotahitanga, Waatea News, 25.10.2023

The harsh and unregulated reality of online safety for indigenous women, NZ Herald, 25.10.2023

'Rampant' increase of digital harm on indigenous women, conference told, Stuff, 25.10.2023

X fined more than $600,000 over child safety, One News, 16.10.2023

Tech: EU's new laws change the internet, RNZ, 24.08.2023

Young people need more support coping with online sexual harms, The Conversation, 07.08.2023

Tech: Spotting AI content and the UK's Online Safety Bill, Newstalk ZB, 22.07.2023

Social media: The kids are not all right, Newsroom, 19.07.2023

A once-in-a-generation opportunity to make a safer internet, Stuff, 15.07.2023

Inclusive Aotearoa Collective Tahono, RNZ, 14.07.2023

Trust issues blight Māori internet use, Waatea News, 07.07.2023

Alarm at Twitter inaction over videos of Christchurch mosque terrorist attacks, The Spinoff, 03.07.2023

The stabbing attack at the University of Waterloo underscores the dangers of polarizing rhetoric about gender, The Conversation, 01.07.2023

Blindfolded during sex, then blackmailed - extortioner jailed, Stuff, 28.06.2023

Tackling harmful content never going to be a simple discussion, 06.06.2023

Don’t turn online safety into a new culture war, Newsroom, 02.06.2023

DIA Media Regulation Overhaul Shows Promise, But Deserves Rigorous Public Scrutiny, Press Release: Brainbox, Scoop, 02.06.2023

Officials eye reforms to combat Internet harm [interview with Vivien Maidaborn], RNZ, 02.06.2023

New code of conduct for online content planned [interview with Suzanne Doig], RNZ, 02.06.2023

How a new proposal could change online safety in New Zealand, The Spinoff, 01.06.2023

Consultation opens on proposals to make online spaces safer, RNZ, 01.06.2023

Regulator proposed to protect NZers from harmful online content, One News, 01.06.2023

Campaigners call for tighter restrictions on illegal sexual content online, RNZ, 25.05.2023

Jacinda Ardern on countering disinformation, and the Christchurch Call with Elon Musk, Stuff, 01.11.2022

Image: Andrew Neel on Pexels

More news articles about Government