

The DIA report released today said it had heard widespread concerns about the harm some content was causing children and young people. Child protection and consumer safety not effective - DIA
ONLINE DIA TV
Media services like TV and radio broadcasters would also need to follow new codes tailored to their industries. The proposed new regulator would make sure social media platforms followed codes to keep people safe. Social media does not have similar compliance requirements in New Zealand. These are a mixture of government and industry-led regulations. The existing system has processes in place to ensure that broadcasters like television, radio, and other traditional media comply with existing codes. DIA said the proposal was a deliberate shift away from the status quo of regulating content, towards regulating platforms. Sector or industry organisations would help come up with enforceable codes of practice. "This is why public feedback on the review is essential." "We want to create safer platforms while preserving essential rights like freedom of expression, freedom of the press, and the benefits of media platforms. "Government would only be involved with individual pieces of content if they are, or could be, illegal - this power already exists. A new independent regulator would be responsible for approving the codes and making sure platforms complied with them. Parliament would set expectations that platforms must achieve there would be codes of practice, with more detailed minimum expectations. The regulator would have powers to require illegal material to be removed quickly. "This material is already illegal, and it will remain illegal to produce, publish, possess, and share."īut other illegal content - such as harassment or threats to injure or kill - can be taken less seriously or even amplified online. This included child sexual exploitation and promotion of terrorism material ('objectionable material'), and it was not proposing to change the definition of what was illegal. "The system would keep powers of censorship for the most extreme types of harmful content," DIA said. Platforms over a certain size (online or traditional media) would be required by law to comply with codes of practice to manage content and address complaints about specific harmful content. Because of this, New Zealanders were being exposed to more harmful content than ever before, it said.ĭIA said the proposed new system would apply some of the accountability mechanisms traditional media were subject to, to social media. Authorities could only take action after people had already been harmed. It said the status quo was slow and reactive. People had to figure out which of five industry complaint bodies to go to if they felt content was unsafe or breached conditions, and not all forms of content were covered by those bodies. Govt says current system confusing, not as strong as it should beĭIA said the existing child and consumer protection was not as strong as it should be, was difficult to navigate and had big gaps. Its main focus would remain on the the most risky material such as harm to children, and promotion of terrorism and violent extremism.ĭIA said freedom of expression would be protected, and there would be no powers over editorial decision-making or individual users who shared legal content. The proposed reforms would bring them all into one framework, to be overseen by a new independent regulator. The department said different standards and rules currently applied to different platforms and the regulations were decades old and predated social media.

It has today released a discussion document for consultation which aims to make it less common for people to see harmful and illegal content. The Department of Internal Affairs (DIA) wants to make social media subject to regulation similar to that of traditional media platforms. The Department of Internal Affairs has released a discussion document for consultation which aims to make it less common for people to see harmful and illegal content.
