For Compliance
Overview here
https://en.wikipedia.org/wiki/Online_Safety_Act_2023
Essence
Compliance with the Online Safety Act 2023 requires a proactive and ongoing commitment to online safety, with a focus on protecting users, especially children, from harm.
- BIG position: we will take all reasonable effort given probability and threat present
General overview
- Risk Assessments:
Conduct thorough risk assessments to identify potential harms on their platforms
- BIG Position: Please add any risks you can identify to a post in this thread
We will respond with measures
- Content Moderation:
Content moderation systems to proactively identify and remove illegal content is crucial.
This includes having clear policies and procedures for handling user reports of harmful content.
- BIG Position: Any person can flag any post at any time. Any Moderator can delete any flagged content
Pro-active as in before the event isn’t possible. We will use best endeavours to minimise reactive response times
- Child Safety:
The Act places a strong emphasis on protecting children. Organizations must implement measures to prevent children’s exposure to harmful content and activities.
- BIG does not target nor ban people by age. We Use the working premise that our users are over the age of majority due to the subject matter that we discuss.
BIG uses a standard of ‘Normal societal acceptability’ - for example as is displayed on Netflix and the BBC after 2100 as an informal benchmark
- Age verification and appropriate safety measures are essential
- Freedom of Expression:
Compliance efforts must balance online safety with users’ rights to freedom of expression.
- BIG Users are free to express them self within bounds of the Communitie’s norms as judged by a representative group of members
- Moderation policies should be transparent and consistent.
- BIG Anybody can flag anything at any time. If clearly in violation it will be removed.
If it’s 2nd strike the user will be removed. If it is of dubious content the moderator will consult with 1 or more other moderator(s)
- Transparency and Accountability:
Organizations must be transparent about their online safety policies and practices.
- SBIG This post as updated from time to time equals BIG’s policy & process
They must also be accountable for their actions in enforcing those policies.
- BIG All decisions can be questioned and will be explained.
Explainations can be appealed to a different person
- Reporting Mechanisms:
It is very important to have easy to use reporting mechanisms so that users can easily report harmful content.
- BIG All posts have a menu option to flag them attached (under the three dots elipses
- Regulation:
Ofcom is the regulator for this act.
They will be enforcing the rules within it.
Key Elements of a Compliance Approach:
- Develop and implement clear online safety policies.
- BIG This thread is BIG’s policies and procedures
- Invest in content moderation technologies and resources.
- All BIG members are expected to be resources that can/ will flag/ moderate/ assist in any/ all aspects as needed
- Provide user education on online safety.
- see Https://…
- Establish clear reporting mechanisms.
- BIG: See flagging above
- Maintain records of compliance efforts
- BIG This post is record #1 26/2/25
- Stay up to date with Ofcom guidelines.
Disclaimer:
- This information is for general guidance only and does not constitute legal advice or undertaking