The social network has said more users are using features like groups and messaging, and it has to take action to stop misinformation, scams and "problematic" content from spreading on the platform.
Guy Rosen, Facebook's vice president of integrity, said: "Ultimately, the balance between protecting people's privacy and protecting public safety is something that societies have been grappling with for centuries probably, and we're certainly grappling with it."
Over the coming weeks, it will look at how moderators and administrators in groups can decided what content to keep up, and use that to establish whether the community is violating the rules.
It will also launch a Group Quality feature to let admins see what content has been flagged and removed.
As a punishment, groups repeatedly sharing misinformation will drop down on the News Feed.
Facebook has confirmed it has used human reviewers, technology and user reports to flag and remove offensive content even in secret groups.
Rosen explained that this means the social network can be more proactive by deleting content before someone has even reported it.