Facebook makes it easier for creators to report impersonators | TechCrunch
Briefly

Facebook makes it easier for creators to report impersonators | TechCrunch
"This is key to Facebook's continued success as a creator platform. Simply put, if unoriginal content and AI slop drowns out original voices and reduce their own ability to monetize, it will no longer be a destination creators prefer."
"Meta now says its earlier efforts have led to both views and time spent watching original content on Facebook approximately doubling during the second half of 2025, compared with the same time the year before. It also said it's made progress on removing impersonators, with 20 million accounts removed in total last year and a 33% drop in the number of impersonation reports related to large creators."
"Now, Facebook says it's testing enhancements to its content protection tools, which allow creators to take action when their reels are detected across its platforms, where they've been published by impersonators. From this dashboard, creators can report that content. With the coming update, Meta aims to make the reporting process even easier by allowing creators to submit reports all in one place."
Meta announced new tools to detect impersonation and updated creator guidelines defining original content on Facebook. Following accusations of the platform becoming an "AI slop hellscape," the company implemented changes to reduce unoriginal and AI-generated content while elevating original creators. Meta reports that views and time spent on original content approximately doubled in the second half of 2025 compared to the previous year. The company removed 20 million impersonator accounts in 2024 and achieved a 33% reduction in impersonation reports related to large creators. New enhancements to content protection tools allow creators to report impersonated reels across Meta's platforms through a unified dashboard, though current matching focuses on content rather than creator likeness.
Read at TechCrunch
Unable to calculate read time
[
|
]