| |||||||||||
Find Experts & Sources
Media ResourcesSubject Index Releases List Event Calendar Submit a news release or Calendar event Include yourself in SOURCESFAQ Membership Form Be an Affiliate Powerful Tools Tell your story Media Directory Media Releases Sources Bookshelf Sources Newsstand Subscriptions Connexions Radical Digressions Sources HotLink Sources Select Resources Twitter Download PDFs Contact |
Media Releases from members of Sources.
To submit a news release, use this form. Censorship in Social Media Leaves Users in FrustrationDecember 15, 2016EFF and Visualizing Impact Analyze Reports of Content Moderation Gone AwrySan Francisco - User reports of censorship of social media posts show a deep frustration with companies content moderation policies, according to an analysis by Onlinecensorship.org, a project of the Electronic Frontier Foundation (EFF) and Visualizing Impact. In Censorship in Context: Insights from Crowdsourced Data on Social Media Censorship, researchers analyzed reports of content takedowns received from users of Facebook, Google+, Instagram, Twitter, and YouTube from April to November of 2016. At a time when many are asking for more content moderation - like calls for Facebook to crack down on fake news - election-related censorship complaints focused on the desire of users to speak their minds and share information about a tight election without worrying that their posts will disappear. Social media is where we receive news, debate, and organize. These companies have enormous impact on the public sphere, yet they are still private entities with the ability to curate the information we see and the information we dont see at their sole discretion, said Jillian C. York, EFF Director for International Freedom of Expression and co-founder of Onlinecensorship.org. The user base is what powers these social media tools, yet users are feeling like they dont have any control or understanding of the system. Censorship in Context recommends best practices for social media content moderation, including transparency in how company policies are enforced and any available remedies. The researchers also urge strengthening systems of redress when content is removed in error, and doing a better job of educating users about what is acceptable on a given platform and what isnt. "Many people depend on Facebook to talk to friends, family, clients, and fans, and to debate the issues of the day, said Project Strategist Sarah Myers West. While these companies have the right to set their own rules, the least they can do is to tell everyone how theyre enforced. Onlinecensorship.org was launched in November of 2015 to spot trends in content removals and learn how these takedowns impact different communities. The site also includes a guide to appealing a content takedown and hosts a collection of news reports on content moderation practices. For the full whitepaper: https://onlinecensorship.org/news-and-analysis/onlinecensorship-org-launches-second-report-censorship-in-context-pdf For more information contact: Jillian C. York Director for International Freedom of Expression Electronic Frontier Foundation Phone: - Email: jillian@eff.org Website: www.eff.org |