New York: Twitter’s 13th biannual Transparency Report represents our ongoing commitment to increasing the availability of critical information and data on how the micro social blogging site handles legal requests and other content issues.
While originally focused on requests submitted to Twitter by government actors, like court orders for information or content removal, over the years the company has been working to expand the report to also include more detail about the actions we take when enforcing the Twitter Rules.
In this latest report, Twitter has continued this evolution and are now including details about the enforcement of a number of key content policies. It has also added a new section covering platform manipulation, which you can read more about below and in the report itself.
Internet freedom and online expression
Internet freedom and online expression remain under significant pressure and constraint, a trend the company observed across recent reports. The latest Twitter Transparency Report shows that Twitter received approximately 80% more global legal demands, impacting more than twice as many accounts compared to the previous reporting period.
The new Twitter Rules enforcement
The new Twitter Rules enforcement section within the Transparency Report is a significant milestone on our transparency journey. It provides data and insights into the following areas of its enforcement approach: abuse, hateful conduct, private information, child sexual exploitation, sensitive media, and violent threats.
This also marks the first Twitter Transparency Report in which Twitter is publishing metrics pertaining to our actions to fight spam and other malicious forms of automation. This builds on our recent work to disclose a full database of previously removed content and accounts that had potential links to state-backed information operations.
Efforts to eradicate child sexual exploitation
Twitter does not tolerate any material that features or promotes child sexual exploitation — whether in Direct Message or elsewhere throughout the service. This includes media, text, illustrations, or computer-generated images. When Twitter removes content, the company immediately reports it to the National Center for Missing and Exploited Children (NCMEC). NCMEC makes reports available to the appropriate law enforcement agencies around the world to facilitate investigations and prosecutions.
Removing terrorist content
Twitter continues its efforts to eradicate content from our platform that violates the Twitter rule prohibiting the promotion of terrorism. The company suspended a total of 205,156 accounts under this policy in the period of January 1, 2018, through June 30, 2018. Of those suspensions, 91% consisted of accounts that were proactively flagged by internal, proprietary tools.