Congress Grills Big Tech Executives on Accountability Issues

Prior to a House hearing on social media’s role in extremism and disinformation, Facebook chief executive Mark Zuckerberg submitted written testimony on Section 230, suggesting that “platforms should be required to demonstrate that they have systems in place for identifying unlawful content and removing it.” Section 230 of the 1996 Communications Decency Act holds that platforms are not liable for content posted by their users. In a bipartisan effort, lawmakers are pushing for change. “Our nation is drowning in disinformation driven by social media,” suggested Rep. Mike Doyle (D-Pennsylvania). “We will legislate to stop this.” Continue reading Congress Grills Big Tech Executives on Accountability Issues

Facebook Struggles to Contain Health Misinformation, QAnon

According to global civic movement Avaaz, over the past year Facebook enabled 3.8 billion views of misinformation related to health, almost four times the views of sites such as the World Health Organization (WHO) and Centers for Disease Control (CDC). This has occurred despite Facebook’s partnership with these organizations to expose users to reliable information. In another effort to squelch misinformation, Facebook removed 790 QAnon groups and restricted another 1,950 groups, 440 pages and 10,000+ Instagram accounts. Continue reading Facebook Struggles to Contain Health Misinformation, QAnon