Twitter Investors Back Musk Offer as Whistleblower Testifies

Twitter shareholders this week approved the $44 billion takeover bid by Elon Musk, voting the same day as whistleblower Peiter Zatko testified at a Senate Judiciary Committee hearing, telling lawmakers that the social media company’s leadership misled regulators about security failures. Senator Chuck Grassley (R-Iowa) was skeptical as to Twitter CEO Parag Agrawal keeping his job if Zatko’s allegations prove to be true, saying the executive “rejected this committee’s invitation by claiming that it would jeopardize Twitter’s ongoing litigation” with Musk. Twitter has categorically denied Zatko’s claims, which include foreign agents infiltrating Twitter’s workforce. Continue reading Twitter Investors Back Musk Offer as Whistleblower Testifies

FTC Explores New Rules Surrounding Data Collection and AI

The Electronic Privacy Information Center is calling on the Federal Trade Commission to create rules that would protect the digital privacy of teens. Human Rights Watch is asking the FTC for safeguards to prevent education companies from selling minors’ personal information to data brokers and a ban on data-driven advertising targeting children. Both groups were represented at the FTC’s first public forum to explore adopting new rules around data collection and AI training on personal data. Practices the FTC is examining include the timeframe in which companies can retain consumer data and mandating audits of automated decision-making systems. Continue reading FTC Explores New Rules Surrounding Data Collection and AI

EU’s AI Act Could Present Dangers for Open-Source Coders

The EU’s draft AI Act is causing quite a stir, particular as it pertains to regulating general-purpose artificial intelligence, including guidelines for open source developers that specify procedures for accuracy, risk management, transparency, technical documentation and data governance, well as cybersecurity. The first law on AI by a major regulator anywhere, the proposed AI Act seeks to promote “trustworthy AI,” but some are critical that as written the legislation could hurt open efforts to develop AI systems. The EU is seeking industry input as the proposal heads for a vote this fall. Continue reading EU’s AI Act Could Present Dangers for Open-Source Coders

Australia’s Highest Court Rules Google Links Not Defamatory

In a major reversal, Australia’s highest court found Google not liable for defamatory content linked through search results, ruling that the Alphabet subsidiary “was not a publisher” of the objectionable content. Google was sued for defamation for a 2004 article appearing in its search engine results, and both the trial court and a circuit court of appeals held Google responsible as a “publisher” because it was instrumental in circulating the contents of the offending article. The lower courts rejected Google’s reliance on the statutory and common law defenses of innocent dissemination and qualified privilege. Continue reading Australia’s Highest Court Rules Google Links Not Defamatory

Massachusetts Court Objects to Gig Worker Ballot Measure

A proposed Massachusetts ballot initiative designating gig drivers as independent contractors was nixed by a state court that deemed it an attempt to avoid liability by companies like Uber and Lyft in the event of accident or crime. The Tuesday ruling effectively halted a $17.8 million campaign in support of a bill the Massachusetts Supreme Judicial Court said violates the State Constitution, with hidden language excepting drivers from being “an employee or agent” of a gig company. The move is the latest in a series of skirmishes between gig companies and local governments.  Continue reading Massachusetts Court Objects to Gig Worker Ballot Measure

Criminal Liability Will Be Added to the UK’s Online Safety Bill

Big Tech executives may find themselves facing UK prosecution or jail time sooner than expected as the target date for Online Safety Bill (OSB) enforcement collapses to within two months of becoming law, rather than the two years originally proposed. Several new offenses have been added to the bill, including criminal liability for those who destroy evidence, fail to cooperate with Ofcom investigations or impede regulatory inspections. Facebook, Instagram, TikTok, Twitter and YouTube can all expect audits for the sort of harmful content the OSB seeks to address. Continue reading Criminal Liability Will Be Added to the UK’s Online Safety Bill

UK Lawmakers Are Taking Steps to Toughen Online Safety Bill

British lawmakers are seeking “major changes” to the forthcoming Online Safety Bill that cracks down on Big Tech but apparently does not go far enough. Expansions under discussion include legal consequences for tech firms and new rules for online fraud, advertising scams and deepfake (AI-generated) adult content. Comparing the Internet to the “Wild West,” Damian Collins, chairman of the joint committee that issued the report, went so far as to suggest corporate directors be subject to criminal liability if their companies withhold information or fail to comply with the act. Continue reading UK Lawmakers Are Taking Steps to Toughen Online Safety Bill

Senate Wants Social Firms to Pay for Holding Back Research

The U.S. Senate has introduced the bipartisan Platform Accountability and Transparency Act (PATA), which if passed into law would allow independent researchers to sue Big Tech for failing to provide requested data. The move follows last week’s Instagram hearing, where leaked internal research suggested the platform’s negative effects on the mental health of teens. On December 6, an international coalition of more than 300 scientists sent an open letter to Mark Zuckerberg — CEO of Meta Platforms, the company that owns Instagram and Facebook — requesting the social behemoth voluntarily share research. Continue reading Senate Wants Social Firms to Pay for Holding Back Research

Government Questions Liability Shield Offered by Section 230

The U.S. House of Representatives is signaling intent to proceed with legislation to scale back the Section 230 liability shield for Big Tech. The move follows a frontal assault on Australia’s version of the law by the Parliament and global saber-rattling against protections that prevent social platforms being held legally accountable for user-posted content that harms others. At a Wednesday hearing on various Section 230 bills, House Energy and Commerce Committee chairman Frank Pallone (D-New Jersey) said that while the protections were vital to Internet growth, they have resulted in anti-social behavior. Continue reading Government Questions Liability Shield Offered by Section 230

Australia Is Opening Door to Social Media Defamation Liability

The Parliament of Australia is preparing to crackdown on social media trolls by preparing legislation that will hold companies legally responsible for defamatory material posted to their sites. A draft of the proposed law would require companies to have formal complaint processes in place to report online abuse and require they provide complainants with the identities of alleged bullies once certain criteria are met. The proposed legislation is scheduled to be released this week, and expected to come before the Parliament next year. It is part of the country’s broader effort to overhaul defamation laws. Continue reading Australia Is Opening Door to Social Media Defamation Liability

Australian Court Holds Media Firms Liable for User Comments

The High Court of Australia upheld a lower court ruling that found media companies — including newspapers and TV stations — that post on Facebook are liable for Facebook users’ comments on those posts. It stated that, by creating a public Facebook page, media outlets “facilitated and encouraged comments” from users and are responsible for defamatory content. News Corp Australia, a subsidiary of News Corp, and Nine Entertainment, which owns the Sydney Morning Herald, called for legislators to protect them from liability. Continue reading Australian Court Holds Media Firms Liable for User Comments

Amazon Quietly Changes Terms of Service to Allow Lawsuits

After being deluged by 75,000+ individual arbitration demands filed by plaintiff’s attorneys on behalf of Echo users, Amazon changed its terms of service to allow customers to file lawsuits. It now faces at least three potential class action suits, one of them brought May 18 that alleges that its Alexa-enabled Echo devices record people without their permission. Arbitration requirements are often inserted in many consumer contracts and the U.S. Supreme Court has repeatedly upheld and underlined the right to mandate arbitration. Continue reading Amazon Quietly Changes Terms of Service to Allow Lawsuits

Proposed Legislation Would Weaken Shields for Social Media

The Justice Department sent Congress draft legislation to weaken Section 230 of the Communications Decency Act, leaving Facebook, YouTube and other social media platforms vulnerable to legal action for content posted by users. The proposed changes would create liability for platforms that allow “known criminal content” to remain once they are aware of it. President Trump claims that social media companies are biased against conservatives. The platforms have not been protected against some civil suits. Continue reading Proposed Legislation Would Weaken Shields for Social Media

Some States Say Amazon Is Liable for Third-Party Products

When Angela Bolger’s laptop caught fire due to a replacement battery she bought on Amazon, she suffered third-degree burns and filed a lawsuit against the popular e-commerce site. Amazon responded by providing a refund for the battery. Until recently, Amazon has successfully fought off such liability suits. The stakes are high since almost 60 percent of all physical goods on its site now come from third-party sellers. The courts have traditionally sided with Amazon, but recent cases from a few states are changing that trend. Continue reading Some States Say Amazon Is Liable for Third-Party Products

Court Finds Amazon Liable for Defective Third-Party Products

The California Fourth District Court of Appeals ruled that Amazon can be held liable for the damages created by a defective replacement laptop battery purchased from a third-party seller on its marketplace. The buyer, Angela Bolger, reportedly got third degree burns when the battery, from Amazon third-party seller Lenoge Technology, caught fire. Amazon has defended itself against such liability lawsuits so the appeals court decision is a major blow to its e-commerce business. The company currently faces several other liability suits.

Continue reading Court Finds Amazon Liable for Defective Third-Party Products