Is Anything Actually Going To Happen To Facebook?

The Daily Caller

Since revelations about Facebook’s content recommendation algorithms surfaced in early September, lawmakers have vowed to hold social media platforms accountable, floating potential legislative solutions — yet it’s unclear what, if anything, will stick.

Top House Democrats on Thursday unveiled a major piece of legislation taking aim at the way in which Facebook amplifies and recommends content, proposing a law that would strip Section 230 liability protection from algorithm-boosted content that leads to physical or “severe emotional” harm.

The proposal was issued in response to testimony from Facebook whistleblower Frances Haugen in early October revealing how Facebook’s algorithms amplified and recommended incendiary content, including “misinformation.” House Energy and Commerce Committee Chair Rep. Frank Pallone said the bill would hold tech companies “accountable” for “designing personalized algorithms that promote extremism, disinformation, and harmful content.”

Republican Sen. Josh Hawley also took aim at Section 230, introducing a bill in late September in response to revelations that Facebook’s subsidiary Instagram was promoting content harmful to the mental health of teen users. Hawley’s bill would remove liability protections from tech companies found to have caused users under the age of 16 “bodily injury or harm to mental health,” and create a private right of action for parents to sue on behalf of their children.


Yet legal experts say that laws looking to curb Facebook’s harms by removing liability protections would do little to affect how Facebook boosts content, as the First Amendment, rather than Section 230, protects the tech giant from civil lawsuits.


“Excluding from Section 230’s protections claims based on harms caused by algorithmic recommendations will not magically lead to the liability that the bill’s authors might imagine,” Ari Cohn, free speech counsel at TechFreedom, said in a statement shared with the Daily Caller News Foundation.

“The First Amendment presents a near-insurmountable barrier to claims that expression caused ‘severe emotional harm,’” Cohn said. “Plaintiffs claiming physical injury as a result of algorithms’ amplification will likely lose, overwhelmingly, on the merits of their cases under basic principles of tort law.”

Others say the Democrats’ law could have unintended consequences resulting in severe limits to free speech by incentivizing tech companies to remove all controversial content, without addressing Facebook’s core business model.

“Exempting personalized and algorithmically amplified content from Section 230 protections wouldn’t prevent platforms from using algorithms to pick and choose what users see, it would just incentivize those platforms to show users more ‘sanitized,’ corporate content that has been vetted by lawyers as ‘non-controversial,’” Evan Greer, director of Fight for the Future, said in a statement shared with the DCNF.

“Instead of undermining global human rights and freedom of expression by tinkering with Section 230, lawmakers should take aim at the data harvesting and surveillance practices that provide the fuel for the harmful algorithms employed by companies like Facebook and YouTube,” Greer added.

Despite these obstacles, Section 230 remains a popular target for lawmakers seeking to rein in Big Tech.

Democratic Sen. Amy Klobuchar introduced legislation in July that would amend Section 230 to hold social media platforms liable for algorithmically boosting “health misinformation,” proposing the bill following comments from President Joe Biden that Facebook’s inadequate response to COVID-19 vaccine misinformation was “killing people.”

Republican Sen. John Thune suggested during Haugen’s testimony that his PACT Act, sponsored with Democratic Sen. Brian Schatz, would hold Facebook accountable by removing liability protections for tech companies that fail to remove illegal content and requiring platforms to publish information on their content moderation practices.

Related News:   ’60 Minutes’ Failed To Disclose ‘Misinformation’ Researcher Got Millions In Gov’t Grants, Donated To Biden

Jeff Kosseff, professor of cybersecurity law at the United States Naval Academy, told the DCNF that legislation targeting Section 230 is popular due to how liability protections are perceived as beneficial to major tech companies, but that such bills fail to address core issues with the largest social media platforms.

“Section 230 is seen as a benefit to Big Tech,” Kosseff said. “While it is true that their business models never would have developed in the same way without Section 230, I’d argue that Section 230 is less important now to the largest platforms than it is to the small and mid-sized platforms.”

Facebook itself has lobbied for reform of Section 230, with Chief Executive Mark Zuckerberg proposing in testimony before Congress that lawmakers should remove liability protections for unlawful content if tech companies lack adequate systems to detect and remove offending posts.

“It’s easier for a trillion-dollar company to face the prospect of defending defamation lawsuits on the merits than it is for a local news website that has three employees,” Kosseff said. “Many of the concerns about Big Tech more directly relate to competition and privacy, so it makes more sense to address those via competition and privacy law.”

Lawmakers have made progress in recent months on legislation aimed at addressing competition issues, with the House Judiciary Committee advancing six antitrust bills intended to curb tech companies’ perceived anti-competitive business practices.

Sens. Chuck Grassley and Klobuchar announced Thursday they would be introducing companion legislation to one of the House bills designed to prevent large tech companies from prioritizing their own products or services on their platforms. Klobuchar and Grassley, along with Sens. Tom Cotton, Hawley and others, are currently developing companion legislation to the other antitrust bills.

Kosseff also said that laws requiring tech companies to share more information on how their algorithms work would be a valuable first step, pointing to a proposal by Stanford Law School professor Nate Persily requiring platforms to share data with third-party researchers.

“The big problem is that we have a very unclear picture of the full range of uses of personalized recommendations,” Kosseff told the DCNF. “The large platforms have not been transparent, so we rely on disclosures from brave whistleblowers to fill the gaps.”

Content created by The Daily Caller News Foundation is available without charge to any eligible news publisher that can provide a large audience. For licensing opportunities of our original content, please contact  licensing@dailycallernewsfoundation.org. Read the full story at the Daily Caller News Foundation

 

You appear to be using an ad blocker

Shore News Network is a free website that does not use paywalls or charge for access to original, breaking news content. In order to provide this free service, we rely on advertisements. Please support our journalism by disabling your ad blocker for this website.