Facebook Whistleblower’s Testimony Builds Momentum for Tougher Tech Laws

Lawmakers called for strengthening privacy and competition laws, special online protections for children, more transparency into social-media algorithms and toughening of the platforms’ accountability.

“I think the time has come for action and you are the catalyst for that action,” said Sen. Amy Klobuchar (D., Minn.).

Ms. Haugen said lawmakers need to go beyond some of the legislative remedies under consideration.

“The severity of this crisis demands that we break out of previous regulatory frames,” she said. “Tweaks to outdated privacy protections…will not be sufficient.”

A critical starting point, she added, would be “full access to data for research not directed by Facebook. On this foundation, we can build sensible rules and standards to address consumer harms, illegal content, data protection, anticompetitive practices, algorithmic systems and more.”

Sen. Richard Blumenthal (D., Conn.), the chairman of the Senate consumer protection subcommittee conducting Tuesday’s hearing, said that as a result of Ms. Haugen’s disclosures, Facebook and other large tech companies are “facing a big tobacco moment, a moment of reckoning.”


What actions, if any, should the Senate Commerce Committee take on Facebook? Join the conversation below.

“Facebook knows its products can be addicting and toxic to children,” he said. Mr. Blumenthal called on Facebook founder

Mark Zuckerberg

to appear before Congress to testify, terming the company “morally bankrupt.”

Facebook didn’t immediately respond to a request for comment on Mr. Blumenthal’s comments. But company representatives on social media questioned the breadth of Ms. Haugen’s knowledge.

As lawmakers asked her about documents showing Instagram’s impact on children, Facebook spokesman

Andy Stone

said on Twitter that Ms. Haugen “did not work on child safety or Instagram or research these issues and has no direct knowledge of the topic from her work at Facebook.”

Documents Ms. Haugen gathered while employed at Facebook formed the foundation of The Wall Street Journal’s Facebook Files series.

Facebook has previously disputed the characterization of the documents in the Journal and by Mr. Blumenthal and other members of his committee, who questioned Facebook executive Antigone Davis about the documents last week.

“It is not accurate that leaked internal research demonstrates Instagram is ‘toxic’ for teen girls,” Facebook said in its statement. “The research actually demonstrated that many teens we heard from feel that using Instagram helps them when they are struggling with the kinds of hard moments and issues teenagers have always faced.”

The Journal has defended the series, saying Facebook hasn’t identified any factual errors.

Ms. Haugen said that when given the choice, Facebook leaders often chose a path that preserved profits over the safety of its users. She said this was part of a corporate culture that Mr. Zuckerberg built at Facebook. Mr. Zuckerberg has majority voting control and serves as CEO and chairman of Facebook’s board, which makes him unusually powerful within Silicon Valley, she said.

“There is no one currently holding Mark accountable but himself,” she said.

Sen. Marsha Blackburn and Sen. Richard Blumenthal listening to testimony from Frances Haugen Tuesday.


Drew Angerer/Getty Images

She added that Facebook under Mr. Zuckerberg has been guided by numbers rather than by people, making it more likely to be toxic and cause harms. “Mark has built an organization that is very metrics-driven,” she said. “The metrics make the decision. Unfortunately that itself is a decision.”

Facebook’s teams that drive the company’s growth often work at cross-purposes with the teams responsible for keeping the platform safe, Ms. Haugen told the panel.

Ms. Haugen made the case for policy changes to address her perceived concerns. In products such as cars and cigarettes, she said, independent researchers can evaluate health effects, but “the public cannot do the same with Facebook.”

“This inability to see in Facebook’s actual systems and confirm that they work as communicated is like the Department of Transportation regulating cars by only watching them drive down the highway,” she said, arguing for an independent government agency that would employ experts to audit the impact of social media.

She said that if Congress moves to change Section 230, a law that protects Facebook and other companies from liability for user-generated content, it should distinguish between that kind of content and choices that companies make about what type of content to promote.

“Facebook should not get a pass on choices it makes to prioritize virality and growth and reactiveness over public safety,” she said.

Ms. Haugen, who resigned from Facebook in April, was a product manager hired to help protect against election interference on Facebook. She said she acted because she was frustrated by what she viewed as Facebook’s lack of openness about the platforms’ potential for harm and its unwillingness to address its flaws.

Ms. Haugen has sought federal whistleblower protection at the Securities and Exchange Commission. She is also interested in cooperating with state attorneys general and European regulators.

Ms. Haugen gathered internal documents showing how Facebook’s Instagram app led to depression and anxiety in many teenage girls.

im 411912?width=700&height=467

‘I would simply say, let’s get to work,’ said Sen. John Thune (R., S.D.), who has sponsored several measures on algorithm transparency.


Stefani Reynolds/Bloomberg News

The Instagram disclosures have built momentum to update the Children’s Online Privacy Protection Act, a 1998 law governing websites that gather data on children. The law, known as Coppa, has been widely criticized as inadequate in the age of social media.

“Updating Coppa will be essential,”

Sen. Maria Cantwell

(D., Wash.), who chairs the powerful Commerce Committee, said at last week’s hearing.

Critics say the law as written has measures that create enforcement challenges for the Federal Trade Commission. One is its requirement that a platform operator have “actual knowledge” that it is collecting personal information of children before the law’s toughest restrictions kick in. The other is its age cutoff—only children under 13 get its strongest protections.

Republicans and Democrats alike have supported updating the law.

Several lawmakers also expressed renewed interest in transparency measures that would give the public and policy makers more understanding of how algorithms work in suggesting content to users. “I would simply say, let’s get to work,” said

Sen. John Thune

(R., S.D.), who has sponsored several measures on algorithm transparency. “We’ve got some things we can do here.”

Other suggestions from lawmakers on Tuesday included crafting comprehensive privacy protections for U.S. consumers and new competition rules for the internet age. Some senators also pressed to cut back the immunity from liability that Congress long ago conferred on online platforms for the actions of their users.

But several lawmakers acknowledged that some efforts to regulate big tech already had drawn intense opposition from some major companies. “There are lobbyists around every single corner of this building that have been hired by the tech industry,” Ms. Klobuchar said. “Facebook and the other big tech companies are throwing a bunch of money around this town and people are listening to them.”

Facebook has publicly called for government action to better regulate the online environment.

In addition to the Instagram documents, Ms. Haugen released other internal documents, including how the company’s moderation rules favor elites; how its algorithms foster discord; and how drug cartels and human traffickers use its services openly.

In a statement this week, Mr. Blumenthal promised more hearings “documenting why Facebook and other tech companies must be held accountable—and how we plan to do that…We must consider stronger oversight, effective protections for children and tools for parents, among the needed reforms.”

In its statement, Facebook said that its “teams have to balance protecting the ability of billions of people to express themselves openly with the need to keep our platform a safe and positive place.”

“We continue to make significant improvements to tackle the spread of misinformation and harmful content,” the company said. “To suggest we encourage bad content and do nothing is just not true.”

Write to John D. McKinnon at john.mckinnon@wsj.com and Ryan Tracy at ryan.tracy@wsj.com

Copyright ©2021 Dow Jones & Company, Inc. All Rights Reserved. 87990cbe856818d5eddac44c7b1cdeb8

Source link

About rudro


Check Also

UPDATE: Sensex Gains Over 200 Points; Tata Motors Soars 14%

<!– –> The domestic stock markets have opened at fresh all-time highs, maintaining the momentum …

Leave a Reply

Your email address will not be published. Required fields are marked *