Reading out the White House listening session on tech platform accountability

Albert
featured image

Technology platforms help us stay connected, create vibrant markets for ideas, and open up new opportunities to bring products and services to market, but they also divide us and keep us from the real world. It can also cause serious harm. The rise of technology platforms ranges from tragic acts of violence linked to toxic online culture, to deteriorating mental health and well-being, to fundamental rights for Americans and communities around the world suffering from the rise of technology platforms large and small. , posed new and difficult challenges.

Today, the White House held a listening session with experts and practitioners about the harm tech platforms cause and the need for greater accountability. At the conference, experts and practitioners identified his concerns in six key areas. privacy; youth mental health; misinformation and disinformation; and illegal and abusive practices, including sexual exploitation. Algorithmic discrimination and lack of transparency.

One of the participants explained the impact of anti-competitive behavior by large platforms on small businesses and entrepreneurs. This includes the limitations that large platforms impose on how products work and potential innovations. Another participant emphasized that large platforms can use their market power to conduct rent-seeking, which can influence consumer prices.

Several participants expressed concern about the widespread collection of large amounts of personal data by technology platforms. Some experts have linked this to the problem of misinformation and disinformation on platforms, with social media platforms using personal data to produce tailored content (often , sensational, extremist, and polarizing content) to maximize “user engagement” for profit. Others raised alarm bells about the risks to reproductive rights and personal safety associated with companies that collect sensitive personal information, from a user’s physical location to medical history and choices. Another participant explained why technological self-protection of privacy is not enough. Participants also highlighted the public safety risks resulting from platform-endorsed information that promotes radicalization, mobilization, and incitement to violence.

Experts say technology now plays a central role in accessing key opportunities such as job openings, home sales, and credit offerings, but companies’ algorithms display these opportunities unfairly. , explains that they too often use predatory products to discriminately target some communities. Experts also explained that the lack of transparency means that algorithms cannot be scrutinized by anyone but the platform itself, creating barriers to meaningful accountability.

One expert described the risks of social media use to the health and well-being of young people, stating that while technology offers the benefits of social connection for some, prolonged use of social media is associated with many It has been described as having significant clinical adverse effects on the psyche of children and teens. concerns over health, the amount of data collected from apps used by children, and the need for better guardrails to protect children’s privacy and prevent addictive use and exposure to harmful content. It also highlighted the scale of illegal and abusive conduct hosted or disseminated by the platform, which is now protected from liability and is responsible for child sexual exploitation, cyberstalking, and non-consensual content. There are not enough incentives to act reasonably, such as distribution of An intimate image of an adult.

White House officials closed the meeting by thanking experts and practitioners for sharing their concerns. They explained that the administration will continue to work to address the harm caused by lack of sufficient accountability for its technology platforms. President Biden has long called for fundamental legislative reform to address these issues, he said.

Attendees at today’s meeting are:

  • Bruce Reed, Assistant to the President and Deputy Chief of Staff
  • Susan Rice, Assistant to the President and Domestic Policy Advisor
  • Brian Deese, Assistant to the President and Director of the National Economic Council
  • Louisa Terrell, Assistant to the President and Director of Legislative Affairs
  • Jennifer Klein, Assistant to the President and Director of the Gender Policy Council
  • Alondra Nelson, Assistant to the President and Director of Science and Technology Policy
  • Bharat Ramamurti, Assistant to the President and Deputy Director General of the National Economic Council
  • Anne Neuberger, Deputy National Security Advisor for Cyber ​​and Emerging Technologies
  • Tarun Chabra, Special Assistant to the President and Senior Director of Technology and National Security
  • Dr. Nusheen Ameenuddin, Chair of the American Academy of Pediatrics Commission on Communication and Media
  • Danielle Citron, Vice President, Cyber ​​Civil Rights Initiative Jefferson Scholars Foundation Schenck Distinguished Law Professor Caddell and Chapman Law Professors, University of Virginia Law School
  • Alexandra Reeve Givens, President and CEO, Center for Democracy and Technology
  • Damon Hewitt, President and Executive Director, Civil Rights Lawyers Under the Law Commission
  • Mitchell Baker, CEO of the Mozilla Corporation and Chairman of the Mozilla Foundation
  • Karl Racine, Attorney General for the District of Columbia
  • Patrick Spence, CEO of Sonos

Principles for Enhancing Competition and Tech Platform Accountability

At the event, the Biden-Harris administration announced the following ground rules for reform:

  1. Promote competition in the technology space. America’s information technology sector has long been an engine of innovation and growth, and America has led the world in the development of the Internet economy. But today, a handful of dominant internet platforms are using that power to weed out market entrants, solicit rent, and collect sensitive personal information that can be used to their advantage. We need a clear path to enabling small businesses and entrepreneurs to compete on a level playing field. This will foster innovation for American consumers and maintain America’s leadership in global technology. I would recommend seeing bipartisan interest in Congress in passing legislation to address the power of technology platforms through antitrust legislation.
  2. Provides strong federal protections for Americans’ privacy. Clear restrictions are required on our ability to collect, use, transfer, and maintain personal data, including restrictions on targeted advertising. These restrictions shouldn’t burden Americans with reading the fine print, but rather platforms to minimize the amount of information they collect. Especially sensitive data, such as health information such as information about I would suggest that we see bipartisan interest in Congress in passing legislation to protect privacy.
  3. Protect children by adopting stronger privacy and online protections, including prioritizing safety through the design standards and practices of online platforms, products and services. Children, adolescents, and teens are particularly vulnerable. Platforms and other interactive digital service providers may use profitable practices in product design, such as limiting excessive data collection and targeted advertising to young people. We must prioritize the safety and well-being of young people over income and income.
  4. Remove special legal protections for large technology platforms. Technology platforms now enjoy special legal protection under Section 230 of the Communications Decency Act, providing broad protection from liability even if they host or distribute illegal and violent conduct or materials . The President has long called for fundamental reform of Section 230.
  5. Increase transparency about platform algorithms and content moderation decisions. Despite their central role in American lives, technology platforms are notoriously opaque. Their decisions regarding what content to show to certain users and when and how to remove content from the site have a significant impact on American life and American society. However, the platform will allow the public and researchers to understand how and why such decisions are made, the potential implications for users, and the very real implications these decisions may have. It has not provided enough transparency to allow us to understand the dangers.
  6. Stop making decisions with discriminatory algorithms. To ensure that algorithms do not discriminate against protected groups, such as by failing to share key opportunities equally, by discriminatingly exposing vulnerable communities to dangerous products, or by exercising persistent surveillance need strong protection.

###

Tags