HomeYour VoiceHerStoryYour MultimediaResource LibraryAbout WVMCode of ConductRegisterLog in
  • Latest Post
  • Post index
  • Archives
  • Categories
  • Latest comments
  • Contact
  • About Your Voice
  • Raise Your Voice
  • « MEDICARE ADVANTAGE PLANS........."killer plans"
  • AMERICANS WANT GOVERNMENT-RUN HEALTHCARE »

HOW TO GET TO THE ROOT OF THE SOCIAL MEDIA CRISES

Posted by jj on Feb 24, 2023 in Background, Tech
HOW TO GET TO THE ROOT OF THE SOCIAL MEDIA CRISES
HOW  TO  GET  TO  THE  ROOT  OF  THE  SOCIAL  MEDIA  CRISES
 Section 230 reform isn’t going to solve our problems.
 
By Leslie Stebbins
 

As a research librarian, my professional life has focused on connecting people to reliable information. In the last thirty years, I have been stunned to watch as the rise in digital information that initially held so much promise in providing people with diverse and trustworthy content has instead spiraled into vast wastelands of clickbait, advertising, misinformation, and toxic content, while at the same time, reliable information is often buried or behind a paywall.

Four years ago, I started looking into the problem of online misinformation and toxic behavior. With support from the Sloan Foundation, I waded through thousands of policy documents and research articles to identify the most promising solutions to our information crisis. When I started on this work, people were excited to hear about it but often threw up their hands in defeat. They said, “You can’t really fix this problem without threatening free speech rights!”

Our attention has remained focused on free speech, but this is not where the answers to our social media crisis lie. We are currently fixated on Section 230 of the U.S. Communications Decency Act of 1996, which designates platform companies as services rather than publishers and gives them legal immunity for most content posted on their platforms: Think phone company, not newspaper.

The problem with revising Section 230 is that if we turn platform companies into publishers and hold them accountable for content they promote, we would start seeing massive amounts of censorship because these companies would err on the side of caution and remove potentially controversial posts. But we need to understand that large social media platforms are not like phone companies or newspapers. They are a different animal altogether.

This term the Supreme Court will decide on two cases—Gonzalez v. Google and Twitter v. Taamneh—that are seeking to broaden the scope of liability under Section 230 for the content platform companies promote. If successful, these cases would jeopardize our right to free speech.  The Court will likely hear two other cases from Texas and Florida. These two cases are going after Section 230 from the other side: questioning whether platforms should be allowed to censor political content.

But the focus on Section 230—the issue of free speech—is a red herring.

In my research, I organized the most promising solutions into six areas where we need to move forward. I was also able to pinpoint two “big picture” takeaways. First, possibly the biggest lie being told about misinformation and toxic online content is that the crisis is uncontainable. It is not. Second, our attention has become hyper-focused on fixing our current social media platforms as they are currently designed by using band-aid content moderation strategies while trying to balance free speech rights. But this is the wrong approach. It is the underlying intentional design of these platforms that is causing much of our information crisis. We need to change how our social media platforms are designed to build better, healthier digital public spaces. We need to go after the problem at its roots.

Legal scholars view Facebook, Google, Twitter, TikTok, and a few other companies as controlling the infrastructure of the digital public square. This infrastructure is vital to the flow of information and ideas in our society. Like clean water, access to reliable information should be a human right. New technologies have disrupted the, albeit imperfect, structures that were in place to ensure access to a free press and trustworthy information essential for our democracy and healthy public discourse.

Our current online spaces have contributed to declining trust in institutions and the media, and our access to reliable information is decreasing. Even Google has strayed and now devotes roughly half of its first-page search results to companies that Alphabet, Google’s parent company, owns. Teenagers are turning to TikTok to get information. Hashtags such as #mentalhealth and #anxiety have tallied up tens of billions of views, but the primarily younger audience seeking help is instead exposed to misinformation, bullying, fraud, and a system expertly designed to keep them online.

Changing the design of platforms can move forward on two interconnected fronts. First, regulations need to target the root causes of our information disorder, specifically the design features that are causing harm. The current business model rests on extracting and using personal data for microtargeting individuals and an advertising system that incentivizes and promotes misinformation and vitriol to keep people engaged. This makes billions of dollars for the tech companies, giving them little incentive to change. And second, by requiring these design changes and weakening the financial incentives, we can chip away at the vast concentration of power a few private companies have over our public discourse.

New structural requirements can be prophylactic. We can better serve the public interest by changing the current business model and insisting on using algorithms and tools that are transparent in their designs and open to oversight. The design can shift from promoting content that favors profit-maximizing personalized engagement to designs that promote reliable content and enhance public safety and privacy. We need to strategically design algorithms to counter systemic bias, racism, and inequity that are baked into our data and machine learning systems. In my research, I found that many exciting new tools are already at our disposal that can improve our digital spaces, but the current platform owners have not chosen to use them. They are not looking for solutions that will interfere with their bottom line.

By addressing design issues, we can sidestep infringing on rights to free expression and changes to Section 230 while we create healthier digital spaces. Not a simple task, to be sure. Content moderation will still need to be a part of the process to remove illegal content, such as child sexual abuse material, and incitement to imminent lawless action. Platform companies and non-profits can be encouraged to experiment and use flexible design practices, but with transparency and oversight. We also will need to create new platforms that can better serve the public interest if current platforms are unwilling to shift their practices. Our democracy is at stake.

Author Bio:

Leslie Stebbins is an independent research librarian and the Director of research4Ed. She is the author of Building Back Truth in an Age of Misinformation (Brown & Littlefield, 2023).
 
 This article was produced by Economy for All, a project of the Independent Media Institute.

Original post blogged on Women' Voices Media.

Tags: By WomenFor WomenLawNorth America/United States of AmericaTech


Form is loading...

Women's Voices Media - Newsletter

Powered by follow.it

Search

Act Now!

  • HAVE YOU CONTACTED YOUR SENATORS AND/OR YOUR HOUSE REP TODAY?

Recent Posts

  • Civil Discourse with Joyce Vance
  • Why Are Men Still Considered the Default?
  • The Animal Feed Industry’s Impact on the Planet
  • How a 20th-Century Family Planning Agenda Fueled the Climate Crisis
  • CALL YOUR SENATORS:   TELL THEM TO VOTE “NO!”
  • WHAT THE BUDGET CUTS ARE REALLY ABOUT
  • THE BATTLE FOR MAKING OUR VOICES HEARD
  • How Community Solar Can Liberate You From Fossil Fuels
  • TRUMP and CHAOS - THAT IS THE PLAN
  • KNOWLEDGE IS POWER!
  • Nationwide Economic 'Blackout' Continues
  • Marge Piercy's poem "Right to Life"
  • Unchecked Human Activity Is Pushing Ecosystems Toward the Brink
  • What We Can Learn Fom Gen Z Workers
  • BEING A PRESIDENT WORTHY OF HIS TITLE
  • I WILL NOT "WORK TOGETHER" TO........
  • A Reformist Program on Immigration
  • How to Make Recyclable Plastics Out of CO2 to Slow Climate Change
  • COMMENTARY FROM A BADASS WOMAN
  • Outdated Narratives Have Humanity in a Downward Spiral—It’s Time to Tell ‘Stories for Life’

Recent Comments

  • chandlerbaxter on BURN THE BARBIES, PAUSE THE PINK
  • dracorouge on FROM RI TO WI: MORE PRO-CHOICE AND PRO-ERA CANDIDATES
  • jj on OPINION: FEMINISM HAS BECOME TOO EXTREME
  • jj on OPINION: FEMINISM HAS BECOME TOO EXTREME
  • admin on THE 2ND IMPEACHMENT OF TRUMP!
  • andreajoy on VOTE!
  • marthaburk on STAND UP & SPEAK OUT!
  • admin on VOTE!
  • urbancat on VOTE!
  • marthaburk on Rep. Alexandria Ocasio-Cortez (D-NY) Responds to Rep. Ted Yoho (R-FL)
  • armandolibertad on Rep. Alexandria Ocasio-Cortez (D-NY) Responds to Rep. Ted Yoho (R-FL)
  • armandolibertad on DON'T EXPRESS OUTRAGE WITHOUT ACTION!
  • allegra22 on GET SMART AMERICA!
  • admin on My Personal Response To Trump by Lisa Wilson Berkowitz‎
  • admin on THE REAL “WELFARE QUEENS” ARE CORPORATE CEO’s

Wit & Wisdom

Let us put our minds together and see what life we can make for our children.
Sitting Bull, Hunkpapa Lakota
October 2025
Sun Mon Tue Wed Thu Fri Sat
      1 2 3 4
5 6 7 8 9 10 11
12 13 14 15 16 17 18
19 20 21 22 23 24 25
26 27 28 29 30 31  
 << <   > >>

Search

XML Feeds

  • RSS 2.0: Posts
  • Atom: Posts
What is RSS?

Your Voice
This collection 2025 by Janice Jochum
Copyright 2019 United Activision Media, LLC
Creative Commons License
This work is licensed under a Creative Commons Attribution-NonCommercial 4.0 International License.
• Contact • Help • Multiblog engine

Social CMS
Cookies are required to enable core site functionality.