Let the community sort it out: A return to the early days of the Internet could solve the social media legitimacy crisis

In the 2018 documentary The Cleaners, a young man in Manila, Philippines, describes his work as a content manager: We see images on the screen. You then go through the images and remove the ones that don’t meet the guidelines. Daily photo quota 25,000. As he talks, his mouse clicks and deletes the offending images while allowing others to stay online.

The man in Manila is one of thousands of content moderators hired as contractors by social media platforms, 10,000 at Google alone. Content moderation on an industrial scale like this is part of the everyday experience of social media users. Sometimes a post someone makes is removed, or a post someone thinks is offensive is allowed to go live.

Similarly, platforms add and remove features without input from those most affected by those decisions. Whether you’re angry or not, most people don’t think about the history of a system where people in conference rooms in Silicon Valley and Manila dictate your experiences online.

But why should a few companies or a few billionaire owners have the power to decide the online spaces that billions of people use? This unaccountable model of governance has led stakeholders of all stripes to criticize the platforms’ decisions as arbitrary, corrupt, or irresponsible. In the early days, before the Web, the social Internet, decisions about the spaces in which people gathered online were often made by community members. Our review of the early history of online governance suggests that social media platforms can at least partially return to community governance models to address their legitimacy crisis.

The Cleaners documentary reveals some of the hidden costs of Big Techs’ customer service approach to content moderation.

A history of online governance

In many early online spaces, governance was run by community members, not professionals. One of the first online spaces, LambdaMOO, invited users to build their own governance system, which transferred power from the hands of the people who technically controlled the space managers, known as wizards, to community members. This was done through a formal application process and a set of appointed mediators who resolved conflicts between users.

Other spaces had more informal processes for incorporating community input. For example, in bulletin board systems, users voted with their wallets and removed important financial support if they disagreed with the decisions of system administrators. Other spaces, such as text-based Usenet newsgroups, gave users considerable power to shape their experiences. Newsgroups left obvious spam in place, but gave users the tools to block it if they wanted to. Usenets administrators argue that it is fairer to allow each user to make decisions that reflect their individual preferences rather than adopting a one-size-fits-all approach.

The graphical web increased Internet usage from a few million users to hundreds of millions within a decade from 1995 to 2005. During this rapid expansion, community governance was replaced by customer service-inspired governance models that focused on scale and cost.

This shift from community management to customer service made sense for the fast-growing companies that made up the Internet boom of the late 1990s. Promising their investors that they could grow quickly and make changes quickly, companies sought approaches to the complex task of managing online spaces that centralized power and increased efficiency.

While this customer service management model allowed early user-generated content sites like Craigslist and GeoCities to grow rapidly, it set the stage for the crisis of legitimacy that social media platforms face today. Contemporary battles over social media are rooted in the sense that the people and processes governing online spaces are not accountable to the communities in which they gather.

Paths of community control

Implementing social governance on today’s platforms can take many forms, some of which are currently being tested.

Advisory boards such as the Metas Oversight Board are one way to involve external stakeholders in platform governance, providing independent, albeit limited, review of platform decisions. X (formerly Twitter) has taken a more democratic approach Forum notes initiative, which allows users to contextualize information on the platform by crowdsourcing notes and ratings.

Some may question whether community governance can be successfully implemented on platforms that serve billions of users. In response, we refer to Wikipedia. It is completely community-driven and has created an open encyclopedia that has become the most important source of information in many languages. Wikipedia is surprisingly resistant to vandalism and abuse, with robust procedures that ensure a resource used by billions remains accessible, accurate, and reasonably civil.

On a smaller scale, full self-governance, mirroring early online spaces, can be key for communities that serve specific subsets of users. For example, Archive of Our Own was created after fan fiction writers who write original stories using characters and universes from published books, TV shows, and movies found existing platforms unpopular. For example, many fiction writers have been banned from social media platforms due to excessive copyright enforcement or concerns about sexual content.

A group of writers, fed up with platforms that don’t understand their work or culture, designed and built their own platform specifically to meet the needs of their community. AO3, as it’s colloquially known, serves millions of people a month, includes tools specific to the needs of fan fiction writers, and is run by the same people it serves.

X, formerly Twitter, allows people to use community notes to add relevant information to posts that contain inaccuracies.
Screen recording by The Conversation USCC BY-ND

Hybrid models, such as Reddit, combine centralized and autonomous. Reddit hosts a collection of interest-based communities called subreddits, which have their own rules, norms, and moderator teams. Underlying the governance structure of subreddits is a set of rules, processes, and features that apply to everyone. Not every subreddit is a perfect example of a healthy online community, but most are.

There are also technical approaches to community management. One approach allows users to choose the algorithms that manage their social media feeds. Imagine that instead of only being able to use Facebook’s algorithm, you could choose from a selection of algorithms provided by third parties, such as the New York Times or Fox News.

More decentralized platforms like Mastodon cede control to a network of servers that are structurally similar to email. This makes it easier to choose the experience that matches your preferences. You can choose which Mastodon server to use, and you can easily switch, just like you can choose to use Gmail or Outlook for email, and you can change your mind, all while maintaining network access. Broader email.

Additionally, advances in generative artificial intelligence that show early promise in generating computer code could make it easier for people, even those without a technical background, to build custom online spaces when they find existing spaces unsuitable. This reduces the pressure on online spaces to be all things to all and supports a sense of agency in the digital public sphere.

There are also more indirect ways to support community governance. For example, increasing transparency by providing access to data about the impact of platforms’ decisions can help researchers, policymakers, and the public hold online platforms accountable. Additionally, encouraging ethical professional norms among engineers and product designers can make online spaces respect the communities they serve.

Going forward by going back

Between now and the end of 2024, national elections are scheduled in many countries, including Argentina, Australia, India, Indonesia, Mexico, South Africa, Taiwan, the United Kingdom, and the United States.

We believe it’s time to think not only about how online spaces can be run effectively and serve companies’ bottom lines, but also how they can be run fairly and legitimately. did Giving communities more control over the spaces they participate in is a proven way to do this.


#community #sort #return #early #days #Internet #solve #social #media #legitimacy #crisis
Image Source : theconversation.com

Leave a Comment