It is convenient to think that by killing the head of the snake with fire and fury, the gophers will leave the garden alone. Recent history in the Middle East has shown that scenario to not only be false, but instead, the gophers reemerge more emboldened and there are holes everywhere in the garden.
The family dinner table, the coffee shop, the tabloid, and scholarly publication all are mashed together in small image and text headlines scrolled by billions around the world. The convergence of opinions, news, and entertainment on social media forms the bedrock of modern discourse. The importance of regulating these platforms has never been more evident. Accusations of manipulation by billionaire founders, political factions, and corporate interests highlight the pressing need for reform. While the extent of such manipulations may be debated, the underlying truth remains: to ensure a fair and balanced discourse, algorithmic transparency is essential. The current models of governance, content moderation, and platform management are inadequate, and without reform, the legitimacy of social media as a forum for public debate will continue to be undermined.
The Illusion of Governance: Failures of Current Moderation Models
Social media companies, in their attempts to regulate content, have adopted governance models that rely heavily on content moderation teams. These teams, often composed of contract workers, are tasked with filtering vast amounts of information. However, this approach has proven to be both slow and costly. More critically, it lacks the necessary transparency to build public trust in the filtering decisions being made. The opacity of these processes fuels skepticism and conspiracy theories, as users are left in the dark about the criteria used to judge content. A report by The New York Times notes that “the moderation process is so opaque that even the most informed users cannot discern why some content is allowed and other content is removed.”
Elon Musk, the owner of Twitter, has proposed a shift towards a more transparent and community-driven governance model, akin to Wikipedia. Whether this was a noble move by Musk, or a means to cut costs and fire ideologies remains to be seen. Wikipedia’s model of open governance and community moderation has been lauded for its democratic approach. However, it is not without flaws. As highlighted in a BBC News investigation, Wikipedia has faced “serious accusations of bias and manipulation by certain editors, some of whom have ties to political or state entities.” This highlights the challenges in achieving a balance between open governance and ensuring the integrity of the information presented. Musk’s vision for Twitter, while ambitious, has yet to reach the level of balance required for effective fact-checking and unbiased opinion sharing.
The Power of Algorithms: The Need for Transparency
The algorithms that power social media platforms are designed to maximize user engagement, often by exploiting psychological tendencies towards hyperbole and sensationalism. This creates a feedback loop where the most attention-grabbing content is promoted, regardless of its accuracy or fairness. As noted by The Guardian, “the engagement-driven algorithms of platforms like Facebook and YouTube have been shown to amplify misinformation and extreme viewpoints, creating polarized environments.” The result is an environment where misinformation can thrive, and echo chambers are reinforced, isolating users from opposing viewpoints.
Algorithmic transparency is crucial to breaking this cycle. By revealing how content is prioritized and filtered, platforms can empower users to make informed decisions about the information they consume. Transparency would also allow for greater accountability, as platforms could be held responsible for the effects of their algorithms on public discourse. Allowing users to modify their feed would then empower people to do as they once did–simply change the channel.
The Influence of State Actors and Corporate Interests
The influence of state actors and corporate interests on social media is another critical issue that regulation must address. Platforms like TikTok, for example, have been widely criticized for their ties to the Chinese government. A report by The Wall Street Journal highlights that “TikTok’s algorithm is not just a powerful tool for engaging users but also a potential instrument for state influence, particularly concerning the spread of content that aligns with Chinese state interests.”
In China, the app provides a very different experience than it does in the United States, reflecting the Chinese government’s desire to manipulate public opinion both domestically and internationally. The potential for foreign influence on platforms used by millions of Americans is a significant national security concern. Similarly, the role of affiliate marketers and influencers in subtly shaping discourse raises questions about the integrity of the information being disseminated on these platforms.
The Integration of Public and Private Spheres: The New Digital Public Square
Social media has become more than just a platform for entertainment; it is now the primary space for public discussion on a wide range of issues. It combines elements of traditional media, such as television and newspapers, with the personal interactions typically found in coffee shops, restaurants, and even family dinner tables. As noted by The Washington Post, “social media platforms have blurred the lines between public and private spheres, making regulation of these spaces all the more critical to ensure they serve the public interest.” This integration of public and private spheres into a single digital space makes the regulation of social media even more critical. The platforms that host these discussions hold immense power but currently operate with little to no responsibility for the content they promote.
Advertising revenue and the sophisticated technology stacks that optimize these platforms have driven the focus of social media companies towards profit rather than public good. However, if we accept that the public square is now online, then these platforms must be held to higher standards both to prevent censorship by plutocrats and limit the spread of misinformation. Discourse in the digital town square must not be an echo chamber layered with undue influence by state actors, corporate interests, and governed by a black box algorithm.