Skip to main content

Verified by Psychology Today

Social Networking

Unity and Division on the Internet

The Internet doesn't divide people; people divide people.

Key points

  • The introduction of for-profit online platforms may predict deficits in mental health.
  • But the introduction of the Internet more generally may predict improvements in mental health.
  • Moreover, we can be biased in our evaluation of societal costs and benefits.
  • So what we dislike about online platforms may have more to do with their design and goals (and our perceptions) than the Internet more generally.

How do social media platforms lead to unity or division? Cecilia Esteban, a student at Stevens Institute of Technology, recently sent me three questions about this. As I looked at some evidence, I found myself unconvinced that the Internet is especially blameworthy for the undesirable division we see on some online platforms. Echo chambers, polarization, and segregation predated the Internet. And, while some Internet platforms can replicate or even reinforce these outcomes, other online platforms seem to be more beneficial. Allow me to reflect on some distinctions and evidence.

Do you think the advent of the Internet and, more specifically, social media, has done more for social unity or division?

The Internet vs. online platforms. Cecilia's question is wise to distinguish between the Internet in general and its online platforms in particular. They are different. So their effects may also be different. Some evidence confirms this: the introduction of for-profit online social media platforms has corresponded with decreases in well-being (Bragghieri et al., 2022), but the introduction of broadband Internet more generally has corresponded with increases in well-being (Donati et al., 2022).

For-profit vs. non-profit. Since the U.S. Department of Defense developed ARPANET, internet protocols have helped people connect and share valuable information. Later the Wikimedia Foundation used the Internet to facilitate a free, world-class encyclopedia (Wikipedia), a non-partisan news network (WikiNews), a social media platform (WT.social), and more online platforms. Wikimedia harnesses humanity's best motivations: donating our talents, knowledge, and other resources to benefit other people. However, some for-profit online platforms feed on humanity's worst qualities: impulsivity, laziness, self-promotion, tribalism, etc. So our mileage on the Internet's platforms may depend on their business model.

Many say that social media can act as a sort of “echo chamber” (specifically due to the algorithms behind these platforms), while others argue that social media actually exposes users to other perspectives/walks of life that they may have otherwise been unaware of. What kind of impact do you think both of these cases can have on a person’s beliefs, and what further implications may they hold?

Both children and adults have been observed sorting themselves into groups of similar people—and this was observed long before most people had access to the Internet (e.g., McPherson & Smith-Lovin 1987; Schrum et al., 1988). Even though this homophily can replicate online (Mele 2017), the design of social media platforms may impact the way we opt in or out of communities and echo chambers.

User experience design. Networks of people may be more segregated into echo chambers in private or closed social platforms than in open or public social platforms (Kwon et al., 2017). I’ve argued that our allegiances to the beliefs or people in our echo chambers probably biases the way we think about the world (even when we’re trying to think carefully), but I also argue that we might overcome this belief bias when our primary goal is nonpartisan rationality (Byrd, 2022). In principle, it seems possible that online platforms could be (re)designed to reinforce nonpartisan norms rather than loyalty to our preferred groups or orthodoxies. Perhaps Wikimedia's positive feedback for citing sources, admitting when evidence is lacking, and correcting errors helps users overcome belief bias more than online platforms that seem to promote things like perceptions and expressions of outrage (Brady et al., 2021, 2023).

What do you think can be done in order to mitigate the more divisive impact of social media on society? Is there anything that has proven helpful in sowing unity amongst people in settings as diverse and vast (yet simultaneously condensed) as the ones borne of social media?

My answer to the first question suggests that the goals of online platforms matter. But the prioritization of goals also matters, such as when the goal of profit conflicts with the goal of human flourishing. As Frances Haugen famously claimed, a for-profit platform often “chooses profit over safety” (Bauder and Liedtke, 2021). Wikimedia’s platform is not only less about profit, but also distributed across more people with a more diverse set of values and motivations than one corporation's stockholders.

Decentralize? Haugen is wise to remind us that “No one at [for-profit social media companies] is [necessarily] malevolent, ...But [their] incentives [can be] misaligned”. When we can choose between only a few online platforms that prioritize advertisers over users, it is unsurprising to find so many cases of profit-over-safety. If, however, users distribute themselves across platforms that rely on different models, we can observe which models correspond to better user well-being. Perhaps user migrations from a few for-profit online platforms to a diverse set of non-profit and decentralized alternative platforms like Mastodon servers (Chan, 2023) will give scientists an opportunity to test this hypothesis.

Serve stakeholders, not just stockholders. When an online platform's model requires selling ads or user data, the platform will want users to spend more time on their platform (so the platform can monitor more user decisions and/or lure users into viewing more ads). Some platforms may become so successful at keeping users active on their platform that users become addicted to using the platform (Cheng et al., 2021)—although the frequency of this outcome probably depends at least in part on platform-independent factors (ibid.). But imagine if an online platform instead optimized for human flourishing. Consider “time banking” platforms that show people how they can achieve mutual benefit by trading their time rather than their money, outrage, or capacity to look at ads (Goodwin & Cahn, 2018). For example, a time banking platform could analyze user data to connect an able-bodied lonely person with a differently-abled person who needs help with basic household chores. The idea is that a lonely person could achieve the social connection they need while helping their fellow human get the physical assistance they need—a win-win.

Conclusion

The Internet and its platforms are tools that can be used for good, bad, or neutral goals. But even when people attempt to use the Internet's platforms for good, there may be unintended side effects. Moreover, biases about side effects (Knobe 2003) may lead to a negativity bias about the Internet, its platforms, and/or its users. By reflecting on the design and use of different online platforms, we may notice more of the Internet's helpful achievements and side effects, which could give us a more accurate estimate of the Internet's costs and benefits for society.

References

Bauder, D., & Liedtke, M. (2021, October 4). Whistleblower: Facebook chose profit over public safety. AP NEWS. https://apnews.com/article/facebook-whistleblower-frances-haugen-4a3640440769d9a241c47670facac213

Braghieri, L., Levy, R., & Makarin, A. (2022). Social Media and Mental Health. American Economic Review, 112(11), 3660–3693. https://doi.org/10.1257/aer.20211218

Brady, W. J., McLoughlin, K., Doan, T. N., & Crockett, M. J. (2021). How social learning amplifies moral outrage expression in online social networks. Science Advances, 7(33), eabe5641. https://doi.org/10.1126/sciadv.abe5641

Brady, W. J., McLoughlin, K. L., Torres, M. P., Luo, K. F., Gendron, M., & Crockett, M. J. (2023). Overperception of moral outrage in online social networks inflates beliefs about intergroup hostility. Nature Human Behaviour. https://doi.org/10.1038/s41562-023-01582-0

Byrd, N. (2022). Bounded Reflectivism & Epistemic Identity. Metaphilosophy, 53(1), 53–69. https://doi.org/10.1111/meta.12534

Chan, W. (2023, April 18). Thousands fled to Mastodon after Musk bought Twitter. Are they still ‘tooting’? The Guardian. https://www.theguardian.com/technology/2023/apr/18/mastodon-users-twitter-elon-musk-social-media

Cheng, C., Lau, Y., Chan, L., & Luk, J. W. (2021). Prevalence of social media addiction across 32 nations: Meta-analysis with subgroup analysis of classification schemes and cultural values. Addictive Behaviors, 117, 106845. https://doi.org/10.1016/j.addbeh.2021.106845

Donati, D., Durante, R., Sobbrio, F., & Zejcirovic, D. (2022). Lost in the Net? Broadband Internet and Youth Mental Health (SSRN Scholarly Paper No. 4121345). https://papers.ssrn.com/abstract=4121345

Goodwin, N., & Cahn, E. (2018). Unmet Needs and Unused Capacities: Time Banking as a Solution. Interdisciplinary Journal of Partnership Studies, 5(1), Article 1. https://doi.org/10.24926/ijps.v5i1.911

Knobe, J. (2003). Intentional Action and Side Effects in Ordinary Language. Analysis, 63(3), 190–194. https://doi.org/10.1111/1467-8284.00419

Kwon, H. E., Oh, W., & Kim, T. (2017). Platform Structures, Homing Preferences, and Homophilous Propensities in Online Social Networks. Journal of Management Information Systems, 34(3), 768–802. https://doi.org/10.1080/07421222.2017.1373008

McPherson, J. M., & Smith-Lovin, L. (1987). Homophily in Voluntary Organizations: Status Distance and the Composition of Face-to-Face Groups. American Sociological Review, 52(3), 370–379. https://doi.org/10.2307/2095356

Mele, A. (2017). A Structural Model of Homophily and Clustering in Social Networks (SSRN Scholarly Paper No. 3031489). https://doi.org/10.2139/ssrn.3031489

Shrum, W., Cheek, N. H., & Hunter, S. MacD. (1988). Friendship in School: Gender and Racial Homophily. Sociology of Education, 61(4), 227–239. https://doi.org/10.2307/2112441

advertisement
More from Nick Byrd Ph.D.
More from Psychology Today