Skip to main content

Verified by Psychology Today

Media

Up the Down Staircase: Human Downgrading & Digital Dystopias

Do we need the concept of human downgrading to prevent digital dystopias?

 Factor Daily
Source: Factor Daily

Anti-Silicon Valley ‘techlash’ is in full swing. And for good reason: From the punishing and dehumanizing floors of the Amazon warehouse, to the indignities and starvation wages of the gig economy, to facial recognition AI that might end privacy as we know it, dystopia seems to be Silicon Valley’s business model

Luckily, some Silicon Valley insiders have become vocal critics. A set of particularly devoted turncoats formed the Center for Humane Technology (CHT). CHT’s mission is to raise awareness about the negative impact of the digital ecosystem on human well-being and to nudge tech companies and power players to combat what they refer to as Human Downgrading — a host of societal problems like addiction, social isolation, outrage, misinformation, and political polarization that they believe result from tech platforms’ efforts to capture and commoditize human attention.

Let’s take a closer look at their argument. Quoting the “father of sociobiology,” E.O. Wilson, they take as their starting point that "the real problem of humanity is that we have Paleolithic emotions, medieval institutions, and god-like technology.” They argue that Human Downgrading is the result of powerful, god-like digital technology designed to hack into our emotional and cognitive vulnerabilities. Specifically, design elements of digital advertising and social media are particularly problematic because they, with laser-like precision, attack us on multiple fronts – information processing limitations, our propensity for addiction, our need for social belonging, easily-manipulated outrage and distrust, and self-confirmation biases. According to this formulation, these hacks of our Paleolithic brains cause teen anxiety and suicide, social isolation, political polarization, and more.

I believe that this idea that our brains are no match for our technology is wrong for three fundamental reasons: First, the very idea of Human Downgrading vastly overestimates how helpless we are in the face of these products and techniques. We are not just moths to the flame. Knowledge of how technology platforms target our vulnerabilities and short-term changes in how we use social media are often enough to disrupt its negative effects. The idea of Human Downgrading also creates a victim mentality that is probably more useful for promoting a feeling of powerlessness than inspiring us to take control of our technology.

The second problem is that Human Downgrading conflates effective design elements with hacking our brains. The kind of computer hack they seem to be referring to here is when someone bypasses security measures to gain access to or control a computer or computer network. This is not actually how brains work or how things like smartphones affect our brains. What things like digital advertising and social media actually do is appeal to and quickly satisfy needs and desires. Through this, they can trigger compulsions and bad habits. Are casino’s hacking our brain and causing gambling addiction? Are fast-food restaurants hacking our brains and causing us to eat unhealthy food? Social media are arguably the fast food of the social brain, so perhaps the more accurate metaphor is that Silicon Valley is like Ronald MacDonald, doling out cheap and easily accessible burgers and fries.

The third problem is that the idea of Human Downgrading is anchored in the dangerously over-simplistic argument that digital technology directly causes our current social and mental health crises, such as teen suicide and political polarization. This laying of blame for our social and personal ills at the foot of technology risks not only getting it wrong in many cases – many other factors cause and exacerbate a vulnerable person’s feelings of anxiety, despondency, rage, or hopelessness – but also risks convincing us that we can renege on devoting real societal and personal resources to known causes of and solutions for things like mental illness and political incivility.

The founders of CHT are extremely well-intentioned and I am grateful that they are fighting the good fight. They are right to argue that the techniques digital companies use create a problematic power differential. Yet, CHT's critique eventually gets off track because they think like the Silicon Valley denizens they criticize – in black-and-white, binary terms. For them, it’s our Paleolithic, reptile brains versus god-like technology. It’s the unassailable genius of the tech industry and the superiority of those who run it versus the cattle-like mindlessness of the rest of us consumers who click where they tell us to click.

If we’re going to fix the very real problems that exist within our digital lives, and if we’re going to demand a humane digital ecosystem, I believe a constructive approach will yield better results. The metaphor of Human Upgrading is an obvious choice. Here, we can admit that digital technology is powerful and can attack our vulnerabilities. Understanding how it does so gives us the opportunity to defend ourselves and turn those vulnerabilities into strengths. But Human Upgrading goes further by stating that our ‘Paleolithic brains’ are incredible masterpieces. They have allowed us to survive and thrive for millennia, and provide us with flexibility, resourcefulness, and resilience. Digital technologies can as easily support this resilience as compromise it. The early promise of Silicon Valley was to do just that: assist our Human Upgrading. It’s time they start keeping that promise.

advertisement
More from Tracy A. Dennis-Tiwary Ph.D.
More from Psychology Today