Skip to main content

Verified by Psychology Today

Self-Sabotage

AI and Our Future Roundabout

Social science needed to be part of the planning, not called in for cleanup.

“I see it as we need to unite the will of the spirit with the work of the flesh.” —from the film The Way (2010)

A roundabout is a road junction at which traffic moves in one direction around a central island to reach one of the roads converging on it. Symbolic to life’s journey in many ways, a roundabout is about getting onto a pathway, or road, in order to reach a more desirable pathway. It is a journey of discovery. Roundabouts are a reminder that we are constantly entering and leaving different experiences in our lives.

So, what is the symbology of roundabouts to our development of self? How do roundabouts connect to the evolution of self? We all enter life and exit life, which cannot be denied. What else is happening?

Because we are sentient beings, we experience sensations and emotions. We are able to have a perceptual awareness of our own existence. This awareness drives us, no pun intended, toward new and novel experiences and discoveries about who we are and where we want to end up. Maybe this is our spirit-self, which needs clarification and purpose. However, it is also our physical-self because we cannot leave that behind. A 1990’s song by Crowded House suggested “always take the weather with you," which might be synergistically comparable in some ways.

AI and the Roundabout

Spirit and flesh are married from birth; no ceremony needed. They are the yin and yang of self. As we enter the roundabout of our lives in 2023, according to Google CEO Sundar Pichai, the greatest change to human existence, AI, is about to enter with us. Pichai admits there is in the AI industry a “black box,” or an area we cannot predict or know about at the present time. Pichai also admits that we are going to need more than engineers to navigate the future of AI. He says we are going to need social scientists.

Interestingly, social scientists are believed to be necessary after the implementation of technology and not in its planning phases. This sounds very familiar. Isn’t this the same situation we find ourselves in with mobile phones and other screen devices in the hands of two-year-olds now? And over time, what we have learned is that it's creating a population that is more distracted, compulsive, lonely, depressed, and anxious.

The time to incorporate knowledge about the social impacts of technology on society is in the pre-implementation phase, not post-implementation. The history of technology to date is all about: “Because we can, we should.” Social scientists will be expected to be part of the clean-up for the techno-bias that pushed society into this place too early.

The time for social science input is before entering the roundabout. Otherwise, we enter without knowledge of where all the other cars are traveling. Asking social scientists to help after exiting the roundabout is too late. We are now being drafted as firemen to put out the fire.

“AI is a tool. The choice about how it gets deployed is ours.“ —Oren Etzioni

Technocrats need to be prepared to investigate the potential impact of their creations before thrusting them onto an unsuspecting and uninformed population. Saying we have a “black box” is an admission that we are not ready to thrust AI on the public. We have already started on this journey in many areas but there needs to be more research before AI becomes not just an asset but also a liability. Studies indicate that in some current contexts, the downsides of AI systems disproportionately affect groups that are already disadvantaged by factors such as race, gender, and socio-economic background (Barocas & Sebst, 2016).

Pichai admitted that fake news, already a major problem, could become even more problematic with the onset of AI. Our roundabout with AI, which unleashed the “black box," looks like a threatening intersection for the human condition. The term “black box” is interesting since that’s what we usually look for after an accident. We need to look at AI pre-accident.

Creating the flesh of technology, machines, is the easy part. Alongside such efforts, designers and researchers from a range of disciplines need to conduct what we call a social-systems analysis of AI. They need to assess the impact of technologies on their social, cultural, and political settings (Crawford & Calo, 2016). We need to qualify and quantify the spirit of AI. How will AI affect the spirit of being human in a world that relies on superhuman machines? Will the human psyche adapt, as the engineers suggest? Or is there some aspect of human adaptability the engineers have not considered? We need that information before, not after.

References

Crawford, K. & Calo, R. (2016). There is a blind spot in AI research. Nature 538, 311-313, Oct. 13, 2016.

Barocas, S. & Selbst, A. D. Calif. Law Rev. 104, 671–732 (2016).

advertisement
More from Bruce Wilson Ph.D.
More from Psychology Today
More from Bruce Wilson Ph.D.
More from Psychology Today