Skip to main content

Verified by Psychology Today

Ira Rosofsky, Ph.D.
Ira Rosofsky Ph.D.
Aging

Immortailty: Are We The Last Generation Not To Live Forever?

It's not time to cancel your life insurance policy.

Will my children or grandchildren, some hundreds of years from now, be lamenting, "Poor old pops. He died just a bit too soon, and missed out on all this"?

If you listen to the proselytizers of physical immortality, it seems like immortality is just around the corner.

It sounds as if only a little resveratrol here, a boost of telomerase there, a healthy dose of biological and technological engineering, and 22nd century, here we come.

The Immortality Institute asks, "Is death inevitable?"

And answers, "It is possible to slow down, stop, and eventually reverse the aging process. It comes as a surprise to many people--especially to non-scientists--that there may be treatments available in the foreseeable future to stop and reverse the aging process."

Aubrey de Grey, one of the main gurus of anti-aging, and co-founder of Strategies of Engineered Negligible Senescence argues that we already understand the fundamental science of aging, and that it's just a matter of getting the funding in place to cure what ages you.

Immortality International, Inc. declares "life and extension of life is an innate right shared by all individuals." And they're not sitting idly by: "We are working to further the public understanding that aging and death are conventions that soon will no longer be necessary."

At the Methuselah Project, founder, David Gobel proclaims, "The good news is that we're closer to finding a true cure for age-related diseases than ever before."

Before that happens, the project is looking for enrollees in The 300, not Spartans fighitng off Persians, but 300 people willing to donate $25,000 to the cause, and achieve an immediate immortality that will be monumental: "Join The 300 now and share in an immortal legacy. Your name and image will be etched into a steel and marble monument, and your wishes for a healthy tomorrow will be displayed for all time."

Already, 269 have signed up. There are only 31 places left on the "steel and marble monument."

Me? I'm afraid I'll settle for my name chiseled into a graveyard monument that will last as long.

If I were convinced that immortality was just around the corner, it would make me sad to think that I missed it by only a generation. But count me among the skeptics.

A cursory look at the history of life expectancy can lull the uncritical into thinking we are progressing in that direction of forever.

Estimates of pre-historic life expectancy range from 20-30 years, which--given the fact our species is still here--is more than enough for homo sapiens survival. During the Middle Ages, life expectancy may have bumped up to around 50, and for the most part it remained there until modern advances in health science raised the current world average to 67, which was the U.S. life expectancy the year I was born, 1946.

In the U.S., overall life expectancy 78.2, for males, 75.6, and for females, 80.8. (There are 37 countries that do better, almost all of which have universal health care, but that's a wholly different rant.)

Since I was born, that's about a 15-20 year increase in my so far 63 years of life. Should we expect similar or even exponentially greater increases in the next generation or two?

Again, count me among the skeptics.

Most of the increases in life expectancy have occurred not because of any defeat of aging, but because of significant reductions in infant mortality and death during the delivery of a child.

There's a big difference between life expectancy at birth, and life expectancy as an older adult who has already negotiated his or her way through the pitfalls of early mortality.

Stroll through a garden variety colonial cemetery, and you will be struck by the number of infant and deaths of females of childbearing age. But you will be also struck by the number of people who, surviving infancy or childbearing, went on to live well into their 80s.

Add to reductions in infant mortality and childbearing deaths, the conquest of most--if not all infectious diseases--and the average life expectancy is significantly increased. The world-wide influenza pandemic, which lasted from 1918-1919, may have killed 100 million people, eclipsing the piddling 16 million fatalities of the just concluded World War I.

As an index of progress, the Hong Kong flu pandemic of 1968-1969 killed only a relatively small 1 million worldwide. And the current H1N1 pandemic has resulted in only 482,00 confirmed cases and a miniscule 6,071 deaths.

Eliminate infant mortality, child bearing, and infectious diseases, and the question remains, "How much longer can I, a reasonably healthy and fit, 63-year-old man expect to live?"

Can I expect to live any longer than a reasonably healthy 63-year-old 19th century man?

In many ways, we are going in reverse. According to a recent CDC report, overweight and obesity percentages have stabilized in the U.S., but at a startling high rate--68 percent of all adults. This is a 50 percent increase since 1960--when we weren't exactly slim at 44 percent obesity/overweight rate. And projections indicate that in only five years, 2015, 76 percent will be overweight or obese.

Given these trends, and the close associations between obesity and life shortening ailments such as diabetes and heart disease, life expectancy could arguably decline.

This is quite a change from the 19th century when obesity rates--to the extent they can be estimated--were probably lower than 10 percent in the U.S. Malnutrition and being underweight were more of a problem.

Many life-extension proponents say that an extremely thin--even underweight body--is one of the keys to a long life. So the 19th century is kind of a laboratory for aging--putting aside deaths in infancy, from childbirth, and infectious diseases. In fact, it may have been a superior real-world environment for the promotion of longevity.

First, although the industrial revolution, and smokestack pollution, had come to many urban environments--particularly in Britain, the U.S., and Germany--the vast majority of people lived in pollution-free agrarian communities.

The average farm worker--with less machine assistance than today--was probably burning between 3,000 and 4,000 calories daily.

And even people in more sedentary occupations were far more active than their counterparts today. I once heard a lecture by the nutritionist, Jean Mayer, in which he sketched out the daily life of a typical office worker. He would get up early, spend some time doing physical chores--including wood chopping, walk three miles to his office, spend 10-12 hours working in a standing position, walk three miles home, chop some more wood, and go to bed early. And that was the sedentary worker.

Contrasting these energy expenditures to today's truly sedentary life, it's clear that longevity solutions have much obesity and idleness to overcome.

But suppose it doesn't matter how fat or lazy we are. Could there be an anti-aging pill around the bend? After all, fat or thin, a statin drug will lower you cholesterol.

A look at cancer advances over the last 60 years--my current lifespan--is not encouraging. Research on cancer occupies much of the same intellectual terrain as longevity and anti-aging research. Put simply, cancer research primarily looks at why new cell production goes bad, which is also at the center of many of the theories of aging. If we can ensure the continued new production of healthy cells, the reasoning goes, we are on the road to defeating aging.

But a recent article in the New York Times, "As Other Death Rates Fall, Cancer's Scarcely Moves" (April 24, 2009), underlines how little progress there has been against cancer in my lifetime.

Although death rates have plummeted for heart disease, stroke, influenza, and pneumonia, for cancer they have barely moved.

Currently, the death rate for cancer is 200 per 100,000 people of all ages, and 1,000 per 100,000 people over 65--a mere 5 percent drop from 1950. In contrast, heart disease deaths have fallen by two-thirds compared to 1950, and the number of deaths from cancer is approaching the number of deaths from heart disease on a yearly basis.

In a previous generation, more people smoked, there weren't statin drugs for treating cholesterol, aspiring was not prescribed on a daily basis, and medical technology--stents, bypass surgery, and anti-blood clotting interventions--were not widely available.

There have not been similar advances in cancer therapies.

Unless you are willing to believe that cardiac researchers are smarter than cancer researchers, the obvious conclusion is that effective cancer therapy is a much tougher nut to crack than the heart.

Physical longevity aside, the immortality proselytizers argue that in the future not only will we live longer but we will be healthier. But even if our bodies become sounder will our minds follow?

Not if the projections about dementia are accurate.

As I wrote in a recent LA Times op-ed, "When it comes to dementia, forget the drugs," any increase in longevity will bring with it an increase in failing minds: "Alzheimer's and other forms of dementia afflict up to 5 million people in the United States and about 26 million people worldwide. By 2050, there could be 13 million cases of Alzheimer's alone among U.S. baby boomers and the aging Generations X and Y, according to the National Institutes of Health. Some reports have the global prevalence of Alzheimer's growing to as many as 100 million people by mid-century."

Currently, if you are fortunate enough to live to be 80, your chances of developing dementia are 1 in 2.

Progress against dementia is even more limited than that against cancer. Anti-dementia drugs such as Aricept have small but non-meaningful effects on cognition. If you have dementia and take Aricept you may--according to one study--manage to keep yourself out of an institution for only a few weeks longer than if you took no drugs. And the drug's effect on cognition appears be trivial. You might have 4 percent less dementia--as measured on an IQ test--than if you hadn't taken the drug, but you will still have dementia.

In the UK, the National Health Service has greatly proscribed the use of Aricept and other anti-dementia drugs, saying the money would be better spent on human service workers--who can't cure dementia either, but can hold your hand and feed you.

I know of no researcher in the field who expects any near-term breakthrough in either dementia or cancer research, so why are the aging folks so sanguine?

I have to conclude that any optimism about a near-term advance in aging is, as Rogers and Hammerstein would put it, "cock-eyed optimism."

When research scientists don't even understand or agree about the causes of aging, I am confident I can go to my grave without any immortality envy for future generations.

I wish my children long-life and prosperity, but I doubt they will be feeling lonely for me hundreds of years from now--maybe, with luck, a hundred years from now.

If you have a life insurance policy, it's not time to cancel.

---------------------------------------------------------------------------------------------

Click here to read the first chapter of my book, Nasty, Brutish, and Long: Adventures in Old Age and the World of Eldercare (Avery/Penguin, 2009). It provides a unique, insider's perspective on aging in America. It is an account of my work as a psychologist in nursing homes, the story of caregiving to my frail, elderly parents--all to the accompaniment of ruminations on my own mortality. Thomas Lynch, author of The Undertaking calls it "A book for policy makers, caregivers, the halt and lame, the upright and unemcumbered: anyone who ever intends to get old."

My Web Page

advertisement
About the Author
Ira Rosofsky, Ph.D.

Ira Rosofsky, Ph.D., is a psychologist in Connecticut who works in eldercare facilities and the author of Nasty, Brutish, and Long: Adventures in Old Age and the World of Eldercare.

Online:
website, Facebook
More from Ira Rosofsky Ph.D.
More from Psychology Today
More from Ira Rosofsky Ph.D.
More from Psychology Today