These words came from an elderly woman sitting behind me on a late-night flight from Los Angeles to Washington, D.C. The plane was dark and quiet. A man I assumed to be her husband murmured almost inaudibly in response, something to the effect of “I wish I was dead.”
Again, the woman: “Oh, stop saying that.”
I didn’t mean to eavesdrop, but couldn’t help it. I listened with morbid fascination, forming an image of the man in my head as they talked. I imagined someone who had worked hard all his life in relative obscurity, someone with unfulfilled dreams—perhaps of the degree he never attained, the career he never pursued, the company he never started.
At the end of the flight, as the lights switched on, I finally got a look at the desolate man. I was shocked. I recognized him—he was, and still is, world-famous. Then in his mid‑80s, he was beloved as a hero for his courage, patriotism, and accomplishments many decades ago. As he walked up the aisle of the plane behind me, other passengers greeted him with veneration. Standing at the door of the cockpit, the pilot stopped him and said, “Sir, I have admired you since I was a little boy.” The older man—apparently wishing for death just a few minutes earlier—beamed with pride at the recognition of his past glories.
For selfish reasons, I couldn’t get the cognitive dissonance of that scene out of my mind. It was the summer of 2015, shortly after my 51st birthday. I was not world-famous like the man on the plane, but my professional life was going very well. I was the president of a flourishing Washington think tank, the American Enterprise Institute. I had written some best-selling books. People came to my speeches. My columns were published in The New York TimeFrom Our July 2019 Issu
But I had started to wonder: Can I really keep this going? I work like a maniac. But even if I stayed at it 12 hours a day, seven days a week, at some point my career would slow and stop. And when it did, what then? Would I one day be looking back wistfully and wishing I were dead? Was there anything I could do, starting now, to give myself a shot at avoiding misery—and maybe even achieve happiness—when the music inevitably stops?
Though these questions were personal, I decided to approach them as the social scientist I am, treating them as a research project. It felt unnatural—like a surgeon taking out his own appendix. But I plunged ahead, and for the past four years, I have been on a quest to figure out how to turn my eventual professional decline from a matter of dread into an opportunity for progress.
Here’s what I’ve found.
The field of “happiness studies” has boomed over the past two decades, and a consensus has developed about well-being as we advance through life. In The Happiness Curve: Why Life Gets Better After 50, Jonathan Rauch, a Brookings Institution scholar and an Atlantic contributing editor, reviews the strong evidence suggesting that the happiness of most adults declines through their 30s and 40s, then bottoms out in their early 50s. Nothing about this pattern is set in stone, of course. But the data seem eerily consistent with my experience: My 40s and early 50s were not an especially happy period of my life, notwithstanding my professional fortunes.
So what can people expect after that, based on the data? The news is mixed. Almost all studies of happiness over the life span show that, in wealthier countries, most people’s contentment starts to increase again in their 50s, until age 70 or so. That is where things get less predictable, however. After 70, some people stay steady in happiness; others get happier until death. Others—men in particular—see their happiness plummet. Indeed, depression and suicide rates for men increase after age 75.
(Illustrations by Luci Gutiérrez)
This last group would seem to include the hero on the plane. A few researchers have looked at this cohort to understand what drives their unhappiness. It is, in a word, irrelevance. In 2007, a team of academic researchers at UCLA and Princeton analyzed data on more than 1,000 older adults. Their findings, published in the Journal of Gerontology, showed that senior citizens who rarely or never “felt useful” were nearly three times as likely as those who frequently felt useful to develop a mild disability, and were more than three times as likely to have died during the course of the study.
One might think that gifted and accomplished people, such as the man on the plane, would be less susceptible than others to this sense of irrelevance; after all, accomplishment is a well-documented source of happiness. If current accomplishment brings happiness, then shouldn’t the memory of that accomplishment provide some happiness as well?
Maybe not. Though the literature on this question is sparse, giftedness and achievements early in life do not appear to provide an insurance policy against suffering later on. In 1999, Carole Holahan and Charles Holahan, psychologists at the University of Texas, published an influential paper in The International Journal of Aging and Human Development that looked at hundreds of older adults who early in life had been identified as highly gifted. The Holahans’ conclusion: “Learning at a younger age of membership in a study of intellectual giftedness was related to … less favorable psychological well-being at age eighty.”
This study may simply be showing that it’s hard to live up to high expectations, and that telling your kid she is a genius is not necessarily good parenting. (The Holahans surmise that the children identified as gifted might have made intellectual ability more central to their self-appraisal, creating “unrealistic expectations for success” and causing them to fail to “take into account the many other life influences on success and recognition.”) However, abundant evidence suggests that the waning of ability in people of high accomplishment is especially brutal psychologically. Consider professional athletes, many of whom struggle profoundly after their sports career ends. Tragic examples abound, involving depression, addiction, or suicide; unhappiness in retired athletes may even be the norm, at least temporarily. A study published in the Journal of Applied Sport Psychology in 2003, which charted the life satisfaction of former Olympic athletes, found that they generally struggled with a low sense of personal control when they first stopped competing.
Recently, I asked Dominique Dawes, a former Olympic gold-medal gymnast, how normal life felt after competing and winning at the highest levels. She told me that she is happy, but that the adjustment wasn’t easy—and still isn’t, even though she won her last Olympic medal in 2000. “My Olympic self would ruin my marriage and leave my kids feeling inadequate,” she told me, because it is so demanding and hard-driving. “Living life as if every day is an Olympics only makes those around me miserable.”
Why might former elite performers have such a hard time? No academic research has yet proved this, but I strongly suspect that the memory of remarkable ability, if that is the source of one’s self-worth, might, for some, provide an invidious contrast to a later, less remarkable life. “Unhappy is he who depends on success to be happy,” Alex Dias Ribeiro, a former Formula 1 race-car driver, once wrote. “For such a person, the end of a successful career is the end of the line. His destiny is to die of bitterness or to search for more success in other careers and to go on living from success to success until he falls dead. In this case, there will not be life after success.”Call it the Principle of Psychoprofessional Gravitation: the idea that the agony of professional oblivion is directly related to the height of professional prestige previously achieved, and to one’s emotional attachment to that prestige. Problems related to achieving professional success might appear to be a pretty good species of problem to have; even raising this issue risks seeming precious. But if you reach professional heights and are deeply invested in being high up, you can suffer mightily when you inevitably fall. That’s the man on the plane. Maybe that will be you, too. And, without significant intervention, I suspect it will be me.
The Principle of Psychoprofessional Gravitation can help explain the many cases of people who have done work of world-historical significance yet wind up feeling like failures. Take Charles Darwin, who was just 22 when he set out on his five-year voyage aboard the Beagle in 1831. Returning at 27, he was celebrated throughout Europe for his discoveries in botany and zoology, and for his early theories of evolution. Over the next 30 years, Darwin took enormous pride in sitting atop the celebrity-scientist pecking order, developing his theories and publishing them as books and essays—the most famous being On the Origin of Species, in 1859.
But as Darwin progressed into his 50s, he stagnated; he hit a wall in his research. At the same time an Austrian monk by the name of Gregor Mendel discovered what Darwin needed to continue his work: the theory of genetic inheritance. Unfortunately, Mendel’s work was published in an obscure academic journal and Darwin never saw it—and in any case, Darwin did not have the mathematical ability to understand it. From then on he made little progress. Depressed in his later years, he wrote to a close friend, “I have not the heart or strength at my age to begin any investigation lasting years, which is the only thing which I enjoy.”
Presumably, Darwin would be pleasantly surprised to learn how his fame grew after his death, in 1882. From what he could see when he was old, however, the world had passed him by, and he had become irrelevant. That could have been Darwin on the plane behind me that night.
It also could have been a younger version of me, because I have had precocious experience with professional decline.
As a child, I had just one goal: to be the world’s greatest French-horn player. I worked at it slavishly, practicing hours a day, seeking out the best teachers, and playing in any ensemble I could find. I had pictures of famous horn players on my bedroom wall for inspiration. And for a while, I thought my dream might come true. At 19, I left college to take a job playing professionally in a touring chamber-music ensemble. My plan was to keep rising through the classical-music ranks, joining a top symphony orchestra in a few years or maybe even becoming a soloist—the most exalted job a classical musician can hold.
But then, in my early 20s, a strange thing happened: I started getting worse. To this day, I have no idea why. My technique began to suffer, and I had no explanation for it. Nothing helped. I visited great teachers and practiced more, but I couldn’t get back to where I had been. Pieces that had been easy to play became hard; pieces that had been hard became impossible.
Perhaps the worst moment in my young but flailing career came at age 22, when I was performing at Carnegie Hall. While delivering a short speech about the music I was about to play, I stepped forward, lost my footing, and fell off the stage into the audience. On the way home from the concert, I mused darkly that the experience was surely a message from God.
But I sputtered along for nine more years. I took a position in the City Orchestra of Barcelona, where I increased my practicing but my playing gradually deteriorated. Eventually I found a job teaching at a small music conservatory in Florida, hoping for a magical turnaround that never materialized. Realizing that maybe I ought to hedge my bets, I went back to college via distance learning, and earned my bachelor’s degree shortly before my 30th birthday. I secretly continued my studies at night, earning a master’s degree in economics a year later. Finally I had to admit defeat: I was never going to turn around my faltering musical career. So at 31 I gave up, abandoning my musical aspirations entirely, to pursue a doctorate in public policy.Life goes on, right? Sort of. After finishing my studies, I became a university professor, a job I enjoyed. But I still thought every day about my beloved first vocation. Even now, I regularly dream that I am onstage, and wake to remember that my childhood aspirations are now only phantasms.I am lucky to have accepted my decline at a young enough age that I could redirect my life into a new line of work. Still, to this day, the sting of that early decline makes these words difficult to write. I vowed to myself that it wouldn’t ever happen again.
Will it happen again? In some professions, early decline is inescapable. No one expects an Olympic athlete to remain competitive until age 60. But in many physically nondemanding occupations, we implicitly reject the inevitability of decline before very old age. Sure, our quads and hamstrings may weaken a little as we age. But as long as we retain our marbles, our quality of work as a writer, lawyer, executive, or entrepreneur should remain high up to the very end, right? Many people think so. I recently met a man a bit older than I am who told me he planned to “push it until the wheels came off.” In effect, he planned to stay at the very top of his game by any means necessary, and then keel over.
But the odds are he won’t be able to. The data are shockingly clear that for most people, in most fields, decline starts earlier than almost anyone thinks.According to research by Dean Keith Simonton, a professor emeritus of psychology at UC Davis and one of the world’s leading experts on the trajectories of creative careers, success and productivity increase for the first 20 years after the inception of a career, on average. So if you start a career in earnest at 30, expect to do your best work around 50 and go into decline soon after that.The specific timing of peak and decline vary somewhat depending on the field. Benjamin Jones, a professor of strategy and entrepreneurship at Northwestern University’s Kellogg School of Management, has spent years studying when people are most likely to make prizewinning scientific discoveries and develop key inventions. His findings can be summarized by this little ditty:
Age is, of course, a fever chill
that every physicist must fear.
He’s better dead than living still
when once he’s past his thirtieth year.
Dirac overstates the point, but only a little. Looking at major inventors and Nobel winners going back more than a century, Jones has found that the most common age for producing a magnum opus is the late 30s. He has shown that the likelihood of a major discovery increases steadily through one’s 20s and 30s and then declines through one’s 40s, 50s, and 60s. Are there outliers? Of course. But the likelihood of producing a major innovation at age 70 is approximately what it was at age 20—almost nonexistent.
Much of literary achievement follows a similar pattern. Simonton has shown that poets peak in their early 40s. Novelists generally take a little longer. When Martin Hill Ortiz, a poet and novelist, collected data on New York Times fiction best sellers from 1960 to 2015, he found that authors were likeliest to reach the No. 1 spot in their 40s and 50s. Despite the famous productivity of a few novelists well into old age, Ortiz shows a steep drop-off in the chance of writing a best seller after the age of 70. (Some nonfiction writers—especially historians—peak later, as we shall see in a minute.)
This research concerns people at the very top of professions that are atypical. But the basic finding appears to apply more broadly. Scholars at Boston College’s Center for Retirement Research studied a wide variety of jobs and found considerable susceptibility to age-related decline in fields ranging from policing to nursing. Other research has found that the best-performing home-plate umpires in Major League Baseball have 18 years less experience and are 23 years younger than the worst-performing umpires (who are 56.1 years old, on average). Among air traffic controllers, the age-related decline is so sharp—and the potential consequences of decline-related errors so dire—that the mandatory retirement age is 56.
In sum, if your profession requires mental processing speed or significant analytic capabilities—the kind of profession most college graduates occupy—noticeable decline is probably going to set in earlier than you imagine.Sorry.
If decline not only is inevitable but also happens earlier than most of us expect, what should we do when it comes for us?
Whole sections of bookstores are dedicated to becoming successful. The shelves are packed with titles like The Science of Getting Rich and The 7 Habits of Highly Effective People. There is no section marked “Managing Your Professional Decline.”
But some people have managed their declines well. Consider the case of Johann Sebastian Bach. Born in 1685 to a long line of prominent musicians in central Germany, Bach quickly distinguished himself as a musical genius. In his 65 years, he published more than 1,000 compositions for all the available instrumentations of his day.
Early in his career, Bach was considered an astoundingly gifted organist and improviser. Commissions rolled in; royalty sought him out; young composers emulated his style. He enjoyed real prestige.
But it didn’t last—in no small part because his career was overtaken by musical trends ushered in by, among others, his own son, Carl Philipp Emanuel, known as C.P.E. to the generations that followed. The fifth of Bach’s 20 children, C.P.E. exhibited the musical gifts his father had. He mastered the baroque idiom, but he was more fascinated with a new “classical” style of music, which was taking Europe by storm. As classical music displaced baroque, C.P.E.’s prestige boomed while his father’s music became passé.
Bach easily could have become embittered, like Darwin. Instead, he chose to redesign his life, moving from innovator to instructor. He spent a good deal of his last 10 years writing The Art of Fugue, not a famous or popular work in his time, but one intended to teach the techniques of the baroque to his children and students—and, as unlikely as it seemed at the time, to any future generations that might be interested. In his later years, he lived a quieter life as a teacher and a family man.
What’s the difference between Bach and Darwin? Both were preternaturally gifted and widely known early in life. Both attained permanent fame posthumously. Where they differed was in their approach to the midlife fade. When Darwin fell behind as an innovator, he became despondent and depressed; his life ended in sad inactivity. When Bach fell behind, he reinvented himself as a master instructor. He died beloved, fulfilled, and—though less famous than he once had been—respected.The lesson for you and me, especially after 50: Be Johann Sebastian Bach, not Charles Darwin.
How does one do that?
A potential answer lies in the work of the British psychologist Raymond Cattell, who in the early 1940s introduced the concepts of fluid and crystallized intelligence. Cattell defined fluid intelligence as the ability to reason, analyze, and solve novel problems—what we commonly think of as raw intellectual horsepower. Innovators typically have an abundance of fluid intelligence. It is highest relatively early in adulthood and diminishes starting in one’s 30s and 40s. This is why tech entrepreneurs, for instance, do so well so early, and why older people have a much harder time innovating.
Crystallized intelligence, in contrast, is the ability to use knowledge gained in the past. Think of it as possessing a vast library and understanding how to use it. It is the essence of wisdom. Because crystallized intelligence relies on an accumulating stock of knowledge, it tends to increase through one’s 40s, and does not diminish until very late in life.
Careers that rely primarily on fluid intelligence tend to peak early, while those that use more crystallized intelligence peak later. For example, Dean Keith Simonton has found that poets—highly fluid in their creativity—tend to have produced half their lifetime creative output by age 40 or so. Historians—who rely on a crystallized stock of knowledge—don’t reach this milestone until about 60.Here’s a practical lesson we can extract from all this: No matter what mix of intelligence your field requires, you can always endeavor to weight your career away from innovation and toward the strengths that persist, or even increase, later in life.Like what? As Bach demonstrated, teaching is an ability that decays very late in life, a principal exception to the general pattern of professional decline over time. A study in The Journal of Higher Education showed that the oldest college professors in disciplines requiring a large store of fixed knowledge, specifically the humanities, tended to get evaluated most positively by students. This probably explains the professional longevity of college professors, three-quarters of whom plan to retire after age 65—more than half of them after 70, and some 15 percent of them after 80. (The average American retires at 61.) One day, during my first year as a professor, I asked a colleague in his late 60s whether he’d ever considered retiring. He laughed, and told me he was more likely to leave his office horizontally than vertically.
Our dean might have chuckled ruefully at this—college administrators complain that research productivity among tenured faculty drops off significantly in the last decades of their career. Older professors take up budget slots that could otherwise be used to hire young scholars hungry to do cutting-edge research. But perhaps therein lies an opportunity: If older faculty members can shift the balance of their work from research to teaching without loss of professional prestige, younger faculty members can take on more research.
Patterns like this match what I’ve seen as the head of a think tank full of scholars of all ages. There are many exceptions, but the most profound insights tend to come from those in their 30s and early 40s. The best synthesizers and explainers of complicated ideas—that is, the best teachers—tend to be in their mid-60s or older, some of them well into their 80s.That older people, with their stores of wisdom, should be the most successful teachers seems almost cosmically right. No matter what our profession, as we age we can dedicate ourselves to sharing knowledge in some meaningful way.
A few years ago, I saw a cartoon of a man on his deathbed saying, “I wish I’d bought more crap.” It has always amazed me that many wealthy people keep working to increase their wealth, amassing far more money than they could possibly spend or even usefully bequeath. One day I asked a wealthy friend why this is so. Many people who have gotten rich know how to measure their self-worth only in pecuniary terms, he explained, so they stay on the hamster wheel, year after year. They believe that at some point, they will finally accumulate enough to feel truly successful, happy, and therefore ready to die.
This is a mistake, and not a benign one. Most Eastern philosophy warns that focusing on acquisition leads to attachment and vanity, which derail the search for happiness by obscuring one’s essential nature. As we grow older, we shouldn’t acquire more, but rather strip things away to find our true selves—and thus, peace.
At some point, writing one more book will not add to my life satisfaction; it will merely stave off the end of my book-writing career. The canvas of my life will have another brushstroke that, if I am being forthright, others will barely notice, and will certainly not appreciate very much. The same will be true for most other markers of my success.What I need to do, in effect, is stop seeing my life as a canvas to fill, and start seeing it more as a block of marble to chip away at and shape something out of. I need a reverse bucket list. My goal for each year of the rest of my life should be to throw out things, obligations, and relationships until I can clearly see my refined self in its best form.And that self is … who, exactly?
Last year, the search for an answer to this question took me deep into the South Indian countryside, to a town called Palakkad, near the border between the states of Kerala and Tamil Nadu. I was there to meet the guru Sri Nochur Venkataraman, known as Acharya (“Teacher”) to his disciples. Acharya is a quiet, humble man dedicated to helping people attain enlightenment; he has no interest in Western techies looking for fresh start-up ideas or burnouts trying to escape the religious traditions they were raised in. Satisfied that I was neither of those things, he agreed to talk with me.
I told him my conundrum: Many people of achievement suffer as they age, because they lose their abilities, gained over many years of hard work. Is this suffering inescapable, like a cosmic joke on the proud? Or is there a loophole somewhere—a way around the suffering?Acharya answered elliptically, explaining an ancient Hindu teaching about the stages of life, or ashramas. The first is Brahmacharya, the period of youth and young adulthood dedicated to learning. The second is Grihastha, when a person builds a career, accumulates wealth, and creates a family. In this second stage, the philosophers find one of life’s most common traps: People become attached to earthly rewards—money, power, sex, prestige—and thus try to make this stage last a lifetime.The antidote to these worldly temptations is Vanaprastha, the third ashrama, whose name comes from two Sanskrit words meaning “retiring” and “into the forest.” This is the stage, usually starting around age 50, in which we purposefully focus less on professional ambition, and become more and more devoted to spirituality, service, and wisdom. This doesn’t mean that you need to stop working when you turn 50—something few people can afford to do—only that your life goals should adjust.
Vanaprastha is a time for study and training for the last stage of life, Sannyasa, which should be totally dedicated to the fruits of enlightenment. In times past, some Hindu men would leave their family in old age, take holy vows, and spend the rest of their life at the feet of masters, praying and studying. Even if sitting in a cave at age 75 isn’t your ambition, the point should still be clear: As we age, we should resist the conventional lures of success in order to focus on more transcendentally important things.
I told Acharya the story about the man on the plane. He listened carefully, and thought for a minute. “He failed to leave Grihastha,” he told me. “He was addicted to the rewards of the world.” He explained that the man’s self-worth was probably still anchored in the memories of professional successes many years earlier, his ongoing recognition purely derivative of long-lost skills. Any glory today was a mere shadow of past glories. Meanwhile, he’d completely skipped the spiritual development of Vanaprastha, and was now missing out on the bliss of Sannyasa.There is a message in this for those of us suffering from the Principle of Psychoprofessional Gravitation. Say you are a hard-charging, type-A lawyer, executive, entrepreneur, or—hypothetically, of course—president of a think tank. From early adulthood to middle age, your foot is on the gas, professionally. Living by your wits—by your fluid intelligence—you seek the material rewards of success, you attain a lot of them, and you are deeply attached to them. But the wisdom of Hindu philosophy—and indeed the wisdom of many philosophical traditions—suggests that you should be prepared to walk away from these rewards before you feel ready. Even if you’re at the height of your professional prestige, you probably need to scale back your career ambitions in order to scale up your metaphysical ones.
When the New York Times columnist David Brooks talks about the difference between “résumé virtues” and “eulogy virtues,” he’s effectively putting the ashramas in a practical context. Résumé virtues are professional and oriented toward earthly success. They require comparison with others. Eulogy virtues are ethical and spiritual, and require no comparison. Your eulogy virtues are what you would want people to talk about at your funeral. As in He was kind and deeply spiritual, not He made senior vice president at an astonishingly young age and had a lot of frequent-flier miles.
You won’t be around to hear the eulogy, but the point Brooks makes is that we live the most fulfilling life—especially once we reach midlife—by pursuing the virtues that are most meaningful to us.
I suspect that my own terror of professional decline is rooted in a fear of death—a fear that, even if it is not conscious, motivates me to act as if death will never come by denying any degradation in my résumé virtues. This denial is destructive, because it leads me to ignore the eulogy virtues that bring me the greatest joy.
How can I overcome this tendency? The Buddha recommends, of all things, corpse meditation: Many Theravada Buddhist monasteries in Thailand and Sri Lanka display photos of corpses in various states of decomposition for the monks to contemplate. “This body, too,” students are taught to say about their own body, “such is its nature, such is its future, such is its unavoidable fate.” At first this seems morbid. But its logic is grounded in psychological principles—and it’s not an exclusively Eastern idea. “To begin depriving death of its greatest advantage over us,” Michel de Montaigne wrote in the 16th century, “let us deprive death of its strangeness, let us frequent it, let us get used to it; let us have nothing more often in mind than death.”
Psychologists call this desensitization, in which repeated exposure to something repellent or frightening makes it seem ordinary, prosaic, not scary. And for death, it works. In 2017, a team of researchers at several American universities recruited volunteers to imagine they were terminally ill or on death row, and then to write blog posts about either their imagined feelings or their would-be final words. The researchers then compared these expressions with the writings and last words of people who were actually dying or facing capital punishment. The results, published in Psychological Science, were stark: The words of the people merely imagining their imminent death were three times as negative as those of the people actually facing death—suggesting that, counterintuitively, death is scarier when it is theoretical and remote than when it is a concrete reality closing in.For most people, actively contemplating our demise so that it is present and real (rather than avoiding the thought of it via the mindless pursuit of worldly success) can make death less frightening; embracing death reminds us that everything is temporary, and can make each day of life more meaningful. “Death destroys a man,” E. M. Forster wrote, but “the idea of Death saves him.”
Decline is inevitable, and it occurs earlier than almost any of us wants to believe. But misery is not inevitable. Accepting the natural cadence of our abilities sets up the possibility of transcendence, because it allows the shifting of attention to higher spiritual and life priorities.
But such a shift demands more than mere platitudes. I embarked on my research with the goal of producing a tangible road map to guide me during the remaining years of my life. This has yielded four specific commitments.JUMPThe biggest mistake professionally successful people make is attempting to sustain peak accomplishment indefinitely, trying to make use of the kind of fluid intelligence that begins fading relatively early in life. This is impossible. The key is to enjoy accomplishments for what they are in the moment, and to walk away perhaps before I am completely ready—but on my own terms.
So: I’ve resigned my job as president of the American Enterprise Institute, effective right about the time this essay is published. While I have not detected deterioration in my performance, it was only a matter of time. Like many executive positions, the job is heavily reliant on fluid intelligence. Also, I wanted freedom from the consuming responsibilities of that job, to have time for more spiritual pursuits. In truth, this decision wasn’t entirely about me. I love my institution and have seen many others like it suffer when a chief executive lingered too long.
Leaving something you love can feel a bit like a part of you is dying. In Tibetan Buddhism, there is a concept called bardo, which is a state of existence between death and rebirth—“like a moment when you step toward the edge of a precipice,” as a famous Buddhist teacher puts it. I am letting go of a professional life that answers the question Who am I?
I am extremely fortunate to have the means and opportunity to be able to walk away from a job. Many people cannot afford to do that. But you don’t necessarily have to quit your job; what’s important is striving to detach progressively from the most obvious earthly rewards—power, fame and status, money—even if you continue to work or advance a career. The real trick is walking into the next stage of life, Vanaprastha, to conduct the study and training that prepare us for fulfillment in life’s final stage.SERVETime is limited, and professional ambition crowds out things that ultimately matter more. To move from résumé virtues to eulogy virtues is to move from activities focused on the self to activities focused on others. This is not easy for me; I am a naturally egotistical person. But I have to face the fact that the costs of catering to selfishness are ruinous—and I now work every day to fight this tendency.
Fortunately, an effort to serve others can play to our strengths as we age. Remember, people whose work focuses on teaching or mentorship, broadly defined, peak later in life. I am thus moving to a phase in my career in which I can dedicate myself fully to sharing ideas in service of others, primarily by teaching at a university. My hope is that my most fruitful years lie ahead.
Because I’ve talked a lot about various religious and spiritual traditions—and emphasized the pitfalls of overinvestment in career success—readers might naturally conclude that I am making a Manichaean separation between the worlds of worship and work, and suggesting that the emphasis be on worship. That is not my intention. I do strongly recommend that each person explore his or her spiritual self—I plan to dedicate a good part of the rest of my life to the practice of my own faith, Roman Catholicism. But this is not incompatible with work; on the contrary, if we can detach ourselves from worldly attachments and redirect our efforts toward the enrichment and teaching of others, work itself can become a transcendental pursuit.
“The aim and final end of all music,” Bach once said, “should be none other than the glory of God and the refreshment of the soul.” Whatever your metaphysical convictions, refreshment of the soul can be the aim of your work, like Bach’s.
Bach finished each of his manuscripts with the words Soli Deo gloria—“Glory to God alone.” He failed, however, to write these words on his last manuscript, “Contrapunctus 14,” from The Art of Fugue, which abruptly stops mid-measure. His son C.P.E. added these words to the score: “Über dieser Fuge … ist der Verfasser gestorben” (“At this point in the fugue … the composer died”). Bach’s life and work merged with his prayers as he breathed his last breath. This is my aspiration.
Throughout this essay, I have focused on the effect that the waning of my work prowess will have on my happiness. But an abundance of research strongly suggests that happiness—not just in later years but across the life span—is tied directly to the health and plentifulness of one’s relationships. Pushing work out of its position of preeminence—sooner rather than later—to make space for deeper relationships can provide a bulwark against the angst of professional decline.
Dedicating more time to relationships, and less to work, is not inconsistent with continued achievement. “He is like a tree planted by streams of water,” the Book of Psalms says of the righteous person, “yielding its fruit in season, whose leaf does not wither, and who prospers in all he does.” Think of an aspen tree. To live a life of extraordinary accomplishment is—like the tree—to grow alone, reach majestic heights alone, and die alone. Right?
The secret to bearing my decline—to enjoying it—is to become more conscious of the roots linking me to others. If I have properly developed the bonds of love among my family and friends, my own withering will be more than offset by blooming in others.
When I talk about this personal research project I’ve been pursuing, people usually ask: Whatever happened to the hero on the plane?
I think about him a lot. He’s still famous, popping up in the news from time to time. Early on, when I saw a story about him, I would feel a flash of something like pity—which I now realize was really only a refracted sense of terror about my own future. Poor guy really meant I’m screwed.
But as my grasp of the principles laid out in this essay has deepened, my fear has declined proportionately. My feeling toward the man on the plane is now one of gratitude for what he taught me. I hope that he can find the peace and joy he is inadvertently helping me attain.”
— by Arthur C. Brooks in the July 2019 issue of The Atlantic p. 66-76. Brooks is/was the president of the American Enterprise Institute since 2009. He will become a professor of public leadership at Harvard this month. His most recent book is Love Your Enemies.
My own retirement from teaching at our local college was not followed by angst or despair as happens to many because I immediately got to teach at a sister school in China. I returned to China for at least part of each year three more times. The challenges and joy of teaching – especially in another culture – kept me busy and happy. And Barry and I traveled.
Back at home, I volunteer, joined book clubs, put energy into becoming an outrigger canoe paddler – and competed. I love being on the ocean, the challenge and camaraderie of being part of a team. This last regatta season has been fantastic. Undefeated, we won gold for all of our races here on our island – beating our arch-rival each week. When we competed in the Hawaii State Races at the beginning of this month, we earned a bronze medal; my first Hawaii State medal. We were a unified, happy bunch.
From right: Audrey, Wanda, Sandy, Diane, Jolyne, and me
However, this Labor Day weekend, I’m at home – not as usual at the QueenLili’uokalani Races in Kona. This year, no one invited me to be in a crew. In the past, I would have been devastated about not being included. However, I’m being philosophical about my being “left behind.”
The reality is that 18-mile race under a hot sun is hard. The last time I got to paddle the Queen Lili, I was a desperation choice for a team whose members were 10-15 years younger than I. I was willing and I worked my hardest. We did water changes in the ocean waves far from shore; the crew switched paddling sides every 19-21 strokes, not the 10-12 that I’d practiced — and I thought I might die! When we finally got to shore, I collapsed in the shade. The crew did earn a Silver medal, so at least I felt that I hadn’t held them back, but that win wasn’t from a unified crew. It was not all that much fun.
So today at home, I got to paddle in a double-hull outrigger at sunrise. I wasn’t chosen to stroke; two Novice B girls were. The reality is that they are decades younger and stronger than I am and are learning the newest recommended stroke. I was in seat two and got an excellent workout. And I got to go to coffee afterwards with paddlers I usually just see on the beach or in another canoe. In my own way, I’m working on being more like Bach than Darwin. And I’m having fun.
Will you be a Darwin or a Bach? It’s probably not too soon to be thinking about it.
“The skies were clearing, the remaining clouds scudding fast across the blue as Watergate Bay stretched endlessly ahead. We followed the beach, too fragile to face the up and down of the clifftop. The wide expanse of sand lay pristine and empty beyond the restaurants and cafés. The only person ahead came into focus as an old man with two spaniels. He stopped to speak as we passed.
‘Are you walking the coastal path?’
‘Some of it. To Land’s End a least.’
‘I’ve always wanted to do that . .. just walk for days and days.’
‘Then do it. Just pack a rucksack and do it now. You never know how long your fetch will be–depends on the wind'” (p. 129).
From: The Salt Path by Raynor Winn
“The uplifting true story of the couple who lost everything and embarked on a journey of salvation across the windswept South West coastline. …” Google Books
“Did you know that bees need a water source?” asks Laryssa, an artistic young Servas guest who at the time we met her kept bee hives on the roof of the building where she rented her apartment in Philadelphia. That winter, her bees froze to death; then she moved to Hawaii. She is a great source for bee information. August 17 was National Honeybee Day in the U.S. In recognition of the fact that bees are extremely necessary for us, here are some of Laryssa’s ideas to keep bees cool in the summer and ideas from others of what else we can do to help bees.
Laryssa says, “Bees gather water and bring it to the hive in order to cool it down. They don’t drink water because nectar is mostly water. When bees bring water to the hive, they spray it onto the frames in the hive. Other bees fan their wings. This essentially creates air conditioning that cools the hive. The baby bees are very sensitive to the temperature, so the hive cannot get too warm or too cold.
Alternative ways to create a water source for bees are wine corks floating in a container of water or just drape a towel over the side and let some of the towel touch the water. The problem with these bee water sources is that you’re creating a breeding ground for mosquitoes, so it can be tricky.
Or if you have a pool, you can put a small towel over the pool edge so that it slightly touches the water. Then the bees at least won’t drown trying to get to the water. They definitely do not want salt water, but they’ll take whatever they can find.
Here is what Carol Ann said recently on Nextdoor (a community web site):
“Bee Water Cooler!
Aloha All ~
After the first fire [we’ve had over 10,000 acres burn this summer on Maui], I noticed bees hovering around our pool. I learned that these bees are tasked with bringing water back to the hives. I rescued many from drowning as they fell into the pool. Unfortunately some also drown.
The ‘aina [Hawaiian term literally means ‘that which feeds’] is very dry now and thousands of acres have been destroyed by fires, so every day the numbers of thirsty bees are increasing.
I researched how best to provide bees with water (so I could stop playing lifeguard!).
Please consider providing our honey-making friends the water they need (and keep them from drowning in your pool). All you have to do is get a shallow pan (pictured here is a plastic pan used under potted plants), fill it with gravel and a few larger stones, add water and VOILA – you have a bee water cooler! The bees need to be able to stand on the gravel to drink, so don’t cover the stones all the way.
Friend Mary’s Bee Water Cooler
Carol Ann continued, “I set my Bee Water Cooler next to the corner of the pool where they were already drinking in an attempt to lure them away from this dangerous (for them) spot.
P.S. These bees are not aggressive and I have not been stung once.”
My simple bee water cooler
Bees land on Bob’s swimming suit when he hangs it out to dry.
CNN reminds us, “People owe a lot to bees — namely, many of the foods we enjoy, like strawberries, avocados and broccoli. The US Department of Agriculture estimates that every 1 in 3 bites of food exists because of animal pollinators, and bees lead the charge.”
Photo from CNN article).
What each of us does matters – for the bees and more.
It’s likely a Bee Water Cooler would help bees where you live.
Do you have other good ideas to help bees? Please share them.
“Did you know that poor diets kill millions worldwide? Diets lacking whole grains and fruit and high in processed meats, trans fats and sugary drinks may be responsible for one-fifths of all deaths. That makes poor diet the biggest risk factor in the world.”
This conclusion is from a Global Burden of Disease study tracking dietary factors from 1990-2017 in 195 countries, conducted by researchers at Institute for Health Metrics and Evaluation, University of Washington, Seattle, published in The Lancet. Seen in BottomLine Personal, July 15, 2019. p. 13.
Today as I was volunteering and getting to share the latest in humpback whale information at the Maui Ocean Center, one group – a mom and her four daughters – seemed particularly interested. Most people at the Ocean Center come to see the many beautiful fish and other sea creatures, and I get to say a few facts as they pass by. But for this particular group, I got to tell about why the humpbacks don’t eat while they are in Hawaii, how the male humpback whales have the most complex acoustical display of any in the animal kingdom, and more. Since I could hear a slight accent, I asked the mom and girls where they were from — Saudi Arabia! Uncovered, unescorted, all speaking English well (and of course, Arabic, and they are learning French); the mom says that the women drive; the girls are learning guitar too, and tomorrow, they are taking hula lessons at their hotel. The mom said that life in Saudi Arabia isn’t really as it is portrayed in the news. I asked if they were afraid of traveling in the U.S. They said, “No.” They are having a wonderful time and find everyone friendly. They see the sensational news as just the news. I would have loved getting to know them.
That seeking out of people, especially ones from cultures much different than his own is what Rick Steves shares in his book Travel as a Political Act, which offers many significant insights. For instance, in describing his time in Iran, Rick Steves notes,
“It’s not easy finding a middle ground between the ‘Great Satan’ and the ‘Axis of Evil.’ Some positions (such as President Ahmadinejad denying the Holocaust) are just plain wrong. But I don’t entirely agree with many in my own government, either. Yes, there are evil people in Iran. Yes, the rhetoric and policies of Iran’s leaders can be objectionable. But there is so much more to Iran than the negative image drummed into us by our media and our government.
I left Iran impressed more by what we have in common than by our differences. Most Iranians, like most Americans, simply want a good life and a safe homeland for their loved ones. Just like my country, Iran has one dominant ethnic group and religion that’s struggling with issues of diversity and change–liberal versus conservative, modern versus traditional, secular versus religious. As in my own hometown, people of great faith are suspicious of people of no faith or a different faith. Both societies seek a defense against the onslaught of modern materialism that threatens their traditional ‘family values.’ Both society are suspicious of each other, and both are especially suspicious of each other’s government.
When we travel–whether to the ‘Axis of Evil’ or just to a place where people yodel when they’re happy, or fight bulls to impress the girls, or can’t serve breakfast until today’s croissants arrive — we enrich our lives and better understand our place on the planet. We undercut groups that sow fear, hated, and mistrust. People-to-people connections help us learn that we can disagree and still coexist peacefully.
Granted, there’s no easy solution, but surely getting to know Iranian culture is a step in the right direction. Hopefully, even the most skeptical will appreciate the humanity of 70 million Iranian people. Our political leaders sometimes make us forget that all of us on this small planet are equally precious children of God. Having been to Iran and meeting its people face-to-face, I feel this more strongly than ever” (p. 192-193).
Wherever you are, find someone of a different culture–listen, reflect, and learn. Talk to people with accents; you are likely to be glad when they share something of their lives.
If you can’t go traveling tomorrow, get Rick Steves’ Travel as a Political Act.
Happy traveling; happy reading. Aloha, Renée
What’s important to this young man? What brings him joy & sorrow? What do we have in common? It would be interesting to find out Photo by POOYAN ESHTIAGHI on Unsplash
Banner photo: Rick Steves with schoolgirls in Iran.
“A child of an Evangelical Friends Sunday school, at an early age I was both born again and schooled in pacifism. While I don’t rightly know how it all fit together theologically, I know that the World War II veterans and their wives grieved what they understood as their necessary service as they loved us into loving Jesus.
As I was coming of age and studying theology, I found myself seeing the ways that war (even “just war”) becomes necessary when we neglect the things that make for peace. I was stunned to learn about the voyage of the MS St. Louis, a ship filled with Jewish Germans seeking asylum in the United States; it was turned away, leaving its refugees to return to Europe and Nazis’ terror (ultimately several European countries received the passengers that we denied). There were things we coulda‐woulda‐shoulda done that would have prevented the Holocaust, things that would have prevented the need for what I’d been taught was a necessary war. Pacifism, I learned, must be proactive and intensely active.
In more recent years I’ve spent many nights praying with my feet in Ferguson, Mo., and beyond. I’ve seen the police state wage war on the people; tasted tear gas; heard the beat of the batons; watched the entrenched, systemic racism up close and personal. As we call for nonviolent resistance, we too often fail to recognize that violence is already present.
In this light, platitudes for peacemaking sound more like acquiescence with evil and have no rightful place. Pacifism, it seems, is a position of privilege more so than justice. And yet as we watch the rise of the alt‐right (essentially Nazi 2.0), I’m finding myself rethinking it all yet again.
Back in the “righteous war” of the European theatre, we defeated one man and his regime with the best of American war‐making tools (or so the story is told). Success was declared, and decades of relative prosperity awaited those heralded as victors. Because our victory was militaristic and focused on one man’s empire, we never addressed what propelled the mass of people to support the madness. Make no mistake, most German folk went along (“it’s a job,” “it’s the law,” “I have to feed my family”), and many actually supported the regime. We never addressed the white supremacist ideology that undergirded the Nazi agenda, the same ideology upon which our nation was founded.
Likely we didn’t address it because it was too close to our own. In the midst of our warring, Jim Crow was having a field day back here at home. After the war, in the era of relative prosperity, the question was raised as to whether the prosperity belonged to everyone or just white folk. Slowly (with hugh sacrifice by Black leaders) some doors opened. But even then white folk never really talked about race and ethnicity. We shared metaphors that allowed us to pretend that everyone is white (melting pot, salad bowl, color‐blind) while maintaining a system of goods and services that were never shared.
Refusing to address the underlying values of the Third Reich (white capitalist patriarchy), we have been destined to relive them. We have a president who recently called a Black woman (his former aide) a “dog,” welcomed the white nationalist folk to the White House Lawn, and continued refusal to return hundreds of Brown‐skinned children to their parents. All the while his base cheers widely and his party stands behind him. In vivid and horrifying detail we are seeing the fruit of the poisonous taproot that we failed to address when we laid the blame for the Holocaust at the feet of a single contorted human. The blame then, and now, belongs with an underlying value system that elevates and dehumanizes in binary categories.
Pacifism is not passive: it is that active work of looking at the deepest causes of violence. Pacifism is a call to address violently oppressive power structures, not a judgment of the response by the oppressed. Pacifism is proactive and militant and actively disrupting [my emphasis].
Had we (white folk) embraced pacifism, we might have engaged the work necessary to identify and unlearn the racism that is suffocating us all. We might have found the courage to atone for our nation’s most original sins.
Instead we are reviving them.” – Katherine HawkerSelf, St. Louis, MO, April 1, 2019
“On an individual level, the human condition changed day by day, even hour by hour, and while you were soaking in self-pity over a misfortune, you might miss an opportunity for a redeeming triumph.
And for every act of inhumanity, the species managed to commit a hundred acts of kindness; so if you were the type to brood, you would be more sensible if you dwelt on the remarkable goodwill with which most people treated others”
— Dean Koontz By the Light of the Moon quoted in The Sun, August 2018, p. 48.
Parker J. Palmer, American author, educator, and activist who focuses on issues in education, community, leadership, spirituality and social change, notes:
“I have thought often and painfully of the education I received — in some of the best colleges in this country — about the history of the Third Reich. I was taught by good historians, some of them award-winning. But I was taught the history of Nazi Germany in such a way that I felt as if all of that murderousness had happened to another species on another planet.
Parker J. Palmer image from Wikipedia
My teachers were not Holocaust revisionists. They weren’t saying it didn’t happen. They taught the statistics and the facts and the theories behind the facts, but they presented them at such objective arm’s length that the inwardness of the events was never revealed to me. All was objectified and externalized, and I ended up orally and spiritually deformed as a consequence.
There are two things that I failed to learn from my history courses on Nazi Germany — things that I should have learned, and did learn painfully in later years. One was that the very community I grew up in, on the North Shore of Chicago, had its own fascist anti-Semitic tendencies. I grew up in Wilmette, Illinois, and if you were a Jew in the Chicago area, you didn’t live in Wilmette. You didn’t live in Evanston or Kenilworth, either, because there was fascism at work. I should have been taught that. Had my life been connected with history in that way, it would have helped me understand my own time and place, and my own involvement in the same evil. Without that knowledge, there was no way for me to grow morally.
The second, even more deeply inward thing I didn’t learn is that there is within me, in the shadow of my soul, a little Hitler, a force of evil that, when the difference between you and me gets too great, will order me to kill you off. I won’t do it with a bullet or a gas chamber but with a category, a word that renders you irrelevant to my universe: ‘Oh, you’re just a [fill in the blank].”
(“The Grace of Great Things,” September 1998, quoted in The Sun, June 2019, p.46).
Palmer doesn’t even mention the separation of Poles, Italians, Chinese . . . , but especially the Blacks and Whites in Chicago, where I was born and lived until when I was four years old, when my family moved down state. I returned after I had earned my Bachelor’s degree – at Southeast Missouri State College in Cape Girardeau, birth place of Russ Limbaugh. I don’t remember any Blacks or Latinos in any of my classes in the late 1960s – although that has changed now.
After graduation, my first teaching job was in an inner-city Chicago public high school. The students were smart enough; the education opportunity not so great. My ninth grade students could read 1st to 10th grade level – all in the same classroom. In that first year with minimal training and no experience, I taught classes for English, history, geography, and EMH (Emotionally & Mentally Handicapped – not a really good label for anyone). The administrators thought I was a good teacher because my students stayed in my classroom. I probably worked harder than I had ever done before or since; I loved my students, but I wasn’t well qualified, and many of the students had huge challenges. Several of my girls were pregnant or already had babies; at 14, they said they wanted someone finally to love them.
One of my biggest shocks as a teacher was when we were covering WWII in my history class; several of the students said they had never heard of our U.S. bombing of Hiroshima and Nagasaki!!! They didn’t know about the U.S. use of atomic bombs that struck mainly civilians and had medical ramifications for generations after the attacks. Of course, they learned about it in my class.
If we, in the U.S., can not admit to how things have been, how can we prevent them from happening again? If we can see only positive aspects of the U.S., how can we change the bad aspects? In Michael Moore’s movie, Where to Invade Next, he looks at great practices in other countries: Italians get lots of vacation time; the French public school children serve each other at small round tables and practice conversations as they eat healthy, several course lunches; in Germany, students are taught the bitter truth about the Holocaust and the Third Reich.
The rise of hateful voices in the U.S. would not come as such a surprise if we had really been paying attention to fringe groups and learning why they believe and act as they do.
One group working to provide accurate history and insights that will help us make good choices is –
Facing History and Ourselves, a non-profit empowering teachers and students to think critically about history and to understand the impact of their choices. <https://www.facinghistory.org>.
Facing History curriculum transform schools (and people)
“There are acts that oppose the flow of life and growth and human dignity. They must be dealt with courageously,” said Stephen R. Schwartz in “The Prayer of the Body III.”
Much is good about our world, but much needs to change.
Moral growth is essential: for ourselves and our nation.
“Someplace in the world, somebody is making love and another a poem. Elsewhere in the universe, a star manyfold the mass of our third-rate sun is living out its final moments in a wild spin before collapsing into a black hole, its exhale bending spacetime itself into a well of nothingness that can swallow every atom that ever touched us and every datum we ever produced, every poem and statue and symphony we’ve ever known–an entropic spectacle insentient to questions of blame and mercy, devoid of why.
In four billion years, our own star will follow its fate, collapsing into a white dwarf. . . .
But until that day comes, nothing once created ever fully leaves us. Seeds are planted and come abloom generations, centuries, civilizations later, migrating across coteries [communities] and countries and continents. Meanwhile, people live and people die–in peace as war rages on, in poverty and disrepute as latent fame awaits, with much that never meets its more, in shipwrecked love.
I will die.
You will die.
The atoms that huddled for a cosmic blink around the shadow of a self will return to the seas that made us.
What will survive of us are shoreless seeds and stardust.”
"Shanghai Street Stories" by Sue Anne Tay
Originally from Singapore, Sue Anne Tay has lived in the U.S., England, and now China. She is a talented street photographer who gets the stories behind her photos. You are likely to love her posts.
Check out the terrific programs for sustainable living. Go experience living on a kibbutz for a week or more.
Pick Up Americad bl
This is an old blog, but it is an example of what one person can do with a good idea. On the road for 24 months, Davey Rogner and friends have walked across the U.S. picking up litter and making people more conscious.
She and her husband Danny taught at ZAFU last year and are now at Xiamen University. Ruth records their experiences.
Where was Johnny and how is Captain Rob doing?
Check out Captain Rob’s sail blog. He and Johnny were in the Solomon Islands (famous for JFK’s PT boat & previously cannibalism; now the islands are known for their beautiful coral reefs and great surfing)