The stigma about mental-health conditions remains rampant, even among those who have access to world-class care. Jeffrey Lieberman, M.D., argues that if we understand the breakthroughs of 21st-century psychiatry, the barrier to seeking treatment will fall.

Four decades ago, when my cousin needed treatment for her mental illness, I steered her away from the most prominent and well-established psychiatric facilities of the time, fearing they might only make things worse for her. Today, I wouldn’t hesitate to send her to the psychiatric department of any major medical center. As someone who has worked on the front lines of clinical care, I’ve seen firsthand the sweeping progress that has transformed psychiatry; sadly, not everyone has been able to benefit from it.

Jeffrey A. Lieberman, MD,Jeffrey A. Lieberman, MD, is the Lawrence C. Kolb Professor and Chairman of Psychiatry at the Columbia University College of Physicians and Surgeons and Director of the New York State Psychiatric Institute. He holds the Lieber Chair for Schizophrenia Research in the Department of Psychiatry at Columbia and serves as Psychiatrist in Chief at New York Presbyterian Hospital-Columbia University Medical Center. Formerly the President of the American Psychiatric Association, he is a member of numerous scientific organizations and in 2000 was elected to the National Academy of Sciences Institute of Medicine. He lives with his wife and two sons in New York City.

Editor: Nadeem Noor

Shortly after I became chair of psychiatry at Columbia University, I was asked to consult on a 66-year-old woman named Dr. Kim.* She had been admitted to our hospital with a severe skin infection that seemed to have gone untreated for a long time. This was puzzling: Dr. Kim was both educated and affluent. She had graduated from medical school and, as the wife of a prominent industrialist, she had access to the best health care.

I quickly discovered why a psychiatrist had been called to see a patient with a skin infection. When I tried to ask Dr. Kim how she was feeling, she began to shout incoherently and make bizarre, angry gestures. When I remained silent and observed her, she talked to herself—more accurately, she talked to nonexistent people. Because I could not engage her in conversation, I decided to speak with her family. The next day, her husband and adult son and daughter reluctantly came to my office. After much cajoling, they revealed that shortly after Dr. Kim graduated from medical school, she had developed symptoms of schizophrenia.

Her family was ashamed of her condition. Despite their wealth and resources, neither Dr. Kim’s parents nor her husband sought any kind of treatment for her illness; instead, they decided to do whatever they could to prevent anyone from discovering her diagnosis. They sectioned off her living quarters in a wing of their spacious home and kept her isolated whenever they had guests. Despite her having received a medical degree, practicing medicine was completely out of the question. Until she developed the skin rash, Dr. Kim rarely left the property. Her family tried over-the-counter remedies, hoping for relief. But when the rash became infected and rapidly began to spread, they were frightened and called in the family doctor. When he saw her torso dotted with abscesses, he implored the family to take her to the hospital, where she was diagnosed with a severe staph infection.

I repeated back to them what they had told me—that for the past 30-some years, they had conspired to keep their wife and mother shut off from the world to avoid public embarrassment. They unabashedly nodded their heads in unison. I told them that their decision to withhold treatment was both cruel and immoral—though, tragically, not illegal—and I urged them to let us transfer her to the psychiatric unit when she had been medically cleared so that she could be treated. They refused.

They informed me that even if Dr. Kim could be successfully treated, at this point the resulting changes would be too disruptive to their lives and their position in the community. They would have to explain the reason why she suddenly began to appear in public after such a long absence—and who knows what she herself might say or how she would behave in such circumstances? The Kims perceived the stigma of mental illness as so oppressive that they would have this once intelligent, otherwise physically healthy woman remain untreated and incapacitated, her brain irreversibly deteriorating, rather than face the social consequences of acknowledging her mental illness.

A few short generations ago, the greatest obstacles to the treatment of mental illness were the lack of effective therapies, unreliable diagnostic criteria, and an ossified theory of the basic nature of the conditions. Today, the single greatest hindrance to treatment is not any gap in scientific knowledge or shortcoming in medical capability, but social stigma. This stigma, unfortunately, has been sustained by the legacy of psychiatry’s historic failures and its enduring reputation—no longer justified—as the stepchild of medicine.

Though we live in a time of unprecedented tolerance of different races, religions, and sexual orientations, mental illnesses—involuntary medical conditions that affect one out of four people—are still regarded as a mark of shame, a scarlet letter C for crazy.

I encounter this shame and sensitivity nearly every day. Many of the patients seen by our faculty opt to pay out of pocket rather than use health insurance, for fear of their treatment becoming known. Other patients choose not to visit our doctors at the Columbia Psychiatry Clinic, preferring a private medical office without any signs indicating the specialty practiced  inside. 

A few years ago, I gave a talk about mental illness at a luncheon in Manhattan to raise funds for psychiatric research. Afterward, I circulated among the attendees—smart, successful, and outgoing people who had all been personally invited to the event by a woman—I’ll call her Sarah—who was a prominent socialite whose schizophrenic son had committed suicide. They chatted over poached salmon and Chablis, openly praising Sarah’s selfless efforts to raise awareness—though none of them admitted any direct experience with mental illness themselves. Instead, mental illness was treated like the genocide in Sudan: an issue deserving public attention, but one distant and removed from the patrons’ own lives.

Several days later, I received a call. One of the attendees, an editor at a publishing company, asked if I could help her. It seemed that she had lost interest in her job, had trouble sleeping, and frequently become emotional, even tearful. Was she having a midlifecrisis? I agreed to see her and eventually diagnosed her as suffering from depression. But before she made the appointment with me, she insisted I keep it completely confidential—adding, “Please don’t tell Sarah!”

In the next few weeks, I received more calls from Sarah’s invitees seeking help. Over time, fully half of the people who attended this event reached out to me, and none wanted Sarah to know about their problems. 

It’s finally time to end this stigma. And, now, there’s good reason to think we can. One key reason for this psychiatric renaissance is my profession’s adoption of a pluralistic view of mental illness. Contemporary psychiatry embraces neuroscience, psychopharmacology, and genetics—but also wields psychotherapy to understand a patient’s unique problems and history.

But the hardest part for so many people—like the Kims—is simply believing that a psychiatrist can actually help. In the media, psychiatrists are still portrayed as shrinks spouting psychobabble. And although in the past, psychiatry sometimes served as a safety net for med students seeking to solve their own problems or feeling unable to compete in other disciplines, it now vies with other specialties for top trainees. 

In 2010, we were trying to recruit a talented young doctor named Mohsin Ahmed, who was considering applying to our psychiatry program. He had completed his doctorate in neurobiology under a celebrated neuroscientist who proclaimed him one of the most talented graduate students he had ever had. Ahmed was a prized recruit and had his pick of any program. Although he had signaled his interest in psychiatry, it was clear he harbored some reservations.

When the results of matching graduating medical students with training programs came out, I was thrilled to see that he had selected psychiatry after all and was coming to Columbia. But midway through his first year, he started having second thoughts and told our training director he wanted to switch to neurology. 

I promptly arranged to meet with him. He told me he was fascinated by the daunting complexities of mental illnesses but disappointed by the clinical practice of psychiatry. “We still base diagnoses on symptoms and assess the effectiveness of treatments by observing the patient rather than relying on laboratory measures,” he lamented. “I want to feel that I have some real sense of why my patients are sick and what our treatments are doing in their brains to help them.”

How could I argue? Ahmed’s concerns were common and valid. But I explained that even though we were still bridging the gap between psychological constructs and neurobiological mechanisms, it was entirely possible to embrace both, as many world-class researchers have done. The most exciting psychiatric research in the 21st century is linked to neuroscience, and all the leaders in our field now have some kind of biological or neurological training. At the same time, there is still steady progress in psychotherapy. Cognitive-behavioral therapy, one of the most effective forms of psychotherapy for depression, has recently been adapted by psychodynamic pioneer Aaron Beck to treat the negative symptoms of patients with schizophrenia.

Another promising area of research is genetics. It is virtually certain that no single gene alone is responsible for any particular mental illness, but through genetic techniques we are starting to understand how certain patterns or networks of genes confer levels of risk. These genetic signatures will lead to more precise diagnosis of patients, as well as earlier identification of those vulnerable to severe mental illness.

Actress Glenn Close’s family provided one of the first examples of the application of genetics in psychiatry. In 2011, her sister Jessie and nephew Calen, both having a mental illness, volunteered for a study led by Deborah Levy, a psychologist at Harvard. An analysis of Jessie’s and Calen’s DNA revealed that they share a rare genetic variant, resulting in extra copies of the gene that produces an enzyme that metabolizes the amino acid glycine, which has been implicated in psychotic disorders. This meant that Jessie and Calen were deficient in glycine because their bodies overproduced the enzyme that metabolizes it. When Levy gave them supplemental glycine, their symptoms markedly improved. It was like watching a patient’s fever decline after giving him aspirin. When they stopped taking the supplemental glycine, their symptoms worsened.

Using a genetic test on this mother and son pair to identify a specific drug that could ameliorate their mental illness was one of the first applications of personalized medicine in psychiatry. This technology holds the promise of revolutionizing the diagnosis and treatment of mental illness. 

In light of these advances, Ahmed’s generation would be the one to finally close the gap between psychodynamic constructs and biological mechanisms—and given his own abilities and passions, he could lead the way. Ahmed is now conducting an innovative project on the pathophysiology of psychotic disorders. Ironically, despite maintaining his focus on neuroscience research, he has shown himself to be a most empathic and skilled psychotherapist. To my mind, he personifies the 21st-century psychiatrist. No longer an alienist, shrink, pill-pusher, or reductionist neuroscientist, he has become a compassionate and pluralistic psychiatric physician.


Courtesy: Psychologytoday

Please write your comments here:-