Followers

Monday, January 6, 2014

Medicalizing Misery and the Loss of Social Suffering


Happy Pills
The real money always lay not in convincing sick people they were ill, but well people they were sick

Throughout the 1990s, the UK Royal College of Psychiatrists and Royal College of General Practitioners, enabled generously by the pharmaceutical industry, campaigned to educate clinicians and the public about depression and its treatment. The Defeat Depression campaign was pithily encapsulated in the slogan: “Depression. Treat it. Defeat it.”  Today depression is so overdiagnosed, that a recent study found that over 60% of those diagnosed did not meet the diagnostic criteria for a major depressive episode, rising to over 90% of those over the age of 65. A recent article in the BMJ claims that the DSM-5, by expanding the definition of depressive illness to include reactions to bereavement, will further erode the concept of normal sadness, leading to more people incorrectly diagnosed and treated for depression. If not depressive illness, what are we experiencing? The answer: social suffering.

The changing face of depression

From 1992 to 1996, the Royal Colleges of General Practitioners and Psychiatrists led the Defeat Depression campaign, to raise public awareness of depression as a medical illness, and to educate professionals to diagnose and treat it. A UK survey of public attitudes at the time found that 85% believed counseling to be effective but were against antidepressants, and almost 80% believed antidepressants were addictive. By 2003, traces of metabolites of the antidepressant Prozac would be found in the water supply. By 2012, 50 million antidepressant prescriptions were issued in England alone, with 1 in 6 people in some areas using antidepressants. This is despite the number of people suffering from depressive illness at any one time staying roughly stable at about 6%. How did this sea-change occur? So successful was the public education campaign and other sources of information, that many people have now come to believe that depression is caused by a chemical imbalance in the brain that is corrected by antidepressants. As the sociologist Nikolas Rose notes, we have come to recode our moods in terms of neurotransmitters, and identity in what he calls ‘the neurochemical self’. In addition, primary care doctors who bear the brunt of the endemic minor misery were heavily pushed into screening for depression, and reimbursed on quality measures including documenting depression scores of self-report rating scales like the PHQ-9. The result was an increase in new antidepressant prescriptions throughout the 1990s. This was coupled with the lack of availability of psychological treatments, with waiting lists in some areas as long as two years.

Although the detection of hitherto unnoticed individuals who suffered extreme mental anguish explain some of the rise in depression diagnoses and antidepressant prescriptions, in the early 2000s something strange happened. In the UK at least, the number of new prescriptions for antidepressants weren’t increasing, but the overall number of prescriptions were. That is to say, it wasn’t that more and more people were being prescribed antidepressants, it was that those who were on them continued. Depressive illness, once thought of as a temporary aberration, sometimes chronic and lifelong, was becoming more chronic. It is likely that antidepressants themselves may have played some role in this. There is some suggestion that some individuals on long-term antidepressant use will become depressed as a result of the chemical changes occurring in their brain, the so-called tardive dysphoria syndrome. Part of this is likely psychological too – in the narrative of chemical imbalances, taking antidepressants is just like insulin for diabetes. If you stop taking your antidepressant won’t your chemical imbalance come back? Further, by locating the source of distress inside a broken brain or twisted molecules we have disempowered individuals to take charge of their lives. The reality is, as I discussed previously, the notion that depression or any mental illness is caused by chemical balances is an oversimplification at best and a myth at worst. That is not to say there is no biological basis to depressive states, but this despair cannot be meaningfully reduced to aberrant brain chemistry.

The currency of depression

Today the word depression provides a currency of validation. When we see a doctor feeling deflated, tired, sleepless, joyless, or sad, a diagnosis of depression is like a badge of honor for the wounded warrior, it confers recognition that we have suffered so. There is something reassuring to hear an ‘expert’ tell us they know what is wrong and they know how to help. When you feel like you are drowning, a prescription for a pill, is like a lifeline that keeps you afloat. And even though you know the reasons you feel so terrible, which you are constantly reminded of, it becomes convenient to believe that something as simple as a chemical imbalance is at the root of it all. Even if you realize that the remedy may be in deeper psychological work, having more money, a better relationship, a better economy, better behaved children, a sense of self-worth, or even taking better care of yourself, those things are unavailable or not forthcoming. We make do with what’s on offer. And what’s on offer is antidepressants.

Incidentally, whilst states of despair and misery have existed throughout time and space, depression only has currency in the West in recent times. In China whilst the same syndrome of major depressive disorder is recognizable, it has no cultural cachet. Instead the diagnosis of neurasthenia is made. In Zimbabwe, anxious and depressive ruminations are captured by the term ‘kifungisisi’ (which is included in the DSM-5) which means 'thinking too much'. In Somalia, there is no linguistic concept of depression. The closest thing is ‘Qu’lub’ which translates as ‘the feelings a camel has when its friend dies’. In Latvia, the term “nervi” or damaged nerves captured the endemic suffering following the fall of the Soviet Union. Depression is as much a cultural concept as it is a biological one.

Whither Social Suffering?

In the anthropological literature, there is a concept that describes the misery that individuals experience in contexts where they are powerless, where things have no prospect of improvement, and the feelings are entirely understandable. The concept is that of social suffering. In the medical and psychiatric literature, little attention has been paid to this concept, yet it afflicts the majority in large parts of the world who are indeed the inhabitants of unstable and uncertain sociopolitical landscapes. Although this term is often applied to those in developing countries, it seems apt to describe many patients in the developed world as afflicted by social suffering too.

It is not surprising that psychiatrists have neglected social suffering. There isn’t a pill for it. We may empathically bear witness to the suffering of others, but that only goes so far. It is not surprising either that these individuals are incorrectly diagnosed as depressed, and that this misdiagnosis increases in those over 65 and live in a world that has no use and sees no value in those who no longer work. It’s much easier for everyone to transform this endemic misery into depressive illness, because if it’s depressive illness then it may benefit from medical or psychological remedy. Those in primary care feel an enormous pressure to ‘do something’ and prescribing a pill is easy enough. It temporarily satisfies both patient and clinician that something has been done. The reality is by seeing the individual as a problem, we conveniently ignore the wider social, economic, and political forces that oppress, and yes, depress. These are problems that need resolution through public policy, and not through pills or psychotherapy.

Blame the DSM-5

The Diagnostic and Statistical Manual of Mental Disorders is certainly beleaguered with problems, but the emphasis on the description of major depression as being responsible for overdiagnosis is just not true. The DSM is irrelevant to clinical practice, especially in primary care, because nobody uses it. People who felt depressed following bereavement were diagnosed and treated for depression long before the DSM-5 and will continue to do so. Sadly, the DSM-5 does not include a term that would aptly describe much of what we see, without labeling it a mental illness. That term is demoralization. Many of the patients I see are demoralized by their situation – in the face of a relationship they cannot escape from, a diagnosis of terminal illness, dashed career hopes. This is not an illness, but it deserves recognition.

As discussed above, most people diagnosed with depression don’t meet the criteria for DSM major depressive disorder. This isn’t because doctors can’t tell the difference between normal sadness and depressive despair. It’s because, when you have 10 minutes to see someone, and they tell you they’re depressed, and they want your help, and you feel pressure to ‘do something’ you are going to tell them they’re depressed and you’re going to give them what they want. And in the UK at least, there was a whole campaign that told you that’s exactly what you were supposed to do.

The medicalization of endemic misery and its medical treatment is our own doing. It takes time and resources away from the management of those with more serious physical and mental maladies, and is profitable for neither clinicians nor patients. The one actor that has profited, and handsomely at that, is the pharmaceutical company. The real money always lay not in convincing sick people they were ill, but well people they were sick. And so we continue to mask social suffering in the guise of depression.

Monday, May 13, 2013

Does DSM-5 Matter? Yes, but not to Psychiatrists.

It is an open secret that most psychiatrists do not use the DSM.




Read the news and you may be forgiven for thinking there is some violent fervor about the release of DSM-5. Its arrival is apparently “long awaited” and “hotly anticipated.” Petitions denounce it. Organizations note their “concern”. Lobby groups have called it unsafe, unfit for purpose. Campaigns for the abolition of psychiatric diagnoses appear. Survivor groups issue premature pronouncements of psychiatry’s death. I’ve been wondering: who exactly has been awaiting its arrival? It’s not researchers: The National Institute of Mental Health has made it clear that the psychiatric research agenda has moved on from categorical diagnoses. It’s not clinicians: most psychiatrists do not even use the DSM to make diagnoses. It’s not insurance companies: even in the US, most payers do not accept the DSM for billing purposes. It’s certainly not patients: a new system of classification will not improve patient care or revolutionize treatment. So then, what’s all the fuss about? Does the release of DSM-5 even matter? The answer is yes, but not as a psychiatric document. What makes the DSM so pernicious is that it is a cultural document whose influence transcends not only psychiatric practice but also the Western civilization from which it originates. Each revision of the DSM rescripts and reimagines how we make sense of our experiences, reinterprets what thoughts, feelings and behaviors are socially sanctioned, and ultimately what it means to be human.

Psychiatrists Don’t Use the DSM

One of the fiercest criticisms of DSM-5 is that it will expand the borders of mental disorder and thus psychiatrists will wrongly diagnose and treat people as mentally ill. Allen Frances, former chair of the DSM-IV task force, most ardently voices this criticism. He comes across as a silly old man nursing a narcissistic injury (he was excluded from DSM-5), throwing his toys out of his pram. He makes the assumption that psychiatrists use the DSM to make diagnoses. It is an open secret most psychiatrists in fact do not! If most psychiatrists used the DSM constructs we would not see an epidemic of bipolar diagnoses in children as young as two. In fact, most of the patients who come to me with the label of bipolar disorder, do not meet the criteria for the DSM-IV bipolar disorder construct. Schizoaffective disorder, which is supposedly a rare diagnosis, is possibly the most common diagnosis I see in the charts of inpatients which is deeply suspicious. More systematic studies show diagnoses patients garner have little to do with the DSM. For example, one study in the Veteran’s Administration system suggest 25% of schizophrenia diagnoses did not meet DSM criteria, and psychiatrists often made up diagnoses so Veterans could get benefits. In the private systems, fraudulent diagnoses are given as diagnosis determines remuneration.

Most psychiatric diagnoses are not made by psychiatrists but in primary care. Most primary care physicians do not know the diagnostic criteria for most of the common mental disorders as described in the DSM, but that does not stop these labels being used. Even for some common mental disorders most psychiatrists do not know the diagnostic criteria off by heart, and even if they do, take no heed. Take posttraumatic stress disorder as an example. This is a common mental health diagnosis. The diagnostic criteria for the construct are many and complex. I would hedge that over 90% of psychiatrists do not know the diagnostic criteria verbatim. Even if they did, one criterion is than an individual responded to a traumatic event with “fear, helplessness, or horror.” I do not know of any psychiatrists who ask their patients whether they responded in one of these three legitimated ways of responded to severe adversity, and if they did, their patients would probably be puzzled. Having no immediate reaction, or feeling anger or shame instead of “fear, helplessness or horror” to rape will not preclude a psychiatrist making a PTSD diagnosis, but if you stayed faithful to the DSM-IV, PTSD cannot be diagnosed. For depression, the bereavement exclusion is going and there has been concern people will now be diagnosed with depression following bereavement. It is already happening and has been happening for years.

That is not to say that diagnostic assessments are never useful, but this goes beyond the DSM. Diagnosis is important when it comes to identifying whether the morbid mental state is secondary to a medical condition. For example, I have treated patients who present with ‘depressive psychosis’ but this is due to myxedema coma, or those who are behaving bizarrely but have a metabolic encephalopathy. It is also important to identify whether the individual has fried their brains with drugs such as methamphetamine, ‘bathsalts’, or ‘spice’ which can lead to florid perceptual distortions and erratic behavior.


DSM diagnoses no longer guide treatment

Perhaps diagnosis informed treatment once upon a time, but this does not seem to be the case today. This is at least partly true. Individuals have experiences of mental life that cause distress and lead them to behave in ways others feel are bizarre or un-understandable. As a result they may see a psychiatrist. The psychiatrist can engage in the semiotic act of making a diagnosis. In order to do that, he engages in a precursor semiotic act, which involves recoding individual experience and observable mental phenomena or behaviors into ‘symptoms’ and ‘signs’ respectively. If he stops there, he can, and often does ‘treat’ the patient. If those ‘symptoms’ and ‘signs’ are regarded as psychosis, he will end up on a neuroleptic. If the patient is seen as ‘depressed’, he may end up on a serotonin reuptake inhibitor. If he appears ‘anxious’, perhaps a benzodiazepine will be prescribed. If ‘mood swings’ are observed, lithium or an anticonvulsant will be the order of the day. Many patients have experiences that are recoded into a bewildering combination of depression, elation, irritability, psychosis, anxiety, and may end up on an ‘antidepressant’, anticonvulsant, neuroleptic, and benzodiazepine, and if there is no response, this experience will be interpreted as ‘treatment-resistance’ and another medication will be added! I would like to say that this is a caricature of American Psychiatry, but this appears to be the rule rather than the exception. This is not how I practice, and am fortunate to have thoughtful trainers, but outside the academic ivory tower and in the community rampant polypharmacy is the rule. This happens in spite of diagnostic constructs in the DSM, not because of them. Sometimes response to cocktails is even used to support a diagnosis in a backward logic. In this way the DSM is largely irrelevant to the practice of psychiatry. Systems of psychiatric classification are relevant in the consultation room more from their influences on cultural consciousness and experience of the self, than from use in guiding diagnosis and treatment.

Redefining Personhood

Throughout history there have always been individuals who have been regarded as mad, or as Philippe Pinel called it, suffering from ‘mental alienation.’ For Pinel, to be mad meant one’s “character, as an individual of the species is always perverted; sometimes annihilated”. Without reason, man is no different “from the beasts that perish”. It is not madness that causes one to relinquish personhood, but to be identified as such. Psychiatrists, as the moral arbiters of mental life are thus also the high priests of personhood. Psychiatric diagnoses today extend far beyond ‘mental alienation’ and include a wide array of behaviors and experiences regarded as deviant. The removal of homosexuality from the psychiatric cannon is the best example of how personhood was restored to individuals previously regarded as pathological and deranged. For DSM-5, ‘gender identity disorder’ is being replaced with ‘gender dysphoria’. This is similar to homosexuality being replaced with ego-dystonic homosexuality before being expunged altogether. So whilst transgender individuals will no longer be regarded as mentally ill, it is a mental illness if you feel shit about it. A step to reclaiming personhood perhaps, but the transperson’s response to an intolerant society is still seen as pathological.

Far away from the locked psychiatric unit and the consultation room, the DSM exists in classrooms, libraries, the internet, the popular imagination. Each diagnosis at once hijacks personhood and redefines it. With the disappearance of Asperger’s syndrome, a cohort of socially awkward computer geeks have been disenfranchised and forced to rejoin ‘neurotypicals’ or be redefined autistic. The DSM provides the script of how we should respond to trauma; the narrative of resilience replaced with vulnerability. It is a veritable ‘how-to’ for those wanting to be anorexic or bulimic and join ‘pro-ana’ communities. It conveniently rewrites the ways we can be seen as ill, seek professional help, gain compensation, or even moral exculpation for our behavior. From Portland to Port Moresby, the DSM unites us with a global template for being mentally ill. In doing so, the DSM not only seeks to describe the landscape of psychopathology, it actively shapes it. Whilst removing the bereavement exclusion for diagnosing major depression may not change the psychiatrist’s attitude, it does refashion the cultural expectations of what constitutes acceptable misery. What is pernicious about the DSM is not how it shapes psychiatric practice directly – it doesn’t. Instead, it at once erodes personhood from those seen as ‘mad’, and for everyone else creates a cultural expectation that we are all sick and in need of treatment.

Wednesday, February 6, 2013

Prescribing Masturbation: An Idea Whose Time Has Come (Again)

Should vibrators be available on prescription and covered by healthcare plans?




Masturbation is the most ubiquitous expression of good sexual health. Despite this, not a moment of my medical training was devoted to the topic. Whilst masturbation is no longer explicitly considered a disease entity or the cause of disease, the idea that masturbation is pathological or immoral persists. For example, childhood masturbation continues to be called ‘gratification disorder’ by pediatricians, whilst the endurance of the term masturbation itself which literally means defilement by hand harks back to a 19th century notion that the act was ‘Forbidden by God, [and] despised by men.’ Nevertheless medicine has enjoyed a complex relationship with masturbation regarding it both as a cause of disease and as a cure. Whilst the evidence for the therapeutic uses of masturbation is not robust, I can’t help but feel that since medicine has done so much to malign masturbation, we now have a moral obligation to promote it. The time has come once more for us to prescribe masturbation.

The Medicalization of Masturbation

Whilst medical men had remarked upon masturbation on occasion since the time of Hippocrates, the belief that masturbation was not only a vice but also a disease did not take hold until the 18th century. With the publication of Onania in 1759 the stage was set for masturbation to establish itself as a pathological process that posed a looming threat to humanity. The belief in the deleterious effects of masturbation on human health was not unanimous; however, such was the popularity of this text that there appeared to be sweeping consensus of the dangers of masturbation. By the 19th century, masturbation had become associated with consumption, scrofula, feeble mindedness, insanity, a diminution of vision, and syphilis. If in the 18th century, masturbation would be seen as both a moral vice and a cause of maladies physical and spiritual, in the 19th century the Swiss physician Samuel Tissot expunged all discussion of the moral and spiritual and secured the place of masturbation as the cause of many maladies, with a “scientific” basis. In addition to the usual complaints experienced by men, Tissot proclaimed that female masturbators could also experience hysteria, jaundice, ulceration and prolapse of the uterus, and clitoral rashes. His ‘scientific’ theory was that masturbation led to disease through unnatural loss of ‘la liqueur séminale’ and secondly through the mental activity required which effectively damaged the brain. Quite how the former ‘scientific’ theory explained the ill effects of masturbation in women is unclear.

Antimasturbation fervor was at its greatest in America. Treatments including cold baths, tying of the hands, even applying carbolic acid to the penises of young boys were all enthusiastically used in the treatment of this ‘disgusting and revolting’ act. The Michigan-based physician Alonso Garwood documented a case of an orphan boy from a poorhouse who he raised as his own with a particularly severe compulsion to masturbation, and noted in the Northwestern Medical and Surgical Journal:

After using every moral means in my power, I tried cold bathing, restricting his diet to plain unstimulating food, whipping him as hard as I dared to without injuring the child, blistered his penis till it was all over raw, and as a dernier resort tied his hands. All these efforts were entirely abortive; whilst his penis was raw, he indulged as much as ever, and did not seem to regard the soreness. And when his hands were tied, he would bring on a seminal discharge by friction against his clothes, between his thighs, or between his abdomen and bed clothes, and at last he obtained such command over the abdominal, perineal and glutial muscles, in connection with the force of imagination, that he could produce a discharge sitting on a chair in my presence when there was no motion perceptible.

The desire of self gratification appeared to be constantly in his mind, and I am convinced that he would forgo any and everything else, even death itself, before he would quit the practice. Giving up all hopes of effecting a cure, and his presence becoming so disgusting and repulsive, I laid the case before the superintendents of the county and the board of supervisors, accompanied with the request, that they would destroy the indentures, and receive him again as a pauper, which they did accordingly.

Incidentally, although clitoridectomies were occasionally performed to curtail excessive female sexuality, the available medical literature almost entirely refers to males. It is almost as if the notion that women could obtain sexual pleasure without penetration was too offensive to male sensibilities.

Female masturbation did not go unremarked, however. Even in Onania, the author remarked "to imagine that Women are naturally more modest than Men, is a Mistake" and noted that “Female masturbators suffer from imbecility, fluor albus [leucorrhoea], hysteric fits, barrenness and a "total Ineptitude to the Act of Generation itself." The psychiatrist Richard von Krafft-Ebing in his Psychopathia Sexualis cites the case of two sisters who masturbated from childhood, regarding them as ‘most revolting’ and notes that hot iron treatment to the clitoris failed to temper their enthusiasm for the practice. He further notes a case of a woman who started masturbating in childhood, noting with horror that she ‘continued to practice masturbation when married, and even during pregnancy. She was pregnant twelve times.’ Krafft-Ebing believed that ‘since woman has less sexual need than man, a predominating sexual desire in her arouses a suspicion of its pathological significance.’ The Swiss psychiatrist Eugen Bleuler is noted to have smelt the hand of one of his schizophrenic female patients, for evidence of masturbation, presumably believing a causal connection.


Epidemiology of Masturbation

Given the prevalence of masturbation, and the rarity of many of the conditions it was ascribed to, it is not surprising that the view that masturbation caused so many ills did not go unchallenged. The Scottish surgeon John Hunter was among those to point out that one would expect a tendency for impotence to be more common if it were truly caused by masturbation. More recent epidemiological surveys shed light on the frequency of masturbation in various populations.

In a British Study of 11 161 participants, 73% of men and 36.8% of women reported masturbating in the 4 weeks prior to telephone interview. In striking contrast, whilst men who reported masturbation were less likely to report vaginal sex during the same period, women were more likely to report vaginal intercourse. Conversely, both men and women reporting same-sex sexual partners were more likely to report masturbation. Similarly in a study of Australian Adolescents aged 15-18, 58.5% of boys reported ever having masturbated, compared with 38.3% of girls. Further, a US cross-sectional survey of adolescents aged 14-17 found that whilst prior masturbation increased with age in females, recent masturbation did not. This contrasted with males where 67.6% of the 17 year olds reported recent masturbation, compared with 42.9% of 14 year olds. The gender disparity of masturbation epidemiology is not new. The Kinsey studies, which were the first to systematically outline sexual behavior in men and women, found that whilst 92% of men reported masturbation to the point of orgasm at some point in the life course, only 58% of women did. This prevalence figure for women was still more than was expected during the sexually conservative 1950s, and this finding was one among many that meant the publication of sexual behavior in women was much more controversial and condemned than the previous publication delineating sexual behavior in the human male. According to data pooled from the online dating website Ok Cupid!, from a sample of 78200 users, 21% Jewish women claimed to have never masturbated, compared with 9% of Jewish men. In contrast, 7.5% of women identifying as agnostic claimed to have never masturbated, along with 5% of agnostic men. Further, 18% of Muslim women, and 17% of Hindu women reported having never masturbated, far higher than male counterparts of the same religion. In sum, there exists a significant gender disparity in masturbation, and this is across cultural bounds.

Masturbation on Prescription?

Since the time of Hippocrates the treatment of hysteria in women has involved massage of the genitalia by the physician or midwife. Despite this therapy, it appears that women themselves were never encouraged to bring themselves to orgasm by stimulating their own genitalia. In fact, this was something that was explicitly discouraged on the grounds that it was deleterious to health as discussed above. Quite why the hands of the physician or husband should be therapeutic, but the woman’s own hands should be viewed as toxic to her own genitalia is inexplicable. Inexplicable but for the implication that women were incapable of arousing themselves without men. The social historian Rachel P. Maines talks of the androcentric model of sexuality, which she notes has been the predominant model in the history of sexuality. The androcentric model of sexuality recognizes preparation of orgasm, penetration, and male orgasm as the constituents of sexual activity. Female orgasm, though expected, is incidental and irrelevant. Safe for a few reports by medical men, female masturbation is but a footnote in the history of masturbation, and female masturbators are caricatured as morbid, pathological and deranged.

By the end of the 19th century, the first medical vibrator was devised, which effectively reduced the effort and manpower needed to manually stimulate the genitalia of ‘hysterical’ women. It seems likely that not only was female sexual pleasure not a goal of electromechanical stimulation, it was not even conceived of as a side-effect. If orgasm was the result of penetration in the prevailing worldview, it was not going to be achieved in this way. Little did the inventors know that not only could vibrators facilitate orgasm, they would often be far superior to penetration.

Vibrators as medical devices?

Today, vibrator use is exceedingly common. In one cross-sectional study of women who have sex with women, over three-quarters reported vibrator use, and over a quarter within the past three months. In another cross-sectional study of over 1000 participants, this time males, 44.8% reported vibrator use, either in solo or partnered sexual activities, 10% having done so in the past month. Vibrators are often recommended in the treatment of both male and female sexual dysfunction. There has been a proliferation of devices available on the market. There is a dearth of data available on which vibrators may be best for whom. Clinical research has been particularly captivated by the move to comparative effectiveness, which aims to test out different interventions against one another, on multiple outcomes in order to answer questions such as which performs better in different groups, or for different conditions. Could this sort of methodology be applied to vibrators? The answer is a resounding yes, but at what cost? A multitude of questions are generated. Should vibrators be registered and regulated as medical devices? Who will pay for the head to head comparisons of different vibrators? Should vibrators be available on prescription and covered by healthcare plans? Perhaps most concerning, do we want to risk remedicalizing masturbation and the vibrator? The answer then is not that vibrators should once again be medical devices and tested as such, but that we need more comparative data in the form of Consumer Reports and other such methodologies than can better help inform women’s choices. There appears to be a relative dearth of impartial information out there on this topic and it is not surprising. Even today, the notion of women’s sexual pleasure, especially without men appears to offend our sensibilities. Recently the Mayor of Boston’s office rejected Trojan’s request for a permit to give away free vibrators in Boston’s City Hall Plaza. Whilst we may have advanced in our attitudes towards masturbation, taboo and stigma persist.


Prescribing Masturbation: the moral imperative

There is a paucity of research investigating the efficacy of masturbation as a therapeutic treatment or as a public health intervention. Although it had been suggested that promoting masturbation may reduce HIV and STI transmission, particularly in endemic regions, the evidence supporting this is weak. On the other hand, masturbation is an important expression of good sexual health, a way for individuals to acquaint themselves with their bodies, and to relieve stress. Given how much the medical establishment has done to demonize masturbation, and denounce it as the cause of all disease and degeneration, the time has now come for us to promote masturbation. As most men masturbate, seeking to redress to the gender inequalities in masturbation would be a logical starting point. Clinicians should first seek permission to discuss the topic with women, whilst remaining culturally sensitive. They can then address any misconceptions or barriers that exist in women who do not masturbate, suggesting it as a possible activity to add to the repertoire of good sexual health. At the same time, clinicians should be mindful to explore attitudes, beliefs and concerns about masturbation without extolling the virtues beyond the evidence base. Sexual health screenings and well woman checks could provide opportune moments to discuss this, and education and counseling about masturbation can be incorporated into comprehensive preventive care and thus covered by health insurance plans.

Incorporating education about masturbation into healthcare will be challenging because taboos surrounding the discussion of masturbation persist. Arguments will be made that broaching this topic in a clinical consultation constitutes an unnecessary and unwanted intrusion of the personal sphere, and would be uncomfortable for patients and clinicians alike. Such criticisms are untenable. Given how ardent practitioners of the past were to denounce masturbation as the harbinger of disease and debility, without a shred of supporting evidence, it seems perfectly reasonable that clinicians of today might respectfully enquire whether their patients would like to talk about masturbation as part of a wider discussion of sexual wellbeing. The real challenges are not around archaic notions of sin or taboo. Rather, the challenge to redress gender inequalities in masturbation is the entrenched androcentric view that women either cannot or should not be capable of sexual satisfaction without penetration. Masturbation then, is not just a tool for sexual wellbeing, but an expression of autonomy and liberation and a challenge to the persisting attitudes that, like female orgasms, women are not only incidental but irrelevant.

Thursday, January 17, 2013

The Problem with PTSD



By wedding traumatic experiences to PTSD we have de-emphasized the role of trauma in other forms of psychopathology



“The voices, they tell me they gonna kill me, and it’s my fault.”

“Sometimes, when we hear voices, they just reflect our own anxieties, sometimes they can echo things we’ve been told in the past. When the voices tell you that they’re going to kill you, does that echo anything you may have been told in the past?” I ask. 

Monique*, a 36 year old African American woman with a long history of crack cocaine abuse and a diagnosis of paranoid schizophrenia pauses, then fixes her gaze intently on me.

“My daddy, he molested me from when I was 6 until I left home. ‘Said he gonna kill me if I tell anyone and ain’t nobody gonna believe me anyway.” She pauses again. “I never told nobody, not my mamma, not my girlfriends, not nobody.” Monique felt a sense of relief, her secret unburdened, her experiences hitherto deemed ‘ununderstandable’ and all the more frightening for it, were now understandable to her. Yet in the 18 years that Monique had been a psychiatric patient, these facets of her mental life were not explored at all. I’m sure she had been asked about trauma in the perfunctory way that characterizes modern psychiatric assessment, but never in a way where she may have made any meaningful connections with her past experiences and the current distress she lived with. For contemporary psychiatry recognizes in diagnostics only one outcome for traumatic experiences, and that is posttraumatic stress disorder. 

The Invention of Posttraumatic Stress Disorder
When PTSD entered the psychiatric nomenclature in 1980, it did so at a time when psychiatry had been remedicalized, all remnants of its former psychodynamic self expunged from the official system of diagnosis and classification of mental disorders. Any mention of hysteria, neurosis, reactions and other psychodynamic terms all but disappeared. PTSD was also unique in that it was the only psychiatric disorder in the new classification in which a cause was implied. The DSM-III was supposed to be atheoretical, foolishly attempting to be value-free. And yet, the assumption was made that PTSD was caused by exposure to traumatic stressors, an assumption that has been questioned by the occurrence of symptoms of this syndrome in those who have not experienced a traumatic event. My purpose here is not to discuss the validity of the PTSD construct, but rather to suggest that, by wedding trauma to the diagnosis of PTSD, the role of trauma in other forms of psychopathology was de-emphasized. This was implicit, deliberate, and exacting. The new remedicalized psychiatry of the 1980s had no time to discuss the social world, but was instead captivated by the notion of broken brains, defective genes, and twisted molecules. By creating a new diagnostic category, traumatic experiences could forever be entangled with PTSD, and the rest of psychiatry could be unencumbered by life stories. The recovered memories and multiple personality disorder debacle of the same decade would seek to confirm that there was some terrain that should be left untouched by psychiatry.

Trauma in the Clinic
When a patient attending for psychiatric evaluation today discloses a history of trauma, be this childhood physical or sexual abuse, rape, domestic violence, kidnapping, attempted murder, or combat exposure, the line of questioning takes a predictable turn. The patient will be bombarded with questions about whether they have nightmares or flashbacks, whether they always feel on edge, or whether there are any situations or people they avoid. It is as if these are the only types of symptoms that could possibly occur following traumatic events. This not only flies in the face of the clinical experience, it also flies in the face of epidemiological studies which show individuals are just as likely to experience depression or anxiety following a traumatic event than they are PTSD. My own clinical experience is that even more common than the traditional symptoms of PTSD are physical symptoms – chronic unexplained pains, unexplained neurological symptoms, gastrointestinal disturbance and so on. The effects of trauma are not so much embedded in a fractured mind, but a fractured body.

It has becoming increasingly uncommon for psychiatrists to consider the role of traumatic experiences in other forms of mental disorder, and the more ‘severe’ the disturbance that is experienced, the less likely that traumatic experiences will be considered. Even when life experiences are considered in the onset of severe mental illness, these experiences are rarely engaged with, and it is rarer still for meaningful connections between these life events and the symptoms to be made. Whilst it is true that most of the research into the role of trauma in psychosis is lacking in rigor and quite frankly wanting, there is a distinction to be made in what people experience and why they experience. Traumatic experiences seem to be non-specific to the development of mental illness inasmuch as they are associated with a wide range of problems including, but not limited to depression, anxiety disorders, substance abuse, personality disorders, somatoform disorders, eating disorders and so on. How much of a causal role these experiences play is largely irrelevant in clinical practice. What is relevant is that the narratives of suffering, chaos, vulnerability and resilience are so often interwoven with physical symptoms, delusions, hallucinations and other experiences. The process of meaning-making between these narratives of experience lived through and the ‘symptoms’ of mental and physical distress was irrevocably broken with the invention of PTSD.

Though there is no doubt that the experiences people have even in the so-called ‘severe’ mental disorders are often related to traumatic occurrences in the life course, I do not wish to over-emphasize the role of trauma or its psychological or physical consequences. Whilst psychiatry has done a great disservice by packaging off trauma with the diagnosis of PTSD as if it were not relevant to any other form of psychopathology, the narratives that I am privileged to hear everyday are not so much narratives of vulnerability but of resilience. My initial reaction is to be amazed and give testimony to our ability to overcome the most horrendous adversity, but the reality is, such a reaction is the product of a culture which cultivates victimhood and sees the effects of trauma as damaging, perpetual, even intergenerational. It is ironic that our society should be so concerned with toxic effects of trauma on the one hand, whilst psychiatry seems oblivious to the meaning of trauma in the phenomenology of mental life on the other. Traumatic experiences neither explain away all psychic woes, nor are they completely irrelevant. How much meaning traumatic experiences take on should not be a matter for psychiatry or cultural pressures, but for the individual in her quest for meaning. For Monique, the recognition that her life experiences, far from irrelevant to her current psychic crisis, were central, made her ‘psychosis’ seem more understandable, less omnipotent, and more manageable. 

*For confidentiality reasons, Monique is not a single patient, but represents a composite of different patients

Monday, December 3, 2012

The Denial of Pain and Mortality: Or, the Art of Self-Prescribing and the Philosopher’s Stone

I couldn’t help but feel that as a physician I had been trained to be an archetypal overreacher like Andrew, to fly too close to the sun, to steal fire from the Gods to help my patient’s cheat death, to fight the realities of existence.



“Don’t look at me! Save yourself!”

Andrew* was a 25 year old with an imposing build that was mollified only by his despair and terror. Andrew was losing his mind. I didn’t have to see Andrew and I somewhat wish I never did. I had received a call late at night from Andrew’s nurse. “You gotta give him something man, I mean, he’s freaking out and I feel really bad.” 

I knew nothing about Andrew, and as the patient was not on the psychiatric service, it was bizarre that the nurse should be calling me, as I was not responsible for his care. Something bothered Andrew’s nurse enough that he called me and so I went to see Andrew. When I arrived, he was pacing in his room, chattering to his tormentors, his mind racing, his body unable to stop moving. He was, he told me, the worst person in the world, he would contaminate me with his evil if I dared to enter his room. He covered his eyes so that he would not destroy me with his gaze. I told him I would take my chances, convinced him to sit down, and to let me sit next to him. It was not possible to be with Andrew without wanting to cry. In his despair he believed he was dead and soon to be tortured, he was a Promethean figure being punished for his grandiosity, castigated by nameless voices that had hounded him for days. To know Andrew was to know despair. It was also to discover why the nurse had wanted me to knock Andrew out. As he said, “I feel really bad.” He didn’t say Andrew felt really bad, which of course he did. Whether aware of it or not, the nurse was not asking for an elixir for Andrew’s pain, he was asking me to treat his own. 

The art of prescribing

The defining act of the physician is the ability to prescribe. This has shifted in recent years as prescribing privileges have shifted to other providers, but prescribing remains a central practice of a physician’s identity. We prescribe because it’s expected, sometimes because it’s necessary, always because it’s what we do. Prescribing goes beyond drugs and to any intervention used to cure or control symptoms.  Sometimes the act of prescribing includes ordering investigations for the aim of assuaging the anxiety of a patient, fending off the threat of litigation, or even palliating our own anxiety that we have missed something. Whether a prescription is necessary or not, it is more often also a self-prescription. The maxim that sometimes the best medicine is no medicine at all has become anathema. We intervene because to not confronts us with the pain of living, the experience of suffering, the reality of death, and most of all our own helplessness. Such is the hidden curriculum of medical school that death (and by extension suffering) is failure means that the physician who has a patient who dies, or who suffers means that we have failed. Failure is unacceptable. Intervention, and prescription as the fundamental medical intervention suppresses feelings of failure. 

Prescribing as Identity

I always half-joke with my medical students that they were far more empathic before they ever started medical school. Some nod their heads knowingly, others pretend to agree with me, a few openly scoff at me. All usually tell me by the end of their time with me they didn’t realize just how right I was, they are far less empathic now and they never even realized it. Two years of endless lectures of every aspect of the body in health and disease had changed the way they saw the world, but incrementally so they didn’t quite realize it was happening. The systematic gutting of humanity that occurs is implicit, deliberate, and unspoken. It occurs because it makes the practice of medicine more tolerable. It is but one more defense against the pain of living that we are confronted with on a daily basis. 

In the same way, prescribing is not implicitly taught as a defensive maneuver that can be employed as a way of avoiding engaging with the patient or to assuage one’s own feelings of helplessness. Prescribing is instead the therapeutic act that defines the identity of the physician. This is how my students are introduced to it.

I remember one of my students was assigned a patient who had attempted suicide and was doing therapeutic work with him. I told him that as it was his patient he could use whatever treatment he wanted. My student was keen to treat his ‘depression’ with both interpersonal therapy and an antidepressant. He had discussed with the patient at great length different medications and they had agreed on an SSRI. My own understanding for the patient was that he was lonely and I did not think that an antidepressant would be beneficial. I asked my keen student whether he still wanted to use a medication. He told me “Well I’ve never prescribed anything before, so I would [like to]!” The act of prescribing was not to treat the patient’s loneliness, it was part of the student’s transition to becoming a doctor. 

Unfortunately, he had not mastered the basic concepts of pharmacology and when I reminded him that these agents are not benign and rattled off the long list of side effects as well the possible psychological complications of antidepressants, he seemed a lot less confident in his identity as a manqué-physician.

The denial of death

5 years ago, when I was still a medical student I remember being constantly confronted with the institutional denial of death in clinical medicine. It was during my emergency medicine rotation, which I enjoyed immensely, but also began to think about quitting medical school altogether. I remember the mute patients with advance dementia who we referred for surgical evaluation, but more than that I remember the patients who were already dead that we wouldn’t let go. There is something brutally demoralizing about resuscitation. Most people don’t know how many times we frantically stab patients with needles trying to get a line in. Or the way the body writhes when it is defibrillated. Most people don’t know what osteoporotic ribs sound or feel like when they crack under the pressure of chest compressions. In physiology, death is the endpoint of life. In clinical practice, it is the failure to prolong life. Once, it was only the physician who felt this sense of failure or humiliation, an alien feeling to someone who is used to succeeding. Today, intervention, and successful intervention is expected.

The Expectation to Intervene

We have a religious faith in the capacity of modern medicine to cure and to regenerate. Medicine is the Philosopher’s Stone of our age. We are asked to imagine a world without pain, without suffering, without sickness, and without death. In the US, where medicine is a multibillion dollar profit-driven complex, we have cultivated an image of the promise of cure. For those conditions where no cure exists, it is on the horizon. America no longer only promises the pursuit of happiness, but the pursuit of immortality. With this pursuit comes the expectation from the public for the physician to intervene. A clinic or ER visit that does not result in a prescription is a disappointment, and leads to a disgruntled “the doctor didn’t do anything.” Even at the very end of life, I am troubled by just how many patients and their families insist on futile interventions including resuscitation, and the institutional reluctance to discuss the inevitability of death.

The Archetypal Overreacher

As I sat with Andrew, he told me how he had believed he could be better than God, and push the limits of human capability. It was this belief that was his downfall. Now he was tormented, and there was no escape because he was already dead and this was his hell. I couldn’t help but feel that as a physician I had been trained to be an archetypal overreacher like Andrew, to fly too close to the sun, to steal fire from the Gods to help my patient’s cheat death, to fight the realities of existence. The price was an intolerable enduring sense of failure. My training was supposed to have deadened the visceral human response to Andrew’s suffering. It was supposed to have taught me to intervene with a prescription to deftly avoid engaging with Andrew on an emotional level and contemporaneously retain feelings of mastery and omnipotence over his suffering. My training had failed me. I knew he needed a prescription to sleep. But he also needed something that was not available on prescription. He needed a hug. This simple act of human connection and kindness is one of the few things proscribed in medical institutions. Instead I put my arm on his shoulder. I realized why we often prescribe without listening, we often engaging in doing to the patient rather than being with. It is because to do so is to be confronted with failure.

*For confidentiality reasons, Andrew is not a single patient but represents a composite of a number of patients

Thursday, November 29, 2012

Suckling Pigs, Stray Dogs, and Psychiatric Diagnoses



By defining the unknown, and classifying psychopathology, we bring an element of knowability to that which we do not know.
In The Order of Things, Michel Foucault, the great French philosopher cites a ‘certain Chinese encyclopedia’ that notes ‘animals are divided into: (a) belonging to the Emperor, (b) embalmed, (c) tame, (d) suckling pigs, (e) sirens, (f) fabulous, (g) stray dogs, (h) included in the present classification, (i) frenzied, (j) innumerable, (k) drawn with a very fine camelhair brush, (l) et cetera, (m) having just broken the water pitcher, (n) that from a long way off look like flies’. The contemporary reader may look on at this classification of animals with amusement and bewilderment or contempt and derision. Yet, the same level of sophistication bedevils our classification of morbid mental life today, as cataloged in the Diagnostic and Statistical Manual of Mental Disorders, the official psychiatric bible. This cultural document, its current permutation a product of fin-de-siècle America, holds up emotions, behaviors, and beliefs deemed pathological as if they exist in external nature as timeless, universal ‘things’ and aims to define and classify them. Attempting to avoid any reference to the causal basis of the disorders listed within, it also ignores the values, meanings, and assumptions imbued within the system of classification or the context in which these ‘disorders’ are experienced. About to enter its fifth revision, the DSM has been encircled in debates regarding which elements of mental life should be recognized as morbid, which disorders should bow their farewell, and where the disorders that are included should be classified. These debates miss the point. Without acknowledging the inherent meaning making in any system of classification, and the context in which mental life is defined as disorder and then categorized, the book is not only not valid, it is not useful.

A Brief History of Psychiatric Classification

Attempts at classifying elements of morbid mental life are not new. The first known system dates back to 1400 BCE India, and the Ayurveda, which regarded aberrant mental states as resulting from different forms of possession. In the West, Hippocrates, Arataeus of Cappadocia, and Galen also tried their hand at categorizing madness. Nor were they the only ones. Throughout the intervening years a number of physicians, philosophers and theologians delineated their own classifications of mental disorder. It was not until the 19th century, however, that classifications of mental disorder came from careful observation of the apparent causes, clinical course, and prognosis to differentiate different forms of madness. Esquirol, more than anyone ushered in this era and noted five types of madness including lypemania (melancholy), mania, monomania, dementia, and idiocy. It was the German Psychiatrist Emil Kraepelin though that made the most detailed study of the classification of mental disorder, which was his life’s work and he refined his classification over eight editions of his famous Textbook of Clinical Psychiatry. The final classification included mostly medical problems including traumatic brain injury, epilepsy, syphilis, intoxication, infection, thyroid disease, and abnormal mental states in the context of brain disease. His most enduring contribution to psychiatry was taking the “amorphous mass of madness”* that existed before and separating it into manic-depressive insanity (which includes depression and bipolar states today), and dementia praecox (today’s schizophrenia). Dementia praecox was further divided into the hebephrenic, catatonic, and paranoid types. Amazingly, Kraepelin’s classification is the basis for our current psychiatric classification some 90 years later.

Why this long trend in classifying pathological mental states? By defining the unknown, and classifying psychopathology, we bring an element of knowability to that which we do not know. Further, we create the illusion and indulge in the deception of knowing more about that which we do not know than we do. It is only natural that humans should try to reason with unreason, to order to disorder, to dispassionately delineate the boundaries between sanity and madness. It is ironic then, that this attempt at understanding and demystifying abnormal mental states, should have compounded the sense of otherness and alienation endured by those experiencing mental distress.

Is depression even a mood disorder?

The DSM attempts to catalog different aspects of mental life together that share some element of commonality. For example, depression and mania are classified as mood disorders, whilst generalized anxiety disorder, specific phobia, and social phobia are classified as anxiety disorders. It would seem to make intuitive sense to file major depressive disorder under the rubric of mood disorders, and generalized anxiety under the rubric of anxiety disorders. It’s in the names after all. When we come to closer inspect the experience that is labeled as depression in the DSM, we discover that one does not even need to experience depressed mood to be diagnosed with depression. For many people, the experience is instead characterized by the inability to derive any joy from their existence, from persistent feelings of stress and worry, endlessly ruminating about the past, feelings of inadequacy, self-loathing and worthlessness. It is almost as if the experience of ‘depression’ for these individuals is summed up by thinking too much. Indeed, the concept of depression does not exist for the Shona people of Zimbabwe. Instead, what would get labeled as depression in the West is called Kufungisisa in Zimbabwe, which means, “thinking too much”. For others the experience of depression is not characterized by thinking too much or depressed mood. It is instead felt as a profoundly visceral sapping of the
vital forces, of unending fatigue, heaviness, nausea, malaise, tinnitus, and unexplained aches and pains. The appetite has waned, sleep sparse and fitful. It is an extremely physical experience. In China, the concept of depression does not exist, and despite attempts to make the diagnosis, it is not accepted. Instead, what would be understood as depression in the West is diagnosed as neurasthenia. Once regarded as an American disease, today it has been expunged from the American psychiatric classification as if it never existed. Even the selection of a particular aspect of mental experience as a hallmark as a whole category of disorders is a cultural act, laden with assumptions.

That generalized anxiety disorder should be grouped with other anxiety disorders and not depression is also more puzzling than might appear. For the Shona people of Zimbabwe, the idea of “thinking too much” might equally be diagnosed as generalized anxiety disorder or major depression in the West. It turns out there is so much overlap between the two experiences, that the idea of mixed-anxiety and depression is a common one in primary care. Intriguingly, this overlap was not always the case. In the previous edition of the DSM, generalized anxiety disorder had different diagnostic criteria altogether. Instead of focusing on symptoms of worry, or constructing anxiety in cognitive terms, generalized anxiety disorder was a fear-based diagnosis, constructed in somatic terms. The experience was seen as characterized by persistent sweating, shakiness, tremulousness, being on edge, experiencing palpitations, breathlessness, a sinking sensation in the stomach, a sense of impending doom. Just as neurasthenia (a physical experience and diagnosis) was supplanted by the more psychologically experienced depression, a somatically or fear-based anxiety disorder, has been supplanted by a more psychologically experienced anxiety disorder. As we become more psychologically minded, the way we experience distress is transformed from a somatic idiom to a psychic one.

The Problem with Psychiatric Classification

The process of psychiatric classification is weaved together with the assumption that with each new edition some new truth has been discovered, a new disorder unveiled, the boundaries between mental health and mental illness more firmly delineated, that the process is the result of scientific progress. But as Foucault demonstrated throughout his life, what appears as progress in seeing the world are merely different ways of seeing, they are not necessarily better. In the pursuit of scientific progress, our system of psychiatric classification has attempted to uniformly describe the acceptable ways in which one can go mad, as if to lose one’s mind was a uniform, discrete experience discontinuous from the experiences of sadness, joy, fear, disgust, and terror we experience in our daily lives. A psychiatric classification that ignores the wider sociocultural forces at work rather than taking these to the heart of the matter is woefully misguided. A psychiatric classification that attempts to homogenize madness, rather than accept the enormous variation in the experience of mental distress and the process of meaning making has missed the point. I will continue to dutifully document my multi-axial diagnoses in my notes. But like the DSM, my notes will be most salient not for what has been written, but for what has not been written. 

Notes
*Brockington, I. F. & Leff, J. P. (1979) Schizo-affective psychosis: definitions and incidence. Psychological Medicine, 9, 91-99.

Sunday, June 24, 2012

Chemical Imbalances and Other Black Unicorns


Like a black unicorn we have created a dangerous mythology in promoting the idea mental illnesses are caused by chemical imbalances.

“What do you think caused your problems?”, I asked.

“I have a chemical imbalance, a chemical imbalance, an imbalance in the brain that makes me ill.”

Sarah* had a diagnosis of bipolar disorder. Since her adolescence she had become acquainted with dark, shifting moods that meant she was sometimes uncontrollable and frenzied, and other times found it so effortful to live that she would retire to her bed for weeks, not eating, not bathing, not sleeping. Her every waking moment was spent contriving her own end, but to do that was too effortful itself.

Sarah joins an ever-growing troupe of patients who tell me that they have a chemical imbalance in their brain. Some have been told this by psychiatrists, others by their relatives, others still from mental health charities. None have heard this term from me. The notion that mental illnesses are caused by chemical imbalances is neither true, nor helpful. Worse still, the idea of mental illnesses as chemical imbalances is making us ill.


Medical or a Marketing Term

The term ‘chemical imbalance’ is not a medical or scientific term. Indeed a quick scientific literature search will show that the term is conspicuous by its absence. Despite this, patients and their families are often told by their physicians that their problems are caused by a chemical imbalance in the brain. Most pharmaceutical advertising for psychiatric drugs also tells consumers that mental illnesses are caused by chemical imbalances. The wide array of information for patients and their families on mental illness also frequently couches mental disorders as due to chemical imbalance. At once so simple and yet technical, it is easy to see why so many people find the idea their problems are due to chemical imbalances so compelling. It provides a simple explanation during a time when individuals crave certainty, and is packaged in the respectable veneer of pseudo-medical jargon. Make no mistake, however. There is only one reason why we have ‘learned’ that mental disorders are caused by chemical imbalances. To sell more drugs. There is one main reason too, why doctors tell their patients their problems are due to chemical imbalances. To convince people to take these drugs.

It was supposed to be a beautiful narrative. A previously well person becomes depressed, feels too listless and tired to live. A chemical imbalance is identified as the perpetrator. The ‘chemical imbalance’ is corrected with an antidepressant, and the patient is restored to her previous self. It is a story of restitution. It is a story where medicine is the hero and bad biochemistry the villain. It is a story with no basis in reality. Instead, we have convinced individuals that they are in some way defective and in need of lifelong treatment.


Making Us Sick

When a physician prescribes an antidepressant, he cannot but help but also prescribe an idea. He may not wish to prescribe the idea, indeed, he often is not aware he is prescribing the idea, but the physician nevertheless is prescribing the idea. The idea is that the problem is a chemical one, with a chemical solution. If it is a chemical problem, then it is largely outside of one’s control. The source of distress is no longer rooted in the fabric of society, interpersonal discord, a life story punctuated by loss, trauma and abuse, but it is located within the individual. It is located within the brain. Suddenly, the problem is no longer unemployment, widening inequality, social disadvantage, or alienation: the problem is you.

Once individuals become inculcated in dealing with their problems with psychiatric medication, they often increasingly see their emotions and life problems as outside their control. Further, they have little problem with medicating away emotions within the usual scope of mental life. It is not unusual for such patients who are a little upset, a little anxious, or angry, and mostly understandably so, to dull away these feelings with a dose of antipsychotic or benzodiazepine. In doing so, they undermine their coping skills and ability to tolerate the rich array of emotions threaded into the tapestry of life.

The most troubling aspect of the message is, instead of one of resilience and recovery, it is one of vulnerability and reliance. Although part of the reason why antidepressants ‘work’ is the idea provides a lifeline to an individual as a message of hope, this is transient. Eventually, patients come to wonder, ‘If I have a chemical balance, won’t it come back if I stop taking this pill?’ or ‘If antidepressants are like insulin for diabetes, don’t I need to take this forever?’ Whilst antidepressant prescriptions have on the whole been rising, the number of new prescriptions for antidepressants has not been increasing year on year. This fits with epidemiological data that show that the number of new cases of depression has actually been decreasing, but the total number of people depressed has been increasing2. What this suggests is not that more people are becoming depressed, but that fewer people are getting better. It is not so much we are all becoming depressed, but when we do, we’re staying that way. In convincing people that they have a chemical imbalance, we have disempowered them to look at how they can change their life for better, and instead made them reliant on medication. As a result instead of making people better we have kept people sick.


The New Phrenology and the Eclipse of the Social World

Today, the majority of research into the causes of mental distress focuses on neuroimaging and genetics. There are other niche interest including immunology, endocrinology, and proteomics, but on the whole, most research is biologically-oriented and focuses on brain scanning and genes. This has come at the expense of research into the social world in which people become depressed, go manic, or have psychotic experiences.

Now we should not ignore avenues of research that have the potential to transform our understanding and help individuals. My contention is that, with the possible exception of dementia, not a single patient has actually benefited from any neuroimaging research. Despite billions of research dollars, many at the public expense, not a single treatment or innovation has come out of this funding. In contrast, the finding that the relapse rate for schizophrenia was higher in families with high expressed emotion led to the development of family therapies, the finding that depression followed particular life events led to the development of interpersonal therapy, and the finding that women lacking a close confiding relationship were more likely to develop depression led to the development of befriending programs for depressed women. Yet, it has become exceedingly hard to get research funding to explore further the social and environmental determinants of health. If I wanted to do a study neuroimaging manic hedgehogs, I would not find much difficulty getting funding. On the other hand, If I wanted to explore the role of social support in outcomes for those who have psychotic experiences, it would be an uphill battle.

It comes as no surprise that when there is a Republican administration, research exploring the social determinants of mental health dwindles, and there is more funding for biological research. The obfuscation of the wider social determinants of mental distress is deliberate. Unfortunately, we have become so obsessed with finding the elusive cause of mental illness using new technologies, we have become complicit in forgetting about the determinants of our mental health in the social world.

Like a black unicorn, we have cultivated a dangerous mythology in the promotion of the notion that mental illnesses are due to chemical imbalances. Whilst there is of course a biological basis to our emotions, thoughts and behaviors, this level of explanation is unhelpful because it ignores what our feelings and experiences of living mean, and ignores the context in which we experience joy, love, anger, sadness and fear. By convincing individuals that their problems are due to chemical imbalances, we have succeeded not only in creating a generation who has recoded their moods and feelings into neurochemicals, we have undermined their ability to manage these problems themselves. Most troubling of all, the notion of chemical imbalances has transformed mental illnesses from temporary aberrations of mental states understandable within a particular context, to permanent disorders of the self embedded in the brain.

*Sarah represents a composite of different patients and not one individual.