A History of Psychiatry: Hero or Villain?

My favourite subject at university by far was the history of minds and madness, so I thought I’d share my essay with you all. Enjoy!


The notion of madness has long been acknowledged, but advancements have altered the way the mentally ill person is conceived, and there remains significant diversity in the accounts of treatments throughout history[1]. Science routinely attempts to legitimise itself by highlighting its roots, but recollection of the history of the psych sciences (i.e psychiatry and proto-psychiatry) is complicated by sporadic progress and diverse origins[2]. Furthermore, competing biological, social and psychological schools have repeatedly rewritten the past as they have risen to the forefront[3]. In this essay, I put forth that the Whiggish perspective of a linear progression[4] is too simplistic, and reflection is required to frame the changing epistemology and content of the psych sciences into wider social and ideological contexts. While there has been significant advancement in effective treatments, I argue that psychiatry needs to be humble in acknowledging that these have originated from its past, and refrain from ‘recasting its heroes as villains’[5], as events have only come to be seen as abuses in hindsight. Furthermore, I aim to highlight the present decline in progress of conceptualising and understanding mental illness. In order to do this, I will expand on the themes of; humoralism, confinement and moral therapy, shock therapy, the emerging pharmaceutical industry, and the classification system of the third Diagnostic and Statistical Manual of Mental Disorders (DSM-III)[6].

The emergence of humoralism caused a departure from the long held tradition of attributing madness to supernatural and divine causes[7]. Hippocrates medicalised illness through his proposition that wellbeing was contingent on the balance of four humours (blood, phlegm, yellow bile, and black bile)[8]. It was believed that physical and mental health were equivalent, and therefore mental ailments could be treated through physical methods such as cupping, bloodletting and induced vomiting which would restore the humours to optimal balance[9]. This notion of physical qualities provided a naturalistic conception of madness that remained popular for much of the 17th Century[10], and the concept of balance[11] which recent disease specificity has lost. Robert Burton endorsed humouralism in his Anatomy of Melancholy[12], whose namesake references the theory of melancholia; significant sadness and anxiety thought to be caused by an excess of black bile[13]. On the other hand, mania was described as the possession of ideas that differed from the truth[14]. In 1886, Wundt regarded these as two different dimensions[15], and thus established an early dimensional model. The notion of physical origin later became associated with reason (or lack of) through Cartesian dualism, but this was challenged by Locke’s tabula rasa which was significant as it proposed methods of retraining the brain[16]. Thus hile it was eventually invalidated, the long endurance of humoralism can be attributed to its relatively sophisticated structure in a time of few alternatives.

Following humoralism came the mad doctors and institutionalisation of the 17th-18th Centuries, and the subsequent moral therapy[17]. The early beginnings of ‘the great confinement’, as Foucault refers to it, can be traced back to the foundation of the Hôpital-Général in Paris in 1656[18]. For a long time, madness was seen as an individual issue but greater awareness, population growth and ideological changes saw it ‘transform[ed]… into a social problem’[19]. These first hospitals were the attempt to deal with the impoverished, incompetent and society’s outcasts[20]. These were not state-led as sometimes argued[21], and highlights that the conception of the psychiatric inpatient predated medical psychiatry[22]. Mad-doctors like Monro continued to use humoralism inspired methods,[23] and patients were often managed like animals. Foucault reasons that this went beyond physical restraint by removing the fascination and ‘voice’ of madness[24], thereby reducing it to an absence of humanity. Such was the case for James Norris, who was chained leash-like for 12 years at Monro’s Bethlehem Hospital[25]. Yet, many argue that contrary to Foucault’s previously mentioned claim, there was nothing ‘great’ or large-scale about confinement in the 17th century, as many favoured private asylums, and outside France institutionalisation rates were low[26]. Furthermore, despite the common interpretation that ‘the 18th century…was a disaster for the insane’[27], massive rehabilitation of institutions occurred through the Mad House Act (1828), and alongside the legends of Tuke and Pinel, through Battie[28] and Conolly[29], and emerging moral therapy[30]. Moral therapy was based on the premise that routine and compassion could treat madness which was an illness[31], and asylums were restructured and built to be places of ‘calmness… hope… [and] satisfaction…where humanity…shall reign supreme’[32] . Nevertheless, this movement failed due to their increasingly chronic and aging population[33], and an inability to maintain high physician-patient-ratios in public institutions[34]. Asylums were not where psychiatry was practiced, but where it was developed to treat and manage patients, and the location where theories were put into practice[35]. While to Foucault, asylums remained repressive[36]; it was here, that humoralism was shown to be ineffective and psychosurgery to be rarely useful; it was here, that optimism and early intervention were born [37]. Hence, though the asylums are considered inhumane, moral therapy conveyed a level of care not previously available.

Before moving on to the biological movement, it seems necessary to define psychiatry and recognise other schools of thought. The term psychiaterie, meaning soul and mind physician, was introduced as a medical discipline in 1808 by Johann Reil who advocated for humane treatment, de-stigmatisation and argued that, ‘we will never find pure mental, pure chemical or mechanical diseases’[38]. This idea of interrelatedness contrasts with the pure biological view of psychiatry, and the psychoanalytic period that Shorter describes as a ‘blip’ in psychiatry’s history[39]. Influenced by Charcot’s inability to find a biological cause for hysteria, Freud’s popular 20th century paradigm focused on talk therapy and argued that mental illness was caused by conflict and repression in the unconscious[40]. To the psychiatrists who wanted to legitimise their field as evidence based, it was a disaster; but from those that viewed psychiatry as pseudo-medical[41], it was ‘the most sophisticated sector’[42]; and from a psychological view, it was the foundation for psychotherapy. Psychoanalysis occurred parallel to significant medical discoveries[43].

The ‘second’ biological movement of the 19th Century, the first involving phrenology and lobotomy, believed madness was caused by a malfunction of the brain and treatments were designed to shock it into correcting itself[44]. The medical term shock refers to states of hypotension and hypothermia[45], and attempts to produce this were done through deep sleep therapy, and insulin comas (1933)[46], which were originally thought would induce seizures[47]. This was founded on Meduna’s observations that epilepsy was uncommon in schizophrenics, and of contrasting changes in the brains of epileptics and schizophrenics, all which suggested that schizophrenia was an antagonist of epilepsy[48]. Subsequently, drugs like Camphor became used to induce seizures, but while arguably effective at reducing psychosis they also caused residual seizures[49]. These side effects motivated the development of the Bini-Cerletti ECT machine in 1938 and the method of using electricity to treat Schizophrenia[50]. The original experiment suggested a decline in symptoms, but is criticised for being vague and likely doctored, especially as it is not popularly known that the first patient Enrico X later relapsed[51]. New methods have highlighted that the prevalence of epilepsy in schizophrenia was actually higher (1.5%) than early studies had suggested (0.13%) and support for biological antagonism declined[52]. A search for a new basis of ECT found that it was more useful for depression than schizophrenia (1940s)[53]. As is a common theme in psychiatry, ECT became a treatment for a disease that it was not intended for which questions the basis and legitimacy of the original experiment. However, it is now thought to be one of the most successful treatments for both severe mood disorders and psychosis and schizophrenia[54]. Highlighting again, that much of the underlying mechanisms is unknown.

Additionally, the growth in biological psychiatry demanded the development of psychopharmacology. Shorter claims that psychiatrists have a tendency to abandon a treatment once there appears to be a better one, and ECT was for a time largely discarded in preference for medications[55]. The beginnings of psychopharmacology were marked by the discovery of lithium (1940s), and Chlorpromazine (1951), the first antipsychotic[56]. In a time where psychoanalysis’ influence was apparent, chlorpromazine was beneficial, as it appeared to cause sedation while allowing capacity for therapy[57]. Psychopharmacology began by using drugs whose mechanisms were not understood, to treat illnesses that were not yet defined[58], however the results of those that worked allowed investigation into the mechanisms behind mental illness[59]. Thus, chlorpromazine made it possible to explore a neural basis of Schizophrenia, and in 1963, was found to increase the metabolism of dopamine[60], which led to the dopamine hypothesis (1966), that Schizophrenics may have overactive dopamine pathways[61]. Thus changing the understanding from electrical impulses to ‘chemically mediated’ signals and leading to the discovery of more neurotransmitters[62]. This allowed breakthroughs into the mechanisms of other illnesses, other antipsychotics, and ultimately to antidepressants[63]. However, it is argued that the search to discover treatments has changed to ‘medicalizing aspects of the human condition’[64] and therefore in trying to progress, psychiatry has distanced itself from essential social and cultural discoveries of the past[65]. Furthermore, the knowledge about pathology has often been overplayed, i.e. the psychiatry’s miracle drug chlorpromazine has been labelled by antipsychiatry as a ‘chemical straitjacket’[66], and research by Kirsch questions the foundations of the neural theories whose results he attributed to the placebo effect[67]. Though there is increasing evidence in favour of the benefits of medication[68]. Regardless, psychiatry has been beneficial insofar as individuals have become to be seen as suffering from a disease ‘in the same sense that cancer or high blood pressure are diseases’[69], and thus it is a step towards de-stigmatising mental illness. In summary, to evaluate the utility of medication it remains necessary to understand the influences involved in its creation, and how perceptions have changed[70].

Moreover, the line between mental illness and normality has still not been clearly described. As put by the Journal of Mental Science, ‘our knowledge of the mental functions of the brain is still comparatively obscure’[71]. Early attempts at taxonomy were made by Kraepelin who abandoned the psychoanalytic focus on aetiology, and the idea of a continuum between normality and madness, in favour of descriptive pathology and categorisation of distinct biological diseases.[72] This was replicated in the creation of the DSM-III (APA, 1980)[73], a manual commonly used to diagnose mental health disorders. The DSM, now up to it’s 5th edition, has undergone many revisions through its attempts to please the current popular movements. These definitions indirectly influence society’s notion of mental illness through the inclusion of symptoms that subsequently, if not already, become perceived as abnormal and hence, the DSM has arguably created many illnesses[74]. Furthermore, despite intending to provide clear definitions to be used by trained professionals, it has regularly failed to do so and there is subjectivity within the medical field. It is possible that this has been caused by the extrapolation of the diagnosing role to professionals other than psychiatrists. Nevertheless, to classify a disorder necessitates that it be understood, but time again history has highlighted the lack of knowledge around aetiology and thus the categories are likely to be incongruent with the structure of mental illness[75]. Furthermore, the DSM was highly influenced by prestige, race, politics, culture and gender prejudices[76], and what makes something a disorder should instead depend on science and the level of disability it has on an individual[77]. There is some argument that the DSM has also been shaped by drug use; occasionally responses to a psychoactive drug have led to distinctions between disorders or perceived boundary change[78]. Additionally, the purely descriptive approach has impaired the development of valid diagnostic categories due to the absence of a testable system[79]. While it may have been progressive in it’s time, the DSM highlights the lack of scientific progress that has occurred since the conception of the DSM-III.

In conclusion, retracing the past may appear to contrast with the linear progression aligning with biological psychiatry, yet progression is not usually undeviating, and it is necessary to understand how psychiatric methods and ideas become recognised as acceptable despite their many risks[80]. I argue that the use and abuse of psychiatry, from humoralism to the asylums and more recent times, was often the result of limited knowledge and thus in the future, modern methods may be criticised for being ‘blinded by biology’[81]. Psychiatry has made considerable advancements in treatments for mental illness, but it is necessary to remain humble about the past and recognise that what is perceived as moral, varies depending on the context. Furthermore, I contend that psychiatry’s conception of mental illness has scarcely progressed at all and this remains a significant problem when treatments are based on classifications provided by the DSM. Future classifications should critique the early theories that led to the DSM, and attempt to provide a measure that aligns with the true nature of mental illness.

 


[1] Eric J. Engstrom, “Cultural and Social History of Psychiatry,” Current Opinion in Psychiatry 21, no. 6 (November 2008): 585, doi: 10.1097/YCO.0b013e328312674f.

[2] Mark S. Micale and Roy Porter, Discovering the History of Psychiatry (Oxford University Press, 1994), 4.

[3] Micale and Porter, 5.

[4] Edwin R Wallace and John Gach, eds., History of Psychiatry and Medical Psychology (Boston: Springer, 2008), xxiii.

[5] Roy Porter, Madness: A Brief History (Oxford: Oxford University Press, 2002), 12.

[6] American Psychiatric Association, Diagnostic and Statistical Manual of Mental Disorders, 3rd ed. (Washington: APA, 1980).

[7] George Androutsos et al., “Health and Disease in Ancient Greek Medicine,” International Journal of Health Science 1, no. 2 (2008): 32.

[8] Lois Hague, “The Four Elements, Four Qualities, Four Humours, Four Seasons, and Four Ages of Man,” Wellcome Library (London: Wellcome Library, 1991).

[9] Androutsos et al., “Health and Disease in Ancient Greek Medicine.”

[10] Michel Foucault, Madness and Civilization: A History of Insanity in the Age of Reason (London: Tavistock, 1967), 197.

[11] Androutsos et al., “Health and Disease in Ancient Greek Medicine,” 35–36.

[12] Robert Burton, Anatomy of Melancholy: What It Is, with All the Kinds, Causes, Symptomes, Prognostickes & Severall Cures of It : In Three Partitions, with Their Severall Sections, Members & Subsections Philosophically, Medicinally, Historically, Opened & Cut Up, 4th ed. (Oxford, 1632)

[13] Hague, “The Four Elements, Four Qualities, Four Humours, Four Seasons, and Four Ages of Man.”

[14] Foucault, Madness and Civilization, 125.

[15] Robert M Stelmack and Anastasios Stalikas, “Galen and the Humour Theory of Temperament,” Personality and Individual Differences 12, no. 3 (1991): 261, doi: 10.1016/0191-8869(91)90111-N.

[16] Porter, Madness: A Brief History, 35–36.

[17] Porter, 53–54.

[18] Foucault, Madness and Civilization, 39.

[19] Gerald N. Grob, “The Transformation of American Psychiatry,” in History of Psychiatry and Medical Psychology (Boston: Springer, 2008), 533, doi: 10.1007/978-0-387-34708-0_18.

[20] Foucault, Madness and Civilization, 40.

[21] Porter, Madness: A Brief History, 53.

[22] Grob, “Transformation of American Psychiatry,” 534.

[23] Andrew Scull, The Insanity of Place / The Place of Insanity: Essays on the History of Psychiatry (Oxford: Routledge, 2006), 56.

[24] Foucault, Madness and Civilization, 38.

[25] Foucault, 72.

[26] Porter, Madness: A Brief History, 53.

[27] Scull, The Insanity of Place, 42.

[28] Porter, Madness: A Brief History, 56.

[29] Scull, The Insanity of Place, 21.

[30] Porter, Madness: A Brief History, 58.

[31] Aaron Rosenblatt, “Concepts of the Asylum in the Care of the Mentally Ill,” Psychiatric Services 35, no. 3 (1984): 244.

[32] John Conolly, On the Construction and Government of Lunatic Asylumd (London: Churchill, 1849), 143.

[33] Grob, “Transformation of American Psychiatry,” 540.

[34] Porter, Madness: A Brief History, 64.

[35] Porter, 55.

[36] Foucault, Madness and Civilization, 266.

[37] Porter, Madness: A Brief History, 56–57.

[38] Andreas Marneros, “Psychiatry’s 200th Birthday,” The British Journal of Psychiatry 193, no. 1 (July 1, 2008): 1–2, doi: 10.1192/bjp.bp.108.051367.

[39] Edward Shorter, A History of Psychiatry: From the Era of the Asylum to the Age of Prozac (New York: John Wiley & Sons, 1997).

[40] Porter, Madness: A Brief History, 94.

[41] David Cooper, “Introduction,” in Madness and Civilization: A History of Insanity in the Age of Reason (London: Tavistock, 1967), viii.

[42] Cooper, ix.

[43] Porter, Madness: A Brief History, 98.

[44] Edward Shorter and David Healy, Shock Therapy : A History of Electroconvulsive Treatment in Mental Illness (New Jersey: Rutgers University Press, 2007), 6.

[45] Shorter and Healy, 9.

[46] Nancy A Piotrowski and Frank Guerra, “Shock Therapy,” Magill’s Medical Guide (Online Edition) (Salem Press, 2016).

[47] Shorter and Healy, Shock Therapy, 17.

[48] Shorter and Healy, 24.

[49] Shorter and Healy, 39.

[50] Piotrowski and Guerra, “Shock Therapy.”

[51] German E Berrios, “The Scientific Origins of Electroconvulsive Therapy: A Conceptual History,” History of Psychiatry 8, no. 29 (1997): 110.

[52] Berrios, 108.

[53] Berrios, 109.

[54] Piotrowski and Guerra, “Shock Therapy.”

[55] Shorter and Healy, Shock Therapy, 164.

[56] Thomas A Ban, “Fifty Years Chlorpromazine: A Historical Perspective,” Neuropsychiatric Disease and Treatment 3, no. 4 (2007): 495–500.

[57] Shorter and Healy, Shock Therapy, 165.

[58] Samuel H. Barondes, “The Biological Approach to Psychiatry: History and Prospects,” The Journal of Neuroscience 10, no. 6 (1990): 1708.

[59] Shorter and Healy, Shock Therapy, 170.

[60] Bertha K. Madras, “History of the Discovery of the Antipsychotic Dopamine D2 Receptor: A Basis for the Dopamine Hypothesis of Schizophrenia,” Journal of the History of the Neurosciences, 22 (2013): 63, doi: 10.1080/0964704X.2012.678199.

[61] Madras, 64.

[62] Ban, “Fifty Years Chlorpromazine: A Historical Perspective,” 497.

[63] Ban, 497.

[64] David Healy, Creation of Psychopharmacology (Harvard University Press, 2002), 2.

[65] Andrew Scull, “Madness and Meaning,” The Paris Review, 2015, https://www.theparisreview.org/blog/2015/04/22/madness-and-meaning/.

[66] Healy, Creation of Psychopharmacology, 4.

[67] Irving Kirsch, “Antidepressants and the Placebo Effect,” Zeitschrift Fur Psychologie 222, no. 3 (2014): 128, doi: 10.1027/2151-2604/a000176.

[68] F Hieronymus et al., “Efficacy of Selective Serotonin Reuptake Inhibitors in the Absence of Side Effects: A Mega-Analysis of Citalopram and Paroxetine in Adult Depression,” Molecular Psychiatry 00 (2017): 1, doi: 10.1038/mp.2017.147.

[69] Nancy C Andreasen, The Broken Brain : The Biological Revolution in Psychiatry (New York: Harper & Row, 1984), 18.

[70] Andrea Tone, “Listening to the Past: History, Psychiatry, and Anxiety,” The Canadian Journal of Psychiatry 50, no. 7 (June 1, 2005): 377, doi: 10.1177/070674370505000702.

[71] Porter, Madness: A Brief History, 105.

[72] Jonathon Y Tsou, “Natural Kinds, Psychiatric Classification and the History of the DSM,” History of Psychiatry 27, no. 4 (2016): 412–13, doi: 10.1177/0957154X16656580.

[73] American Psychiatric Association, Diagnostic and Statistical Manual of Mental Disorders.

[74] Rachel Cooper, Classifying Madness : A Philosophical Examination of the Diagnostic and Statistical Manual of Mental Disorder (New York: Springer, 2005), 1.

[75] Cooper, 4.

[76] Porter, Madness: A Brief History, 103.

[77] Cooper, Classifying Madness : A Philosophical Examination of the Diagnostic and Statistical Manual of Mental Disorder, 2–4.

[78] Cooper, 126.

[79] Tsou, “Natural Kinds,” 407.

[80] Tone, “Listening to the Past,” 375.

[81] Scull, The Insanity of Place, 85.

Advertisements

On Completing University Part-time

At the beginning of my degree, if I’d been asked if I would ever go part-time with my course, I would have certainly said ‘no’. Yet, that’s not how things occurred and after my first year, I did drop a subject (still considered full-time) and then later took a leave of absence for a year. Unsurprisingly, my views about going part-time have changed.

University can be full on and I found myself drowning in coursework and exams, and for a while forgot who I was as a person. It’s important to have a balance between studying, work, and things that you find enjoyable. My problem was that I was too focused on grades and living up to my peers, and therefore wasn’t enjoying it.

The best thing I ever did was take time off. The notion of completing an undergrad in three years is changing and I’ve come across a lot of people who are choosing to do it over four or even five! You are not weak or abnormal for choosing to take that little bit longer during your studies and you may actually be better off. Spending more time on fewer subjects means you will have a better chance at getting those HDs and be able to enjoy university life more.

If you need to take longer than usual to finish your studies, that is okay. It is perfectly normal. You are not worth less than your peers. You are not doing anything wrong. You are not expected to complete them in a set time. I wish someone had told me when I first started because I may have been able to have had a better quality of life during the beginning of my studies.

How I Write My Notes Now

  1. I used to organise my notes on the computer as seen here, but found that it was difficult for me to remember them and I am the type of learner who needs to physically write my notes out.Screen Shot 2018-02-24 at 12.59.38 pm.png
  2. The method that works best for me now involves printing off the lecture slides, writing on them during lectures and then organising them in folders. I tend to have one folder for two subjects so two folders per semester. This has changed a lot to how I previously took my notes, which you’ll see was primarily computer-based. I also take a plastic document folder to all my classes for loose pieces of paper which go into my folder when at home. Then during SWOTVAC, I rewrite my notes into a spiral notebook and/or cue cards and mindmaps.img_22851.jpg

It’s taken me a long time and a lot of trial and error to work out what works for me but now I have and it’s become almost automatic. You need to find a method that works for you and for the subject you are studying because everyone learns differently.

A History of Psychiatry in 5 Objects

  1. 400-500 BC: Humorism

Untitled.png

Figure 1: Humorism

Humorism was an early theory for the mechanisms of the body. Hippocrates (400-500BC) stated that illness was caused by an imbalance of four humours; blood, phlegm, yellow bile, and black bile (Stelmack and Stalikas 1991, 257), an idea that reigned until the 17th Century (Bos 2009, 29). Galen (Hague 1991) later linked blood to a sanguine disposition (hopeful); yellow bile to a choleric one (easily angered), phlegm to a phlegmatic one (calm, neutral); and black bile to melancholia (sadness). It was understood that imbalances could be adjusted with physical treatments, such as bloodletting and emetics that would remove a humour that was in excess (Androutsos et al 2008, 32), and bleeding and vomiting were evidence that the imbalance was rectified.

Humorism provided a holistic view of wellness that acknowledged the importance of both physical and environmental factors, such as diet and exercise (Telles-Correia and Marques 2015). It was predominately a departure from supernatural ideology and the role of the gods, in favour of the concept that physical and mental health were intrinsically linked (Bos 2009, 31). Bos (2009, 29) believes the decline is linked to a parting from a focus on character in favour of alternative theories. Nevertheless, the idea of balance remains prevalent.

2. From the 17th C: The Asylum

1

Figure 2: Bethlehem Hospital 1714

The concept of locking up the insane began in the late 17th Century (Porter 2002, 51-52). The Asylum was an institution based on moral therapy; the premise that psychology and compassion could be used to treat severe mental illness (Rosenblatt 1984). Prior to this, individuals were largely kept in the community (Porter 2002). One of the earliest hospitals was Tuke’s York Retreat (1796) (Rosenblatt 1984, 246), which echoed the theory of Pinel breaking patient chains (Porter 2002, 58). Both advocated for routines, pleasant surrounds and to abolish restraint.

The asylums were self-contained yet isolated, with architecture that was part of the treatment (Porter 2002, 62). Early on, conditions varied greatly and following outrage at abuse of patients like William Norris at the Bethlehem Hospital (1814), a move was made towards regulation through the Mad House Act of 1828 (Wiles 2016). However, due to overpopulation, patient care declined and by the 1890s more were leaving dead than cured, influencing consequent deinstitutionalisation (Wiles 2016, Porter 2002, 64).

Originating as a place of refuge, asylums were a sign and place of progress (Porter 2002, 65). While it is argued that moving the focus from the body to the mind merely meant switching repression type (Foucault, 1988, 266), this does not warrant overlooking the introduction of compassion, hope and a person-central approach (Shorter, 1997, 4).

3. 1939: Electro-Convulsive Therapy

2Figure 3: Australian ECT Machine (Melbourne Museum 2017)

Electro-convulsive therapy (ECT) is arguably the greatest discovery of psychiatry, and the most effective and empirical treatment (Shorter and Healy 2007, 2).

Replacing chemical alternatives such as metrazol and insulin, both dangerous and less successful, it is a form of shock therapy (1935; Shorter and Healy 2007, 6) which was based on Meduna’s idea that schizophrenia was an antagonist of epilepsy. The concept was to treat symptoms by disrupting brain activity through seizures, coma or loss of consciousness (Piotrowski and Guerra 2016). The development of the Bini-Cerletti ECT machine was motivated by the desire to find a safer treatment.

4. 1951: The First Antipsychotic (Chlorpromazine)

Laborit’s and Rhone-Poulence’s discovery of Chlorpromazine (1951), the first antipsychotic, was a precursor to the rapid development of the psychopharmacology industry (Ban 2007).

3

It was initially used with general anaesthesia to sedate and prevent shock(Carpenter and Davis 2012, 1168) and when trialled on patient Jacques Lh. undergoing ECT, an improvement of psychiatric symptoms was observed (Ban 2007). This was replicated by Deniker and Delay, who announced in Luxemburg that Chlorpromazine reduced psychosis symptoms(Carpenter and Davis 2012). Following this, Chlorpromazine saw the wards of asylums grow calm as noisy schizophrenic patients became quieter and more docile (Elkes and Elkes, 1954, 560), thus reducing violence and the number of hospitalised patients. Hence, despite some side effects like tardive dyskinesia, it became regarded as a miracle drug (Ban 2007, Carpenter and Davis 2012)

4

Following moral therapy’s failure, antipsychotics offered the possibility of a scientific and medical approach that shifted the location of clinical care(Carpenter and Davis 2012, 1168). It wasn’t long before advertisements began marketing Chlorpromazine at not only schizophrenia, but also emotional instability, hiccups and cancer (APA 1956, 2; APA 1958), highlighting the lack of understanding of the mechanisms of the drug. Over time the uses became more specific and psychopharmacology grew rapidly and chlorpromazine prompted the development of the dopamine hypothesis (Carpenter and Davis 2012, 1170).

5. 980- The Diagnostic and Statistical Manual of Mental Disorders-III

5Figure 6: DSM-III

The third edition (1980) of the APA’s DSM (Diagnostic and Statistical Manual of Mental Disorders), was revolutionary for the diagnosis and treatment of mental illness (Decker 2013, xvii).

The DSM-III influenced psychiatry’s shift from an aetiological and psychoanalytic focus, towards descriptive classification (Decker 2003, xvi). Neo-Kraepelinian in nature, (Tsou 2016), it emphasised symptoms and course and aimed to provide clear and valid definitions (APA 2017). This was influenced by the 5-axis system implemented by Spitzer, who advocated for biological ideas (Decker 2013, 315-317). The manual was larger than it’s predecessors and involved the caveats that the criteria were not completely discrete, and should only be used by psychiatrists (APA 1980), for whom it became convenient shorthand.

The DSM-IIIs flaws can be seen through attempts of subsequent editions to rectify mistakes, such as homosexuality’s removal in 1973 (Cooper 2004, 6) and some argue that it has not managed to progress beyond description (Tsou 2016). The criteria were not as clear or evidence based as intended, and symptom thresholds excluded many people from diagnosis (Cooper 2004, 5-22). Nevertheless, the DSM-III was a milestone document due to its descriptive diagnostic categories and the support for which it gave to the disease model (Decker 2013). It was the first DSM to become widely used by professionals and provide a uniform method of diagnosis (Tsou 2016).

The results of electrically induced seizures in a patient called Enrico X, were presented at the 3rd International Neurological Congress (1939), with a reported significant reduction of symptoms (Shorter and Healey 2007, 43), from which ECT rose to popularity in the 1940s. Early on, seizures would result in physical harm and this led to the development of muscle relaxants and use of anaesthetics (Piotrowski and Guerra 2016).

Despite initial popularity, in the 1980s antipsychiatry and politics resulted in rapid decline in the use and reputation of ECT, damage that is still being reversed today (Shorter and Healey 2007, 145). ECT’s development is significant as it is an empirical treatment that can produce rapid responses to acute symptoms of psychosis and depression (Shorter 1997, 3).

 

Going into my final year of study

This year will be my fourth year completing my undergraduate degree. I started it straight out of high school and have now completed two full years of study. It’s difficult knowing that many of my classmates have graduated and sometimes that makes me feel left behind, but other times it doesn’t bother me too much. I may have taken longer but I needed to due to my health and I believe that my grades have been much higher than what they would otherwise be. So in some ways, my peers have surpassed me but in other ways, they have not. Slow and steady wins the race as they say but this is my personal experience and what works for me might not work for others.

This year I am completing two full-time semesters and one winter subject that I need to make up. I am majoring in psychology and this means that I have two core subjects plus some elective psych subjects to do. I am excited because this means that I finally get to go more in-depth in clinical psych and neuroscience, areas which I deeply enjoy. However, I am also scared about what this means because at the end of this year my degree will be finished. Then I have to choose where I want to go from there. Will I continue with honours? Will I complete my masters? And in what and where? I’m anxious even thinking about it. I am just as uncertain about my future as I was when I had just finished high school.

I hope that this year I can do well with my grades and get into something that makes me happy and that I’m passionate about. I want to do well but I also want to stay mentally and physically well. I think I am finally learning to balance and manage my illnesses with other commitments. It’s taken me a long time but I’m heading in the right direction and that gives me hope.

Navigating University With a Mental Health Condition

Being independent is difficult and even more so, when you are balancing a mental health condition. If the media is correct, then going off to university is meant to be the peak of a young person’s life; it’s the border between being a teenager and becoming an adult. Yet, for many young people it can be the source of disappointment or uncertainty. Below are some tips for how to make the most of university and the resources available.

Register for the disability service

Most unis have a service dedicated to supporting students with an illness, including mental illness. They can provide important information about special consideration, help with applying for extensions, offer alternative assessment rearrangements and check in to see how you’re doing.

Utilise course planning services and student advisors

You can often feel like your identity has been reduced to just one number among many so making appointments with advisors can help you to engage with university staff and feel like you are being listened to. There is staff dedicated to helping you plan your degree, sort out accommodation and financial aid and navigate other services.

Get Organised

Diaries are an extremely important and simple way to keep track of assignments and dates and getting stationary together before classes start can help you to feel on top of things. You can also carry around your class timetable in case you ever need to check it.

Download the Lost on Campus app

The lost on campus app provides a map and direction system that can be used to find classrooms and lecture theatres. If you ever get lost then all you need to do is open the app up on your phone and put in where you want to go and it will direct you.

Seek outside support

Whether this is through a GP or psychologist (most unis have a service too but you won’t want to have an outside one), or through Centrelink having extra support can be crucial to reaching your potential.

Talk to staff

Your tutors are there to help you so talk to them if any issues come up and you’re not sure what to do. There also there to help you learn so make sure to ask questions about the content and assessments and are not a stranger to the teaching staff. The best way to learn is by asking questions.

Look after yourself

Self-care is an essential part of staying well. Find other things that you enjoy and that aren’t related to study, to give yourself a break. This could be through university clubs, learning an instrument or language, work, volunteering, art groups or reading. Whatever it is, make sure you have something that breaks up the study so that your life is enriched and you don’t burn out. Grades are important but there is also more to life than studying.

My undergrad degree was worth it

Going into my degree, a 3-year bachelor of science, I knew that it wouldn’t result in an instant qualification or easy transition into a high-paying job. Some days I regret that, but most days I don’t. Yes, I could have done a health care undergrad and enjoyed it, but as a 17 year old I was confused and just wanted to learn. I had an idea of what I wanted to do but I wasn’t ready to commit.

Now, coming towards the end of my degree the irony is I’m still tossing up between the same options; nursing, OT and psych, but I’ve also added research.

Was it worth it? Yes. Yes, it was.

I was given the chance to choose electives from history and philosophy, creative writing and linguistics. I studied plant science, Australian flora and fauna, chemistry and biology. All of this, enriched my psychology major and I’m grateful for the diversity of the knowledge I’ve been able to develop.

Psychology is one of those areas that can benefit from studies of history, biology, and philosophy. In the APA major, only limited time is given to each of these and my elected subjects have provided me which a much broader understanding that I wouldn’t have otherwise gained.

Biology has helped me to understand how the brain works in relation to the body. Philosophy has taught me about mind-body dualism and the philosophy of the mind and mental illness. History has taught me about how health was once conceived by Galen as a balance of four bodily fluids, that mental illness was once thought to be caused by witchcraft, that the asylums were not all doom and gloom and moral therapy provided a more humane understanding. I learned that psychiatric medications and the DSM were all formed on unstable and even experimental foundations, that even today ECT is the treatment with the most evidence behind it. I wouldn’t have understood any of this from just my major.

In addition to this, I’ve leaned things that have made me a better person. I gained an understanding of the major schools of philosophy; Buddhism, Descartes, Aristotle, Spinoza, and Kant. Through physiology, I understand many of the processes happening in my body. I know that fat and carbs aren’t bad. That the news can and often does lie. I learn how to think critically and evaluate studies that are often portrayed as true when they are not.

I have developed opinions and the ability to think for myself and not just agree with other people. Through university, I have found my own unique voice and have become a person that I am proud of.

The Melbourne Model worked for me. Yes, it’s taken me longer to get on a path towards my career but it’s also given me so much that is invaluable and will be for the rest of my life. My degree supported my intellectual transition into adulthood and provided me with the foundations to become a critical thinker and a hard-working and compassionate human being. For that, I will be eternally grateful.

The new antidepressant hypothesis

Being a science student and having studied antidepressants, I thought I knew how they worked or at least why people thought so but after a lecture on the history of their development, now I’m not so sure.

Originally, in the 1950s-60s the thesis was that too many Monoaxamine Oxidase neurotransmitters (MAOs) such as serotonin and norepinephrine, were being oxidised by the brain. This led to a deficiency of them and thus depression. Drugs were developed to maximise the amount of MAOs in the brain, by preventing them from being oxidised. The idea is that high levels= better communication= stronger mood regulating circuit. The results were good! They appeared to work, yet serious side effects were often seen.

In order to reduce side effects, a new hypothesis was developed. The serotonin hypothesis. The idea was too much serotonin was being taken up by presynaptic neurons in the brain leading to a deficiency and ultimately depression. People, therefore, speculated that altering the level of serotonin would alleviate the symptoms of depression. This is where SSRIs were developed. They act by stopping the reuptake of serotonin and increasing the levels of it in the brain. Again, they appear to work. SNRIs do the same thing but for serotonin and norepinephrine.

The basis of these theories was PURELY SPECULATIVE. No one has been able to prove them. Though it could be argued that new research does exactly that, but this would depend on who you talk to.

Irving Kirsch showed that for people who’ve been on SSRIs before and experienced the side effects, if they’re then given a placebo drug with the same side effects, it appears to work. In his study, there was no significant difference between an active placebo (with side effects) and SSRIs. Yet, many people have criticised his work.

Studies have suggested that; changing serotonin levels in a healthy individual doesn’t cause depression, attacking symptoms of depression with SSRIs is no better than with an active placebo, and the placebo effect is critical in treating depression. Furthermore, one popular antidepressant bupropion appears to reduce depression in some people but it has not impact on serotonin at all, but dopamine and norepinephrine.

Obviously, though, I couldn’t just accept all of this to be true because I have seen antidepressants work, hell I’ve even FELT them work at times. In my day to day life I have yet to come across a pill that makes me feel ‘not depressed’ but I know that when I’m at my worst, medication is the only thing that can pull me out of it. So I turned to doctor google.

Apparently, there is this other theory that antidepressants cause neurogenesis (birth of new brain cells). This could explain why ADs take a while to work in the brain (whereas if it were just the amount of neurotransmitters you would expect them to work right away) because they are altering the pathways in the brain. I was drawn to how lots of these studies list physiological and psychological stress as a causal factor because it reflects the idea of early mad doctors, that neuroses were caused by stress. However,  it’s not a foolproof theory because some AD studies show neurogenesis and some don’t + it could be unrelated to what ADs actually do. Yet, it’s still really interesting.

I refuse to believe that antidepressants don’t work, but maybe they don’t work the way we think they do. Or maybe they do and we just happened to stumble upon the MAO idea by accident without having a clear reason for why this is the case.

What is your personal opinion and/or experience on antidepressants?

 

 

Identity and career confusion

I have a confession to make. For so long I’ve been determined to complete my science degree and pursue psychology as a a career. I was certain that this was what I wanted and was going to do. Now… I’m not so sure.

I was forced to take a leave of absence at the end of last year and have tried to go back twice since then but haven’t managed to. I’ve been off for a year and still have a year and a half of a three year degree to go. I want to finish it, I do. I just don’t know if I can handle the pressure or if it’s the right course for me. If I could go back and tell my 17 year old self anything it would be to decide what I wanted to do then. because at least if I didn’t like it I’d know by now.

I chose science because I was indecisive and it left options open but what I didn’t consider was whether a high pressure environment was the right choice for me. It wasn’t. Don’t get me wrong, I love my university but I wonder whether the pressure contributed to my declining mental health.

Right now I’m still not sure what I want to do. I guess that’s a common thing. It’s hard to plan out the rest of your life when there is so much uncertainty and I think it’s unfair that there is this expectation of young people coming straight out of school to know what they want to do.

This all has left me feeling disheartened and confused.

I’ve been considering doing a course in nursing or teaching, both which can be done as masters after my bachelors or I could choose to start a new bachelors degree. There’s also the option of completing a short course at tafe for 6 months and going back to university study at the beginning of next year.

The honest truth is I still don’t know what I’m going to do. Whatever decision I make, I feel like it’s not going to be the right one. I know I have to make one but at the moment it feels impossible.

My first day studying since being sick

What a day.

Today was my first day back at uni and I had less classes then I usually would, it being the first week, so I thought things would be okay but not so much. Something playing on my mind is that I can’t just choose 2 or 3 subjects I want to do ALL the subjects. Which is hard when you haven’t studied in a while.

screen-shot-2017-02-26-at-6-42-01-pm
My day started off well. I had my first neuroscience lecture with the lecturer who I have idolised ever since starting uni. He was the one who inspired me to consider neuroscience as a real option and he was the one that motivated me and stopped me from dropping out after that first lonely semester. This guy is just one of those people  that are so passionate about what they do and yet also so funny and real that you can’t help but like them. If I wasn’t interested in this subject I probably still would have chosen it just because his lectures are so much fun and so thought provoking.

IMG_0796
Yeah so I was on a high when I went upstairs to my next lecture (research methods and stats). And here’s where things went downhill.

I tried to pay attention but when I looked up I realised I’d missed about 30 minutes without realising, couldn’t focus on what was being said and everything was just going over the top of my head. Not that the content was hard, just that I wasn’t mentally there. 

I sat in my seat panicking because I was at the end of the row and couldn’t leave, until it got to halfway where there was a break and I escaped out the back to go and break down in the bathroom. It was like all my dreams had been taken away at once.

I assumed going back would be the same difficulty as when I first started but today was so much harder. And what’s worse is that no one seems to understand when I tell them that. They think I’m being dramatic and should be fine but I’m not.

Processed with VSCO with hb1 preset

I guess it’s just going to take time.

In other news, it was hilarious to see that o-week has morphed into a two week festival of everything from water slides to overnight sleepovers and parties on campus. The change makes me feel so old even though it’s only been two years since my o-week. My memories are of painfully lining up in the heat to be told all the free food was gone and then getting lost on campus. How things change.

Anyway, I’m out. This was just a quick update to let you know how today went. Here’s hoping things only get better.