The slow rise and rapid fall of the lobotomy can give us pause to wonder: In the future, what practice of today will be looked on with horror?
“Each physician has a different nature. One believes in the principle: premum non nocere (do no harm). The other says: melius anceps remedium quam nullum (better a dangerous remedy than nothing). I lean toward the second category.” —Gottlieb Burckhardt, the father of psychosurgery (1891)
Psychosurgery—an ill-defined combination of neurosurgery and psychiatry—has long been one of the most controversial fields in medicine. It has captivated the minds of both the physician and the philosopher alike, having a complicated history of medical uncertainty and ethical divide. Perhaps one of the most familiar terms within the field of psychosurgery is lobotomy—a word that has been broadly used to describe various procedures such as the leucotomy, the topectomy, and the neuroinjection of different sclerosing agents.
Origins of psychosurgery
The origins of psychosurgery can be traced back to antiquity, with evidence of Stone Age craniotomies dating as far back as 5100 BCE. Archaeological findings suggest that prehistoric shamans could access the brain through trephination, a process that involves the drilling or incision of a hole in the skull using a bladed surgical tool. Trephination has been well documented throughout early history leading into premodern times—not only in medical literature but also in certain works of visual art. For example, Renaissance painter Hieronymus Bosch depicts psychosurgical trephination in one of his most popular works, The Extraction of the Stone of Madness (circa 1494). Clearly, there has been a longstanding interest in the brain–behavior relationship and the potential role for psychosurgery in manipulation of this complex connection.
The Extraction of the Stone of Maddness by Hieronymus Bosch.
Reproduced with permission of the Museo Nacional del Prado, Mardid.
It wasn’t until the mid-19th century that psychosurgery took on a more familiar form, when the scientific community became interested in the hallmark neuropsychiatric case of Phineas Gage, a 25-year-old railroad worker who was speared by a rod measuring 109 cm long and 3 cm thick through his prefrontal cortex during an unfortunate workplace explosion.[6,7] To the surprise of the masses, Gage walked away from the incident without any remarkable somatic complaints, but to those who knew him well, the Gage who survived the explosion was not the Gage they had known before. Once an upstanding model citizen, he had become easily irritable, disinhibited, and extremely labile. Gage’s physician followed his case closely and published the following description:
Previous to his injury, although untrained in the schools, he possessed a well-balanced mind, and was looked upon by those who knew him as a shrewd, smart businessman, very energetic and persistent in executing all his plans of operation. In this regard his mind was radically changed, so decidedly that his friends and acquaintances said he was “no longer Gage.”
The case of Phineas Gage spurred an entire field of research into the specific functioning of different parts of the brain, and how this might be related to the clinicopathology of various psychiatric diseases with similarly disinhibited presentations.
Inception of lobotomy
Inspired by an emerging understanding of the frontal lobe and its undeniable force in shaping human behavior, Swiss psychiatrist Gottlieb Burckhardt was the first-known physician to translate theories about the brain–behavior connection into a targeted surgical practice. Working with a small cohort of severely schizophrenic patients who were refractory to other treatment measures, Burckhardt removed segments of a patient’s brain to treat the psychiatric disease and to change the patient, in his words, from “an excited to a quieter demented [schizophrenic] person.” In his landmark research, which he reported in 1891, Burckhardt performed and documented multiple open-brain surgical procedures on six schizophrenic patients over the span of 10 years—with varying degrees of success. His results ranged from patients being successfully “quieted” by the procedure (which was the case for three of the six patients) to one patient passing away from postoperative complications.6 While Burckhardt intended for the utility of his surgery to be “at most palliative,” his research was harshly rejected by the medical community for being highly disturbing and grossly ineffective. So, Burckhardt abandoned his research after the publication of his results, and psychosurgical exploration faded into the background for some decades.
In the early 1930s, psychosurgery experienced a sudden and surprisingly swift revival. In Europe, Portuguese neurologist António Egas Moniz and his neurosurgical colleague Almeida Lima were experimenting with connections between the frontal cortices and the thalamus, and began to slowly reintroduce some of the principles of Burckhardt’s research. To further refine Burckhardt’s surgical technique, the duo developed a more targeted, specific process called the leucotomy, which involved inserting a small surgical rod with a retractable wire loop (called the leucotome) into the brain. The instrument could then be used to cavitate areas of white matter, with the express intent of altering a patient’s disposition.[6,11] With a body of research that was very much in its infancy and without having produced convincing results to support their new technique, Moniz and Lima began promoting the controversial procedure across Europe with charisma and political savvy. Indeed, it was then that the lobotomy began to gain acceptance as a primary treatment for psychiatric disease—even though Moniz and Lima kept poor records of patient follow-up and even had returned some patients to asylums postoperatively, never to be seen again.
As the lobotomy was popularized across Europe, the procedure was also being introduced to an eager North American medical audience. Neurologist Walter Freeman and neurosurgeon James Watts championed this migration, aiming to improve on the results of their international colleagues. The duo modified the procedure so that it required nothing more than a small 1 cm burr hole that could be drilled superior to the zygomatic arch for the insertion of the leucotome. This undoubtedly made the procedure much simpler and slightly less invasive, but it still came with the inherent postoperative risks of seizure disorders, infections, and even death.[6,12] Furthermore, Freeman eventually found himself mesmerized by the work of an Italian colleague who had developed a transorbital approach to the procedure that required nothing more than a simple, ice pick–like instrument that could be tapped through the orbital bone and swept across the prefrontal cortex. He quickly and eagerly adopted this method in the late 1930s.
The work of Freeman and Watts had simplified the lobotomy so much that Freeman began performing the procedure without the help of his neurosurgical colleague and without the sterile field that was often required in the operating room. This served to distance Watts from the pair’s research, as he was disturbed by the crude nature of the transorbital approach and unimpressed with the substandard, nonsterile perioperative care that Freeman was providing. In time, the duo severed their ties, but Freeman continued with his passionate crusade to popularize the transorbital lobotomy throughout North America. Tens of thousands of psychiatric patients underwent the procedure—with varying degrees of success—until the lack of evidence supporting the lobotomy finally caught up with Freeman and his psychosurgical colleagues.
Downfall of the lobotomy
While the rise of the lobotomy was slow and sequential, its demise seems to have happened all at once. Amid growing doubt about the procedure, Moniz was awarded the 1949 Nobel Prize for Physiology or Medicine for his earlier work with the contentious surgery. In an instant, the global medical community cast its critical eye on the research of Burckhardt, Moniz and Lima, and Freeman and Watt, and so began the downfall. Critics challenged that the lobotomy did not “confer the greatest benefit to mankind”—a stated criterion for the Nobel Prize—but rather argued that it caused a more grievous harm. An impressive library of antilobotomy literature was quickly formed.
It wasn’t until chlorpromazine was introduced into the psychopharmaceutical market that the lobotomy was truly depopularized. Chlorpromazine was the first psychotherapeutic drug that had been approved to treat schizophrenia with positive effect, and during its first year on the market, it was administered to an estimated 2 million patients. With a safer, more reliable option now readily available for the entire medical community, the lobotomy officially fell out of favor.
This article has been peer reviewed.
1. Burckhardt G. 1891. Ueber Rindenexcisionen, als Beitrag zur operativen Therapie der Psychosen [About cortical excision, as a contribution to surgical treatment of psychosis]. Allgemeine Zeitschrift fur Psychiatrie und psychisch-gerichtliche Medicin [General journal for psychiatry and mental forensic medicine]. 1891;47:463-548. German.
14. Lindsten J, Ringertz N. The Nobel Prize in Physiology or Medicine, 1901-2000. Nobelprize.org. 26 June 2001. www.nobelprize.org/nobel_prizes/themes/medicine/lindsten-ringertz-rev/.
Mr Gallea is a third-year medical student at the University of British Columbia.
Above is the information needed to cite this article in your paper or presentation. The International Committee
of Medical Journal Editors (ICMJE) recommends the following citation style, which is the now nearly universally
accepted citation style for scientific papers:
Halpern SD, Ubel PA, Caplan AL, Marion DW, Palmer AM, Schiding JK, et al. Solid-organ transplantation in HIV-infected patients. N Engl J Med. 2002;347:284-7.
About the ICMJE and citation styles
The ICMJE is small group of editors of general medical journals who first met informally in Vancouver, British Columbia, in 1978 to establish guidelines for the format of manuscripts submitted to their journals. The group became known as the Vancouver Group. Its requirements for manuscripts, including formats for bibliographic references developed by the U.S. National Library of Medicine (NLM), were first published in 1979. The Vancouver Group expanded and evolved into the International Committee of Medical Journal Editors (ICMJE), which meets annually. The ICMJE created the Recommendations for the Conduct, Reporting, Editing, and Publication of Scholarly Work in Medical Journals to help authors and editors create and distribute accurate, clear, easily accessible reports of biomedical studies.
An alternate version of ICMJE style is to additionally list the month an issue number, but since most journals use continuous pagination, the shorter form provides sufficient information to locate the reference. The NLM now lists all authors.
BCMJ standard citation style is a slight modification of the ICMJE/NLM style, as follows:
- Only the first three authors are listed, followed by "et al."
- There is no period after the journal name.
- Page numbers are not abbreviated.
For more information on the ICMJE Recommendations for the Conduct, Reporting, Editing, and Publication of Scholarly Work in Medical Journals, visit www.icmje.org