13 Fascinating Facts About Lobotomies

Hieronymous Bosch, 'Cutting Out the Stone of Madness' (c. 1501-1505), now in the Museo del Prado
Hieronymous Bosch, 'Cutting Out the Stone of Madness' (c. 1501-1505), now in the Museo del Prado / Museo del Prado, Wikimedia Commons // Public Domain

An estimated 50,000 lobotomies were performed across the U.S. between 1936 and the late 1950s. At least 3500 of them were done by one man, Walter Jackson Freeman, dubbed the father of the lobotomy. First hailed as brain surgery that cured intractable mental illnesses, the lobotomy was never proven effective, and it’s difficult to know how patients fared (though the fatality rate was around 14 percent). Read on for the dark and fascinating story of an ethically dubious medical practice that fortunately fell out of fashion.

1. Humans have a long history of boring holes in skulls.

Trephination (or trepanation) refers to a hole being drilled or scraped into a skull. According to Charles G. Gross, author of A Hole in the Head, thousands of trephined (or trepanned) skulls have been discovered across the globe. The specimens, from both sexes and all ages, date from the late Paleolithic Era to this century.

It’s unclear why the earliest trephinations were performed, but scientists have identified hard-stoned knives of obsidian or flint, metal knives, and drills as trephination tools. We know the holes did not immediately kill the people they were bored into, because scarring, which can take years to form, is often observed along the edges of the openings.

One of the first texts from the Hippocratic Corpus, in which several types of head wounds are described, suggests that trephination was a recommended treatment even in cases of minor bruising. A likely rationale, Gross writes, is that Hippocratic physicians viewed stagnant blood, like stagnant water, as bad; therefore, letting the blood flow out would prevent it from spoiling. Trephination was also used to treat cases of epilepsy and mental illness. A 13th-century text recommended opening the skulls of people with epilepsy so “that the humors and air may go out and evaporate.”

2. The “American Crowbar Case” captured media and medical attention.

Phineas Gage with the tamping iron.
Phineas Gage with the tamping iron / Originally from the collection of Jack and Beverly Wilgus, and now in the Warren Anatomical Museum, Harvard Medical School, Wikimedia Commons // CC BY-SA 3.0

In 1848, an explosion propelled a 3.5-foot, 13-pound tamping iron through the skull of the Phineas Gage, a railway construction foreman. The object skewered Gage’s head, puncturing his left cheek, passing behind his left eye, ripping through his prefrontal lobe, and erupting clear through the crown. His survival was miraculous. 

By all accounts, however, he was not the same. Though intellectually intact, with a grip on his memories, he was transformed from a friendly man into someone ornery and rude with few inhibitions. (His personality reportedly returned to normal after a couple of years.) Supposedly, he carried the iron rod that tore through his brain wherever he went until he died from a seizure 12 years later. 

The “American crowbar case” became one of several critical points in the evolutionary timeline of the lobotomy.

3. Understanding of how each part of the brain functions was in its infancy in the mid-19th century.

At the time of Gage’s accident, the crude mapping of brain function as it relates to location within a seemingly uniform mass was still a novel concept. Scientists in labs in Europe and the U.S. excised or purposely damaged brain regions in dogs and great apes to see how the injuries affected them. Researchers studied the effects of cerebral lesions, such as cysts or tumors, in patients. They inspected postmortem brains to correlate disease or damage with the deceased patients’ symptoms.

Eventually, a newer, though still coarse, picture of the mysterious organ took shape. The frontal lobes seemed to house aspects of affect, behavior, and impulse control. And if removing those parts from chimps made them calmer and more complacent, scientists thought, maybe it could do the same for people struggling with severe mental health conditions, like schizophrenia.

4. Swiss psychiatrist Gottlieb Burckhardt performed the first brain surgery to treat a mental disorder in the 1880s.

Burckhardt believed the mind to be made up “of small faculties, holding their seats in distinct parts of the brain,” wrote British psychiatrist William Ireland. “Where excess or irregularity occurs, he seeks to check it by ablation of a portion of the irritated centers”—or, in other words, remove the area of the brain where he thought the illness was located. To test his theory, Burckhardt opened the skulls of six patients with schizophrenia all living at the facility where he was director. He used a sharp spoon to scoop out specific sections of cortex, in a procedure known as a topectomy. While he reported improvement in three patients, one patient died and the remaining two experienced no change. Some developed aphasia (inability to understand or express speech) or seizures afterwards. Critics accused Burckhardt of being needlessly reckless.

5. Research into effective treatments for mental illness ramped up in the early 20th century.

Spiral staircase inside the Octagon on New York's Welfare Island
Inside the Octagon, part of New York City's asylum for mentally ill patients in the 19th and early 20th centuries / Historic American Buildings Survey, Library of Congress // Public Domain

Prior to the mid-1930s, those with schizophrenia had few options beyond confinement in an overcrowded, inhumane asylum. Developing surgical techniques that could offer the “soul sick,” of which there seemed to be no shortage after the First World War and the Great Depression, appeared worth exploring [PDF]. (Other therapies like medication wouldn’t come long until the mid-1950s.) But what happened next pushed the boundaries of science while blatantly ignoring medical ethics.

6. A Portuguese neurologist is credited as the founder of psychosurgery—and won the Nobel Prize for his research.

Psychosurgery may sound like something out of American Horror Story, but it really describes surgically induced changes to the brain intended to influence behavior or treat mental health disorders. The Portuguese neurologist António Egas Moniz, who coined the term, was already widely recognized for developing cerebral angiography—a way to visualize blood vessels in the brain. In 1935, he turned his attention to psychosurgery and the severely mentally ill.

Moniz believed that mental illness was a problem of persistent, repetitive thoughts occurring in the brain’s frontal lobes. While attending the 1935 International Neurological Conference in London, he heard about a study in which two chimpanzees, Becky and Lucy, displayed dramatic behavioral shifts after the removal of their frontal lobes.

Invigorated after the meeting, Moniz developed the prefrontal leucotomy (from the Greek words leukos, “white,” and tomia, “to cut”), a surgery targeting the white matter between the prefrontal cortex—a region just behind the eyes and forehead—and the thalamus, considered to be the “emotional brain." 

Moniz and his colleague, Pedro Almeida Lima, carried out leucotomies on 20 psychiatric patients who had features of schizophrenia, mood disorder, or anxiety neurosis. They used a leucotome, a surgical rod with a retractable wire loop, to “core” 12 chunks, 1 centimeter in diameter, in the white matter connecting the two regions of the frontal lobe, severing their communication. Moniz quickly announced the success of his technique in June 1937. “It is claimed [the surgery] cuts away sick parts of the human personality and transforms wild animals into gentle creatures,” The New York Times reported [PDF], and noted that 15 percent of the 20 patients were greatly improved and 50 percent moderately improved. Critics would later point out the dearth of information in Moniz's publications, especially related to the methods and results. There was never any proof that the patients improved. 

Moniz would go on to win the 1949 Nobel Prize in Physiology or Medicine for psychosurgery. (Many have called for the prize to be revoked posthumously, but that is unlikely to happen.)

7. In 1936, a 63-year-old woman from Topeka, Kansas, became the first lobotomy patient in the United States.

Walter Freeman, a neurologist and psychiatrist without any surgical training, attended the same medical conference that captivated Moniz in 1935. At the time, Freeman was implementing new protocols at George Washington University Hospital. He experimented with “shock” therapies through the use of medications (such as insulin or metrazol) or electricity (to induce “therapeutic” seizures and comas). But he was bewitched by the lobotomized chimps, and closely followed Moniz’s leucotomic handiwork in Europe. 

Freeman partnered with James Watts, a neurosurgeon from the same university, to practice Moniz’s technique on some brains from the hospital’s morgue. Just one year after the medical conference, the duo believed they were ready for a live human patient. 

They chose Mrs. Alice Hood Hammatt—a homemaker from Topeka, Kansas, diagnosed with agitated depression—as their first patient. According to Jack El-Hai in his book The Lobotomist, the doctors told Hammatt she would be committed to a hospital if she did not have the operation. Freeman and Watts acted as co-surgeons using an instrument similar to Moniz’s. They made two holes on the side of her head and then extracted cores of white matter. It took about one hour. 

The surgery was deemed successful, and two months later, Freeman began calling the operation a lobotomy. Mrs. Hammatt’s husband told Freeman that she was a changed woman. “As she expressed it, she could go to the theater and really enjoy the play without thinking what her back hair looked like or whether her shoes pinched,” Freeman wrote.

8. Freeman sought the spotlight in jaw-dropping ways.

Walter Freeman and James Watts study an X-ray
Freeman (left) and Watts study a patient's X-ray before performing psychosurgery. / Harris A. Ewing, 'The Saturday Evening Post,' Wikimedia Commons // Public Domain

By 1942, six years after Hammatt’s operation, Freeman and Watts had performed 200 lobotomies and reported that 63 percent showed improvement following the procedure, 23 percent experienced no change, and 14 percent experienced severe detriment or death.

El-Hai writes that Freeman advertised his services, which was considered unethical for physicians at the time. He appeared at conventions to grab the attention of the press. “I found the technique of getting noticed in the papers,” Freeman wrote, “[was] to arrive a day or two ahead of the opening and install the exhibit in the most graphic manner and then be alert for prowling newsmen.” He usually had a lobotomized animal on display.

The Saturday Evening Post profiled Freeman and Watts, claiming that “a world that once seemed the abode of misery, cruelty, and hate is now radiant with sunshine and kindness” thanks to lobotomies. Newspapers and magazines made lobotomy sound like a miracle cure, when more often it only made patients more docile—if it didn’t incapacitate or kill them.

9. Freeman eventually changed the lobotomy procedure so that no drilling was necessary.

Freeman and Watts continued to refine their technique (so to speak) while staying true to Moniz’s original premise. In their 1942 surgical protocol, Freeman wrote, “The depth of the incision must be judged by the surgeon, any increased resistance being the signal for withdrawing the instrument as a precaution against lacerating an artery. Once the primary incision has been made, it is safe to deepen the incision by radial thrusts of the knife.”

He and Watts performed nearly 1000 lobotomies together, but Freeman grew restless. He tinkered with the procedures and tools. In 1952, TIME magazine reported that “[Freeman] has fallen completely out of love with the prefrontal lobotomy in which a knife is inserted through a hole drilled in the temple … Now he is a devotee of the transorbital lobotomy, in which approach to the frontal lobe is made through the eye socket.” 

10. Freeman really was inspired by an icepick.

Freeman grew frustrated with the need for an expensive neurosurgeon to be present at every lobotomy. He wanted to find a way that was faster, easier, and cheaper; one that could be applied to the masses.

This time he found inspiration in the work of Italian psychosurgeon Amaro Fiamberti, who had developed a new way to access the brain: by inserting a thin tube through the fragile bone at the back of the eye socket. He would then inject alcohol or formalin through the tube into the frontal lobes, completing the lobotomy. Freeman preferred cutting the prefrontal cortex over Fiamberti’s injections. He searched for the ideal tool for the procedure and chose an icepick from his kitchen drawer; he would eventually alter his surgical instrument to resemble it.

In 1946, Ellen Ionesco, a 29-year-old homemaker and mother with suicidal ideation, became Freeman’s first transorbital lobotomy patient. Freeman would insert the icepick-shaped tool through the unconscious patient’s tear duct, tap it with a surgical hammer to break the eye socket bone, and swish the instrument around the frontal lobe. The process was then repeated on the other side. Some have compared the sweeping movement of the tool to windshield wipers. Ionesco, and later patients, went home in a taxi an hour later. Ionesco’s daughter later said that after the procedure her mother had come back to her and was the person she had remembered.

Supposedly, the first nine of these procedures were done in Freeman’s office without Watts’s knowledge. During the tenth, Watts was either invited in or accidentally walked in on Freeman operating in his decidedly nonsterile office. Either way, he was found out.

11. Freeman took his lobotomy show on the road.

Lobotomy tools belonging to Walter Freeman
A pair of Freeman's icepick-like lobotomy tools / Wellcome Collection, Wikimedia Commons // CC BY 4.0

When Freeman made it known that he planned to start training other non-surgeon psychiatrists to perform lobotomies, Watts severed their partnership. 

Now answerable to no one, Freeman expanded operations. He promoted the the icepick lobotomy to patients with postnatal depression, headaches, chronic pain, indigestion, insomnia, or behavioral difficulties,. And he came to believe that the sickest patients at risk of disability or suicide were too far gone to be helped.

Freeman set out on a cross country tour in a camper van promoting the transorbital lobotomy as a 10-minute miracle operation. Evidently, he was convincing: Over the course of his career he performed lobotomies in 55 hospitals across 23 states, though not all of them could be considered successful. On one occasion, he stopped midway through the procedure to snap a photo, and his instrument slid deeper into the patient’s brain, killing the person.

12. “Lobotomy gets them home” was Freeman’s motto.

There were some successful lobotomies, where patients returned to a semblance of normal life. Freeman often took before-and-after photos of his subjects as proof that the lobotomies worked. In one series, Patient 121 glares at the camera in her pre-surgery photo; 11 days after her lobotomy, she is smiling. “She giggles a lot,” the caption reads.

Unfortunately, there were likely many more unsuccessful procedures. One of Freeman’s most famous patients was Rosemary Kennedy, younger sister of future president John F. Kennedy, who received a lobotomy at age 23 and was severely injured. She required constant care for the rest of her life. 

Freeman eventually published a long-term follow-up report of his schizophrenic lobotomy patients. He wrote that although the majority improved, 73 percent were still hospitalized or at home in a “state of idle dependency.”

13. Effective medications finally brought an end to lobotomies.

In 1955, the antipsychotic drug Thorazine was approved in the U.S., launching a new era of treatments for severe mental illness using medications rather than surgery. 

At the same time, depictions of lobotomized characters in literature, film, and theater further illuminated the ethical lapses of the mental healthcare system. In Tennessee Williams‘s 1958 play Suddenly, Last Summer (later made into a movie starring Elizabeth Taylor and Katharine Hepburn), a woman is forcefully lobotomized when her wealthy aunt fears she will reveal family secrets. In 1975, Jack Nicholson played a patient in a power struggle with the horrific Nurse Ratched in One Flew Over the Cuckoo’s Nest (based on the 1962 novel of the same name), in which his character undergoes a lobotomy.

Freeman was finally banned from performing surgery in 1967, following the death of a patient named Helen Mortensen. She passed away when her third transorbital lobotomy (by him) resulted in a fatal brain hemorrhage.