For a few weeks a year, as winter turns into spring, or summer gives way to fall, people in heavy coats coexist with those in sandals and shorts. Similarly, in an office where the thermostat is set at 74°F, some workers will be comfortable in short sleeves, while others will be wearing sweaters and scarves.
Underlying this disagreement are the different ways people perceive cold—and scientists are still trying to understand them.
Men, Women, and Metabolism
In work settings, men and women often have different opinions about the ideal temperature. A 2019 study found that women performed better in math and verbal tasks at temperatures between 70°F and 80°F, while men did better below 70°F. The researchers proposed that gender-mixed workplaces might boost productivity by setting the thermostat higher than the current norm (which the Occupational Safety and Health Administration suggests should be between 68°F and 76°F).
The discrepancy has a known physical basis: Women tend to have lower resting metabolic rates than men, due to having smaller bodies and higher fat-to-muscle ratio. According to a 2015 study, indoor climate regulations are based on an “empirical thermal comfort model” developed in the 1960s with the male workers in mind, which may overestimate female metabolic rates by up to 35 percent. To compound the problem, men in business settings might wear suits year-round, while women tend to have more flexibility to wear skirts or sundresses when it's warm outside.
Culture and the Cold
Cultural factors are also involved. European visitors are habitually alarmed by the chilly temperatures in American movie theaters and department stores, while American tourists are flabbergasted at the lack of air conditioning in many European hotels, shops, and offices. The preferred temperature for American workspaces, 70°F, is too cold for Europeans that grew up without the icy blast of air conditioners, Michael Sivak, a transportation researcher formerly at the University of Michigan, told The Washington Post in 2015.
The effects of cultural change on the human ability to withstand extreme temperatures can be dramatic. In the 19th century, 22 percent of women on the Korean island of Jeju were breath-hold divers (haenyeo). Wearing thin cotton bathing suits, haenyeo dove nearly 100 feet to gather shellfish from the sea floor, holding their breath for more than three minutes in each dive. In winter, they stayed in 55°F-57°F water for up to an hour at the time, and then warmed up by the fire for three of four hours before jumping back in.
In the 1970s, haenyeo starting wearing protective wet suits. Studies conducted between the 1960s and the 1980s showed that their tolerance for cold diminished [PDF].
Blame Your Brain
Beyond the effects of cultural practice and body composition, scientists have started to identify the cognitive factors that influence our temperature perception. It turns out that what feels unpleasantly cold versus comfortably chill is partly in our own minds.
One example is the phenomenon described as “cold contagion.” A 2014 study asked participants to view videos of people immersing their hands in visibly warm or cold water. Observers not only rated the hands in cold water as cooler than those in hot water, but their own hands became cooler when watching the cold-water videos. There was no comparable effect for the warm water videos, however. The findings suggest that we may feel colder when surrounded by shivering people at the office than if we're there by ourselves, even when setting the thermostat at the same temperature in both cases.
Other studies highlight the psychological aspects of temperature perception. Experimental participants at the Institute of Biomedical Investigations in Barcelona, Spain, watched their arms become blue, red, or green by means of virtual reality, while the neuroscientist Maria Victoria Sanchez-Vives and her team applied heat to their actual wrists. As the temperature increased, participants felt pain earlier when their virtual skin turned red than when it turned blue or green.
Subjectivity in temperature perception has led to some creative treatments for burn patients. In the 1990s, Hunter Hoffman, David Patterson, and Sam Sharar of the University of Washington developed a virtual-reality game called SnowWorld, which allows patients in hospital burn units to experience virtual immersion in a frozen environment. Amazingly, playing SnowWorld counteracted pain during wound care more effectively than morphine did.
“The perception of temperature is influenced by expectations,” Sanchez-Vives tells Mental Floss. “Putting one’s hand inside a virtual oven is perceived as ‘hot,’ while sticking one’s hand into a virtual bucket filled with iced water is perceived as ‘cold,’ despite being at room temperature in each scenario.”
In other words, if you expect to feel cold walking into the office or out on the street, chances are that you will.