During World War II, Sex Was a National-Security Threat


In June 1942, a woman named Billie Smith was arrested in her hotel room in Little Rock, Arkansas, and charged with prostitution and violation of the state’s immorality laws. Smith pled guilty and paid the fine of $10—equivalent to $150 today—but authorities weren’t quite done.

Smith was turned over to the city’s health examiner, who ran tests for syphilis and gonorrhea. When both came back positive, the officer ordered her committed to a federally run quarantine center in Hot Springs, Arkansas. Three days after her arrest, she petitioned for a writ of habeas corpus, arguing that the quarantine amounted to unlawful imprisonment—but Smith’s condition posed too much of a threat, the court argued, for it to do anything but let the quarantine stand.

“It affects the public health so intimately and so insidiously,” the court wrote, “that consideration of delicacy and privacy may not be permitted to that measures necessary to avert the public peril.”

U.S. National Archives / Wikimedia       

Technically speaking, the “it” referred to sexually transmitted diseases, which the government had recently declared to be “military saboteur number one.” In practice, though, the real saboteurs were considered to be the women who carried them. Over the course of the United States’ involvement in World War II, federal authorities detained hundreds of women in quarantine centers across the country, determined to protect the country’s fighting men from sex workers and other women who flocked to the towns that housed army bases, known as “Khaki Wackies,” “good-time Charlottes,” “camp followers,” and—in a portmanteau coined by a the U.S. Public Health Service—“patriotutes.”

“Controlling these women was considered important to the defense effort,” said John Parascandola, a medical historian and the author of Sex, Science, and Sin: A History of Syphilis in America. In 1939, three years before Smith’s arrest, the War Deparment, the Navy, the Federal Security Agency, and state health departments crafted the Eight-Point Plan, a set of measures intended to curb the spread of STDs “in areas where armed forces or national-defense employees are concentrated.”

The problem was, treating STDs at the time was often a lengthy and involved process. Penicillin was first used as a treatment for syphilis in 1943, but it was scarce for civilians during the war—the more common treatment at that time was a regular injection of arsenic-based drugs, administered once a week for up to a year. Gonorrhea could be cured with a round of pills, but even that required careful adherence to the dosing schedule in order to work. Counting on such a high degree of cooperation was too much of a gamble.

The strategy, then: Eliminate the need for cooperation.

In 1941, the government created the Social Protection Division, an agency whose goal was to combat prostitution in these areas, and appointed Eliot Ness, a Prohibition agent who had helped indict Al Capone, as its head. The same year also saw the passage of the May Act, which made it a federal offense to solicit sex near a military base.

“Uncle Sam is not taking camp followers for granted,” Ness wrote in 1942.  Under his direction, the government opened a network of so-called “rapid treatment centers” across the country, where women—it was almost always women—could be detained and given a concentrated intravenous dose of the syphilis drugs over a period of days or weeks.

At first, health officials focused on sex workers who had been arrested near military bases or factories. As the war progressed, however, the focus widened from sex workers to “any women who were somehow viewed or under suspicion as being delinquent,” Parascandola said. Some places dispatched health workers to bars and dance halls to scout out women who appeared too sexually forward; in other cases, officials would wait at bus stops, questioning the women who came off the bus about their reasons for traveling to the town.

Women who didn’t agree to submit to testing could still be quarantined via court order if officials suspected her of having an STD—a caveat that was interpreted liberally. “For example, they might arrest a woman who they found hanging around the camps under vagrancy charges,” he said, “and they might use some claim of suspicion for venereal disease because this women was hanging around with all these men.”

While the quarantine itself was legal, the treatment was more of a legal gray area—for syphilis, in particular, the insertion of an intravenous line could be considered a surgical procedure, which legally couldn’t be forced. In practice, however, “I’m not sure how much of a choice they had,” Parascandola said. “For one thing, they could keep them quarantined [indefinitely] … They could simply say, ‘If you don’t get treated, you still have the infection and we’re not going to release you.’”

While the centers were primarily hospitals, the staffers frequently tried to “cure” their patients in more ways than one. “There were attempts to counsel them, to set them on the ‘right path,’ if you will,” Parascandola said. Most centers offered vocational training, sometimes funneling the women directly from treatment into jobs at factories that supported the war effort—and by extension, they believed, eliminating the threat of the number-one saboteur.

U.S. National Archives / Wikimedia      

“A lot of people who were supposedly experts on venereal disease had a ‘one-way transmission idea’ of how it moved. They saw it going from prostitutes to their clients,” said Linda Gordon, a professor of history at New York University. The idea of women as the carriers of STDs and men as the hapless victims wasn’t new—in fact, it was as old as syphilis itself—but it was embedded in the public-health rhetoric of wartime America.

“What you have is not so much what we would consider a scientific conclusion as a sexist bias that saw prostitutes as inherently dirty,” Gordon said. “Polluted, but also polluting to anyone they came across.”

The double standard manifested itself in other ways, too—while condoms were readily available for male soldiers on military bases, the official policy of the Women’s Army Corps was abstinence.

When Women Could Drink at 18, and Men at 21


Until quite recently, drinking in America has been strongly associated with maleness. According to one study, as late as 1990 only 80 percent of Americans thought it was acceptable for a woman to be drinking at a bar with friends, compared with 85 percent who thought it was fine for a man to do so.

We can thank many things for the change in social norms: The increase in college attendance among women, feminism, and of course, Sex and the City and its river of cosmopolitans.

But some places were ahead of their time. I recently came across this map, which shows that until 1961 in Illinois, the drinking age was lower for women than it was for men. It was the only state where women could knock one back at 18, but men had to wait until 21.

To find out why, I reached out to Joy Getnick, a State University of New York at Geneseo who has studied drinking ages. It turns out the Illinois provision was tied to the age of majority—when someone is considered an adult—which was lower for women.

“The premise behind the old laws had been that women matured faster than men, and perhaps married younger than men,” Getnick said in an email. “By 1961 those views had changed. Women matured (or didn’t) similarly to men, and there were growing concerns about younger women buying older (but still not yet legal) men drinks, and all of the social (and legal) problems that went with that.

Some considered the law an outrage, however, long before then—mostly because it cramped men’s style. A 1948 editorial in the Chicago Tribune called it one of the “silliest laws” on the books:

A married man under 21, alone or accompanied by his wife or others, may not be served intoxicating beverages … A law that tells girls they can start their public liquor drinking at 18, and tells boys they must wait three years, places boys in the embarrassing position of trying to persuade their girl friends not to frequent taverns until they (the boys) are old enough to accompany them.

In 1961, Illinois Governor Otto Kerner signed a law making the drinking age 21 for both genders. Shortly after, a 19-year-old woman named Virginia Wantroba filed suit, saying the new restriction infringed on her right to have the occasional frothy afternoon cocktail. (Her boss, it’s worth noting, was the attorney for the state’s Beverage Dealer’s Association.) You’ve got to fight, as they say, for your right to party.

‘Primary’ Caregiver Benefits Sound Gender-Neutral but Aren’t


It sounds like a fair, gender-blind idea for American businesses: offering paid leave not to moms specifically but to a baby’s “primary caregiver.” Unfortunately, in reality, this policy is often used to reinforce old stereotypes—and in the end can discourage men from taking leave.

Many American businesses have leave policies that are stuck in the past. Families have evolved, and dads take on much more responsibility at home. But workplaces tend to offer much more generous leave for moms than for dads, pushing moms to stay at home and be caregivers, while men are pushed to stay in the office and be the primary earners.

On the surface, “primary caregiver” benefits should change this and give families real choices. Adobe, for example, offers 16 weeks of paid leave to primary caregivers, defined as parents who take “primary responsibility for care of the child during the typical Adobe work hours.” An employee must sign an affidavit averring that he or she is the “primary” caregiver. EY (formerly Ernst & Young) offers six weeks of paid leave to any primary caregiver, and two for the other partner.

But what’s written on paper and what occurs in practice are very different. Dads who want to be equal partners at home face tremendous stigma in the workplace. Many men have been demoted or even fired for taking time off to care for loved ones, including their newborn children, a series of studies overseen by a working group at the Center for WorkLife Law found.

This is why even among the minority of American men who get some paid paternity leave, most don’t use it up. It’s tough enough for many men to even acknowledge being caregivers. So declaring themselves the “primary” caregivers is often completely out of the question.

“By forcing men to prove they are primary caregivers in order to ‘earn’ paternity leave, (a company) subverts the man’s already difficult struggle to obtain some semblance of work-family balance,” attorney Keith Cunningham-Parmeter wrote in the Stanford Law Review. A man who “musters the courage” to ask “will have to overcome a policy that is predicated on the assumption that parental leave is woman’s work.”

And that’s the best-case scenario, when the policies are written to treat men and women at the company the same, at least in theory. But often the primary-caregiver benefits are written in such a way that they’re not even available to dads who are partnered with a child’s biological mom (perhaps the most common situation).

I faced such a policy at CNN in 2013. It allowed for 10 paid weeks of leave for biological moms (under the company’s disability leave) and for “primary caregivers” in cases of adoption or surrogacy. When I informed the company that I would be the primary caregiver for my daughter, corporate parent Time Warner said no. (I took legal action. The company has since updated its policy, making it much better for dads like me and for moms after a birth. We recently settled.)

But beyond problems with how such policies are written and carried out, there is something about the idea that a child has one parent who is a “primary” caregiver and another who is secondary that is startlingly outdated. About 60 percent of families with children at home have two working parents who share caregiving responsibilities. Workplaces should be doing what they can to encourage an even distribution of those responsibilities, not encoding the idea that one parent will do more.

“I regularly counsel employers not to set up [primary-caregiver benefits],” says Cynthia Calvert, president of Workforce 21C, a consultancy that helps companies update their policies. “It reinforces the idea that there is one main parent. It also has the effect of excluding men and is difficult for employers to administer,” she told me in an interview for my book All In.

All this hurts both men and women. Men can’t get time off to bond with their new babies and women are hampered by a culture “that automatically assumes they are the primary caregiver for a newborn,” Cunningham-Parmeter wrote in a study of law firms. “Women who do not take maternity leave may be viewed as suspect mothers; at the same time, women who take time off to be with their newborns face the opposite presumption that they are somehow less ‘committed’ to the firm.” In other words, women are damned either way.

Some businesses may feel that limiting a benefit to “primary caregivers” helps ensure it won’t be used as an excuse to simply have paid time off. But this fear is unfounded. First, the business doesn’t gain anything from insisting on the “primary” designation. If leave is for caregiving, it must be used for caregiving.  

Meanwhile, policies that don’t turn on these distinctions are proven to benefit the bottom line. Paid family leave helps companies attract and retain high-quality employees who are happier and more productive.

Of course, most companies offer no parental leave at all, let alone for “primary caregivers.” More than half of companies offer some paid leave to women, but usually through disability. Only 14 percent of businesses offer any paid paternity leave. Federal law requires 12 weeks of unpaid leave, but the statute doesn’t apply to about 40 percent of workers, and one in five companies admits to not fully complying.

There is reason for hope. In recent weeks, companies such as Netflix, Microsoft, Virgin, and Johnson & Johnson have made news for establishing generous paid parental-leave programs for some employees, with no insistence on “primary” caregiving. Still, the overall trend is moving backward. In recent years, businesses that offer paternity leave have been cutting back.

To see how alive and well the old ways of thinking are, consider a recent Miami Herald article in which local CEOs were asked to weigh in on the idea of paternity leave. While some expressed support, others were firmly against it. Two said paternity leave should not exist. One said it should be allowed “only for one week to support the childbearing wife.” Another argued for it “only when mothers with medical reasons are not able to take care of their baby.”

Many people in my generation grew up believing in gender equality. We were recipients of the long struggle waged by women, and we thought that when we became parents we’d be parents equally. But then we got jobs, had kids, and discovered the depressing truth: The American workplace still hasn’t grown up.