(Translated by https://www.hiragana.jp/)
History - TIME
The Wayback Machine - https://web.archive.org/web/20141216143457/http://time.com/history/
TIME politics

Then as Now, the Tea Party Proved Divisive

Boston Tea Party
Artist's rendering of the Boston Tea Party of Dec. 16, 1773. MPI / Getty Images

Dec. 16, 1773: Colonial activists dump 45 tons of tea into Boston Harbor to protest the Tea Act

Members of today’s Tea Party movement embrace as kindred spirits the colonists who turned Boston Harbor into a teapot 241 years ago. And while it’s true that both groups formed around a robust opposition to the government in power and an equally vigorous objection to the taxes it levied, it would be a mistake to say that the Boston Tea Party was triggered by a tax hike.

On this evening, Dec. 16, in 1773, dozens of colonists boarded three ships laden with East India Company tea and dumped the entire stock — 45 tons of tea, worth roughly $1 million in today’s economy — into the harbor to protest Parliament’s recent Tea Act. The act, however, didn’t increase taxes: It lowered the price of tea by allowing the struggling East India Company to sell directly to colonists without first stopping in England. This cut out colonial middlemen and essentially gave the company a monopoly on tea sales.

So, although organizers of the original tea party echoed the popular refrain of “No taxation without representation,” many were motivated by a personal interest that continues to motivate 241 years later: profit. Boston’s wealthy merchants, some of whom made a fortune smuggling Dutch tea, stood to lose big when the Tea Act was passed. John Hancock, one of the main agitators behind the tea party, was among them.

Ahough the Boston Tea Party has become synonymous with patriotism, not all of early America’s top patriots were on board. The protest appalled many colonists with its destructiveness and waste, according to Harlow Unger, the author of American Tempest: How the Boston Tea Party Sparked a Revolution. “Far from uniting colonists, the Tea Party had alienated many property owners, who held private property to be sacrosanct and did not tolerate its destruction or violation,” Unger wrote.

Ben Franklin suggested to Hancock and co-agitator Samuel Adams that they reimburse the East India Company for the lost tea. He wrote, in a letter from London shortly after the protest, “I am truly concern’d, as I believe all considerate Men are with you, that there should seem to any a Necessity for carrying Matters to such Extremity, as, in a Dispute about Publick Rights, to destroy private Property.”

George Washington was similarly disapproving. His take on the Boston Tea Party clashes with the modern-day tea party’s more reverent view — and with their claim to channel the beliefs of the Founding Fathers.

When a contemporary Tea Partier, on a visit to Colonial Williamsburg, brought up the topic with a historical interpreter dressed as Washington, he was surprised by the answer, according to a 2010 Washington Post story. “…Asked whether the Boston Tea Party had helped rally the patriots, Washington disagreed with force,” the Post reported. “The tea party ‘should never have occurred,’ he said. ‘It’s hurt our cause, sir.’”

Read more about the modern Tea Party here, in TIME’s archives: Why the Tea Party Movement Matters

TIME People

James Brown: The Hardest-Working, Fastest-Driving Man in Show Business

James Brown
James Brown performing on Jul. 1, 1990, in the Netherlands Paul Bergen—Redferns / Getty Images

Dec. 15, 1988: James Brown begins serving a prison sentence for charges related to a high-speed police chase

After a tumultuous few years of increasingly bizarre, sometimes violent public outbursts, the self-styled Soul Brother No. 1 became Inmate No. 155413 at South Carolina’s State Park Correctional Center.

On this day, Dec. 15, in 1988, James Brown began serving a six-year sentence for carrying a deadly weapon at a public gathering, attempting to flee police, and driving under the influence of drugs, as reported in his 2006 New York Times obituary. Rumors of a PCP habit had already surfaced by the time his erratic behavior came to a head in September, when he reportedly stormed into the insurance company next to his office, waving a shotgun and complaining that “strangers were using his bathroom,” as TIME reported in its take on his crime and punishment.

When the police arrived, Brown led them on a high-speed chase through Georgia and South Carolina. He tried to ram police cars with his pickup truck. They shot out two of his tires; he drove on the rims for six miles. Years later, this episode would frame the 2014 Brown biopic Get On Up.

It became the latest entry on a rap sheet that had begun during Brown’s impoverished childhood in rural South Carolina, where he went to prison for the first time at age 15 for breaking into cars. He sang in the prison choir and started a band when he got out. In many ways, his was a classic American bootstrapping success story, fueled by raw talent and unrelenting effort. He became a soul and R&B legend for his innovative songwriting and his impassioned showmanship, influencing performers from Michael Jackson to Mick Jagger. He earned the nicknames he gave himself: “Godfather of Soul,” “Minister of Super Heavy Funk,” and the “Hardest-Working Man in Show Business,” among others.

But despite his staggering successes, he couldn’t stay out of legal trouble for long. The 1980s were a particularly rocky time, according to TIME’s 1988 report on his prison stint, which noted:

Brown’s fall from the top of the charts to a four-man prison cell has been going on for several years. In 1985 the IRS slapped a lien on his 62-acre spread on rural Beech Island, about ten miles outside Augusta, and he was forced to auction it off. His eight-year marriage to Adrienne, his third wife, has been tempestuous. Last April she filed suit against him for assault, then dropped the charge. (Among other things, he allegedly ventilated her $35,000 black mink coat with bullets.)

He was freed in 1991 after serving half his six-year sentence for the blowup at the insurance company. But in 1998 he reprised his antics and was arrested again on nearly identical charges: discharging a rifle, this time at his South Carolina home, and leading police on another car chase.

On this occasion he was sentenced to a drug rehabilitation program, although his recovery doesn’t seem to have lasted. In 2004, at age 70, he was arrested on domestic violence charges against his fourth wife. When he died two years later, of congestive heart failure, obituaries listed his arrests alongside his achievements, including a Grammy Lifetime Achievement Award in 1992, Kennedy Center Honors in 2003, and 116 singles on Billboard’s Hot 100 singles chart from 1958 to 1986.

TIME noted in remembering his life that Brown had thrown his full energy into all pursuits in music and in life, including his two police chases.

“Unlike O.J.’s,” TIME’s Richard Corliss wrote, “J.B.’s were naturally high-speed.”

Read TIME’s take on Brown’s 1988 incarceration, here in the archives: Soul Brother No. 155413

TIME Television

Here’s What Critics Said About The Colbert Report When It Premiered

#cancelcolbert
Scott Gries—Picturegroup/Comedy Central

When skepticism gave way to praise

When The Colbert Report began it’s nearly decade-long run in 2005—the show rolls its final credits on Dec. 18—viewers and critics were excited but nervous. Colbert’s blowhard persona had been a mainstay of The Daily Show, but many worried that, given a whole half-hour to anchor, the character would grow tiresome.

Even in the first few weeks, the jury was out.

“Unfortunately, in just two weeks on the air, this half-hour spoof of a no-spin-zone type show has already stretched Colbert’s character and the artifice that supports it past its natural breaking point,” wrote USA Today. “Colbert was an invaluable part of the Daily Show, but as the whole show, he’s not enough and too much simultaneously.” And New York magazine decided that one of Colbert’s rivals at Comedy Central, David Spade’s The Showbiz Show, was the better of the two. The Colbert Report, wrote critic Adam Sternbergh, “has problems so intrinsic as to be potentially unfixable.” (The Showbiz Show ended in 2007.) Even critics who liked the show, like Heather Havrilesky at Salon, found the show “foolish, bizarre, idiotic fun,” mostly interesting in regards to his characters spoof of Bill O’Reilly.

Still, when the consensus emerged, it was one that stuck: The show was great. “[H]e packs more wit and acid commentary in 22 minutes of his one-man show than multiple skits by the entire cast of ‘SNL,’” declared The New York Times, and the The Los Angeles Times said that “Colbert, with his young Republican haircut and dead-serious eyes, is a terrifically artful speaker; there may be no better reader of writing on TV than him.”

TIME’s James Poniewozik concurred:

Many people, Colbert included, were worried that that guy would be too much to take for 30 minutes. (Then again, people blow a full hour on Bill O’Reilly.) But Colbert inhabits his pose so lustily–“I’ve just swallowed 20 condoms full of truth, and I’m smuggling them across the border!”–that his glee is infectious. Like the band Weezer or The O.C.‘s Seth Cohen, he is in the grand modern tradition of the swaggering nerd.

And here’s another reason to swagger: His legions of fans would say that the question of whether he’s too much to take—for 11 seasons, much less 30 minutes—seems as “foolish, bizarre, idiotic” as the character to whom they’re saying good-bye.

Read James Poniewozik’s full Nov. 14, 2005, piece on The Colbert Report: The American Bald Ego

TIME movies

7 Things You Didn’t Know About Gone With the Wind

Dec. 25, 1939, cover of TIME
The Dec. 25, 1939, cover of TIME TIME

The film premiered on Dec. 16, 1939

When the movie of Gone With the Wind premiered in Atlanta 75 years ago — on Dec. 15, 1939 — it made news far beyond Hollywood. The movie scored the cover of TIME, with coverage of the city of Atlanta and an in-depth look at how the movie came to be.

Though die-hard fans of “G with the W” — as 1939’s cinema moguls referred to the picture — may think they already know everything there is to know about the film, here are a few of the more surprising facts from TIME’s cover story:

The day of the premiere was a statewide holiday in Georgia. The state’s Governor prepared to call out the National Guard for the event, and the mayor of Atlanta decided that, within the city, the festivities would last three days. He also took it one step further, asking Atlantans to dress up for the occasion: “Mayor Hartsfield urged every Atlanta woman and maid to put on hoop skirts and pantalets, appealed to every Atlanta male to don tight trousers and a beaver, sprout a goatee, sideburns and Kentucky colonel whiskers. He also requested citizens not to tear off the clothes of visiting movie stars, as happened in Kansas at the premiere of Dodge City.”

Hundreds of thousands of people came to watch the stars arrive in Atlanta — in fact, TIME reported, the crowds (an estimated 300,00 fans) were bigger than the combined armies at the Battle of Atlanta in 1864. The masses tossed confetti as Vivien Leigh, Laurence Olivier, Clark Gable, Gable’s wife Carole Lombard and producer David O. Selznick made their way into the city from the airport.

At the time, it was the third most expensive movie ever made. Gone With the Wind cost $3.85 million to make, which was less than only Ben Hur ($4.5 million) and Hell’s Angels ($4 million). That’s about $66 million in today’s money, which is significantly less than the typical production budget for today’s major blockbusters. (Avatar reportedly cost about $425 million; $65 million is closer to the cost of Lincoln.) Gone With the Wind was also one of the longest movies ever made, though its roughly three hours wasn’t that long considering the first draft of the script would have run nearly twice as long.

Producer David O. Selznick bought the film rights to the book without having read it. Kay Brown famously sent Margaret Mitchell’s novel to Selznick with a note urging him to buy it, but TIME reported that when he read the synopsis and saw that it was a Civil War story, he decided against it. He had recently made an unsuccessful Civil War movie called So Red the Rose and didn’t want a repeat. But, when the chair of his company’s board offered to put up the money, Selznick went ahead and bought the rights. He read it within a week while on a trip to Hawaii.

The movie was finished filming before the script was done. The process of turning an epic novel into a normal-length movie without disappointing fans meant that there were many, many drafts. In fact, there were so many drafts that, according to TIME, “when the filming was practically complete the last day’s call sheet read: Script to come.” Part of the reason shooting the movie ended up taking a long time was that the actors had to wait for the script to be finished as they went along.

The “city” used for the burning of Atlanta was made up of 20 years’ worth of old movie sets. It was almost exactly a year before the premiere — Dec. 11, 1938 — that Selznick had the old sets in his studio’s back lot doused with kerosene and lit on fire, with a painted flat of an Atlanta scene in front of them.

The process of casting Scarlett O’Hara went on for years — but Vivien Leigh knew about the part earlier than most. It was a calculated decision by Selznick to wait a few years after the book’s release to get going on the movie, and in that time ideas for casting Scarlett were all over the place. There was even a play about the question, which TIME reports ran for two months before the answer came in the form of Vivien Leigh. Though Leigh met Selznick after the first scenes of GWtW were filmed, apparently she’d had it on her radar for a long time. “British Director Victor Saville, now in Hollywood, read one of the first copies of Gone With the Wind to reach England,” TIME reported. “As soon as he had finished it, he rushed to the telephone and mischievously called Vivien Leigh. Said he: ‘Vivien, I’ve just read a great story for the movies about the bitchiest of all bitches, and you’re just the person to play the part.'”

Read the full 1939 article, here in the TIME Vault: G With the W

TIME movies

What Is It About Gone With the Wind That Still Enchants Us?

LIFE Gone with the Wind - Front Cover
LIFE

Read the introduction to LIFE's book Gone With the Wind: The Great American Movie 75 Years Later, excerpted here

Scarlett O’Hara famously vowed she would lie, steal, cheat or kill to survive. And just like its heroine, Gone with the Wind has shown remarkable resilience. Celebrating its 75th anniversary this year, the film remains a fixture in popular culture. Its iconic status is more secure than ever, thanks to television, DVDs, parodies and revivals that roll around as regularly as national holidays. Just as amazingly, Margaret Mitchell’s 1,037-page novel has been in print since it became a best-seller in 1936, its life extended by a prequel, a sequel—and, of course, the movie.

The story of a small corner of the South in the 19th century, a war movie with no battle scenes, revolving around a heroine of questionable morals, has proved uncannily adept at crossing barriers of geography and time. The poster of Scarlett and Rhett posed against the flaming sky is as instantly recognizable in China or Ethiopia or France as the American flag. The movie is still the biggest blockbuster in history with ticket prices adjusted for inflation. And if it doesn’t have quite the must-see thrill it once did, if many of its transgressions appear fairly innocuous today, others are as fresh and controversial as they were in 1939.

Politically incorrect and racially retrograde, GWTW has offended so many sensibilities that the overture should be preceded by a trigger alert: Beware! This is history written by the losers. The Yankees are irredeemable villains; the slaves too happy in their subjugation to yearn for freedom. The marital rape, in which Rhett forces himself on Scarlett and—horrors!—she enjoys it, can still raise the blood pressure of feminists. But the allure of Gone with the Wind is more powerful, fed by fantasies that run roughshod over ideology.

I came to it as a Southern teenager in the ’50s, when the book was a sort of underground bible. We consumed it under covers with a flashlight, much as Margaret Mitchell read the romance novels her bluestocking mother deplored. The movie, ideally cast, preserved all the disreputable qualities of its heroine, the delicious ambiguities of good boy and bad boy in her two lovers. As with the book, we embraced the movie in a state of critical and political innocence. Max Steiner’s sweeping score is nothing if not relentless, yet who can hear the first few chords of Tara’s theme without experiencing a frisson?

There is a reason so many studios turned the property down. The book was too long, its legion of admirers too passionate. They would detect any alteration, would brook no compromise. And who could play the crucial and near-impossible role of Scarlett? A known movie star would bring too much baggage, an unknown wouldn’t have the chops. The budget would be prohibitive. Only producer David O. Selznick had both the ego and cultural pretensions to even attempt it, and he passed many an insomniac, pill-fueled night as the $4.25 million production went through five directors, 15 screenwriters, firings and rewritings, not to mention finding the leading lady only after shooting had begun. In the end, what should have been one of the great disasters was a triumph, not just a blockbuster and winner of 10 Academy Awards, but a showcase of a kind of filmmaking that we would seldom see again. Yet the irony can’t have escaped spectators: The New World’s most democratic medium had given us the portrait of an aristocratic past whose seductiveness depended on the denial of unpleasant truths.

The scapegrace daughter of a high-minded suffragette, Margaret “Peggy” Mitchell was a tomboy truant, then a flapper who acted up at parties while her mother marched for the vote and attended to the sick. From this dividedness comes Scarlett, an unresolved amalgam of the high-spirited party girl, thumbing her nose at proprieties, and the lost girl, longing for the love of a disapproving mother.

Officially she would have nothing to do with Selznick’s film, thus providing herself with deniability should her fellow Atlantans be outraged by the movie’s vulgarities. In her letters, she took a line of baffled innocence. The book had “precious little obscenity in it,” she wrote disingenuously to a correspondent, “no adultery and not a single degenerate, and I couldn’t imagine a publisher being silly enough to buy it.”

Macmillan bought it, of course. With very little effort on the publisher’s part, Gone with the Wind sold 1 million copies in the first six months, then (with a great deal of effort on the part of Selznick et al.) it became an Oscar-winning movie, all with its degeneracy intact. I’m thinking of such no-no’s as a house of prostitution patronized by Rhett Butler and other Atlanta notables; the marital rape; a near rape in Shantytown (Scarlett is attacked by a black man in the book, saved by one—Big Sam—in the movie); adultery of the soul if not of the body between Scarlett and Ashley; and a farewell punctured by a four-letter word not allowed on the heavily censored movie screens of the time. The offenses against gentility include a harrowing childbirth and, against virtue and Hollywood conventions, a heroine of unprecedented selfishness who lies and cheats her way through Reconstruction, stealing her sister’s man in the process. Mitchell’s way of rationalizing her she-devil protagonist was to maintain that Melanie was the heroine, not Scarlett. Or was meant to be.

But we teenagers knew forbidden fruit when we tasted it. In the uptight, prefeminist ’50s Scarlett was a slap in the face to all the rules of white-gloved ladylike behavior in which we were steeped, a beacon (however tarnished) of female wiliness and defiance. She looked marriage and adulthood square in the face—a life on the sidelines, matronly chaperones in dowdy clothes—and would have none of it. Proudly adolescent, a rebuke to grown-up hypocrisy and conformity, she’s the opening salvo in the teenage revolution, pioneer of a new demographic that would become official with rebel James Dean.

Naturally, the first American readers and audiences saw GWTW as a fable of the Depression, when men were laid off and women were compelled to find ways to survive. Later it would inevitably echo the reality of the Second World War, with men fighting abroad and women going to work for the first time. Perhaps more surprising is the way the movie has enraptured hearts and minds around the world. From postwar France, left-wing cine clubs in Greece and prisons in Ethiopia, there have come stories of people who are by no means sympathetic to slave-owning Dixie, but who nevertheless identify with the South and see it as a mirror of their own travails. Maybe it’s because GWTW is not about bravery on the battlefield but the courage of resistance, of holding it together, of coming through in the clutch—in other words, gumption, Margaret Mitchell’s favorite word. The Darwinian struggle is between, as Ashley says, “people who have brains and courage . . . and the ones who haven’t.”

This confusion of good and evil, of winners and losers, is embedded in the very marrow of Gone with the Wind, as it is in the idealized vision of the South so long cherished by the former Confederacy. Margaret Mitchell spent hours as a child on her grandmother’s porch, listening to relatives tell war stories. She claimed not to have realized until she was 10 years old that the South had lost the war. And so it is that in the face of unacceptable defeat, she gives spiritual victory to her characters. If Scarlett is the motor, Yankee-like in her drive, the impractical, ungreedy Ashley embodies the South’s moral ascendancy. Audiences, as well as characters in the movie, tend to cut Scarlett a surprising amount of slack, rationalize her selfishness as necessary (and very American) expediency. GWTW is full of such questionable fudgings and South-justifying sentimentalities, and its reception, never unmixed, has been plagued by stories that haunt us. Butterfly McQueen could never escape the role, or voice, of Prissy. And though Hattie McDaniel would end up winning the Oscar for playing Mammy, she couldn’t attend the premiere in segregated Atlanta.

Still, it’s important to give Mitchell and Selznick the benefit of context—different time, different rules; and they were more progressive than many around them. As a Jewish man, Selznick understood persecution and didn’t want to go to his grave with the racist legacy of D.W. Griffith. He listened to advisers and blacks on the set, and dropped the word n-gger (used in the book by blacks in reference to one another). Mitchell was a product of the Jim Crow South, but wound up funding education for blacks at considerable risk to herself.

The film, with all its complications and controversies, with all its success, proved as much a burden for its authors as a joy. Margaret Mitchell was overwhelmed by attention and ailments and died at age 48. Selznick, too, suffered from the stress, and Vivien Leigh, the third obsessive of the trio, gave so much of her unstable self to the incandescent Scarlett that she displayed symptoms of burnout the rest of her life.

You are about to embark on a fascinating journey into the heart of an American epic—some would say the American epic. As you read the juicy stories about the making of the movie and the making of Mitchell, you may find, as I did, that the film remains a testament to the manic dedication of Selznick, Mitchell and Leigh . . . and to a fourth partner, the viewers, who have made the film—intensely—their own.

LIFE’s book Gone With the Wind: The Great American Movie 75 Years Later is available here.

See photos from the Gone With the Wind set here, and from the making of the movie here

TIME politics

Why Is It Congress Seems Concerned With Families Only When Sex Trafficking Is at Issue?

It's an old tradition in America, going back to the Mann Act of 1910

History News Network

This post is in partnership with the History News Network, the website that puts the news into historical perspective. The article below was originally published at HNN.

In one of the rare instances of bipartisan cooperation, the House’s Ways and Means Committee and the Senate’s Finance Committee passed the Preventing Sex Trafficking and Strengthening Families Act (P.L. 113-183/H.R. 4980), which President Obama signed into law on September 29, 2014. On the whole, the law seeks to encourage states to reform their foster care systems by encouraging and streamlining adoption processes. Though foster care reform is an admirable legislative concern that intersects with the real-world needs of children, what interests me is the way that foster care reform has been linked to sex trafficking within this bill.

Senate Finance Committee Chairman Ron Wyden (D-OR) declared, “This legislation will ensure no state turns a blind eye to child sex trafficking by requiring state child welfare systems to identity victims and build a systematic response.” Looking at the public record, it is incredibly unlikely that any state could turn a blind eye to domestic minor trafficking as states have raced to save the children from traffickers. According to the Polaris Project, starting in 2003, all 50 states have passed laws prohibiting sex trafficking within their borders, and an additional 45 states have passed domestic minor sex trafficking laws. These state laws are fortified with the federal Trafficking Victims Protection Act (2000, reauthorized in 2003, 2005, 2008, and 2013) and the Mann Act (1910). Indeed, looking at the public rhetoric coming out of city halls, state houses, and halls of Congress, it seems that the United States is plagued with the scourge of sex traffickers preying on children. In a period of intense bipartisan division, the issue of sex trafficking seems to be one of the few issues that brings together members of rival political classes. One hundred years ago the United States was just as captivated by the issue of sex trafficking and the dangers it posed to youth.

Sensational stories about sex trafficking dominated the nation’s newspapers from 1907 until the outbreak of World War I in 1914. Called white slavery at the time, newspapers and magazines warned that a clandestine network of sex traffickers imported sex slaves from Europe to the United States to fill America’s brothels. But foreign-born girls were not the only youths at risk. Edwin Sims, a U.S. Attorney in Illinois, noted in 1909, “Literally thousands of innocent girls from the country districts are every year entrapped into a life of hopeless slavery and degradation…[by] ‘white slave’ traders who have reduced the art of ruining young girls to a national and international system.”

As numerous historians have noted, these sensational stories of sex trafficking encapsulated a host of intersecting anxieties circulating during the period: fears of the immigrant hordes, worries over the new heterosocial recreations offered by the city, dismay over the availability of legal prostitution, dread of interracial relationships, unease over women’s increased entry into the wage marketplace and public life, and concern about the eroding of traditional familial and community relationships in a period of marked rural-to-urban migration. With so many fears expressed within the stories of sexual slavery, it wasn’t long until social purity reformers, women’s rights activists, and other moral reformers turned to congress to protect “somebody’s daughters.”

In 1909, Edwin Sims and his ally Clifford Roe approached Illinois Congressman James R. Mann about drafting a nation-wide domestic anti-trafficking law that would complement existing immigration laws. Mann’s proposed law would make it illegal to take a woman or girl over state lines for the purposes of prostitution, debauchery, or “any other immoral purpose.” Debated in 1910, most congressmen signaled their support for the law, with Thetis W. Simms (D-TN) urging passage of the law to “take care of the girls, the women—the defenseless.” He suggested that “we will prevent, I hope forever, the taking away by fraud or violence, from some doting mother or loving father, of some blue-eyed girl and immersing her in dens of infamy.” Within the Progressive imagination, the most sympathetic victim of sex trafficking was the young, white girl who had a previous reputation of chastity who through no fault of her own had become alienated from a stable family structure—the doting mother or loving father. The legislation sailed through the House and Senate and was signed into law on June 26, 1910 by President Taft.

Enforcement of this broad anti-trafficking law, with its vague “any other immoral purpose” clause, fell to the young Bureau of Investigation (renamed the Federal Bureau of Investigation in 1935). When handed the mandate of the Mann Act, the Bureau had only 61 special agents, yet within three years it would have well over 300 special agents scattered across the nation. When initially enforcing the law the Bureau faced two challenges: 1. it questioned whether a widespread network of traffickers preyed on innocent young women; 2. the parameters and constitutionality of the “any other immoral purpose” clause was very much in doubt. In the face of such concerns, the Bureau initially used the law as an anti-prostitution law to expand its reach until 1917 when the Supreme Court ruled that the “any other immoral purpose” clause truly meant any other immoral purpose.

Amid the changing cultural mores of the 1920s the Bureau became a force for conservative values within the federal government. Mann Act investigations continued to make up the bulk of the day-to-day activities of the agency; but the type of Mann cases pursued changed. The Bureau actively responded to parents’ requests to track down run-away daughters and husbands’ demands for help locating adulterous wives. Until the outbreak of WWII, the anti-sex trafficking law was used to uphold patriarchal familial privilege, and the dependency of wives and daughters—strengthening family along the values of the early twentieth century.

The Preventing Sex Trafficking and Strengthening Families Act fits into a long tradition linking the dangers of sex trafficking to the frailty of family stability. Senator Wyden argues that the new law “helps build bridges to permanent families and stable relationships, which are key to protecting children from predators.” But as the Bureau’s investigations into Mann Act cases reminds us, building stable families and fighting sex trafficking has historically meant empowering the law enforcement state, rather than growing social services or providing for victims’ assistance. Legal scholar Jennifer Sheldon-Sherman notes that the current trend in sex trafficking policy follows the same pattern of prioritizing law enforcement over victims’ services and preventive care. Perhaps the Preventing Sex Trafficking and Strengthening Families Act, with its efforts to protect vulnerable foster kids from prostitution, represents a step towards constructing a comprehensive way to combat sex trafficking.

Jessica R. Pliley is an assistant professor of women’s history at Texas State University and the author of “Policing Sexuality: The Mann Act and the Making of the FBI” (Harvard University Press, 2014).

TIME Civil Rights

What the International Response to the Civil Rights Movement Tells Us About Ferguson

Education Segregation, USA. pic: circa 1957. Little Rock, Arkansas. National Guardsmen, having admitted white children to a school, barr the way to a black student.
Little Rock, Ark., 1957: National Guardsmen, having admitted white children to a school, bar the way to a black student Paul Popper—Popperfoto / Getty Images

International criticism during the Civil Rights Movement helped bring about new legislation

Images of armed soldiers blocking nine African-American high school students from integrating a public high school in Little Rock, Ark. shocked the world nearly 60 years ago. Organs of Soviet propaganda, determined to disrupt perceptions of a tranquil American democracy, wrote of American police “who abuse human dignity and stoop to the level of animals” in the newspaper Izvestia. In the midst of stiff Cold War competition for hearts and minds around the world, the prospect of controlling international perceptions motivated officials at the highest levels of U.S. government to support new civil rights measures.

The U.S. representative to the United Nations warned President Dwight Eisenhower that the incident had damaged American influence, and the President listened.

“Before Eisenhower sent in the troops, there were mobs around the school for weeks, keeping these high school students from going to school,” says Mary Dudziak, a professor at Emory whose book Cold War Civil Rights argues that international pressures encouraged the federal government to work to improve civil rights, and which tells the above story about Little Rock. “The issue caused people from other countries to wonder whether the U.S. had a commitment to human rights.”

Today, the highly-publicized killings of unarmed black men like Michael Brown and Eric Garner have attracted similar international condemnation, and some historians wonder whether concerns about U.S. appearances around the world could once again influence the federal government.

Read More: One Man. One March. One Speech. One Dream.

During the Cold War, the Soviet Union, the sworn enemy of the U.S., had a lot to gain by showing that American democracy wasn’t all it was cracked up to be. The opposition of today’s day and age are less influential, but they appear equally eager to highlight American dysfunction. Iran’s Supreme Leader Ali Khamenei used the attention surrounding Michael Brown to remind his Twitter followers of America’s history on race issues. A tweet from the leader features images of police dogs in Birmingham during the Civil Rights Movement alongside an image of Michael Brown:

North Korea may have one of the world’s worst human rights records, but that didn’t stop the country from criticizing the U.S. for “wantonly violating the human rights where people are subject to discrimination and humiliation due to their races.”

It’s not surprising that North Korea and Iran would criticize the U.S., but the reprimanding hasn’t been limited to opponents. The U.N. High Commissioner for Human Rights said he is unsure whether the decision not to indict the police officer who shot Michael Brown “conforms with international human rights law.”

“It is clear that, at least among some sectors of the population, there is a deep and festering lack of confidence in the fairness of the justice and law enforcement systems,” Zeid Ra’ad Al Hussein, U.N. high commissioner for human rights, said in a statement.

Criticism from the U.N. is significant, but the international body’s desires, much less those of North Korea or Iran, have never driven U.S. policy — and the fact is, while there are many links between the Cold War era and today, times have changed. Thus far, the President has walked a fine line in his response. He proposed some measures, including encouraging police to wear body cameras, but it seems unlikely that he’ll be proposing any game-changing legislation like the Civil Rights Act of 1964.

It’s not surprising that Obama has hesitated to involve the federal government in what is typically a local or state issue. The international mandate simply isn’t as strong as it was during the Cold War. There’s no equivalent to the Soviet Union offering a credible alternative to America’s system of governance.

But that may not be the case forever: historians and political scientists say that a growing movement against police brutality has the potential to increase international pressure and, perhaps, force change.

“I’m sure the Obama administration and the State Department are concerned about [international perceptions],” says Rick Valelly, a political science professor at Swarthmore. “Right now it’s embarrassing, but I don’t think it’s internationally consequential.”

Movements to end police brutality don’t yet have the “same kind of legs” that the Civil Rights Movement had, Valelly says. This year’s demonstrations have been attention-grabbing — and the “Justice for All” March planned for Washington, D.C, this Saturday is certain to make headlines — but it may take many more years of sustained protest before the movement would be noticed internationally on a much larger scale, as the Civil Rights Movement of the 1960s was. For now, experts in the field still remain optimistic about the benefits of international attention, relatively minor though it may be.

“Public diplomacy begins with listening,” says Nick Cull, a professor at the University of Southern California, “and this would be a really good time to listen.”

TIME movies

Corleone Family Values: The Godfather Part II at 40

Al Pacino In 'The Godfather: Part II'Woody Allen And Mia Farrow In 'A Midsummer Night's Sex Comedy''
Al Pacino in a scene from the film 'The Godfather: Part II', 1974. Archive Photos / Getty Images

The second 'Godfather' film made enduring stars of its lead players, defined the machismo of its generation and influenced every TV drama about ordinary families with the darkest secrets

Francis Ford Coppola didn’t want to make a sequel to his 1972 Oscar-winning blockbuster, recommending Martin Scorsese, fresh off Mean Streets, for the job. He finally said yes when Robert Evans of Paramount Pictures agreed that Part II could include extensive flashbacks of the young Vito Corleone, played by Mean Streets comer Robert De Niro. Marlon Brando was out because he wanted too much money, and the Clemenza character got dropped because Richard Castellano wanted his dialogue to be written by a friend. Director Elia Kazan, the first choice to play Hyman Roth, spent so much time with his shirt off during a conversation with Coppola that when Lee Strasberg was hired for the role, Coppola insisted he play one scene topless. That’s Francis’s mother Italia in the casket as the deceased Mama Corleone — actress Morgana King thought it bad luck to lie in a coffin — and his Uncle Louie, a dead ringer for Brando, in the Havana cake scene.

Coppola provided these anecdotal nuggets in a commentary on the 2001 five-disc DVD of the saga he made from Mario Puzo’s novels: The Godfather in 1972, The Godfather Part II 40 years ago (its New York premiere was Dec. 12, 1974; it arrived in theaters about a week after) and The Godfather Part III in 1990. Third time was not the charm, but the first two were sensationally popular, influential and cherished. Both won Oscars for Best Picture — the first and only time that’s happened — and made enduring stars of De Niro, Al Pacino (as Michael Corleone), Diane Keaton (Michael’s wife Kay), James Caan (his brother Sonny) and Robert Duvall (Consigliere Tom Hagen).

In TIME’s Godfather Part II review, titled “The Final Act of a Family Epic” — who knew, back then, that every movie epic had to be a trilogy? — Richard Schickel described the Lake Tahoe scene of a party celebrating Michael’s son’s First Communion and noted:

What happens at this point is that delicious sensation of letting-go familiar to readers of huge 19th century novels, but much less readily available to a moviegoer today. A skilled popular artist — the kind of man who can blend subtly observed details with a gift for socko showmanship — takes over to lead a guided tour of an exotic yet humanly recognizable and completely realized world. That’s really entertainment.

This is a much colder film, with austere aspirations — not fully realized — to transcend its melodramatic origins and to become an authentic tragedy. … As Michael plots his careful, lethal moves, the recurring, unforgettable image is of his eyes growing colder, until they finally go dead to the horrors around him.

The Godfather was its year’s box-office champ — its $135-million take at North American theaters would be nearly $700m today — and, in real dollars, still in the all-time top 25. Part II was sixth on the 1974 chart (Mel Brooks’s Blazing Saddles was No. 1) and is widely considered the darker and stronger of the pair. Together they formed a bold mural of America: crime infiltrating big business and Washington politics, all intersecting with the Corleones’ family values.

God I and God II had both immediate and lasting impact. They helped define machismo for a couple generations of young males, maybe females too. The films spawned countless Mafioso movies, and of course The Sopranos — in fact, every vaunted TV drama about loving families with a dirty secret. (They’re vampires, they’re polygamous, they run a meth business, they’re Commie spies.) True Blood, Big Love, Breaking Bad, The Americans and countless other were forged from the Corleone template.

On the Internet Movie Database’s all-time Top 250, “as voted by regular IMDb users,” the two Godfather films rank second and third, behind only The Shawshank Redemption (a terrific film but… come on). And who wouldn’t rate Michael’s New Year’s Eve takedown of his traitorous brother Fredo (John Cazale) as one of the greatest movie kisses? That “You broke my heart” moment was both passionate and chilling, like the films themselves. Part II’s final shot, closing in on Michael as he ponders the sins that brought him his power, left viewers to determine his admixture of hero and monster.

Forty years on, the films’ core stars are still prominently around. After seven Oscar nominations (two for The Godfathers), Pacino finally won Best Actor in 1993 for Scent of a Woman. De Niro matched his Supporting Actor Oscar for The Godfather with a Best Actor for Raging Bull. Keaton was Oscared for Annie Hall (one of her four nominations) and Duvall for Tender Mercies (one of his six). To these and Caan, add Francis’s sister Talia Shire (Connie Corleone) and his director daughter Sofia (“Child on Ship” in Part II). Also two legendary 88-year-olds: B-movie mogul emeritus Roger Corman (“Senator #2″) and that supreme hangdog character actor Harry Dean Stanton (“F.B.I. Man #1″).

As for Coppola, he had a great 1970s — The Conversation and Apocalypse Now as well as the first two Godfathers — but found it increasingly hard to raise financing for films he wanted to make. “It’s ironic that people should look back decades later and celebrate films I was given a lot of trouble on,” he said on the DVD commentary, “but that nobody wants me to make a movie right now. Talking to me about The Godfather is like talking to me about my first wife when I’m sitting next to my second one. I’d rather get some encouragement on what I’m doing now than celebrate old projects. It was no fun 30 years ago, and I’m still doing it, and I didn’t want to.”

That was in 2001, when the director was in a 10-year dry spell between projects. Since then, he has poured the profits from the Francis Ford Coppola winery into making his own low-budget, minimally-released indie films: Youth Without Youth, Tetro and the 3-D Twixt — intimate, moody melodramas closer in spirit to his 1963 Dementia 13 than to the impossibly ambitious films of his early prime.

At an indefatigable 75, does Coppola look around, see the stars he discovered still in the game, and wonder if maybe it’s time for a Godfather Part IV?

Read TIME’s review of The Godfather Part II here in the TIME Vault: The Final Act of a Family Epic

TIME Sports

How the Heisman Trophy Got Its Name

John Heisman
ATLANTA - John Heisman, head coach of the Georgia Tech Yellow Jackets, circa 1904-1919. Collegiate Images / Getty Images

John Heisman's name is famous — his career, not so much

The latest winner of the Heisman Trophy, the annual top award for college football players, will be announced on Saturday. But, while the trophy and its signature pose have become synonymous with college football success, the actual career of this “Heisman” guy isn’t exactly well-known.

So who was Heisman, and why is there a trophy named after him?

As the award’s official site lays out, John W. Heisman was — unsurprisingly — a college football player, active in the late 19th century. After that, his story gets a lot more noteworthy. For one thing, while studying law and playing ball at the University of Pennsylvania, his eyesight was damaged — his official bio says by a bolt of lightning striking nearby, but other sources argue that the damage was more likely to have come from exposure to flashing lights during a football game he played at Madison Square Garden. In any case, he graduated but was unable to go straight to a law practice, as his eyes needed to rest.

In 1892, he began coaching at Oberlin College — and he never did get around to practicing law. His coaching career took him to schools including Georgia Tech, Auburn and UPenn. He’s also credited with introducing the center snap, the hidden ball trick and the practice of paying coaches. After he retired from coaching, he worked at the Downtown Athletic Club in New York City, where he founded the National Football Coaches Association and organized a vote to pick the best college player of the year. The first such award, in 1935, went to Jay Berwanger of the University of Chicago; the trophy has been in the shape of a player with his arm extended since the very beginning.

Heisman died the next year and the award was then named in his honor — an honor that continues to this day.

Not that everything about the Heisman Trophy is about honoring him: when Berwanger died in 2002, it was reported that the player’s aunt had used the first-ever Heisman Trophy as a doorstop.

Read a 1973 story about Oberlin’s athletic program, here in the TIME Vault: Overhaul at Oberlin

TIME Boston

Time Capsule from Paul Revere and Samuel Adams Discovered in Boston

The copper box was entombed in the state capitol in 1795

Someone call Nicholas Cage because Bostonians may have just discovered a new real-life national treasure.

A time capsule from 1795 in the form of a small copper box was unearthed in Massachusetts statehouse in Boston Thursday. The container was first placed in the cornerstone of the building on Beacon Hill by revolutionary war hero Paul Revere and the then-governor of Massachusetts, Samuel Adams.

The cigar-box sized capsule will be x-rayed over the weekend and its contents revealed next week. The artifact is believed to be one of the oldest time capsules in Massachusetts, making it by default one of the oldest such items in the United States.

During the excavation, coins fell out of the plaster that held the box in place, good luck tokens tossed in with the box when it was unearthed and reburied in 1855, the last time the time capsule was uncovered, The Boston Globe reports.

[The Boston Globe]

Read next: Here’s What Really Killed the Dinosaurs

Your browser, Internet Explorer 8 or below, is out of date. It has known security flaws and may not display all features of this and other websites.

Learn how to update your browser