(Translated by https://www.hiragana.jp/)
Opinion - TIME
The Wayback Machine - https://web.archive.org/web/20141216222633/http://time.com/opinion/
TIME Opinion

Girl Gone Wild: The Rise of the Lone She-Wolf

Wild
Fox Searchlight

A woman on a solitary journey used to be seen as pitiful, vulnerable or scary. Not any more.

The first few seconds of Wild sound like sex. You hear a woman panting and moaning as the camera pans across the forest, and it seems like the movie is starting off with an outdoor quickie. But it’s not the sound of two hikers hooking up: it’s the sound of Cheryl Strayed, played by Reese Witherspoon, climbing a mountain all by herself.

It lasts only a moment, but that first shot contains everything you need to know about why Wild is so important. It’s a story of a woman who hikes the Pacific Crest Trail for 94 days in the wake of her mother’s death, but more than that, it’s a story of a woman who is no longer anything to anybody. We’re so used to seeing women entangled with other people (with parents, with men, with children, in neurotic friendships with other women), that it’s surprising, almost shocking, to see a woman who is gloriously, intentionally, radically alone.

When it comes to women onscreen, the lone frontier is the last frontier. It’s no big deal to see women play presidents, villains, baseball players, psychopaths, superheroes, math geniuses, or emotionally stunted losers. We’ve even had a female Bob Dylan. But a woman, alone, in the wilderness, for an entire movie? Not until now.

Which is unfair, considering all the books and movies dedicated to the often-tedious excursions of solitary men, from Henry David Thoreau to Jack Kerouac to Christopher McCandless. Audiences have sat through hours of solo-dude time in critically acclaimed movies like Castaway, Into the Wild, Life of Pi, 127 Hours, and All is Lost. America loves a Lone Ranger so much, even Superman worked alone.

In fact, the only thing more central to the American canon than a solitary guy hanging out in the woods is a guy on a quest (think Huckleberry Finn or Moby Dick). The road narrative may be the most fundamental American legend, grown from our history of pilgrimage and Western expansion. But adventure stories are almost always no-girls-allowed, partly because the male adventurer is usually fleeing from a smothering domesticity represented by women. In our collective imaginations, women don’t set out on a journey unless they’re fleeing from something, usually violence. As Vanessa Veselka writes in her excellent essay on female road narratives in The American Reader: “A man on the road is caught in the act of a becoming. A woman on the road has something seriously wrong with her. She has not ‘struck out on her own.’ She has been shunned.”

MORE: The Top 10 Best Movies of 2014

The ‘loner in nature’ and the ‘man on the road’ are our American origin stories, our Genesis and Exodus. They’re fables of an American national character which, as A.O. Scott pointed out in his The New York Times essay on the death of adulthood in American culture, has always tended towards the boyish. Wild is the first big movie– or bestselling book, for that matter–to re-tell that central American story with a female protagonist.

But Wild is just the most visible example of what’s been a slow movement towards loner ladies onscreen. Sandra Bullock’s solo spin through space last year in Gravity was the first step (although her aloneness was accidental, and it was more a survival story than road narrative). Mia Wasikowska’s long walk across Australia in Tracks this year was another. But Wild, based on Strayed’s bestselling memoir and propelled by Witherspoon’s star power, is the movie that has the best shot at moving us past the now-tired “power woman” towards a new kind of feminist role model: the lone female.

Because for women, aloneness is the next frontier. Despite our chirpy boosting of “independent women” and “strong female leads,” it’s easy to forget that women can never be independent if we’re not allowed to be alone.

For men, solitude is noble: it implies moral toughness, intellectual rigor, a deep connection with the environment. For women, solitude is dangerous: a lone woman is considered vulnerable to attacks, pitiful for her lack of male companionship, or threatening to another woman’s relationship. We see women in all kinds of states of loneliness–single, socially isolated, abandoned–but almost never in a state of deliberate, total aloneness.

Not to mention the fact that women’s stories are almost always told in the context of their relationships with other people. Even if you set aside romance narratives, the “girl group” has become the mechanism for telling the stories of “independent” women– that is, women’s stories that don’t necessarily revolve around men. Think Sex & The City, Steel Magnolias, A League of Their Own, Sisterhood of the Traveling Pants, Girls: if a woman’s not half of a couple, she must be part of a gaggle.

When Cheryl Strayed describes her experience of “radical aloneness,” she’s talking about being completely cut off from human contact–no cell phone, no credit card, no GPS. But her aloneness is also radical in that it rejects the female identity that is always viewed through the lens of a relationship with someone else. To be alone, radically alone, is to root yourself in your own life, not the role you play in other people’s lives. Or, as Strayed’s mother Bobbi wistfully puts it, “I always did what someone else wanted me to do. I’ve always been someone’s daughter or mother or wife. I’ve never just been me.”

MORE: The Top 10 Best Movie Performances of 2014

And that’s the difference between aloneness and independence. The “independent woman” is nothing new– if anything, it’s become a tired catchphrase of a certain kind of rah-rah feminism. “Independence” implies a relationship with another thing, a thing from which you’re severing your ties. It’s inherently conspicuous, even performative. Female independence has become such a trope that it’s become another role for women to play: independent career woman, independent post-breakup vixen, independent spitfire who doesn’t care what anyone thinks. And usually, that “independence” is just a temporary phase before she meets a guy at the end of the movie who conveniently “likes a woman who speaks her mind.”

Aloneness is more fundamental, and more difficult. It involves cultivating a sense of self that has little to do with the motherhood, daughterhood, wifehood or friendship that society calls “womanhood.” When interviewed by the Hobo Times about being a “female hobo,” Strayed says: “Women can’t walk out of their lives. They have families. They have kids to take care of.” Aloneness then, isn’t just a choice to focus on one’s self– it’s also a rejection of all the other social functions women are expected to perform.

In 1995, when Strayed hiked for 94 days, that would have been hard. In 2014, it’s even harder. Thanks to the internet, our world is more social now than ever before, and it’s even harder to escape other people. But aloneness is at the root of real independence, it’s where self-reliance begins and ends. So these days, if you want to be independent, maybe you can start by trying to be alone.

Read next: Reese Witherspoon Isn’t Nice or Wholesome in Wild, and That’s What Makes It Great

TIME

Viral Threats

TURKEY-SYRIA-CONFLICT-KURDS
Militants of Islamic State are seen before explosion of air strike on Tilsehir hill near the Turkish border village Yumurtalik in Sanliurfa province, Oct. 23, 2014. BULENT KILIC—AFP/Getty Images

Why combatting the extremists of ISIS is harder than fighting an Ebola outbreak

As images of brutal beheadings and dying plague victims compete for the world’s shrinking attention span, it is instructive to compare the unexpected terrors of the Islamic State of Iraq and Greater Syria (known as ISIS or ISIL) and Ebola. In October, the U.N. High Commissioner for Human Rights pointed out that “the twin plagues of Ebola and ISIL both fomented quietly, neglected by a world that knew they existed but misread their terrible potential, before exploding into the global consciousness.” Seeking more direct connections, various press stories have cited “experts” discussing the potential for ISIS to weaponize Ebola for bioterrorist attacks on the West.

Sensationalist claims aside, questions about similarities and differences are worth considering. Both burst onto the scene this year, capturing imaginations as they spread with surprising speed and severity. About Ebola, the world knows a lot and is doing relatively little. About ISIS, we know relatively little but are doing a lot.

In the case of Ebola, the first U.S.-funded treatment unit opened on Nov. 10—more than eight months after the epidemic came to the world’s attention. The U.S. has committed more than $350 million and 3,000 troops to this challenge to date. To combat ISIS, President Obama announced on Nov. 7 that he would be sending an additional 1,500 troops to Iraq to supplement his initial deployment of 1,500. And he has asked Congress for a down payment of $5.6 billion in this chapter of the global war on terrorism declared by his predecessor 13 years ago and on which the U.S. has spent more than $4 trillion so far.

Over recent centuries, medicine has made more progress than statecraft. It can be useful therefore to examine ISIS through a public-health lens. When confronting a disease, modern medicine begins by asking: What is the pathogen? How does it spread? Who is at risk? And, informed by this understanding, how can it be treated and possibly prevented?

About Ebola, we know the answers to each. But what about ISIS?

Start with identification of the virus itself. In the case of Ebola, scientists know the genetic code of the specific virus that causes an infected human being to bleed and die. Evidence suggests that the virus is animal-borne, and bats appear to be the most likely source. Scientists have traced the current outbreak to a likely animal-to-human transfer in December 2013.

In the case of ISIS, neither the identity of the virus nor the circumstances that gave rise to it are clear. Most see ISIS as a mutation of al-Qaeda, the Osama bin Laden–led terrorist group that killed nearly 3,000 people in the attacks on the World Trade Center and Pentagon in September 2001. In response to those attacks, President George W. Bush declared the start of a global war on terrorism and sent American troops into direct conflict with the al-Qaeda core in Pakistan and Afghanistan. In the years since, the White House has deployed military personnel and intelligence officers to deal with offshoots of al-Qaeda in Iraq (AQI), Yemen (AQAP), Syria (al-Nusra) and Somalia (al-Shabab).

But while ISIS has its roots in AQI, it was excommunicated by al-Qaeda leadership in February. Moreover, over the past six months, ISIS has distinguished itself as a remarkably purpose-driven organization, achieving unprecedented success on the battlefield—as well as engaging in indiscriminate violence, mass murders, sexual slavery and apparently even attempted genocide.

Horrifying as the symptoms of both Ebola and ISIS are, from an epidemiological perspective, the mere emergence of a deadly disease is not sufficient cause for global concern. For an outbreak to become truly worrying, it must be highly contagious. So how does the ISIS virus spread?

Ebola is transmitted only through contact with infected bodily fluids. No transfer of fluids, no spread. Not so for ISIS, where online images and words can instantly appear worldwide. ISIS’s leadership has demonstrated extraordinary skill and sophistication in crafting persuasive messages for specific audiences. It has won some followers by offering a sense of community and belonging, others by intimidation and a sense of inevitable victory, and still others by claims to restore the purity of Wahhabi Islam. According to CIA estimates, ISIS’s ranks of fighters tripled from initial estimates of 10,000 to more than 31,000 by mid-September. These militants include over 15,000 foreign volunteers from around the globe, including more than 2,000 from Europe and more than 100 from the U.S.

Individuals at risk of Ebola are relatively easy to identify: all have come into direct contact with the bodily fluids of a symptomatic Ebola patient, and almost all these cases occurred in just a handful of countries in West Africa. Once symptoms begin, those with the virus soon find it difficult to move, much less travel, for very long undetected.

But who is most likely to catch the ISIS virus? The most susceptible appear to be 18- to 35-year-old male Sunni Muslims, among whom there are many Western converts, disaffected or isolated in their local environment. But militants’ individual circumstances vary greatly, with foreign fighters hailing from more than 80 countries. These terrorists’ message can also inspire “lone wolf” sympathizers to engage in deadly behavior thousands of miles from any master planner or jihadist cell.

In sum, if Ebola were judged as a serious threat to the U.S., Americans have the knowledge to stop it in its tracks. Imagine an outbreak in the U.S. or another advanced society. The infected would be immediately quarantined, limiting contact to appropriately protected medical professionals—thus breaking the chain of infection. It is no surprise that all but two of the individuals infected by the virus who have returned to the U.S. have recovered and have not infected others. Countries like Liberia, on the other hand, with no comprehensive modern public-health or medical system, face entirely different challenges. International assistance has come slowly, piecemeal and in a largely uncoordinated fashion.

Of course, if ISIS really were a disease, it would be a nightmare: a deadly, highly contagious killer whose identity, origins, transmission and risk factors are poorly understood. Facing it, we find ourselves more like the Founding Fathers of the U.S., who in the 1790s experienced seasonal outbreaks of yellow fever in Philadelphia (then the capital of the country). Imagining that it was caused by the “putrid” airs of hot summers in the city, President John Adams and his Cabinet simply left the city, not returning until later in the fall when the plague subsided. In one particularly virulent year, Adams remained at his home in Quincy, Mass., for four months.

Not until more than a century later did medical science discover that the disease was transmitted by mosquitoes and its spread could be stopped.

We cannot hope to temporarily escape the ­“putrid” airs of ISIS until our understanding of that scourge improves. Faced with the realities of this threat, how would the medical world suggest we respond?

First, we would begin with humility. Since 9/11, the dominant U.S. strategy to prevent the spread of Islamic extremism has been to kill its hosts. Thirteen years on, having toppled the Taliban in Kabul and Saddam Hussein in Baghdad, waged war in both Iraq and Afghanistan, decimated the al-Qaeda core in Pakistan and Afghanistan and conducted 500 drone strikes against al-Qaeda affiliates in Yemen and Pakistan, and now launched over 1,000 air strikes against ISIS in Iraq and Syria, we should pause and ask: Are the numbers of those currently infected by the disease shrinking—or growing? As former Secretary of Defense Donald Rumsfeld once put it: Are we creating more enemies than we are killing? With our current approach, will we be declaring war on another acronym a decade from now? As we mount a response to ISIS, we must examine honestly past failures and successes and work to improve our limited understanding of what we are facing. We should then proceed with caution, keeping in mind Hippocrates’ wise counsel “to help, or at least, to do no harm.”

Second, we would tailor our treatments to reflect the different theaters of the disease. Health care professionals fighting Ebola in West Africa face quite different challenges of containment, treatment and prevention than do their counterparts dealing with isolated cases in the Western world. Similarly, our strategy to “defeat and ultimately destroy” ISIS in its hotbed of Iraq and Syria must be linked to, but differentiated from, our treatment for foreign fighters likely to “catch” the ISIS virus in Western nations. While continuing to focus on the center of the outbreak, the U.S. must also work to identify, track and—when necessary—isolate infected individuals within its borders.

Just as Ebola quarantines have raised ethical debates, our response to foreign fighters will need to address difficult trade-offs between individual rights and collective security. Should citizens who choose to fight for ISIS be stripped of their citizenship, imprisoned on their return, or denied entry to their home country? Such a response would certainly chill “jihadi tourism.” Should potential foreign fighters be denied passports or have their travel restricted? How closely should security agencies be allowed to monitor individuals who visit the most extremist Salafist websites or espouse ISIS-friendly views? Will punitive measures control the threat or only add fuel to radical beliefs?

Finally, we should acknowledge the fact that for the foreseeable future, there may be no permanent cure for Islamic extremism. Against Ebola, researchers are racing toward a vaccine that could decisively prevent future epidemics. But the past decade has taught us that despite our best efforts, if and when the ISIS outbreak is controlled, another strain of the virus is likely to emerge. In this sense, violent Islamic extremism may be more like the flu than Ebola: a virus for which we have no cure, but for which we can develop a coherent management strategy to minimize the number of annual infections and deaths. And recalling the 1918 influenza pandemic that killed at least 50 million people around the world, we must remain vigilant to the possibility that a new, more virulent and contagious strain of extremism could emerge with even graver consequences.

Allison is director of the Belfer Center for Science and International Affairs at Harvard’s John F. Kennedy School of Government

TIME Opinion

The Problem With Frats Isn’t Just Rape. It’s Power.

The Phi Kappa Psi fraternity house at the University of Virginia in Charlottesville, Va., Nov. 24, 2014. A Rolling Stone article last week alleged a gang rape at the house which has since suspended operations.
The Phi Kappa Psi fraternity house at the University of Virginia in Charlottesville, Va., on Nov. 24, 2014. A Rolling Stone article alleged a gang rape at the house, which has since suspended operations Steve Helber—AP

Too many frats breed sexism and misogyny that lasts long after college. Why we need to ban them—for good.

At the university I called home my freshman year, fraternity row was a tree-lined street full of Southern style mansions, against a backdrop of the poor urban ghetto that surrounded the school. Off-campus frat parties weren’t quite how I pictured spending my weekends at a new school – I wasn’t actually part of the Greek system – but it became clear quickly that they were the center of the social structure. They controlled the alcohol on campus, and thus, the social life. So there I was, week after week, joining the throngs of half-naked women trekking to fraternity row.

We learned the rules to frat life quickly, or at least we thought we did. Never let your drink out of your sight. Don’t go upstairs – where the bedrooms were housed – without a girlfriend who could check in on you later. If one of us was denied entry to a party because we weren’t deemed “hot” enough – houses often ranked women on a scale of one to 10, with only “sixes” and up granted entry to a party – we stuck together. Maybe we went to the foam party next door.

In two years at the University of Southern California, I heard plenty of stories of women being drugged at frat parties. At least one woman I knew was date raped, though she didn’t report it. But most of us basically shrugged our shoulders: This was just how it worked… right?

If the recent headlines are any indication, it certainly appears so. Among them: women blacked out and hospitalized after a frat party at the University of Wisconsin, only to discover red or black X’s marked on their hands. An email guide to getting girls in bed called “Luring your rapebait.” A banner displayed at a Texas Tech party reading “No Means Yes, Yes Means Anal” – which happened to be the same slogan chanted by frat brothers at Yale, later part of a civil rights complaint against the university.

And now, the story of Jackie, who alleged in a Rolling Stone article — swiftly becoming the subject over fairness in reporting whether the author was negligent in not reaching out to the alleged rapists — that she was gang raped by seven members of the Phi Kappa Psi house at the University of Virginia, and discouraged from pressing charges to protect the university’s reputation.

The alleged rape, it turned out, took place at the same house where another rape had occurred some thirty years prior, ultimately landing the perpetrator in jail.

“I’m sick about this,” says Caitlin Flanagan, a writer and UVA alumna who spent a year documenting the culture of fraternity life for a recent cover story in the Atlantic. “It’s been 30 years of education programs by the frats, initiatives to change culture, management policies, and we’re still here.”

Which begs the question: Why isn’t every campus in America dissolving its fraternity program — or at least instituting major, serious reform?

Not every fraternity member is a rapist (nor is every fraternity misogynist). But fraternity members are three times more likely to rape, according to a 2007 study, which notes that fraternity culture reinforces “within-group attitudes” that perpetuate sexual coercion. Taken together, frats and other traditionally male-dominated social clubs (ahem: the Princeton eating club) crystalize the elements of our culture that reinforce inequality, both gender and otherwise.

For starters, they are insulated from outside perspective. It wasn’t until the late 1960s that Greek organizations eradicated whites-only membership clauses; as a recent controversy at the University of Alabama revealed, only one black student had been permitted into that Greek system since 1964. Throughout the country, the fraternities grew into “caste system based on socioeconomic status as perceived by students,” John Chandler, the former president of Middlebury, which has banned frats on campus, recently told Newsweek.

And when it comes to campus social life, they exert huge social control: providing the alcohol, hosting the parties, policing who may enter–based on whatever criteria they choose. Because sororities are prohibited from serving alcohol, they can’t host their own parties; they must also abide by strict decorum rules. So night after night, women line up, in tube tops and high heels, vying for entrance. Even their clothes are a signifier of where the power lies. “Those with less power almost invariably dress up for those who have more,” Michael Kimmel, a sociologist at Stony Brook University, wrote in a recent column for TIME. “So, by day, in class, women and men dress pretty much the same … At parties, though, the guys will still be dressed that way, while the women will be sporting party dresses, high heels and make up.”

And when frat boys grow up? They slide right into the boys club of the business world, where brothers land Wall Street jobs via the “fraternity pipeline,” as a recent Bloomberg Businessweek piece put it — a place where secret handshakes mean special treatment in an already male-dominated field. Fraternities have graduated plenty of brilliant Silicon Valley founders: the creators of Facebook, Instagram, among others. They’ve also brought us Justin Mateen, the founder of Tinder, who stepped down amid a sexual harassment lawsuit, and Evan Spiegel, the Snapchat CEO, whose recently apologized for e-mails sent while in the Stanford frat where Snapchat was founded, which discussed convincing sorority women to perform sex acts and drunkenly peeing on a woman in bed.

(VIDEO: My Rapist Is Still on Campus: A Columbia Undergrad Tells Her Story)

If we lived in a gender-equal world, fraternities might work. But in an age where 1 in five college women are raped or assaulted on campus, where dozens of universities are under federal investigations for their handling of it, and where the business world remains dominated by men, doesn’t the continued existence of fraternities normalize a kind of white, male-dominated culture that already pervades our society? There is something insidious about a group of men who deny women entry, control the No. 1 asset on campus – alcohol – and make the rules in isolated groups. “[Colleges] should be cultivating the kind of sensibility that makes you a better citizen of a diverse and distressingly fractious society,” Frank Bruni wrote it in a New York Times column this week. “How is that served by retreating into an exclusionary clique of people just like you?”

The argument for Greek life – at least for the mainstream, largely white frats that seem to be the problem – goes something like this: It’s about fostering camaraderie. (According to a 2014 Gallup Poll, fraternity and sorority members have stronger relationships with friends and family than other college graduates.) It’s about community: As the Washington Post reported, chapters at UVA reportedly raised $400,000 for charity and logged 56,000 hours of community service during the past academic year. It’s part of a student’s free right to congregate. And also about training future leaders. According to Gallup, fraternity and sorority members will end up better off financially, and more likely to start businesses than other college graduates.

But the real benefit – at least the unspoken one – may be about money. Frats breed generous donors: as Flanagan pointed out in her Atlantic piece, fraternities save universities millions of dollars in student housing. At least one study has confirmed that fraternity brothers also tend to be generous to their alma maters.

All of which is part of the problem. Who wants to crack down on frats if it’s going to profoundly disturb campus life?

UVA, for its part, has suspended the frat in question until the new year, what the Inter-Fraternity Council described as a helpful opportunity for UVA’s Greek system to “take a breath.” The university’s president has said that the school “is too good a place to allow this evil to reside.” But critics saw the punishment as a slap on the wrist: a suspension, when most students are out of town for the holidays?

There are other options on the table: The school is reportedly considering proposals to crack down on underage drinking and even a ban on alcohol. Other universities have explored making fraternities co-ed. And there’s some evidence that fraternity brothers who participate in a rape prevention program at the start of the academic year are less likely to commit a sexually coercive act than a control group of men who also joined fraternities.

Yet all the while, the parade of ugly news continues. A group of frat brothers at San Diego State University interrupted a “Take Back the Night” march last week by screaming obscenities, throwing eggs and waving dildos at marchers. The next night, a woman reported she was sexually assaulted at a party near the school’s campus; she was the seventh person to come forward this semester. And on Monday, Wesleyan announced that its Psi Upsilon fraternity would be banned from hosting social events until the end of 2015, also because of rape accusations.

Fraternities have created something that’s fairly unique in the modern world: a place where young men spend three or four years living with other men whom they have vetted as like them and able to “fit in.” What do you expect to happen at a club where women are viewed as outsiders, or commodities, or worse, as prey, and where men make the rules? It should be no surprise they end up recreating the boys club — and one that isn’t all so great for the boys, either.

Jessica Bennett is a contributing columnist at Time.com covering the intersection of gender, sexuality, business and pop culture. She writes regularly for the New York Times and is a contributing editor on special projects for Sheryl Sandberg’s women’s non-profit, Lean In. You can follow her @jess7bennett.

Read more views on the debate about preventing sexual assault on campus:

Caitlin Flanagan: We Need More Transparency on the Issue of Fraternity Rape

A Lawyer for the Accused on Why Some Rules About Consent Are Unfair to Men

Ban Frat Parties–Let Sororities Run the Show

TIME Opinion

Why Ferguson Should Matter to Asian-Americans

A female protester raises her hands while blocking police cars in Ferguson
A female protester raises her hands while blocking police cars in Ferguson, Mo. on Nov. 25, 2014. Adrees Latif—Reuters

Ferguson isn’t simply black versus white

A peculiar Vine floated around social media Monday evening following the grand jury announcement in Ferguson, Mo. The short video shows an Asian-American shopkeeper standing in his looted store, with a hands-in-his-pockets matter-of-factness and a sad slump to his facial expression. “Are you okay, sir?” an off-screen cameraman asks. “Yes,” the storeowner says, dejectedly.

The clip is only a few seconds, but it highlights the question of where Asian-Americans stand in the black and white palette often used to paint incidents like Ferguson. In the story of a white cop’s killing of a black teen, Asian-Americans may at first seem irrelevant. They are neither white nor black; they assume the benefits of non-blackness, but also the burdens of non-whiteness. They can appear innocuous on nighttime streets, but also defenseless; getting into Harvard is a result of “one’s own merit,” but also a genetic gift; they are assumed well-off in society, but also perpetually foreign. Asian-Americans’ peculiar gray space on the racial spectrum can translate to detachment from the situation in Ferguson. When that happens, the racialized nature of the events in Ferguson loses relevance to Asian-Americans. But seen with a historical perspective, it’s clear that such moments are decidedly of more colors than two.

VOTE: Should the Ferguson Protestors Be TIME’s Person of the Year?

Michael Brown’s death has several parallels in Asian-American history. The first to come to mind may be the story of Vincent Chin, a Chinese-American killed in 1982 by a Chrysler plant superintendent and his stepson, both white, both uncharged in a racially-motivated murder; like Brown, Chin unified his community to demand protection under the law. However, most direct parallels have often had one distinct dissimilarity to Ferguson: they have not spurred widespread resistance, nor have they engraved a visible legacy.

There is the story of Kuanchang Kao, an intoxicated Chinese-American fatally shot in 1997 by police threatened by his “martial arts” moves. There is Cau Bich Tran, a Vietnamese-American killed in 2003 after holding a vegetable peeler, which police thought was a cleaver. There is Fong Lee, a Hmong-American shot to death in 2006 by police who believed he was carrying a gun. None of the three cases resulted in criminal charges against the police or in public campaigns that turned the victim’s memory into a commitment to seek justice. One op-ed even declared how little America learned from Tran’s slaying.

While Ferguson captures the world’s attention, why do these Asian-American stories remain comparatively unknown?

One possible answer could be found in the model minority myth. The myth, a decades-old stereotype, casts Asian-Americans as universally successful, and discourages others — even Asian-Americans themselves — from believing in the validity of their struggles. But as protests over Ferguson continue, it’s increasingly important to remember the purpose of the model minority narrative’s construction. The doctored portrayal, which dates to 1966, was intended to shame African-American activists whose demands for equal civil rights threatened a centuries-old white society. (The original story in the New York Times thrust forward an image of Japanese-Americans quietly rising to economic successes despite the racial prejudice responsible for their unjust internment during World War II.)

Racial engineering of Asian-Americans and African-Americans to protect a white-run society was nothing new, but the puppeteering of one minority to slap the other’s wrist was a marked change. The apparent boost of Asian-Americans suggested that racism was no longer a problem for all people of color — it was a problem for people of a specific color. “The model minority discourse has elevated Asian-Americans as a group that’s worked hard, using education to get ahead,” said Daryl Maeda, a professor of ethnic studies at the University of Colorado, Boulder. “But the reality is that it’s a discourse that intends to pit us against other people of color. And that’s a divide and conquer strategy we shouldn’t be complicit with.”

Through the years, that idea erased from the public consciousness the fact that the Asian-American experience was once a story of racially motivated legal exclusion, disenfranchisement and horrific violence — commonalities with the African-American experience that became rallying points in demanding racial equality. That division between racial minorities also erased a history of Afro-Asian solidarity born by the shared experience of sociopolitical marginalization.

As with Ferguson, it’s easy to say the Civil Rights movement was entirely black and white, when in reality there were many moments of interplay between African-American and Asian-American activism. Japanese-American activist Yuri Kochiyama worked alongside Malcolm X until he was assassinated in front of her. Groups protesting America’s involvement in the Vietnam War, like the student-run Third World Liberation Front, united resisters across racial lines under a collective radical political identity. W.E.B. DuBois called on African Americans to support the 1920s Indian anti-colonial resistance, which he compared to whites’ oppression of blacks. Chinese-American activist Grace Lee Boggs, who struggled as a female scholar of color, found passion in fighting similar injustices against African-Americans alongside C.L.R. James in the 1950s. Though Afro-Asian solidarity wasn’t the norm in either groups’ resistance movements, the examples highlight the power of cross-racial resistance, and what hardships they shared as non-whites.

The concept of non-whiteness is one way to begin the retelling of most hyphenated American histories. In Asian-American history, non-whiteness indelibly characterized the first waves of Asians arriving in the mid-1800s in America. Cases like People v. Hall (1854) placed them alongside unfree blacks, in that case by ruling that a law barring blacks from testifying against whites was intended to block non-white witnesses, while popular images documented Asian-American bodies as dark, faceless and indistinguishable — a racialization strengthened against the white supremacy of Manifest Destiny and naturalization law. Non-whiteness facilitated racism, but it in time also facilitated cross-racial opposition. With issues like post-9/11 racial profiling, anti-racism efforts continue to uphold this tradition of a shared non-white struggle.

“This stuff is what I call M.I.H. — missing in history,” said Helen Zia, an Asian-American historian and activist. “Unfortunately, we have generations growing up thinking there’s no connection [between African-Americans and Asian-Americans]. These things are there, all the linkages of struggles that have been fought together.”

The disassociation of Asian-Americans from Ferguson — not just as absent allies, but forgotten legacies — is another chapter in that missing history. In final moments of the Vine depicting an Asian-American shopkeeper’s looted store, the cameraman offers a last thought in their conversation that had halted to a brief pause. “It’s just a mess,” the cameraman says. The observation, however simplistic, has a truth. That, as an Asian-American who’s become collateral damage in a climate often black-and-white, he, like all of Ferguson, must first clean up — and then reassess the unfolding reality outside.

TIME

When One Twin is More Academically Gifted

My son tested into the gifted program at school, but my daughter didn't. Should I split them up?

Splitting up twins in school is never easy. But splitting up twins so that one goes on the advanced learning track and the other follows the regular program is one of the most agonizing decisions a parent can face. And no amount of Internet searches will give you helpful advice. The consensus: Figure it out, parents. That’s what you’re (not) paid for.

As you may have guessed, I have twins, a boy and a girl, and they’re in the first grade. I happen to be a fraternal twin myself, so I’m sensitive to always being compared to a sibling. My son is like his engineer father —completely committed to being a lovable nerd. The other day he found a book of math problems at Barnes and Noble and was so excited it was as if Santa arrived, handed him a gift, and then let him ride a reindeer. My daughter is like her freelance writer mother – studying is not really her thing. She reminds me of the prince in Monty Python and the Holy Grail who is to inherit a large amount of land and says, “But I don’t want any of that. I’d rather sing!” That’s my girl.

We were first introduced to our school’s Spectrum (advanced learning) program last year in Seattle, Washington at the beginning of kindergarten. The kids could be tested that year and would enter the program—or not—in first grade. I hadn’t really thought about whether to have my kids tested. Other parents apparently had. One asked: “Should we have our child practice at home with the same kind of mouse they’re going to use in the test?”

In the beginning, my husband and I laughed at the idea of advanced learning in the first grade. We joked about “Level Two Crayons” and “Expert Alphabet.” But then, as the day to decide about testing came closer, we started hearing from our son’s teacher about how gifted he was. What first grader wants to practice math and reading on his own during the evenings and weekends? My son. And then there was my daughter, who was right on track, but, like most kids her age, was happy to leave school stuff at school. “Let’s just get them both tested and see what happens,” I said.

As far as my kids knew, they were just going to school to talk about what they know and what they don’t. They were never told that the results of the test had any sort of consequences and weren’t the least bit curious. But when we got the results–my son tested into the advanced program and my daughter didn’t–I immediately became anxious. I wanted to let my son move into the advanced program because I knew he would love it and thrive. But I worried for my vibrant, passionate daughter who at the age of six doesn’t think she has any limits. How was I going to separate her from her brother because he could do something better?

As a child I never felt smart enough. Not because of my twin sister, but because of my mother, who was brilliant. She used her intelligence to get off of the Kentucky farm where she grew up and into a New York City law firm. She placed a lot of value on the power of education and what good grades could do. I felt perpetually unable to meet her high expectations. Now I had a daughter who, in kindergarten, was already resistant to doing her reading homework. I was terrified that placing her brother in a higher academic track would affect my daughter’s self-esteem.

I contacted Christina Baglivi Tingloff from the site Talk About Twins. She’s a mother of adult twins and author of six books, including Double Duty and Parenting School-Age Twins and Multiples. “It’s tough when twins differ in abilities,” she says, “and I’d say that it’s the biggest challenge of parenting multiples. [But] kids take their cues from their parents. If you make this a non-issue in your household, I think your kids will follow suit.”

My husband and I have no lofty goals for our kids besides wanting them to be able to pay their own bills, not hurt themselves or anyone else, and be happy. “So many parents of twins try to even the playing field,” says Tingloff. “In my opinion, that’s a bad course of action because…kids then never develop a strong emotional backbone. Your job as a parent is to help them deal with the disappointments in life.”

We ended up putting our son in the Spectrum program and our daughter in the regular learning track. In the years to come, I will make sure that they understand that advanced or regular doesn’t mean better or worse, it just means different. I want both of my children to do the best they can, whether that means taking advanced classes or singing the hell out of the school musical.

When my daughter wanders through the house making up her own songs and singing at the top of her voice, I support her…most of the time. “Really encourage your daughter in the arts,” says Tingloff. “Find her spotlight. At some point her brother will look at her accomplishments and say, ‘Wow, I can’t do that.'” While I had been worrying all this time about my daughter feeling outshined by her brother, I had never considered that he might also feel outperformed by her.

Despite all of my talk about how my daughter’s interests were every bit as valid as her brother’s, I had not been treating them the same. I saw the dance and drama as diversions and hobbies. I never gave those talents the respect that I gave to her brother’s academic interests.

Now that I am more aware of how I have been valuing their different strengths, I’ll be able to give my daughter’s interests the same amount of focus and praise as her brother’s. Hopefully, I can assure them that our only concern is their happiness. Then my husband and son can go do math problems together, and take things apart to see how they work, and my daughter and I will lay on the grass and find shapes in the clouds while we wonder about the world and sing.

The truth is, both my kids are gifted.

 

TIME Opinion

Confessions of a Lumbersexual

Jordan Ruiz—Getty Images

Why plaid yoga mats and beards are the future

Several years ago I was riding in a van with two female friends in the front seats when one of them pointed out the window and yelled “Wait! Slow down…is that him?” We were passing the bar that employed her ex-boyfriend.

“I don’t know,” said her friend who was driving. “A guy in Brooklyn with a beard and a plaid shirt? Could be anyone.”

I looked down over my beard at my shirt and both girls looked at me and we all laughed.

I’ve had a beard most of my adult life and my wardrobe is comprised largely of cowboy cut, plaid shirts and Wrangler blue jeans. On cold days I wear a big Carhartt coat into the office. In my youth in Oklahoma I did cut down some trees and split firewood for use in a house I really did grow up in, but in those days I dressed like a poser gutter punk. I nurture an abiding love for outlaw country and bluegrass, though, again, during my actual lumberjacking days it was all Black Flag, Operation Ivy and an inadvisable amount of The Doors.

After a decade living in urban places likes Brooklyn and Washington, I still keep a fishing rod I haven’t used in years, woodworking tools I shouldn’t be trusted with, and when I drink my voice deepens into a sort of a growl the provenance of which I do not know. I like mason jars, and craft beer and vintage pickup trucks. An old friend visiting me a few years ago commented, as I propped a booted foot against the wall behind me and adjusted the shirt tucked into my blue jeans, that I looked more Oklahoma than I ever did in Oklahoma.

I am a lumbersexual.

The lumbersexual has been the subject of much Internet musing in the last several weeks. The term is a new one on me but it is not a new phenomenon. In 2010 Urban Dictionary defined the lumbersexual as, “A metro-sexual who has the need to hold on to some outdoor based ruggedness, thus opting to keep a finely trimmed beard.” I was never a metrosexual and I’m actually most amused by Urban Dictionary’s earliest entry for lumbersexual, from February 2004: “A male who humps anyone who gives him wood.” But I do think defining the lumbersexual as a metrosexual grasping at masculinity gets at something.

It doesn’t take a lot of deep self-reflection to see that my lumbersexuality is, in part, a response to the easing of gender identities in society at large over the last few decades. Writing for The New Republic nearly 15 years ago, Andrew Sullivan observed “many areas of life that were once ‘gentlemanly’ have simply been opened to women and thus effectively demasculinized.” The flipside of this happy consequence of social progress is a generation of men left a bit rudderless. “Take their exclusive vocations away, remove their institutions, de-gender their clubs and schools and workplaces, and you leave men with more than a little cultural bewilderment,” writes Sullivan.

If not a breadwinner, not ogreishly aggressive, and not a senior member in good standing at a stuffy old real-life boy’s club, what is a man to be?

On the other hand, the upending of gender norms frees men in mainstream culture to do things verboten by a retrograde man-code once enforced by the most insecure and doltish among us. We carry purses now (and call them murses, or satchels, but don’t kid yourselves fellas). We do yoga. That the ancient core workout is so associated with femininity the pop culture has invented the term “broga” only goes to show what a sorry state masculinity is in. The lumbersexual is merely a healthier expression of the same identity crisis.

Which is, I think (?), why I dress like a lumberjack (and a lumberjack from like 100 years ago, mind you; real lumberjacks today, orange-clad in helmets and ear protection, do not dress like lumbersexuals). As a 21st-century man who does not identify with the pickup artist thing or the boobs/cars/abs triad of masculinity on display in most 21st-century men’s magazines (Maxim et al), is not particularly fastidious or a member of any clearly identifiable subculture and who is as attracted to notions of old-timey authenticity as anyone else in my 20s-30s hipster cohort (all of you are hipsters get over it), I guess this is just the fashion sense that felt most natural. I am actually fairly outdoorsy, in a redneck car-camping kind of way. Lumbersexuality just fit right, like an axe handle smoothed out by years of palm grease or an iPhone case weathered in all the right places to the shape of my hand.

There is a dark side to this lumbersexual moment however. It’s an impulse evident in Tim Allen’s new show Last Man Standing. Whereas in the 1990s, Tim the Tool-Man Taylor from Home Improvement was a confident and self-effacing parody on the Man Cave, complete with silly dude-grunting and fetishizing of tools, Mike Baxter, played by Tim Allen in Last Man Standing, is an entirely un-self-aware, willfully ignorant reactionary. The central theme of the show is Baxter in a household full of women struggling to retain his masculinity, which is presumed to be under assault because of all the estrogen around. He does this through all manner of posturing, complaining and at times being outright weird. In an early episode, Baxter waltzes into the back office at his job in a big box store modeled off Bass Pro Shops and relishes in the fact that it “smells like balls in here.” The joke is a crude attempt at celebrating maleness but it rings distressingly hollow to anyone who has spent any time in rooms redolent with the scent of actual balls. In later seasons the show softened but the central concern of a man whose masculinity is under assault because he is surrounded by women speaks to this moment in our popular culture.

If my beard is a trend-inspired attempt to reclaim a semblance of masculinity in a world gone mad then so be it. Beats scrotum jokes.

TIME Family

Why Your Kids Don’t Thank You for Gifts

images by Tang Ming Tung;Getty Images/Flickr RM

And how to help them develop some gratitude

When we shop for holiday gifts, many of us look for things that will make our children happy. We can’t wait to hear their appreciative cries of “thank you! thank you!” once the wrapping gets ripped off.

But here’s a tip: Don’t count on it.

In this season for thanks and giving, even the most thoughtful children may not offer much gratitude for the gadgets, gizmos, and games they receive. And you’d be wise not expect it.

I’ve spent the last year living more gratefully because of a book I’m writing on the subject, so I’m confident that gratitude can make us happier, healthier, and even fitter. Seeing the world through grateful eyes can lower depression and improve sleep. It creates a pay-it-forward spirit that is good for the world. Encouraging children to write down events that made them grateful—and not just on Thanksgiving—can begin a habit that lasts a lifetime.

Read: What I’m Thankful For, by Nick Offernan, Wendy Davis and others

But gratitude for the endless stuff we buy them? All the research I’ve done has convinced me that it’s not going to happen. And there are several reasons why.

In one study, Yale’s assistant professor of psychology, Yarrow Dunham, found that 4- to 8-year old kids responded differently when given a gift they thought they earned versus one that was granted out of simple generosity. He called the earned gift an “exchange relationship.” The children were happy for the trinket but didn’t experience the deeper resonance of gratitude that might also make them more generous to others. The gift given for no reason, however, had a different emotional impact and the children showed thanks by being more likely to share candies they received in a follow-up game.

As parents, we don’t consider our holiday gifts an “exchange relationship” since we know the time, money, and effort we put in to buying them. But kids have a different view. One mom told me that when she asked her 16-year-old son to thank her for buying him a cellphone, he said, “But that’s what moms do, isn’t it?” He wasn’t being rude—just practical.

From a teenager’s vantage, it’s a parent’s responsibility to take care of the family, and playing Santa is part of the job. According to Dunham, “when teenagers code it that way, a gift is no longer something given freely and voluntarily”—it’s just mom and dad living up to their obligation. And who’s going to be grateful for parents doing what they’re supposed to do?

Read: 40 Inspiring Motivational Quotes About Gratitude

Asking our children to be grateful for gifts is sending the wrong message, anyway. Cornell psychology professor Tom Gilovich has found that people are more likely to be grateful for experiences than for material possessions. A family dinner, a songfest around the fireplace, or even a hike in the woods creates a spirit of gratitude that outlasts even the nicest Nintendo.

Parents may get exasperated when a teenager tosses a new cashmere sweater on the floor, and gratitude aside, and we do have the right to demand good manners. Children should know to say thank you (profusely) to every parent, child, aunt, and uncle who gives them something.

But kids can’t know how blessed they are unless they have a basis for comparison. And they don’t learn that by a parent complaining that they’re ungrateful. We need to give our children the gift of a wider world view. Take them to a soup kitchen instead of to the mall. Become the secret Santa for a needy family. Show by example that gratitude isn’t about stuff—which ultimately can’t make any of us happy anyway. It’s about realizing how lucky you are and paying your good fortune forward.

My favorite idea: Collect all the charitable appeals you get this time of year into a big basket and find a night when the whole family can sit down together to go through them. You set the budget for giving and the kids decide how it’s distributed. Going through each request, you have the opportunity to discuss with children and teens (and also your spouse) what it means to need a food bank or to live in a part of the world where there is no clean water. You can talk about teenagers who are caught in war zones or those suffering from disabilities. Then write the checks together or go online and make your contributions.

Once the conversation about gratitude gets started, it’s much easier to continue all year. Set up a family ritual at bedtime where kids describe three things that made them grateful. When kids go off to college, text them a picture each week of something that inspired your appreciation. Whether it’s a friend, a snowflake, or a sunset, the spirit of the photos will help you (and them) see the world differently.

Teaching children to focus on the positive and appreciate the good in their lives is perhaps the greatest gift we can give them. And we can all learn together that the things that really matter aren’t on sale at a department store.

So I hope my kids will thank me for the gifts I buy them this year. But gratitude? That needs more than wrapping paper and a bow.

TIME Opinion

Feminist Is a 21st Century Word

Gloria Steinem, Jane Fonda, and Robin Morgan, Co-Founders of the Women's Media Center
From left: Gloria Steinem, Jane Fonda and Robin Morgan, co-founders of the Women's Media Center on CBS This Morning in New York City on Sept. 18, 2013 CBS Photo Archive/Getty Images

Robin Morgan is an author, activist and feminist. She is also a co-founder, with Gloria Steinem and Jane Fonda, of the Women's Media Center

I know, I know, TIME’s annual word-banning poll is meant as a joke, and this year’s inclusion of the word feminist wasn’t an attempt to end a movement. But as a writer — and feminist who naturally has no sense of humor — banning words feels, well, uncomfortable. The fault lies in the usage or overusage, not the word — even dumb or faddish words.

Feminist is neither of those. Nevertheless, I once loathed it. In 1968, while organizing the first protest against the Miss America Pageant, I called myself a “women’s liberationist,” because “feminist” seemed so 19th century: ladies scooting around in hoop skirts with ringlet curls cascading over their ears!

What an ignoramus I was. But school hadn’t taught me who they really were, and the media hadn’t either. We Americans forget or rewrite even our recent history, and accomplishments of any group not pale and male have tended to get downplayed or erased — one reason why Gloria Steinem, Jane Fonda and I founded the Women’s Media Center: to make women visible and powerful in media.

No, it took assembling and researching my anthology Sisterhood Is Powerful to teach me about the word feminism. I had no clue that feminists had been a major (or leading) presence in every social-justice movement in the U.S. time line: the revolutionary war, the campaigns to abolish slavery, debtors’ prisons and sweatshops; mobilizations for suffrage, prison reform, equal credit; fights to establish social security, unions, universal childhood education, halfway houses, free libraries; plus the environmentalism, antiwar and peace movements. And more. By 1970, I was a feminist.

Throughout that decade, feminism was targeted for ridicule. Here’s how it plays: first they ignore you, then laugh at you, then prosecute you, then try to co-opt you, then — once you win — they claim they gave you your rights: after a century of women organizing, protesting, being jailed, going on hunger strikes and being brutally force-fed, “they” gave women the vote.

We outlasted being a joke only to find our adversaries had repositioned “feminist” as synonymous with “lesbian” — therefore oooh, “dangerous.” These days — given recent wins toward marriage equality and the end of “don’t ask don’t tell” in the military, not to mention the popularity of Orange Is the New Black — it’s strange to recall how, in the ’70s, that connotation scared many heterosexual women away from claiming the word feminist. But at least it gave birth to a witty button of which I’ve always been especially fond: “How dare you assume I’m straight?!”

Yet in the 1980s the word was still being avoided. You’d hear maddening contradictions like “I’m no feminist, but …” after which feminist statements would pour from the speaker’s mouth. Meanwhile, women’s-rights activists of color preferred culturally organic versions: womanist among African Americans, mujerista among Latinas. I began using feminisms to more accurately depict and affirm such a richness of constituencies. Furthermore, those of us working in the global women’s movement found it fitting to celebrate what I termed a “multiplicity of feminisms.”

No matter the name, the movement kept growing. Along the way, the word absorbed the identity politics of the 1980s and ’90s, ergo cultural feminism, radical feminism, liberal/reform feminism, electoral feminism, academic feminism, ecofeminism, lesbian feminism, Marxist feminism, socialist feminism — and at times hybrids of the above.

Flash-forward to today when, despite predictions to the contrary, young women are furiously active online and off, and are adopting “the F word” with far greater ease and rapidity than previous feminists. Women of color have embraced the words feminism and feminist as their own, along with women all over the world, including Afghanistan and Saudi Arabia.

As we move into 2015, feminism is suddenly hot; celebrities want to identify with it. While such irony makes me smile wryly, I know we live in a celebrity culture and this brings more attention to issues like equal pay, full reproductive rights, and ending violence against women. I also know that sincere women (and men of conscience), celebs or not, will stay with the word and what it stands for. Others will just peel off when the next flavor of the month comes along.

Either way, the inexorable forward trajectory of this global movement persists, powered by women in Nepal’s rice paddies fighting for literacy rights; women in Kenya’s Green Belt Movement planting trees for microbusiness and the environment; Texas housewives in solidarity with immigrant women to bring and keep families together; and survivors speaking out about prostitution not being “sex work” or “just another job,” but a human-rights violation. From boardroom to Planned Parenthood clinic, this is feminism.

The dictionary definition is simple: “the theory of the political, economic, and social equality of the sexes.” Anyone who can’t support something that commonsensical and fair is part of a vanishing breed: well over half of all American women and more than 30% of American men approve of the word — the percentages running even higher in communities of color and internationally.

But I confess that for me feminism means something more profound. It means freeing a political force: the power, energy and intelligence of half the human species hitherto ignored or silenced. More than any other time in history, that force is needed to save this imperiled blue planet. Feminism, for me, is the politics of the 21st century.

Robin Morgan, the author of 22 books, hosts Women’s Media Center Live With Robin Morgan (syndicated radio, iTunes, and wmcLive.com).

TIME Culture

How the Cult of Early Success Is Bad for Young People

Photograph by Martin Schoeller for TIME

Taylor Swift and Malala Yousafzai are great role models. They've also set an impossible standard for success

Taylor Swift is on the cover of TIME magazine this week as the new queen of the music industry. She’s been in the business for more than 11 years, but at 24, she’d still have trouble renting a car.

It should be inspiring for young people to see someone so young achieve such phenomenal success. “Other women who are killing it should motivate you, thrill you, challenge you and inspire you rather than threaten you and make you feel like you’re immediately being compared to them,” she told my colleague Jack Dickey. “The only thing I compare myself to is me, two years ago, or me one year ago.”

But despite her best efforts to set a positive example, Swift also represents a generation of super-youth to which normal young people are inevitably compared. “You see someone so young, your age or even younger, being so wildly successful, and you can think ‘they just have it, they have something I don’t have,’” says Dr. Carol Dweck, a professor of Psychology at Stanford University and author of Mindset: The New Psychology of Success. “You think, ‘I’m so young and already I’m doomed.’”

Forget Forbes’s 30-under-30 list: when it comes to “freshness,” 30 is the new 40. At her age, Taylor Swift isn’t even considered precociously successful– she’s just regular successful. In fact, it’s been a banner year for wunderkind, and not just in entertainment (which has always been fixated on the young and beautiful.) 18-year old Saira Blair just became the youngest American lawmaker when she was elected to the West Virginia Legislature. 18-year old fashion blogger Tavi Gevinson took up a second career—as a Broadway star—as her magazine Rookie rakes in 3.5 million hits a month. 17-year old Malala Yousafzai became the youngest person ever to win the Nobel Prize.

As most millennials are moving sluggishly through their twenties, the hyper-visible hotshots are getting younger and younger, whittling away at the maximum age limit at which someone can get their “big break.”

For every young cultural force like Lena Dunham or genius app-creator like Evan Spiegel, there are thousands of other twenty-somethings sitting in their parents’ basements wondering why they haven’t invented an app or started a fashion line. According to a Pew survey, young people today have more debt and less income than their parents and grandparents did at their age, which means we’re the least financially stable generation in recent memory. We’re are making life decisions later than ever, delaying marriage and babies longer than previous generations did (partly because of the cash flow problems), and taking much longer to settle into a career. Yet, thanks to platforms like Youtube and Kickstarter that remove the traditional gatekeepers, there’s a pervasive expectation that young people should be achieving more, faster, younger.

“There’s a lot of attention paid to people who have success very young, like Taylor Swift and Mark Zuckerberg, but the average young person is not coming into their career until later these days,” says Dr. Jean Twenge, author of Generation Me. “Across the board, what you can see is much higher expectations among millennials compared to Boomers and Gen Xers, but a reality which is if anything more difficult than it was for those previous generations when they were young.”

Middle-aged sourpusses have long complained about America’s cultural fixation on youth and to be fair, the Beatles weren’t much older than Taylor Swift. Bill Maher even devoted a whole segment of last Friday’s “Real Time” to ageism, calling it “the last acceptable prejudice in America.” But today, the world is dominated by tech, and tech is dominated by young people. “I want to stress the importance of being young and technical,” Facebook founder Mark Zuckerberg said in a speech to a Y Combinator startup at Stanford in 2007. “Younger people are just smarter.”

But even for those of us who happen to be young, a youth-obsessed culture is a pretty raw deal. Because the perception that young people are “smarter” implies they should be getting successful more quickly, and often, they’re not. “In the internet age, the idea that fame is just out of reach has become more common,” says Dr. Twenge, noting that technological advances like YouTube helped launch careers of stars like Justin Bieber. “I think there’s an impression that it’s easier to become famous now, or easier to be discovered… There’s a perception that it’s easier, but that may not be entirely true.”

That expectation that it’s easy to get rich and famous may also contribute to some of the negative stereotypes about millennials, especially the reputation for laziness or entitlement. In other words, next to Lorde, the rest of us look like schlubs.

“I don’t think they’re comparing themselves to those wunderkind necessarily, but maybe their elders are, who are so critical of them,” says Dr. Jeffrey Arnett, who coined the phrase “emerging adult” and says he’s found little evidence to support the claim that millennials are lazy. “I wonder if that’s partly related to the fact that you have these amazingly successful young people, and people are saying ‘well, if Mark Zuckerberg can do this, why can’t you?’”

Of course, none of these comparisons are Taylor Swift’s fault, and she does everything in her power to nix that competitive instinct, especially among other women. But the fact that young superstars seem to have been born fully formed implies that growth and learning aren’t part of the recipe for success. “It not only tells them they don’t have time to grow, it saps them of the motivation to grow,” Dr. Dweck says.

Even Taylor recognizes that her darling days are numbered. “I just struggle to find a woman in music who hasn’t been completely picked apart by the media, or scrutinized and criticized for aging, or criticized for fighting aging,” she said. “It just seems to be much more difficult to be a woman in music and to grow older.”

When politicians proclaim that “young people are the future,” they mean we’ll inherit mountains of debt and a destroyed environment. But when young people think about our own futures, we should look at the way middle-aged and older people are treated—because like it or not, that’s going to be us one day. If young people were really so smart, we wouldn’t forget that.

Read next: The Secret Language of Girls on Instagram

Your browser, Internet Explorer 8 or below, is out of date. It has known security flaws and may not display all features of this and other websites.

Learn how to update your browser