(Translated by https://www.hiragana.jp/)
Opinion - TIME
The Wayback Machine - https://web.archive.org/web/20150116233537/http://time.com/opinion/
TIME life hacks

How Not to Be ‘Manterrupted’ in Meetings

2009 MTV Video Music Awards - Show
Kanye West takes the microphone from Taylor Swift and speaks onstage during the 2009 MTV Video Music Awards on Sept. 13, 2009 Kevin Mazur—WireImage/Getty Images

A guide for women, men and bosses

Manterrupting: Unnecessary interruption of a woman by a man.

Bropropriating: Taking a woman’s idea and taking credit for it.

We all remember that moment back in 2009, when Kanye West lunged onto the stage at the MTV Video Music Awards, grabbed the microphone from Taylor Swift, and launched into a monologue. “I’m gonna let you finish,” he said as he interrupted Swift as she was accepting the award for best female video. “But Beyoncé had one of the best videos of all time!”

It was perhaps the most public example of the “manterruption” – that is, a man interrupting a woman while she’s trying to speak (in this case, on stage, by herself, as an award honoree) and taking over the floor. At the VMAs it might have counted as entertainment, but ask any woman in the working world and we all recognize the phenomenon. We speak up in a meeting, only to hear a man’s voice chime in louder. We pitch an idea, perhaps too uncertainly – only to have a dude repeat it with authority. We may possess the skill, but he has the right vocal cords – which means we shut up, losing our confidence (or worse, the credit for the work).

We might have thought we were just being paranoid. But thanks to Sheryl Sandberg and Wharton business school professor Adam Grant (a man!) we can feel just a little less crazy when we mentally replay those meetings gone wrong. In a new op-ed in the New York Times, they point out the perils of “speaking while female,” along with a bevy of new research to prove that no, this is not all in our heads. (Disclaimer: I edit special projects for Sandberg’s women’s nonprofit, LeanIn.Org. Though I did not edit her Times op-ed.)

Sandberg and Grant cite research showing that powerful male Senators speak significantly more than their junior colleagues, while female Senators do not. That male executives who speak more often than their peers are deemed more competent (by 10%), while female executives who speak up are considered less (14% less). The data follows a long line of research showing that when it comes to the workplace, women speak less, are interrupted more, and have their ideas more harshly scrutinized.

“We’ve both seen it happen again and again,” Sandberg and Grant write. “When a woman speaks in a professional setting, she walks a tightrope. Either she’s barely heard or she’s judged as too aggressive. When a man says virtually the same thing, heads nod in appreciation for his fine idea.”

My friends have come up with terminology for it: Manterrupting. Manstanding. (Or talk-blocking, if you want the gender-neutral version.)

And the result? Women hold back. That, or we relinquish credit altogether. Our ideas get co-opted (bro-opted), re-appropriated (bro-propriated?) — or they simply fizzle out. We shut down, become less creative, less engaged. We revert into ourselves, wondering if it’s actually our fault. Enter spiral of self-doubt.

But there are things we can do to stop that cycle: women, men, and even bosses.

Know That We’re All a Little Bit Sexist — and Correct for It

The reality is that we all exhibit what scholars call “unconscious bias” — ingrained prejudices we may not even know we have. (Don’t think you’re among the culprits? Take this Implicit Association Test to be proved wrong.) When it comes to women, that bias is the result of decades of history; we’ve been taught that men lead and women nurture. So when women exhibit male traits – you know, decision-making, authority, leadership – we often dislike them, while men who exhibit those same traits are frequently deemed strong, masculine, and competent. It’s not only men who exhibit this bias, it’s women too: as one recent study found, it’s not just men who interrupt women more at work — it’s women too. But acknowledging that bias is an important step toward correcting for it.

Establish a No-Kanye Rule (Or Any Interruption, for That Matter)

When Glen Mazarra, a showrunner at The Shield, an FX TV drama from the early 2000s, noticed that his female writers weren’t speaking up in the writer’s room – or that when they did, they were interrupted and their ideas overtaken — he instituted a no-interruption policy while writers (male or female) were pitching. “It worked, and he later observed that it made the entire team more effective,” Sandberg and Grant wrote.

Practice Bystander Intervention

Seriously, stop an interrupter in his (or her) tracks. Nudge him, elbow him, or simply speak up to say, “Wait, let her finish,” or “Hey, I want to hear what Jess is saying.” The words are your choice — but don’t stay silent.

Create a Buddy System With a Friend

Or, better yet, if you’re a woman, create a buddy system with a friend who is a dude. Ask him to nod and look interested when you speak (when he’s interested, of course). Let him to back you up publicly in meetings. Seriously, try it. It’s not fair, no. But dammit, it works.

Support Your (Female) Colleagues

If you hear an idea from a woman that you think is good, back her up. You’ll have more of an effect than you think and you’ll establish yourself as a team player too.

Give Credit Where It’s Due

Yes, everyone wants credit for a good idea. But research shows that giving credit where it’s due will actually make you look better (as well as the person with the idea).

Women: Practice Assertive Body Language

Sit at the table, point to someone, stand up, walk to the front of the room, place your hand on the table — whatever it takes. Not only do these high-power poses make you appear more authoritative, but they actually increase your testosterone levels – and thus, your confidence. In some cases, it may actually help to literally “lean in”: in one study, researchers found that men physically lean in more often than women in professional meetings, making them less likely to be interrupted. Women more often leaned away — and were more likely to be interrupted.

… And Own Your Voice

Don’t undermine your authority with “I’m not sure if this is right, but—.” Speak authoritatively. Avoid the baby voice (leadership and authority are associated with the deep masculine voice, not with a softer, higher pitched tone). And please, whatever you do, don’t apologize before you speak.

Support Companies With Women in Power

We know that companies with more women on their corporate boards have higher outcomes and better returns. Teams with more diverse members perform better too. But having more women in power may actually encourage women to bring their ideas forward. In one study cited by Sandberg and Grant, researchers looked at the employees of a credit union where women made up 74% of supervisors and 84% of front-line employees. Shocker: women here were more likely to speak up, and be heard.

If all else fails, you can always learn how to talk really, really loud.

Jessica Bennett is a contributing columnist at Time.com covering the intersection of gender, sexuality, business and pop culture. She writes regularly for the New York Times and is a contributing editor on special projects for Sheryl Sandberg’s women’s nonprofit, Lean In. You can follow her @jess7bennett.

Read next: A Better Feminism for 2015

Listen to the most important stories of the day.

TIME

Pakistan’s New Strategy to Beat the Taliban

The Peshawar massacre must mark a turning point in Pakistan's battle against Taliban militants

Nearly a week after Pakistan’s worst-ever terrorist attack resulted in the death of 132 schoolchildren in Peshawar, the grief has turned to anger. As the Pakistan army pounds militant targets, the country’s politicians have achieved rare unity against the Taliban. For the first time, there are large protests outside mosques in Islamabad notorious for their pro-Taliban sympathies.

None of this should be surprising. No society can remain unmoved by the mass slaughter of their most vulnerable. That message appears to have finally registered with horror-hardened Pakistanis in a way that hasn’t been the case these past several years. “We are not making any differentiation,” Khawaja Muhammad Asif, the Defense Minister, said of the new approach. “All Taliban are bad Taliban.”

But many are right to question the durability of this new resolve. After all, in the past, Pakistan has seen assassinations, massacres of minorities, attacks on high-profile installations, even the seizure of large territory. Each time, there would be a bout of public outrage that would inevitably dissipate. Old arguments about whether the Taliban should be confronted or negotiated with would be revived.

This time, though, there is evidence of real change. Since the summer, the Pakistan military has been mounting an ambitious ground offensive in North Waziristan, the most hazardous of the country’s seven tribal areas. The armed forces had long resisted doing so out of fear of a backlash, despite repeated Western pressure. It took worsening action from the militants and a new army chief to make a difference.

The Peshawar massacre demonstrates that the militants are being hurt by the offensive. They feel the need to raise the human cost to Pakistanis of such military operations—and they did so in blood. But this time, the politicians aren’t balking. They have resolved that this war is their own, and that they can no longer afford to discriminate between so-called “good Taliban”—those who operate in Afghanistan—and the “bad Taliban” fighting the military in Pakistan.

The problem in Pakistan hasn’t been support for the Taliban. That exists and exists still, as the well-attended funerals of militants hanged in the aftermath attests. The enthusiasts have always been a minority. The problem is with those who don’t believe the Taliban exist, pleading that Muslims could never slaughter coreligionists, fingering India, Afghanistan, the U.S. and Israel instead. And there are those who still see the militants as a merely misguided group that would cease if violence if the state stopped attacking them. These apologists and equivocators have long enjoyed prestige and influence in the Pakistani media.

The Pakistani leadership is finally taking a more clear-eyed view of the militant menace. They aim to destroy not only the Taliban, but, Defense Minister Asif told me, extremism altogether. “Extremism of any kind, of thought, action, religious or political extremism is bad,” he said. “We have to eliminate them wherever we find them.”

As for those preachers continue to retain some affection for child-murderers, ordinary citizens are assailing them on the streets. On Monday, protesters gathered in five different cities across Pakistan to “reclaim their mosques” from Taliban sympathizers who abuse their pulpits to incite militant violence. They are calling on the police to arrest these imams, braving serious threats from militants.

There’s reason to be skeptical. As one Pakistani columnist sourly mused, there have been so many “last straws” in the struggle against the Taliban that there’s now a mountainous haystack. And the response so far has been characterized more by an immediate desire for vengeance than a long-term pursuit of justice. The execution of convicted militants gratifies widespread calls for revenge, and helps the government and military show people they are doing something.

But when facing an enemy that craves “martyrdom,” such measures hardly constitute a long-term strategy. For a state that has nurtured jihadists as instruments of official policy, and long encouraged its citizenry to look upon them as holy warriors, rolling back that history is a tremendous challenge.

In recent years, Pakistan has only ever fought militants when it felt it absolutely must. More often it has appeased them when it could. It has tolerated those that don’t attack the state directly. And it has steadily supported the ones who use its soil to launch attacks in Kashmir and Afghanistan. As some have quipped, it has been both “the fireman” and “the arsonist” of militancy.

Given the frailty of a state that can’t enforce basic laws, collect tax or provide electricity, it would be foolish to expect Pakistan to mount simultaneous assault on this bewildering array of scattered groups. But Pakistan does need to stop being the arsonist, though. In the short-term, the militants that pose the greatest threat— the Pakistani Taliban—will have to be a priority. As the Taliban are targeted, the state will also have a responsibility to protect its citizens at the same time. More massacres would severely strain the new consensus. The government will also have to overhaul its security structure. In the cities, and the largest province of Punjab, the sledgehammer of military action won’t be effective.

They will need civilian law-enforcement agencies that can act, but also prosecutors who can effectively bring culprits to justice and protect those who help the state in that task. One of the greatest scandals of this government has been the failure to prosecute the self-confessed killers of hundreds of Pakistani Shias, murdered by sectarian militants who regard them as infidels. The witnesses, judges, and prosecutors were too afraid of reprisals to act.

This won’t be a short war, either. Unlike the U.S. in Afghanistan, Pakistan cannot simply withdraw from the region. It has to stay— forever. In the long run, madrassas will have to be reformed, mosques cleared of extremist preachers, and militant groups defanged of their vast arsenals.

It will be a war whose end cannot be foreseen today. It is easy to sit in Western capitals and complain that Pakistan isn’t doing enough, as many argued last week. But from the point of view of a long traumatized population that is repeatedly forced to lower its children in early graves, the sentiment trespasses the boundaries of taste. Pakistanis don’t want pity or sympathy. At this crucial moment, they deserve the world’s solidarity.

TIME

Why Having Kids Won’t Fulfill You

hand in hand
Getty Images

Jennifer Aniston, take note. You haven't failed as a woman if you don't have kids.

I was struck by the comments Jennifer Aniston made to Allure magazine this week about the badgering she gets on a topic that she finds painful: her lack of children. She tells the magazine: “I don’t like [the pressure] that people put on me, on women – that you’ve failed yourself as a female because you haven’t procreated. I don’t think it’s fair. You may not have a child come out of your vagina, but that doesn’t mean that you aren’t mothering — dogs, friends, friends’ children.” For Aniston, 45, the topic is fraught with emotion. “Even saying it gets me a little tight in my throat,” she said.

I thought about Aniston’s comments—what many women in their early 40s without children are forced to feel—and then I thought about my own life. In some respects I’m Aniston’s exact opposite: I’m a 41-year-old mother of two who spent my entire adult life telling myself that children were my destiny. I did what society and my family expected, never questioning the choice. But sometimes I wonder how much of the blueprint of my life was drawn by me, and how much was sketched by experiences I had when I was way too young to be the architect of my own destiny.

For all intents and purposes, my mother was a single parent. My father left when I was twelve, but long before then my mother had taken over the head of the household role. She worked full-time as a waitress while my father flitted between different construction jobs. There always seemed to be an injury or a reason he wasn’t able to work. The image of him lying on our living room floor in front of our television is burned on my brain. He was there so much — diagonally and on his side with his head perched upon on his hand–I actually thought it was odd when I went to friends’ houses and their fathers weren’t in that prone position. I also found it odd that my friends’ parents shared a bedroom. My dad had taken up residence on the couch for so long, it seemed normal.

It was the obviously unhappy marriage that birthed the mantra my mother would repeat to me throughout my young life: “Do not depend on a man for anything.” That was followed closely by: “You and your sister are the best things I’ve ever done.” My mother made it clear that we were her reason for living. There was never a time I didn’t feel loved by my mother. But there was also a latent message that became clear after my father left: I am not alone because I have children. If it weren’t for you two I would be falling apart.

Before I hit adolescence, I decided that children were the only things that could fulfill me when I grew older.

“I’ve always wanted kids.” I don’t think I could possibly count the number of times in my life I have uttered those words. But, the same enthusiasm never escaped my lips when talking about marriage. I was never that girl who fantasized about her wedding day. So I skipped the marriage part, feeling like a renegade who was bucking the patriarchal confines of society.

It took five years for my partner and me to have a pregnancy that didn’t end in loss. After the third miscarriage, I began to panic: what if I really couldn’t have children? What would my life become? I was a bartender at the time that we were trying and my partner was a musician — we were in no way financially prepared for children. But the panic and fear that the narrative I had chosen for myself so many years earlier was not going to play out made me a woman consumed.

For five years we spent month after month trying for a child. The obsession I had with ovulation calendars and pregnancy tests only paused when a test came back positive, then the obsession switched to worrying about whether the pregnancy was going to last. I gave birth to a healthy baby boy in 2010, when I was thirty-eight. I was finally a mom.

My life changed — but only the daily tasks. I was still working full-time. Once we added a baby, the only difference was we now had no downtime. I was not a new person. I was the woman I had always been, I just added another label to my list of identifiers: friend, photographer, bartender, girlfriend, writer, mother. I reached the endgame, and nothing about myself had changed — save my ability to multitask.

My assumption that I was destined to be maternal made me never consider the idea that maybe I wasn’t. The possibility that I wasn’t actually hard-wired to mother never occurred to me until I looked into my child’s eyes for the first time and didn’t feel that thunderbolt everyone talks so much about. Those overwhelming feelings of love arrived eventually, but they certainly weren’t automatic.

Had we continued having infertility issues and not been able to conceive, I am certain that I would have felt that there was something “missing” from my life. But only because I believed the narrative my mother sold that children bring fulfillment. Since I’ve become a mother and seen that the essence of what makes me who I am has not changed, I’ve learned that nothing outside of you can fulfill you. Fulfillment is all about how you perceive the fullness or emptiness of your life. But how can a woman feel fulfilled if she’s constantly being told her life is empty without children? How can she ever feel certain she’s made the right decision if society is second-guessing her constantly?

There is nothing wrong or incomplete about building a life with a partner or alone, unburdened by the added stress of keeping another human being alive. This is something that men have always been allowed – women, not so much. A woman is constantly reminded of the ticking time bomb that is her biological clock. We don’t believe that a life without children is something a woman could possibly want. It’s why successful, wealthy women like Aniston are still asked the baby question every single time they sit down for an interview. Everyone is always looking for the latent sadness, the regret. What if it’s not there?

It’s been 40 years since the women’s liberation movement told us that just because we have a uterus, doesn’t mean we have to use it. We still don’t believe it. Whether we realize it or not, the necessity to tap into our maternal side is so wired into our being that we can’t escape it. If we could, there wouldn’t be debates about whether women could “have it all” or whether we were turning against our nature if we decide not to procreate.

I never questioned my desire to have children, because I didn’t have to; I took the well-traveled road. That desire is expected of me – it’s expected of all women. It took me decades to realize that the maternal drive I carried with me my entire adult life, the one that led me to try for five years to have children, may not have been a biological imperative at all. It may just have been a program that was placed into my psyche by the repeated mantras of a woman who was let down by a man and comforted by her children. That’s okay. I love my children and I’m happy about the experiences I’ve had and the paths that have led me to this place. But if this isn’t your place—whether you’re a famous movie star or not– you didn’t take a wrong turn.

 

Parents Newsletter Signup Banner
TIME
TIME Parenting

From BFF to ‘Friend Divorce:’ The 5 Truths We Should Teach Our Girls About Friendship

People walking, blurred motion
Getty Images

There's no such thing as a perfect friendship. It’s time to teach girls the truth about the complexities of BFFs.

Girls may love movies about fairytale princes, but their most captivating romance is with their friends. Every year, I stand on the stages of school auditoriums and ask thousands of girls this question: “How many of you have had a friend divorce?”

Instantly, a sea of hands shoot up in the air – this is not a term I need to define. The girls look around furtively, surprise spreading across their faces. They are astonished to discover they are not the only ones who have lost close friends.

That’s because girls receive unrealistic messages about how to have a friendship. Films and television see-saw between two extremes: mean girl-fests (think Real Housewives) and bestie love-fests (Sex and the City). Adults, meanwhile, aren’t always the perfect role models, either. The result is a steady diet of what I call “friendship myths”: find a best friend, and keep her forever. A good friendship is one where you never fight and are always happy. The more friends you have, the cooler you are.

These myths are all part of the pressure girls face to be “good girls”: liked by everyone, nice to all, and pleasing others before herself. It’s a subject I wrote an entire book on, and see often with my students.

Research has found that girls who are more authentic in their friendships – by being open and honest about their true feelings, and even having conflicts – have closer, happier connections with each other. Yet when a girls’ social life goes awry, they often blame themselves. Many interpret minor problems as catastrophes. Some may not even tell their parents out of embarrassment.

But there are things we can do to prepare girls for the gritty realities of real-life friendships. We can teach them that friendship challenges are a fact of life. That hiccups – a moody friend, fight over a love interest, or mean joke –- are simply par for the course. And when we do? They probably wouldn’t beat themselves up as much when conflicts happen. They’d be more willing to seek out support and move on when it did. Instead of expecting perfection all the time, they could adapt more easily to stress.

Here are five hard but important truths we can teach our girls about their relationships — perhaps sparing them that traumatizing “friend divorce” later on.

There is no such thing as a perfect friendship.

A healthy friendship is one where you share your true feelings without fearing the end of the relationship. It’s also one where you sometimes have to let things that bug you slide. The tough moments will make you wiser about yourself and each other. They will also make you stronger and closer as friends.

You will be left out or excluded.

It may happen because someone is being mean to you, or because someone forgot to include you. It will happen for a big reason or no clear reason at all; it will have everything or nothing to do with you. You will feel sad about it, and as your parent, I will be there to support you.

No matter how hard you try, your apology may not be accepted.

Some people just can’t move on from a conflict. You are only responsible for your own actions, not others’. You cannot make anyone do anything they don’t want to do. If you have done everything you can to make things right on your side, all you can do is wait. Yes, you may wait a long time, maybe even forever, but I will be there to support you.

Friend divorce happens.

Just like people date and break up, friends break up, too. “Best friends forever” rarely ever happens; it’s just that no one talks about it. Friend divorce is a sign that something was broken in your relationship, and it creates space in your life to let the next good friend in. You may be heartbroken by this experience, but your heart is strong, and you will find a new close friend again soon. I will be there to support you.

Friendships ebb and flow.

There are times in every friendship when you or your friend are too busy to call, or are more focused on other relationships. It will hurt, but it’s rarely personal. Making it personal usually makes things worse, and being too clingy or demanding can drive a friend even further away. Like people, friendships can get “overworked” and need to rest. In the meantime, let’s figure out other friends you can connect with.

I know plenty of grown-ups who still haven’t learned these truths – and they can be painful. But that’s all part of friendship: understanding just how hard – but at the same time, rewarding — it can be.

 

Rachel Simmons is the co-founder of Girls Leadership Institute and the author of the New York Times bestselling book, “Odd Girl Out: The Hidden Culture of Aggression in Girls” and “The Curse of the Good Girl: Raising Authentic Girls With Courage and Confidence.” Follow her on Twitter @racheljsimmons.

TIME Opinion

Girl Gone Wild: The Rise of the Lone She-Wolf

Wild
Fox Searchlight

A woman on a solitary journey used to be seen as pitiful, vulnerable or scary. Not any more.

The first few seconds of Wild sound like sex. You hear a woman panting and moaning as the camera pans across the forest, and it seems like the movie is starting off with an outdoor quickie. But it’s not the sound of two hikers hooking up: it’s the sound of Cheryl Strayed, played by Reese Witherspoon, climbing a mountain all by herself.

It lasts only a moment, but that first shot contains everything you need to know about why Wild is so important. It’s a story of a woman who hikes the Pacific Crest Trail for 94 days in the wake of her mother’s death, but more than that, it’s a story of a woman who is no longer anything to anybody. We’re so used to seeing women entangled with other people (with parents, with men, with children, in neurotic friendships with other women), that it’s surprising, almost shocking, to see a woman who is gloriously, intentionally, radically alone.

When it comes to women onscreen, the lone frontier is the last frontier. It’s no big deal to see women play presidents, villains, baseball players, psychopaths, superheroes, math geniuses, or emotionally stunted losers. We’ve even had a female Bob Dylan. But a woman, alone, in the wilderness, for an entire movie? Not until now.

Which is unfair, considering all the books and movies dedicated to the often-tedious excursions of solitary men, from Henry David Thoreau to Jack Kerouac to Christopher McCandless. Audiences have sat through hours of solo-dude time in critically acclaimed movies like Castaway, Into the Wild, Life of Pi, 127 Hours, and All is Lost. America loves a Lone Ranger so much, even Superman worked alone.

In fact, the only thing more central to the American canon than a solitary guy hanging out in the woods is a guy on a quest (think Huckleberry Finn or Moby Dick). The road narrative may be the most fundamental American legend, grown from our history of pilgrimage and Western expansion. But adventure stories are almost always no-girls-allowed, partly because the male adventurer is usually fleeing from a smothering domesticity represented by women. In our collective imaginations, women don’t set out on a journey unless they’re fleeing from something, usually violence. As Vanessa Veselka writes in her excellent essay on female road narratives in The American Reader: “A man on the road is caught in the act of a becoming. A woman on the road has something seriously wrong with her. She has not ‘struck out on her own.’ She has been shunned.”

MORE: The Top 10 Best Movies of 2014

The ‘loner in nature’ and the ‘man on the road’ are our American origin stories, our Genesis and Exodus. They’re fables of an American national character which, as A.O. Scott pointed out in his The New York Times essay on the death of adulthood in American culture, has always tended towards the boyish. Wild is the first big movie– or bestselling book, for that matter–to re-tell that central American story with a female protagonist.

But Wild is just the most visible example of what’s been a slow movement towards loner ladies onscreen. Sandra Bullock’s solo spin through space last year in Gravity was the first step (although her aloneness was accidental, and it was more a survival story than road narrative). Mia Wasikowska’s long walk across Australia in Tracks this year was another. But Wild, based on Strayed’s bestselling memoir and propelled by Witherspoon’s star power, is the movie that has the best shot at moving us past the now-tired “power woman” towards a new kind of feminist role model: the lone female.

Because for women, aloneness is the next frontier. Despite our chirpy boosting of “independent women” and “strong female leads,” it’s easy to forget that women can never be independent if we’re not allowed to be alone.

For men, solitude is noble: it implies moral toughness, intellectual rigor, a deep connection with the environment. For women, solitude is dangerous: a lone woman is considered vulnerable to attacks, pitiful for her lack of male companionship, or threatening to another woman’s relationship. We see women in all kinds of states of loneliness–single, socially isolated, abandoned–but almost never in a state of deliberate, total aloneness.

Not to mention the fact that women’s stories are almost always told in the context of their relationships with other people. Even if you set aside romance narratives, the “girl group” has become the mechanism for telling the stories of “independent” women– that is, women’s stories that don’t necessarily revolve around men. Think Sex & The City, Steel Magnolias, A League of Their Own, Sisterhood of the Traveling Pants, Girls: if a woman’s not half of a couple, she must be part of a gaggle.

When Cheryl Strayed describes her experience of “radical aloneness,” she’s talking about being completely cut off from human contact–no cell phone, no credit card, no GPS. But her aloneness is also radical in that it rejects the female identity that is always viewed through the lens of a relationship with someone else. To be alone, radically alone, is to root yourself in your own life, not the role you play in other people’s lives. Or, as Strayed’s mother Bobbi wistfully puts it, “I always did what someone else wanted me to do. I’ve always been someone’s daughter or mother or wife. I’ve never just been me.”

MORE: The Top 10 Best Movie Performances of 2014

And that’s the difference between aloneness and independence. The “independent woman” is nothing new– if anything, it’s become a tired catchphrase of a certain kind of rah-rah feminism. “Independence” implies a relationship with another thing, a thing from which you’re severing your ties. It’s inherently conspicuous, even performative. Female independence has become such a trope that it’s become another role for women to play: independent career woman, independent post-breakup vixen, independent spitfire who doesn’t care what anyone thinks. And usually, that “independence” is just a temporary phase before she meets a guy at the end of the movie who conveniently “likes a woman who speaks her mind.”

Aloneness is more fundamental, and more difficult. It involves cultivating a sense of self that has little to do with the motherhood, daughterhood, wifehood or friendship that society calls “womanhood.” When interviewed by the Hobo Times about being a “female hobo,” Strayed says: “Women can’t walk out of their lives. They have families. They have kids to take care of.” Aloneness then, isn’t just a choice to focus on one’s self– it’s also a rejection of all the other social functions women are expected to perform.

In 1995, when Strayed hiked for 94 days, that would have been hard. In 2014, it’s even harder. Thanks to the internet, our world is more social now than ever before, and it’s even harder to escape other people. But aloneness is at the root of real independence, it’s where self-reliance begins and ends. So these days, if you want to be independent, maybe you can start by trying to be alone.

Read next: Reese Witherspoon Isn’t Nice or Wholesome in Wild, and That’s What Makes It Great

TIME

Viral Threats

TURKEY-SYRIA-CONFLICT-KURDS
Militants of Islamic State are seen before explosion of air strike on Tilsehir hill near the Turkish border village Yumurtalik in Sanliurfa province, Oct. 23, 2014. BULENT KILIC—AFP/Getty Images

Why combatting the extremists of ISIS is harder than fighting an Ebola outbreak

As images of brutal beheadings and dying plague victims compete for the world’s shrinking attention span, it is instructive to compare the unexpected terrors of the Islamic State of Iraq and Greater Syria (known as ISIS or ISIL) and Ebola. In October, the U.N. High Commissioner for Human Rights pointed out that “the twin plagues of Ebola and ISIL both fomented quietly, neglected by a world that knew they existed but misread their terrible potential, before exploding into the global consciousness.” Seeking more direct connections, various press stories have cited “experts” discussing the potential for ISIS to weaponize Ebola for bioterrorist attacks on the West.

Sensationalist claims aside, questions about similarities and differences are worth considering. Both burst onto the scene this year, capturing imaginations as they spread with surprising speed and severity. About Ebola, the world knows a lot and is doing relatively little. About ISIS, we know relatively little but are doing a lot.

In the case of Ebola, the first U.S.-funded treatment unit opened on Nov. 10—more than eight months after the epidemic came to the world’s attention. The U.S. has committed more than $350 million and 3,000 troops to this challenge to date. To combat ISIS, President Obama announced on Nov. 7 that he would be sending an additional 1,500 troops to Iraq to supplement his initial deployment of 1,500. And he has asked Congress for a down payment of $5.6 billion in this chapter of the global war on terrorism declared by his predecessor 13 years ago and on which the U.S. has spent more than $4 trillion so far.

Over recent centuries, medicine has made more progress than statecraft. It can be useful therefore to examine ISIS through a public-health lens. When confronting a disease, modern medicine begins by asking: What is the pathogen? How does it spread? Who is at risk? And, informed by this understanding, how can it be treated and possibly prevented?

About Ebola, we know the answers to each. But what about ISIS?

Start with identification of the virus itself. In the case of Ebola, scientists know the genetic code of the specific virus that causes an infected human being to bleed and die. Evidence suggests that the virus is animal-borne, and bats appear to be the most likely source. Scientists have traced the current outbreak to a likely animal-to-human transfer in December 2013.

In the case of ISIS, neither the identity of the virus nor the circumstances that gave rise to it are clear. Most see ISIS as a mutation of al-Qaeda, the Osama bin Laden–led terrorist group that killed nearly 3,000 people in the attacks on the World Trade Center and Pentagon in September 2001. In response to those attacks, President George W. Bush declared the start of a global war on terrorism and sent American troops into direct conflict with the al-Qaeda core in Pakistan and Afghanistan. In the years since, the White House has deployed military personnel and intelligence officers to deal with offshoots of al-Qaeda in Iraq (AQI), Yemen (AQAP), Syria (al-Nusra) and Somalia (al-Shabab).

But while ISIS has its roots in AQI, it was excommunicated by al-Qaeda leadership in February. Moreover, over the past six months, ISIS has distinguished itself as a remarkably purpose-driven organization, achieving unprecedented success on the battlefield—as well as engaging in indiscriminate violence, mass murders, sexual slavery and apparently even attempted genocide.

Horrifying as the symptoms of both Ebola and ISIS are, from an epidemiological perspective, the mere emergence of a deadly disease is not sufficient cause for global concern. For an outbreak to become truly worrying, it must be highly contagious. So how does the ISIS virus spread?

Ebola is transmitted only through contact with infected bodily fluids. No transfer of fluids, no spread. Not so for ISIS, where online images and words can instantly appear worldwide. ISIS’s leadership has demonstrated extraordinary skill and sophistication in crafting persuasive messages for specific audiences. It has won some followers by offering a sense of community and belonging, others by intimidation and a sense of inevitable victory, and still others by claims to restore the purity of Wahhabi Islam. According to CIA estimates, ISIS’s ranks of fighters tripled from initial estimates of 10,000 to more than 31,000 by mid-September. These militants include over 15,000 foreign volunteers from around the globe, including more than 2,000 from Europe and more than 100 from the U.S.

Individuals at risk of Ebola are relatively easy to identify: all have come into direct contact with the bodily fluids of a symptomatic Ebola patient, and almost all these cases occurred in just a handful of countries in West Africa. Once symptoms begin, those with the virus soon find it difficult to move, much less travel, for very long undetected.

But who is most likely to catch the ISIS virus? The most susceptible appear to be 18- to 35-year-old male Sunni Muslims, among whom there are many Western converts, disaffected or isolated in their local environment. But militants’ individual circumstances vary greatly, with foreign fighters hailing from more than 80 countries. These terrorists’ message can also inspire “lone wolf” sympathizers to engage in deadly behavior thousands of miles from any master planner or jihadist cell.

In sum, if Ebola were judged as a serious threat to the U.S., Americans have the knowledge to stop it in its tracks. Imagine an outbreak in the U.S. or another advanced society. The infected would be immediately quarantined, limiting contact to appropriately protected medical professionals—thus breaking the chain of infection. It is no surprise that all but two of the individuals infected by the virus who have returned to the U.S. have recovered and have not infected others. Countries like Liberia, on the other hand, with no comprehensive modern public-health or medical system, face entirely different challenges. International assistance has come slowly, piecemeal and in a largely uncoordinated fashion.

Of course, if ISIS really were a disease, it would be a nightmare: a deadly, highly contagious killer whose identity, origins, transmission and risk factors are poorly understood. Facing it, we find ourselves more like the Founding Fathers of the U.S., who in the 1790s experienced seasonal outbreaks of yellow fever in Philadelphia (then the capital of the country). Imagining that it was caused by the “putrid” airs of hot summers in the city, President John Adams and his Cabinet simply left the city, not returning until later in the fall when the plague subsided. In one particularly virulent year, Adams remained at his home in Quincy, Mass., for four months.

Not until more than a century later did medical science discover that the disease was transmitted by mosquitoes and its spread could be stopped.

We cannot hope to temporarily escape the ­“putrid” airs of ISIS until our understanding of that scourge improves. Faced with the realities of this threat, how would the medical world suggest we respond?

First, we would begin with humility. Since 9/11, the dominant U.S. strategy to prevent the spread of Islamic extremism has been to kill its hosts. Thirteen years on, having toppled the Taliban in Kabul and Saddam Hussein in Baghdad, waged war in both Iraq and Afghanistan, decimated the al-Qaeda core in Pakistan and Afghanistan and conducted 500 drone strikes against al-Qaeda affiliates in Yemen and Pakistan, and now launched over 1,000 air strikes against ISIS in Iraq and Syria, we should pause and ask: Are the numbers of those currently infected by the disease shrinking—or growing? As former Secretary of Defense Donald Rumsfeld once put it: Are we creating more enemies than we are killing? With our current approach, will we be declaring war on another acronym a decade from now? As we mount a response to ISIS, we must examine honestly past failures and successes and work to improve our limited understanding of what we are facing. We should then proceed with caution, keeping in mind Hippocrates’ wise counsel “to help, or at least, to do no harm.”

Second, we would tailor our treatments to reflect the different theaters of the disease. Health care professionals fighting Ebola in West Africa face quite different challenges of containment, treatment and prevention than do their counterparts dealing with isolated cases in the Western world. Similarly, our strategy to “defeat and ultimately destroy” ISIS in its hotbed of Iraq and Syria must be linked to, but differentiated from, our treatment for foreign fighters likely to “catch” the ISIS virus in Western nations. While continuing to focus on the center of the outbreak, the U.S. must also work to identify, track and—when necessary—isolate infected individuals within its borders.

Just as Ebola quarantines have raised ethical debates, our response to foreign fighters will need to address difficult trade-offs between individual rights and collective security. Should citizens who choose to fight for ISIS be stripped of their citizenship, imprisoned on their return, or denied entry to their home country? Such a response would certainly chill “jihadi tourism.” Should potential foreign fighters be denied passports or have their travel restricted? How closely should security agencies be allowed to monitor individuals who visit the most extremist Salafist websites or espouse ISIS-friendly views? Will punitive measures control the threat or only add fuel to radical beliefs?

Finally, we should acknowledge the fact that for the foreseeable future, there may be no permanent cure for Islamic extremism. Against Ebola, researchers are racing toward a vaccine that could decisively prevent future epidemics. But the past decade has taught us that despite our best efforts, if and when the ISIS outbreak is controlled, another strain of the virus is likely to emerge. In this sense, violent Islamic extremism may be more like the flu than Ebola: a virus for which we have no cure, but for which we can develop a coherent management strategy to minimize the number of annual infections and deaths. And recalling the 1918 influenza pandemic that killed at least 50 million people around the world, we must remain vigilant to the possibility that a new, more virulent and contagious strain of extremism could emerge with even graver consequences.

Allison is director of the Belfer Center for Science and International Affairs at Harvard’s John F. Kennedy School of Government

TIME Opinion

The Problem With Frats Isn’t Just Rape. It’s Power.

The Phi Kappa Psi fraternity house at the University of Virginia in Charlottesville, Va., Nov. 24, 2014. A Rolling Stone article last week alleged a gang rape at the house which has since suspended operations.
The Phi Kappa Psi fraternity house at the University of Virginia in Charlottesville, Va., on Nov. 24, 2014. A Rolling Stone article alleged a gang rape at the house, which has since suspended operations Steve Helber—AP

Too many frats breed sexism and misogyny that lasts long after college. Why we need to ban them—for good.

At the university I called home my freshman year, fraternity row was a tree-lined street full of Southern style mansions, against a backdrop of the poor urban ghetto that surrounded the school. Off-campus frat parties weren’t quite how I pictured spending my weekends at a new school – I wasn’t actually part of the Greek system – but it became clear quickly that they were the center of the social structure. They controlled the alcohol on campus, and thus, the social life. So there I was, week after week, joining the throngs of half-naked women trekking to fraternity row.

We learned the rules to frat life quickly, or at least we thought we did. Never let your drink out of your sight. Don’t go upstairs – where the bedrooms were housed – without a girlfriend who could check in on you later. If one of us was denied entry to a party because we weren’t deemed “hot” enough – houses often ranked women on a scale of one to 10, with only “sixes” and up granted entry to a party – we stuck together. Maybe we went to the foam party next door.

In two years at the University of Southern California, I heard plenty of stories of women being drugged at frat parties. At least one woman I knew was date raped, though she didn’t report it. But most of us basically shrugged our shoulders: This was just how it worked… right?

If the recent headlines are any indication, it certainly appears so. Among them: women blacked out and hospitalized after a frat party at the University of Wisconsin, only to discover red or black X’s marked on their hands. An email guide to getting girls in bed called “Luring your rapebait.” A banner displayed at a Texas Tech party reading “No Means Yes, Yes Means Anal” – which happened to be the same slogan chanted by frat brothers at Yale, later part of a civil rights complaint against the university.

And now, the story of Jackie, who alleged in a Rolling Stone article — swiftly becoming the subject over fairness in reporting whether the author was negligent in not reaching out to the alleged rapists — that she was gang raped by seven members of the Phi Kappa Psi house at the University of Virginia, and discouraged from pressing charges to protect the university’s reputation.

The alleged rape, it turned out, took place at the same house where another rape had occurred some thirty years prior, ultimately landing the perpetrator in jail.

“I’m sick about this,” says Caitlin Flanagan, a writer and UVA alumna who spent a year documenting the culture of fraternity life for a recent cover story in the Atlantic. “It’s been 30 years of education programs by the frats, initiatives to change culture, management policies, and we’re still here.”

Which begs the question: Why isn’t every campus in America dissolving its fraternity program — or at least instituting major, serious reform?

Not every fraternity member is a rapist (nor is every fraternity misogynist). But fraternity members are three times more likely to rape, according to a 2007 study, which notes that fraternity culture reinforces “within-group attitudes” that perpetuate sexual coercion. Taken together, frats and other traditionally male-dominated social clubs (ahem: the Princeton eating club) crystalize the elements of our culture that reinforce inequality, both gender and otherwise.

For starters, they are insulated from outside perspective. It wasn’t until the late 1960s that Greek organizations eradicated whites-only membership clauses; as a recent controversy at the University of Alabama revealed, only one black student had been permitted into that Greek system since 1964. Throughout the country, the fraternities grew into “caste system based on socioeconomic status as perceived by students,” John Chandler, the former president of Middlebury, which has banned frats on campus, recently told Newsweek.

And when it comes to campus social life, they exert huge social control: providing the alcohol, hosting the parties, policing who may enter–based on whatever criteria they choose. Because sororities are prohibited from serving alcohol, they can’t host their own parties; they must also abide by strict decorum rules. So night after night, women line up, in tube tops and high heels, vying for entrance. Even their clothes are a signifier of where the power lies. “Those with less power almost invariably dress up for those who have more,” Michael Kimmel, a sociologist at Stony Brook University, wrote in a recent column for TIME. “So, by day, in class, women and men dress pretty much the same … At parties, though, the guys will still be dressed that way, while the women will be sporting party dresses, high heels and make up.”

And when frat boys grow up? They slide right into the boys club of the business world, where brothers land Wall Street jobs via the “fraternity pipeline,” as a recent Bloomberg Businessweek piece put it — a place where secret handshakes mean special treatment in an already male-dominated field. Fraternities have graduated plenty of brilliant Silicon Valley founders: the creators of Facebook, Instagram, among others. They’ve also brought us Justin Mateen, the founder of Tinder, who stepped down amid a sexual harassment lawsuit, and Evan Spiegel, the Snapchat CEO, whose recently apologized for e-mails sent while in the Stanford frat where Snapchat was founded, which discussed convincing sorority women to perform sex acts and drunkenly peeing on a woman in bed.

(VIDEO: My Rapist Is Still on Campus: A Columbia Undergrad Tells Her Story)

If we lived in a gender-equal world, fraternities might work. But in an age where 1 in five college women are raped or assaulted on campus, where dozens of universities are under federal investigations for their handling of it, and where the business world remains dominated by men, doesn’t the continued existence of fraternities normalize a kind of white, male-dominated culture that already pervades our society? There is something insidious about a group of men who deny women entry, control the No. 1 asset on campus – alcohol – and make the rules in isolated groups. “[Colleges] should be cultivating the kind of sensibility that makes you a better citizen of a diverse and distressingly fractious society,” Frank Bruni wrote it in a New York Times column this week. “How is that served by retreating into an exclusionary clique of people just like you?”

The argument for Greek life – at least for the mainstream, largely white frats that seem to be the problem – goes something like this: It’s about fostering camaraderie. (According to a 2014 Gallup Poll, fraternity and sorority members have stronger relationships with friends and family than other college graduates.) It’s about community: As the Washington Post reported, chapters at UVA reportedly raised $400,000 for charity and logged 56,000 hours of community service during the past academic year. It’s part of a student’s free right to congregate. And also about training future leaders. According to Gallup, fraternity and sorority members will end up better off financially, and more likely to start businesses than other college graduates.

But the real benefit – at least the unspoken one – may be about money. Frats breed generous donors: as Flanagan pointed out in her Atlantic piece, fraternities save universities millions of dollars in student housing. At least one study has confirmed that fraternity brothers also tend to be generous to their alma maters.

All of which is part of the problem. Who wants to crack down on frats if it’s going to profoundly disturb campus life?

UVA, for its part, has suspended the frat in question until the new year, what the Inter-Fraternity Council described as a helpful opportunity for UVA’s Greek system to “take a breath.” The university’s president has said that the school “is too good a place to allow this evil to reside.” But critics saw the punishment as a slap on the wrist: a suspension, when most students are out of town for the holidays?

There are other options on the table: The school is reportedly considering proposals to crack down on underage drinking and even a ban on alcohol. Other universities have explored making fraternities co-ed. And there’s some evidence that fraternity brothers who participate in a rape prevention program at the start of the academic year are less likely to commit a sexually coercive act than a control group of men who also joined fraternities.

Yet all the while, the parade of ugly news continues. A group of frat brothers at San Diego State University interrupted a “Take Back the Night” march last week by screaming obscenities, throwing eggs and waving dildos at marchers. The next night, a woman reported she was sexually assaulted at a party near the school’s campus; she was the seventh person to come forward this semester. And on Monday, Wesleyan announced that its Psi Upsilon fraternity would be banned from hosting social events until the end of 2015, also because of rape accusations.

Fraternities have created something that’s fairly unique in the modern world: a place where young men spend three or four years living with other men whom they have vetted as like them and able to “fit in.” What do you expect to happen at a club where women are viewed as outsiders, or commodities, or worse, as prey, and where men make the rules? It should be no surprise they end up recreating the boys club — and one that isn’t all so great for the boys, either.

Jessica Bennett is a contributing columnist at Time.com covering the intersection of gender, sexuality, business and pop culture. She writes regularly for the New York Times and is a contributing editor on special projects for Sheryl Sandberg’s women’s non-profit, Lean In. You can follow her @jess7bennett.

Read more views on the debate about preventing sexual assault on campus:

Caitlin Flanagan: We Need More Transparency on the Issue of Fraternity Rape

A Lawyer for the Accused on Why Some Rules About Consent Are Unfair to Men

Ban Frat Parties–Let Sororities Run the Show

TIME Opinion

Why Ferguson Should Matter to Asian-Americans

A female protester raises her hands while blocking police cars in Ferguson
A female protester raises her hands while blocking police cars in Ferguson, Mo. on Nov. 25, 2014. Adrees Latif—Reuters

Ferguson isn’t simply black versus white

A peculiar Vine floated around social media Monday evening following the grand jury announcement in Ferguson, Mo. The short video shows an Asian-American shopkeeper standing in his looted store, with a hands-in-his-pockets matter-of-factness and a sad slump to his facial expression. “Are you okay, sir?” an off-screen cameraman asks. “Yes,” the storeowner says, dejectedly.

The clip is only a few seconds, but it highlights the question of where Asian-Americans stand in the black and white palette often used to paint incidents like Ferguson. In the story of a white cop’s killing of a black teen, Asian-Americans may at first seem irrelevant. They are neither white nor black; they assume the benefits of non-blackness, but also the burdens of non-whiteness. They can appear innocuous on nighttime streets, but also defenseless; getting into Harvard is a result of “one’s own merit,” but also a genetic gift; they are assumed well-off in society, but also perpetually foreign. Asian-Americans’ peculiar gray space on the racial spectrum can translate to detachment from the situation in Ferguson. When that happens, the racialized nature of the events in Ferguson loses relevance to Asian-Americans. But seen with a historical perspective, it’s clear that such moments are decidedly of more colors than two.

VOTE: Should the Ferguson Protestors Be TIME’s Person of the Year?

Michael Brown’s death has several parallels in Asian-American history. The first to come to mind may be the story of Vincent Chin, a Chinese-American killed in 1982 by a Chrysler plant superintendent and his stepson, both white, both uncharged in a racially-motivated murder; like Brown, Chin unified his community to demand protection under the law. However, most direct parallels have often had one distinct dissimilarity to Ferguson: they have not spurred widespread resistance, nor have they engraved a visible legacy.

There is the story of Kuanchang Kao, an intoxicated Chinese-American fatally shot in 1997 by police threatened by his “martial arts” moves. There is Cau Bich Tran, a Vietnamese-American killed in 2003 after holding a vegetable peeler, which police thought was a cleaver. There is Fong Lee, a Hmong-American shot to death in 2006 by police who believed he was carrying a gun. None of the three cases resulted in criminal charges against the police or in public campaigns that turned the victim’s memory into a commitment to seek justice. One op-ed even declared how little America learned from Tran’s slaying.

While Ferguson captures the world’s attention, why do these Asian-American stories remain comparatively unknown?

One possible answer could be found in the model minority myth. The myth, a decades-old stereotype, casts Asian-Americans as universally successful, and discourages others — even Asian-Americans themselves — from believing in the validity of their struggles. But as protests over Ferguson continue, it’s increasingly important to remember the purpose of the model minority narrative’s construction. The doctored portrayal, which dates to 1966, was intended to shame African-American activists whose demands for equal civil rights threatened a centuries-old white society. (The original story in the New York Times thrust forward an image of Japanese-Americans quietly rising to economic successes despite the racial prejudice responsible for their unjust internment during World War II.)

Racial engineering of Asian-Americans and African-Americans to protect a white-run society was nothing new, but the puppeteering of one minority to slap the other’s wrist was a marked change. The apparent boost of Asian-Americans suggested that racism was no longer a problem for all people of color — it was a problem for people of a specific color. “The model minority discourse has elevated Asian-Americans as a group that’s worked hard, using education to get ahead,” said Daryl Maeda, a professor of ethnic studies at the University of Colorado, Boulder. “But the reality is that it’s a discourse that intends to pit us against other people of color. And that’s a divide and conquer strategy we shouldn’t be complicit with.”

Through the years, that idea erased from the public consciousness the fact that the Asian-American experience was once a story of racially motivated legal exclusion, disenfranchisement and horrific violence — commonalities with the African-American experience that became rallying points in demanding racial equality. That division between racial minorities also erased a history of Afro-Asian solidarity born by the shared experience of sociopolitical marginalization.

As with Ferguson, it’s easy to say the Civil Rights movement was entirely black and white, when in reality there were many moments of interplay between African-American and Asian-American activism. Japanese-American activist Yuri Kochiyama worked alongside Malcolm X until he was assassinated in front of her. Groups protesting America’s involvement in the Vietnam War, like the student-run Third World Liberation Front, united resisters across racial lines under a collective radical political identity. W.E.B. DuBois called on African Americans to support the 1920s Indian anti-colonial resistance, which he compared to whites’ oppression of blacks. Chinese-American activist Grace Lee Boggs, who struggled as a female scholar of color, found passion in fighting similar injustices against African-Americans alongside C.L.R. James in the 1950s. Though Afro-Asian solidarity wasn’t the norm in either groups’ resistance movements, the examples highlight the power of cross-racial resistance, and what hardships they shared as non-whites.

The concept of non-whiteness is one way to begin the retelling of most hyphenated American histories. In Asian-American history, non-whiteness indelibly characterized the first waves of Asians arriving in the mid-1800s in America. Cases like People v. Hall (1854) placed them alongside unfree blacks, in that case by ruling that a law barring blacks from testifying against whites was intended to block non-white witnesses, while popular images documented Asian-American bodies as dark, faceless and indistinguishable — a racialization strengthened against the white supremacy of Manifest Destiny and naturalization law. Non-whiteness facilitated racism, but it in time also facilitated cross-racial opposition. With issues like post-9/11 racial profiling, anti-racism efforts continue to uphold this tradition of a shared non-white struggle.

“This stuff is what I call M.I.H. — missing in history,” said Helen Zia, an Asian-American historian and activist. “Unfortunately, we have generations growing up thinking there’s no connection [between African-Americans and Asian-Americans]. These things are there, all the linkages of struggles that have been fought together.”

The disassociation of Asian-Americans from Ferguson — not just as absent allies, but forgotten legacies — is another chapter in that missing history. In final moments of the Vine depicting an Asian-American shopkeeper’s looted store, the cameraman offers a last thought in their conversation that had halted to a brief pause. “It’s just a mess,” the cameraman says. The observation, however simplistic, has a truth. That, as an Asian-American who’s become collateral damage in a climate often black-and-white, he, like all of Ferguson, must first clean up — and then reassess the unfolding reality outside.

TIME

When One Twin is More Academically Gifted

My son tested into the gifted program at school, but my daughter didn't. Should I split them up?

Splitting up twins in school is never easy. But splitting up twins so that one goes on the advanced learning track and the other follows the regular program is one of the most agonizing decisions a parent can face. And no amount of Internet searches will give you helpful advice. The consensus: Figure it out, parents. That’s what you’re (not) paid for.

As you may have guessed, I have twins, a boy and a girl, and they’re in the first grade. I happen to be a fraternal twin myself, so I’m sensitive to always being compared to a sibling. My son is like his engineer father —completely committed to being a lovable nerd. The other day he found a book of math problems at Barnes and Noble and was so excited it was as if Santa arrived, handed him a gift, and then let him ride a reindeer. My daughter is like her freelance writer mother – studying is not really her thing. She reminds me of the prince in Monty Python and the Holy Grail who is to inherit a large amount of land and says, “But I don’t want any of that. I’d rather sing!” That’s my girl.

We were first introduced to our school’s Spectrum (advanced learning) program last year in Seattle, Washington at the beginning of kindergarten. The kids could be tested that year and would enter the program—or not—in first grade. I hadn’t really thought about whether to have my kids tested. Other parents apparently had. One asked: “Should we have our child practice at home with the same kind of mouse they’re going to use in the test?”

In the beginning, my husband and I laughed at the idea of advanced learning in the first grade. We joked about “Level Two Crayons” and “Expert Alphabet.” But then, as the day to decide about testing came closer, we started hearing from our son’s teacher about how gifted he was. What first grader wants to practice math and reading on his own during the evenings and weekends? My son. And then there was my daughter, who was right on track, but, like most kids her age, was happy to leave school stuff at school. “Let’s just get them both tested and see what happens,” I said.

As far as my kids knew, they were just going to school to talk about what they know and what they don’t. They were never told that the results of the test had any sort of consequences and weren’t the least bit curious. But when we got the results–my son tested into the advanced program and my daughter didn’t–I immediately became anxious. I wanted to let my son move into the advanced program because I knew he would love it and thrive. But I worried for my vibrant, passionate daughter who at the age of six doesn’t think she has any limits. How was I going to separate her from her brother because he could do something better?

As a child I never felt smart enough. Not because of my twin sister, but because of my mother, who was brilliant. She used her intelligence to get off of the Kentucky farm where she grew up and into a New York City law firm. She placed a lot of value on the power of education and what good grades could do. I felt perpetually unable to meet her high expectations. Now I had a daughter who, in kindergarten, was already resistant to doing her reading homework. I was terrified that placing her brother in a higher academic track would affect my daughter’s self-esteem.

I contacted Christina Baglivi Tingloff from the site Talk About Twins. She’s a mother of adult twins and author of six books, including Double Duty and Parenting School-Age Twins and Multiples. “It’s tough when twins differ in abilities,” she says, “and I’d say that it’s the biggest challenge of parenting multiples. [But] kids take their cues from their parents. If you make this a non-issue in your household, I think your kids will follow suit.”

My husband and I have no lofty goals for our kids besides wanting them to be able to pay their own bills, not hurt themselves or anyone else, and be happy. “So many parents of twins try to even the playing field,” says Tingloff. “In my opinion, that’s a bad course of action because…kids then never develop a strong emotional backbone. Your job as a parent is to help them deal with the disappointments in life.”

We ended up putting our son in the Spectrum program and our daughter in the regular learning track. In the years to come, I will make sure that they understand that advanced or regular doesn’t mean better or worse, it just means different. I want both of my children to do the best they can, whether that means taking advanced classes or singing the hell out of the school musical.

When my daughter wanders through the house making up her own songs and singing at the top of her voice, I support her…most of the time. “Really encourage your daughter in the arts,” says Tingloff. “Find her spotlight. At some point her brother will look at her accomplishments and say, ‘Wow, I can’t do that.'” While I had been worrying all this time about my daughter feeling outshined by her brother, I had never considered that he might also feel outperformed by her.

Despite all of my talk about how my daughter’s interests were every bit as valid as her brother’s, I had not been treating them the same. I saw the dance and drama as diversions and hobbies. I never gave those talents the respect that I gave to her brother’s academic interests.

Now that I am more aware of how I have been valuing their different strengths, I’ll be able to give my daughter’s interests the same amount of focus and praise as her brother’s. Hopefully, I can assure them that our only concern is their happiness. Then my husband and son can go do math problems together, and take things apart to see how they work, and my daughter and I will lay on the grass and find shapes in the clouds while we wonder about the world and sing.

The truth is, both my kids are gifted.

 

Your browser, Internet Explorer 8 or below, is out of date. It has known security flaws and may not display all features of this and other websites.

Learn how to update your browser