Bobbie Jo Racette auditioned fifty potential business partners before founding her company, meeting each candidate at an Ordinary Joe’s restaurant in Calgary. She had a hard time finding a match.
Maybe fifteen of the fifty had a problem with her being gay.
“They would hint: ‘Well, that we can hide.’ Or, ‘We don’t need to tell people that you’re gay,’” says Racette. “And I’m like, ‘Oh no, no, no. You’re not putting me back in the closet.”
That was in 2015. Racette is now the COO of that business, a virtual personal assistant service called Virtual Gurus. She spoke on a panel of young entrepreneurs at the 5th Annual LGBT Summit of the Americas, which was held this year in Toronto to coincide with Pride.
The summit is a conference for LGBT business people, and anyone who wants to do business with them. It might sound like another instance of corporate opportunism, in the vein of the rainbow-decked banks that will ditch their pride flags by the end of the month.
But organizers and speakers say they’re working for the economic advancement of LGBT people. They say it’s wrong to think of being queer as a purely personal matter, which has no place in the professional world. And they’re ready to make the case that inclusion is good business, too.
The Canadian Gay and Lesbian Chamber of Commerce, which hosted the conference, estimates that there are 140,000 LGBT-owned businesses in Canada, with an annual buying power of $90 billion. That doesn’t count the productivity of around 2 million LGBT people. When those people are marginalized at work or excluded from the labour force, the chamber argues, they can’t reach their economic potential.
The four previous LGBT business summits were held in Mexico, Peru, Colombia, and Costa Rica to mark the launch of LGBT chambers of commerce in those countries. Organizers of this LGBT business movement argue that including queer workers allows them to build up power and visibility while contributing more to the national economy.
Researchers at the Williams Institute at UCLA, who presented at the conference, have found that LGBT inclusion is closely tied to economic development in countries around the world. Researchers argue that police abuse, violence, workplace discrimination, and exclusion from health services and education all push LGBT people out of the workforce.
Their 2014 study concluded: “At this micro-level, the costs to the economy of just these five examples of types of exclusionary treatment include lost labour time, lost productivity, underinvestment in human capital, and the inefficient allocation of human resources.”
In Canada, the barriers to inclusion look different—sometimes more subtle—but are no less real. Racette says she has run into obstacles around appearance, and the tacit demands of an office culture.
She once spent an unhappy six months as an office administrator where she wasn’t out at work. Racette, who came to the conference in grey slacks and black button-down with sleeves rolled up to show her intricate tattoos, says she felt pressure to hide her forearms, dress “girlie,” and not talk about her personal life. It was demoralizing, she said.
“I stood out because everyone else in the office was wearing a skirts and high heels,” she said. “That’s what they wanted.”
No one in the hotel ballroom seemed shocked to hear about the thinly veiled homophobia of Racette’s would-be business partners. Many have heard this message for years: be gay if you want to, but don’t bring it to work.
But Racette is among a younger generation of LGBT entrepreneurs who won’t compromise or conceal their identity. To them, being visible at work is not only a question of justice. It also makes for a better workplace.
“I’m going to be who I am,” said Racette. “I’m going to be tattooed. I’m going to be gay. And I’m always going to be First Nations.”
When she started thinking about her own company, Racette didn’t want a business partner who told her to conceal her sexuality, because she didn’t want a company where anyone would have to. Today, she has a lot of queer people working for her.
“I feel like a lot of them come to me because they feel like they can,” said Racette. “When they start working with me, they realize that I’m completely out, and I’m proud, and I treat everyone the same.” That encourages her employees to come out at work.
Her message is: “it’s comfortable. If you’re bisexual, you’re bisexual. No haters. No judging. Like, don’t worry about it.” That goes for clothing too, tattoos, and hairstyle.
Racette says her employees sometimes worry that their appearance might alienate new clients: “They’re coming for the quality service, not for the way you look,” she tells them. And it seems to work. These days, Racette says, she has more clients than she can handle. And she gets the best from her employees because they feel comfortable.
The corporate world in Canada is beginning to see the economic case for inclusion too. Conference goers wore their nametags on green TD lanyards. The bank was a sponsor, as were accounting behemoths Ernst & Young and KPMG.
Mary Lou Maher, KPMG’s chief inclusion officer, spoke earlier in the day, before Racette’s panel. Her presence and her title both show the progress that’s been made in recent years. But her remarks were also a reminder of how recent that progress is.
“I came out when I made partner,” she said. That was 1995, twelve years after she joined the firm. Many at the conference, Maher said, would remember what it was like to get up in the morning and put on a mask before going to work. She recalled carefully watching the pronouns she used when she answered water-cooler questions like “what did you do last weekend.”
Maher’s two minutes on stage were as good an answer as any to the questions that are often used to justify queer exclusion at work: what does being gay have to do with business anyway? Why bring sexuality into it at all?
Every business is made up of people. Some of those people are going to be gay, or lesbian, or bisexual, or trans—whether you can see it or not. If you don’t include them, you may lose them or saddle them with an unjust and unnecessary burden.
Maher ended her remarks with a nod to the catchphrase of the conference. You should, she said, be able to “bring your whole self to work.”
The first 2016 Democratic Presidential debate had just begun when the moderator, CNN’s Anderson Cooper, summoned the spectre that many assumed would devour Senator Bernie Sanders’ candidacy. “You call yourself a democratic socialist,” Cooper said. “How can any kind of socialist win a general election in the United States?”
The question was presented as its own answer: only a politician of truly colossal naiveté would stand on a Las Vegas stage and expound the virtues of Scandinavian-style democratic socialism. Surely, the candidate would have no choice but to backpedal, re-frame his politics as “progressive,” and maybe mumble a few conciliatory words about hardworking American families. However, Sanders seemed to believe that he could win because of his socialism, that all he had to do was explain what his form of socialism meant—an alternative to a “rigged economy” in which “the top one-tenth of one percent own almost as much wealth as the bottom 90 percent.” It meant providing healthcare as a human right, and paid medical and family leave to end the American travesty of separating mothers from their newborn babies. It meant a strategic deployment of Scandinavian solutions, to “learn what they have accomplished for their working people.”
But Cooper was less impressed with the details of Sanders’s politics than he was with the very fact of his self-proclaimed socialism. Cooper concluded his Joseph McCarthy routine by asking if any of the other four candidates on the stage was “not a capitalist.” No hands went up, which was, of course, the point. Sanders, viewers were to understand, was a relic, a political dinosaur whose ideological DNA had somehow been preserved in amber. His platform—free post-secondary education, wealth redistribution, and more regulations for Wall Street—bore the hallmarks of an unregenerate leftist. Had the Berlin Wall not been sledgehammered down, just as Reagan demanded? Had the Soviet Union not collapsed beneath the weight of its own bloody contradictions? Were we seriously debating whether a socialist could win the White House in 2016?
In fairness, Cooper was only channeling the collective wisdom of the entire American political establishment. Indeed, across the West, fewer factions are more ferociously committed to the “death of socialism” narrative than centre-left parties themselves. This was seen in the particular malice with which former British Prime Minister Tony Blair treated Labour party leader Jeremy Corbyn in their recent election. For a Third Way centrist like Blair—who eliminated “common ownership of the means of production” from the Labour constitution in 1995—Corbyn, an unapologetic socialist with plans to nationalize the railway, scrap tuition, and abandon Britain’s nuclear arsenal, represents a stale menu of policies that had been rejected around the world a generation ago. Today’s voters “do not think their challenges can be met by old-fashioned state control as the way to personal or social empowerment,” Blair wrote in the Guardian, “and they realize that a party without a serious deficit-reduction plan is not in these times a serious contender to govern them.”
Over the past few decades, the assumed victory of laissez-faire capitalism over socialistic alternatives has been the sine qua non of Western economic policy. Austerity, de-regulation, de-unionization, trade liberalization, tax cuts—the free-market fundamentalism underlying these policies is not, we are told, a contestable ideological position, but rather economic reality. Anyone who dares challenge the essential wisdom of the market is labeled an irresponsible fantasist, unworthy of the people’s trust. In fact, partly due to Corbyn’s leadership, pollsters predicted a historic victory for the incumbent Conservatives (who were going into the election with a majority), saying that they could see their strongest electoral showing since 1979.
We know how that turned out: significant gains (and political vindication) for Corbyn’s Labour, while Theresa May’s diminished Tories now hobble along with a minority government. This outcome was just the latest instance in which a conservative party in the West managed to snatch a defeat from the jaws of presumptive victory. 2017, widely assumed to be the year in which rightwing nationalism would go viral, has seen electoral defeats for populist parties in Austria (Hofer), Holland (Wilders), and France (La Pen). Kellie Leitch, Canada’s contribution to this pathetic pantheon, never received more than 8 percent of the vote through nine rounds of ballots in the Conservative Party leadership contest.
Far from the predicted ascension of right-wing nationalism, 2017 has seen a generational revival on the left. An increasingly educated electorate is capable of repudiating the atrocities perpetrated in the names of Marx and Lenin while also recognizing that specific, achievable goals—a livable minimum wage or guaranteed annual income, universal healthcare, reduced income inequality—are properly called socialist goals, and that their realization would enable better lives for more people. In Canada, with the NDP leadership race now underway, it seems inevitable that at least one candidate will look at the popularity Corbyn and Sanders were able to garner in a short period and say, Why not here?
The case for a socialist NDP platform finds support in the party’s recent electoral fortunes: when Thomas Mulcair maneuvered the party to the centre in 2015, the NDP lost both 51 seats and a greater percentage of the popular vote than Stephen Harper’s Conservatives (not to mention their Official Opposition status). Yet the victory of an unabashedly socialist platform today is far from certain. Corbyn’s success, many have argued, is inextricable from Theresa May’s failure, while Sanders’ ascendance was grounded in the perception that Hillary Clinton’s priorities lie with preserving an economic order in which working-class Americans had been left behind. In short, the socialist surge was animated by British and American anxieties that are mostly absent in Canada, where the politics of austerity are more muted and social programs are not under comparable threat.
At the same time, there’s no denying that some economic and environmental anxieties transcend borders—particularly among millennials, who are rapidly becoming the largest voting demographic in the West. As the dream of home ownership recedes further into the realm of fantasy, young, urban voters in Canada could be receptive to housing policy akin to Corbyn’s right-to-buy scheme, which would regulate rental markets and guarantee tenants the opportunity to buy their homes at subsidized mortgage rates. At a time when more young Canadians than ever are attending post-secondary education—and when more parents than ever are paying for that education—tuition relief policies, embraced by both Sanders and Corbyn, could also resonate here. And as the Trudeau government approves more pipelines and encourages further tar sands development, space emerges on the left for a more credible environmental policy.
If the NDP has anything to learn from Sanders and Corbyn, however, the lesson must include style as well as substance. Both of these seasoned socialists speak in terms of class-consciousness—even class warfare—that accurately register the stark realities of wealth and income inequality in modern capitalism. Regardless of policy, Sanders and Corbyn have proven the rhetorical efficacy of calling for political “revolution,” of framing the interests of their constituents in direct contrast to those of the “billionaire class,” and the NDP should inject their arguments with similar brio. At a minimum, they must dispense with the stale canard that left-wing parties must accept market-based “realities,” scrub off that unionist stench, get cuddly with corporations, and fight for the scraps of the political centre.
Yes, centrism scored a recent victory in France, where Emmanuel Macron’s En Marche party achieved an absolute majority—though not without the assistance of a communist party that won 20 percent of the popular vote in the first round. But if Canada’s Liberals fail to address anxieties on housing, the economy, and the environment, its Trudeau dynasty could be radically abridged by an NDP that promises to do just that.
The unexpected ascendance of socialist politics is not so much a repudiation of populism as it is another manifestation of the same underlying discontent. That malaise is an effect with multiple causes, including the wholesale “unwinding” (to invoke George Packer’s indispensable book) of entire industries including steel, tobacco, and manufacturing; the long-term erosion of unions, religious institutions, and public schools; increased automation, and the displacement of commodity production by financial capitalism.
But that discontent is also related to long-term structural trends in the rapidly de-industrializing West—“crisis symptoms” which are, in the view of economic sociologist Wolfgang Streeck, now irreversible. These mutually reinforcing trends include a chronic decline in economic growth among OECD countries, the continuous rise of household and governmental debt, and spiking economic inequality measured by both income and wealth. The longer these trends persist, the stronger the desire among lower and middle-class voters for political voices that speak to the chronic injustice of our current arrangements. Whether any party on our political horizon is capable of actually correcting those injustices is, of course, another question entirely.
Nearly thirty years have passed since Francis Fukuyama published his often discussed essay “The End of History?” in which he argued that the West’s “triumph . . . is evident first of all in the total exhaustion of viable systematic alternatives to Western liberalism.” The upshot of Fukuyama’s argument, accepted as axiomatic by all mainstream Western political parties in the intervening span, is that any serious political analysis must accept the death of utopian possibilities: the ideological fantasies of both left and right would be replaced (Fukuyama prophesied) “by economic calculation, the endless solving of technical problems, environmental concerns and the satisfaction of sophisticated consumer demands.”
Today, confronted by a systemic failure of technocratic solutions to capitalism’s contradictions and a corresponding resurgence of both left- and right-wing political alternatives, it has become increasingly obvious that Fukuyama’s idealized vision of Western liberal democracy was itself the utopian possibility. As the unstoppable forces of automation and globalization continue their work of creating jobless voters, and as a growing demographic of educated citizens recognizes the political function of the historical libel against socialism—that is, the upward redistribution of wealth from the poor to the rich—a genuine socialist alternative will appear increasingly viable. Whether the heirs of Tommy Douglas will provide that alternative remains an open question, though one thing is certain: our political future is going to involve a continuing reappraisal of some old ideas.
Correction: an earlier version of this article misstated the number of rounds Kellie Leitch participated in during the Conservative Leadership contest. She was on the ballot for nine rounds, not one. The Walrus regrets the error.
After weeks of secret deliberation, Republicans in the US Senate have revealed their changes to the bill that promises to replace the Affordable Care Act (ACA)—and, while they had promised significant changes, it looks a lot like the bill that passed Congress early last month. It seems that Republicans in every branch of government have begun to realize that healthcare is a very complicated issue to deal with. But the complexities of healthcare don’t simply provide building blocks for future policy lessons—they teach us lessons about our society, ourselves, and the differences we possess.
The Republicans have expended monumental effort to repeal and replace the ACA—President Obama’s landmark healthcare bill from 2010 which strove, among many things, to reduce total health system costs, increase access to care and services, and provide Americans with the type of coverage that would eliminate discrimination based on pre-existing conditions. A large part of the Republicans’ reasoning, as articulated by House Speaker Paul Ryan, has been that “government shouldn’t be (this) involved in people’s healthcare.” It comes back to their party’s notion that the role of government is to follow the path of least intrusion. However, for those of us who practice, follow, and believe in the principles of behavioural economics, this is folly. Behavioural economics studies the effects of psychological, social, cognitive, and emotional factors on the economic decisions of individuals and institutions and the consequences for market prices, returns, and resource allocation. The health and social policy literature is littered with examples of people needing “nudges,” and of social experiments where we see clearly that individual decision-making is shaky and tenuous.
Part of this has to do with health literacy—an area of healthcare that is woefully ignored. Health literacy, as defined by the US Department of Health and Human Services, is “the degree to which individuals have the capacity to obtain, process and understand basic health information and services needed to make appropriate health decisions.” It’s hard to decide what to do when you can’t understand what you’re reading and you lack prior experience to guide your decision-making. So, we know that people do not always make healthcare decisions that are in their own best interest, and in some instances, they don’t even make decisions better than someone else can do for them. Conventional examples help to illustrate this concept: people still smoke, lead sedentary lives with minimal exercise, and consume diets rich in fats and added sugars. On their own, let alone collectively, each of these behaviours is clearly not in the best interest of the individual. Other people—a clinician or a loved one, for example—could make a better decision for the individual concerned. But public vaccination programs are also an example of this. People don’t always get a flu shot or an HPV vaccine. Whether it is a literacy issue, such as not understanding the importance of vaccinations, or simply making a bad decision for other reasons, the decision or “nudge” to vaccinate needs to be made by someone else.
However, the mistaken idea that people are the best judges of their own health remains a tenet of our approach to healthcare, and it helps explain the debate south of the border. People make bad decisions. This is not an exhortation for extreme government involvement in all aspects of healthcare; it is an acknowledgment that, intentionally or unintentionally, countries that have some level of essential benefits and services set out by the government may be onto something. We need to accept that people bring biases to decision making, and that those biases may be a function of many things—including one’s inability to understand what they are being told.
Another lesson from the raging debate south of the border manifests itself in the “individual mandate.” This is the aspect of the ACA that requires individuals to either purchase health insurance or face a penalty. There was an attempt with Obamacare to establish a “point of indifference” that might motivate healthy people to buy insurance, which would help premiums stay within an affordable range. Here’s what happens in a nutshell: healthy people are mandated to buy health insurance and they do (sometimes very grudgingly). Sick and vulnerable people also buy health insurance. The “risk pool” now has lots of healthy people and a fair number of sick people. Insurance companies are happy, as they collect premiums from the entire risk pool and only pay out to those who are sick and make claims. Premiums stay relatively stable in this scenario—until healthy people are released from the onus of having to buy health insurance.
Those same risk pools now have an exodus of healthy members, but they retain their sick members. Eventually, a significant majority of healthy people have left. The insurance companies now start paying out claims at a higher rate than the premiums they’re collecting. They raise premiums the following year to offset their anticipated losses. And when they raise rates, the sick people can’t afford their new insurance policy so they go without insurance.
In health economics, we have what we call “indifference curves.” We define this phenomenon as the scenario in which a consumer’s utility or preference between two goods is of equal value. That is to say that the consumer has no preference for one combination or bundle of goods over a different combination on the same curve. In other words, there is a point at which health-care consumers ought to be indifferent between buying insurance and paying a penalty if they don’t, such that they do buy the insurance. The irony is obvious. In the field of medicine and public health, we are taught from our early days to focus on caring and treating people with respect and dignity. Instead, sometimes not caring—being, in fact, indifferent—just might be a big part of the solution.
It’s hard to take something away once you’ve given it to society. It’s even harder when you have no back up plan. This is true of the entire Affordable Care Act, as Republicans are finding out. It’s not easy to make a piece of legislation that contributes to an industry responsible for almost 20 percent of the US’s gross domestic product (GDP) go away. But it’s also true of the individual elements and clauses within the act itself. Pre-existing conditions: don’t touch that! Children covered on their parents plan until the age of twenty-six: need it. Federally established essential health benefits and services: an absolute must. If Republicans are to succeed (or, for that matter, if any political party in any jurisdiction is to succeed in this type of bold attempt to overturn existing legislation), the focus needs to be on “building upon and improving” instead of “repealing and replacing.” This is as much a lesson in communication as it is about navigating the corridors of power. By using words like “repeal” and “replace,” there is a palpable sense of a void. And with it a connotation that is inescapable: that repealing and replacing implies I must give something up.
To be clear, the use of different words probably would have made a very minor impact. There still needed to be an actual ‘plan’ that made sense. However, the behavioural sciences have taught us for decades that taking things away or giving something up elicits a very strong reaction in people. It’s called “loss aversion,” and empirical studies have shown that it’s about twice as powerful psychologically as the prospect of gaining something. Those same behavioural sciences have also taught us that language and words can be critically important drivers of behaviour. It’s bizarre that a President who has arguably used psychology and communication as effectively, if not better, than anyone else who has occupied his seat missed this one.
The great martial artist and philosopher Bruce Lee said this of water: “When you pour water in a cup, it becomes the cup. When you pour water in a bottle, it becomes the bottle. When you pour water in a teapot, it becomes the teapot.” This is our fourth lesson. Healthcare is like water. A nation’s identity is the cup. Healthcare becomes the embodiment of a nation. We speak with pride of our commitment to a social system that relies and insists upon the idea that I will look after you when you are old, and that the next generation will look after me when I am old. It is, at the risk of sounding cheesy, a part of who we are as Canadians. Our healthcare, with all its imperfections, has become an integral part of our identities. And it is not just a “Canadian” thing; When you travel to Europe and speak with the English, the French, the Germans, the Swiss, it becomes immediately obvious that other countries, too, have allowed healthcare to take the shape of their national identities. In America, this phenomenon has not occurred. Healthcare is, in some respects, regarded as an enemy, a suspicious intruder that is here to rob us of our hard-earned dollars and that answers to private corporate entities that do not have our best interests at heart. How the nation organizes its health-care system has never been an intrinsic part of what it means to be American. Hence, it feels like it’s easier for them to discard it (or parts of it) with the arrival of every new administration.
Every nation struggles with the harsh reality that there is no one, perfect solution to the problem of how we should take care of ourselves and each other. However, for a country that has produced some of history’s greatest medical innovations, some of the finest clinicians in the world, and spends more per capita on healthcare than any single place on the planet, our American neighbours seem to struggle more deeply than others. Perhaps it is due to partisan politics and political myopia. Or perhaps it is due to ignoring the most basic lesson of them all: If you have your health, nothing else matters. If you don’t have your health, nothing else matters.
The only time that the azan—the Islamic call to prayer that I’ve heard all my life—has truly frightened me was one evening four years ago, when it rang unexpectedly from my phone just as I was leaving the Legislative Assembly of Ontario. I was working as an usher in the home of Ontario politics, helping manage and maintain the institution’s decorum and traditions. After a five-hour shift, six of my colleagues and I were heading home via the Queen’s Park subway station just as the sun was setting over that gorgeous castle of a building. Everything was illuminated in a perfect soft Instagram-filter glow, and rush hour had slowed down to a pleasant hum as we walked and talked.
As a precautionary measure, my prayer app is always set to vibrate—an intense, impossible-to-ignore series of vibrations that buzz five times a day. But phones sometimes do mysterious things. On that day, mine decided to violate its human-set instructions. Loudly. Allahu akbar. Alllaaaaaaaaahhhh-hu-akbar.
Time suddenly stood still, as though my phone had burst into an obnoxious ringtone in the middle of a wedding service. My hands immediately began to rummage through my shoulder bag in search of the volume-down button on the phone that was inevitably buried at the very bottom. When I looked up, my colleagues were all looking silently at me, at my bag. I’d never wanted powers of invisibility or teleportation more. “What was that?” asked one of the boys. “Sorry guys,” I said, “it’s just time for the evening prayer, and my phone went spaz.”
I shouldn’t have apologized—there was nothing to apologize for—but the moment seemed to demand it. Actually, their slightly alarmed, confused looks seemed to demand it. I had inadvertently disrupted the peaceful evening with a sound that’s been deemed scary, traumatic even—the last words many victims likely heard before they were killed in terrorist attacks around the world. But I wasn’t a Muslim terrorist; I was just a Muslim, and my colleagues obviously knew that. They knew the distinction between the two. And yet that moment was deeply unsettling—for all of us.
My prayer app continues to be set on silent, even though these days, being Muslim is cool. In the past year, hijab-wearing women have been featured on the covers of high-profile magazines such as Playboy and Women’s Running and, most recently, Allure. Following the rise of American Olympic fencer Ibtihaj Muhammad, who was also the first veiled Muslim woman to ever appear on The Ellen Show, Nike generated social media traction when it announced a line of athletic wear tailored for Muslim women. Stephen Colbert has hosted every famous Muslim person on his show. Ms. Marvel is now a young Muslim-American girl—a superhero who struggles with her sense of self as she grapples with her faith and her superpowers. The glossy message, it seems, is that Muslims can be sexy, badass, and even liberated.
Around the world, non-Muslims have started to increasingly protect Muslims as they pray or break their fast in public places, from major international airports such as JFK to protest sites outside the New York Trump tower. If you attack Sadiq Khan, beloved Muslim mayor of the city of London, UK, you’re collectively told to “BACK THE F**K OFF.” And the phrase “Muslim ban” has galvanized a burgeoning movement of Muslims and non-Muslims alike that is supporting our freedom of religion and our right to belong like never before.
Meanwhile, in Canada, we have a prime minister who makes every Muslim swoon when he says “As-salam-walaykum” in his videos marking religious events such as Ramadan and Eid. The same prime minister has visited mosques, sat down for iftar, greeted Muslim refugee families at Toronto Pearson airport, and helped pave the way for anti-Islamophobia legislation across the country. Friday prayers still happen at schools despite community-led efforts to get Ontario’s Peel District School Board to dismiss them as “indoctrination.” October is nationally known as Canadian Islamic History Month, and has been since 2007. (Ontario declared October to be Islamic Heritage Month last year.)
With the exception of the occasional political stumble—a proposed niqab ban, a barbaric cultural practices hotline, immigration limits—Muslims are told they’re welcome in Canada. We have Calgary’s Naheed Nenshi, the first Muslim mayor in Canada and an object of almost universal adoration. And both the Aga Khan and Nobel laureate Malala Yousufzai have both addressed parliament, conferring their celebrity statuses onto the Muslim communities of any country they visit.
These combined gestures seem to point to a vast improvement over the associations that have plagued Muslims since 9/11. For years, the tired but still necessary trope that not all Muslims are terrorists has forced the community to identify as what we aren’t rather than what we are—and I, for one, have struggled to hold on to a secure sense of self. Too often, I’ve found myself formulating my identity in terms of how I differ from mainstream white culture: I don’t eat pork, I don’t drink alcohol, I don’t wear mini-skirts, I don’t date, and I pray five times a day, sometimes at a mosque.
Nowadays, magazine covers and government communications tell me that not only do those actions not make me a suspected terrorist—they’re to be celebrated. But here’s the really fine print: according to mainstream white culture, it’s okay to be a Muslim, but only if you’re the cool kind of Muslim—the Western Muslim. Fashionable. Current. Hip. Liberal. Relatable. Religious, but not too religious. You have to blend into society not by embracing your differences, but by minimizing them. Pray, but don’t flaunt it. Believe in God, but don’t air it.
These expectations have only strengthened the binaries of perceived Muslim identity. Muslim women are either oppressed or they’re badasses. They’re either victims of terror or soldiers against the patriarchy. Muslim men are either murderers or they’re heroes. Imams are either chauvinist dictators or inspiring faith leaders. I’m either . . . well, I don’t know.
“Just Muslim” isn’t a socially acceptable answer because my community is usually portrayed as one extreme or the other. That leaves little room in the public consciousness for Muslims who, like me, follow the scripture, but also love Harry Potter, listen to Drake, are LGBTQ allies, and fight rape culture. Because even if I am all these things, there’s a chance that someone will still attack me, verbally or physically. Just like seventeen-year-old Nabra Hassanen, who was killed last Sunday in Virginia after attending Ramadan prayers in her abaya and scarf. Or those at the Finsbury Park Mosque in London who were hit by a man in a van after the same prayers. Or, in my own country, those who were shot at a Quebec Mosque last January.
Being caught between binaries is not a new feeling. At school, I always used to slip away during lunch hour to pray. I had a list of excuses at the ready to justify leaving my friends: Bathroom break. Forgot to print my homework. Need to return a library book. Then I’d head to my locker, grab my abaya and scarf, and head to whatever obscure corner the school’s prayer room was hidden in.
Even though I grew up in Saudi Arabia and the United Arab Emirates, prayer rooms were an afterthought in the white halls of the British International schools I attended. In their original designs, no one thought that anyone at school would have need of prayer space. In fact, the rooms were only set up after an official ministry review in both countries mandated that every school should have one.
At my middle school, the prayer space was on the top floor at the back of the campus, accessible by a blue spiral fire escape. At my high school, it was near the cafeteria—an even bluer fire escape at the side of the building led up to a loft space above it.
The locations were perfect for a fourteen-year-old closet prayer who wanted to avoid the awkwardness that ensues when your mostly white classmates find out you pray. Awkwardness demanded that I justify and explain my actions in front of a jury that had already made a decision. It meant answering questions like, “Do your parents force you to?” or “Do you have to?” or, my favourite, “But why?” Being a Muslim was one thing, made respectable by the place we lived in, but being a fourteen-year-old Muslim girl who actually did Muslim things seemed bizarre—and, to some friends and coworkers today, that’s still the case.
As I got older, I tried to make my exits more natural, but I only became more apologetic. When we were hanging out at a friend’s house I’d ask to go to another room. “Sorry guys, I’m going to go pray real quickly.” If we were outside, I’d apologize for making a pit stop and pausing the fun. “Sorry, I just want to stop at the mosque real quick.” In order to be true to my faith, to be who I wanted to be, I felt the need to downplay my actions and ease any tension I might cause. Always, apologize.
I apologize because I realize I’m not represented in mainstream portrayals of Muslims—even those that stray from the clichéd terrorist. Think Aziz Ansari in Master of None—a self-identified Muslim man whose lifestyle seems entirely devoid of any Islam. In the Season Two episode “Religion,” he struggles to pretend to be a good Muslim when his parents invite him to Eid dinner with their friends. He usually eats pork, drinks, and doesn’t attend mosque—all habits his community wouldn’t approve of, and which his parents would prefer to keep secret. But that episode supposedly captures what the modern Muslim is expected to be: someone forced into Islamic traditions by the older generations in their family, but who actively disregards them.
Master of None is not the only show to erase my existence. I was excited when I first watched the hit FBI drama, Quantico, and saw that two of the characters are twin hijab-wearing Muslim female officers in the making. One seems more grounded in her Muslim identity than the other; she wants to wear the hijab while her sister does not. Their arguments about faith seem real, even bordering on nuance. The skeptic eventually removes her veil and becomes an integral part of high-level FBI operations. The other falls in love with a nice Jewish man; when he dies, she fades away from the central narrative, only to come back once or twice. The message, it seems, is that Muslim women can only attain heroine status when they defy their community’s traditions.
There’s nothing wrong with any of these portrayals. There is, after all, no one kind of Muslim. But such depictions recast Muslims solely in a western-friendly light, one that isn’t on my terms, and that continues to exclude me. Would it have been so difficult for the producers of Master of None to include a young person who, unlike Ansari, was actually trying to be an observant Muslim? Would fans have lost interest in Quantico if one of the twins were simply a practicing female Muslim FBI agent?
Then there’s real life. Think, for a second, of what happened in Portland and Manchester last month. Muslim members of the communities have come out in waves to prove they’re just as upset as everybody else, if not more. The headline of one Evening Standard report praises “The Muslim heroes of Manchester who rushed to help in the wake of the city’s terror attack.” The news of the Muslims who raised tens of thousands of dollars for the families of the Portland men who were stabbed to death while defending two young girls from a hate crime went viral. So did the unprecedented report that more than 130 Imams refused to perform funeral prayers for the terrorists of the London and Manchester incidents. So did the report of the Muslim who saved 64 Christians from being executed in the Philippines. Mayor Sadiq Khan earned accolades for saying that he is “a proud, patriotic British Muslim” and terrorist acts by Islamist groups are “not done in my name.”
Such pointed headlines and sound bites, and the positive social-media hot takes and retweets that accompany them, are only entrenching the binary further. They make me think I can only fit in by shunning tradition entirely—or, if I do practice my faith, I have to become a superhero to justify it. Until something changes, the only way I can validate my place in society is by winning Olympic medals, stuffing my face with bacon, or rescuing children from burning buildings.
Every Ramadan, my sister and I put on our abayas and headscarves and head to the mosque for the special night prayers. Growing up in Saudi Arabia, this was so normal to do, so natural—it was like a uniform for prayer, simple, smart, and decent. We’ve both mastered our individual hijab styles—hers is a little more spread out; mine is more fitted around the face. We’ve been wearing them for so long, we can put them on without looking in a mirror.
But it feels different here, today. “It must be scary for people here, you know,” my sister said one evening, “watching two girls in black abayas walking down the street.” For a moment, I felt hyper-aware of strangers’ reactions. All through the drive there, I wondered if the police car we had passed thought I looked suspicious, or if the family crossing the intersection was surreptitiously keeping an eye on us.
Some days my mom watches news coverage of the rising number of hate crimes aimed at Muslim girls and women, and worriedly tries to persuade me to wear the hijab full-time. “At least people will know immediately that you’re a Muslim,” she argues. “You’ll have a clear identity wherever you go.” Her point is that I’ll have more credibility—I’ll become an obvious card-carrying member of the community. She’s never pressured me to wear the hijab (it’s a choice; not a uniform); she’s simply acting out of concern. In a world that demands that I fit neatly into one category or another, maybe the hijab will spell out where I stand. No explanations or apologies needed.
But I shouldn’t have to change my habits to help shift the Western-accepted paradigm dictating how Muslim I get to be. Instead, people like me have to share their own stories, revealing the broad spectrum of Muslimness around the world.
If it’s cool to be the Aziz Ansari brand of Muslim today, the next step is to make it cool to be my kind of Muslim, too. The kind who chooses to pray and also chooses not to wear a veil. The one who finds solace in her faith and believes in Allah wholeheartedly. The kind who struggles with her identity, caught between two worlds, because that’s okay too.
This past month, I asked two event organizers and a restaurant employee for a space to pray. It was the first time I’d ever done it in Canada, and I was worried I’d have to explain myself—answer to the same unsettling looks my colleagues gave me four years ago. But all three happily obliged, without judgement or awkwardness. One of them took me to a storage closet in a back alleyway. But it’s a start.
Correction: Staff Sergeant Donovan called school boards following a raid in December 1978, and the 1981 raids were conducted on four bathhouses rather than three. Additionally, the number of men arrested in the 1981 raids is closer to 300, not 266. The Walrus regrets the errors.