Page 1

Washington U niversity

POLITICAL REVIEW 27.2 | October 2017 |



Constructions of Home,


Construction of Identity


Elizabeth Feldman & Claire Wild


The (Mis)construction of Knowledge Dan Sicorsky


Book of Life

Wash U: Then, Now, Tomorrow Hanna Khalil & Ryan Mendelson

Rachel Butler




The Moral Tones in our Politics


Luke Voyles

Reuben Siegnman


Neoliberalism and Climate Change


Jeffrey Connor


Metro Unlinked: How the St. Louis


Hanna Khalil & Ryan Mendelson


Dear Colleges: Pause the


Construction Sienna Ruiz



Luke Voyles


Emma Baker


Rebuilding Opportunity Christopher Hall


Deconstructing the Harvey Experience Madi Bangs

How to Debate Non-Interventionists Nicholas Kinberg


7 Years of Plenty, 70 Years of Famine: Understanding Venezuela's Crisis

A Story of Subways and Metrolinks

A Tale of Two Cities

The Trinitarian Streak: Religious Orthodoxy and the American Presidency

Daniel Smits

Hanna Khalil


The Last True Conservative Max Handler

Metro Marks Social Division


The Greatest Supreme Court Justice


North Korea: A Pointed Example of Diplomacy's Limits Johnathan Romero

EDITORS' NOTE Editors-in-Chief: Rachel Butler Dan Sicorsky Executive Director: Sam Klein

Dear Reader, Think back to the last thing you built. Was it a relationship? LEGO tower? Fresh career path, or new conception of yourself? Consider, too, all that is constantly created and recreated around you. Towers rise to new heights, cultures redefine their traditions, and political leaders rebuild countries according their worldviews. All around us, the world is under construction.

Staff Editors: Michael Fogarty Max Handler Katelyn Taira Sabrina Wang Features Editors: Hanna Khalil Ryan Mendelson Finance Director: Adya Jain Director of Design: Dominique Senteza Web Editor: Nicholas Kinberg Director of External Operations: Jack Goldberg Programming Director: Liza Sivriver Front Cover: Gavi Weitzman Theme Spread: Neema Samawi Dominique Senteza

Often, what is constructed leads to positive change. But, at the same time, “Nothing is so painful to the human mind as a great and sudden change,” as Mary Shelley wrote in Frankenstein. No doubt feelings are heightened in the face of interruptions to how things once were. When new presidential administrations reshape old ways and amend longtime policies, for example, many are unsettled. Or—in an example closer to our hearts— when tractors attack Olin Library and cranes tower over Brookings, feelings of annoyance and anticipation compete within our psyches. In this issue, writers and artists tackle a theme that is abstract but pointed, vague yet provocative. This openness inspired the cover designer, Gavi Weitzman, to create a mesmerizing pattern that depicts no one type of construction, and neither do the pieces you will find in the pages that follow. Some contributors examine constructions in their most literal sense. Sienna Ruiz writes about heavy spending on infrastructure development at many universities, including our own. Houston native Madi Bangs explains the substantial impact Hurricane Harvey dealt to her hometown. More writers explore even other physical constructions, including those of climate change, transportation, and facts. Not all constructions, of course, are of the Bob-the-Builder kind. Social constructions don’t involve concrete and hard hats, but they nevertheless gave us race, gender, and class. Emma Baker examines one of these less concrete constructions: the disparities on both sides of the Delmar Divide. In a feature designed by Neema Samawi, writers Lulu Feldman and Claire Wild detail how Wash U has shaped their perceptions of home and identity. As always, other articles consider national and international issues. Luke Voyles shares his views on Supreme Court Justice Byron White and Daniel Smits analyzes the economic crisis in Venezuela, to name a few. So, grab some LEGOs, plop down on the floor, and pour right in. The “Constructions” issue was built to be rich in content and design, and we are thrilled to share it with you. Warmly, Rachel Butler & Dan Sicorsky Editors-in-Chief

CONSTRUCTIONS OF HOME I came into college equipped with posters, fairy lights, the DC flag, and pictures of friends and family. I had all the ingredients to construct my dorm room into a home, and yet that home evolved differently than I could have imagined—gradually, and also in leaps. Each time I return to my dorm I bring a small part of Wash U with me: A domino’s coupon from the club fair, the smell of burnt popcorn on my sweatpants from the common room, muddy shoes from my first run in Forest Park. These are the physical and emotional tokens that transform my dorm into home. Then there are the leaps. I remember the first time I walked to class and saw people I know. It was an improbable feat—seeing four out of the five friends I had made so far in a span of ten minutes—and yet campus felt transformed. Once a blank space occupied by foreign faces, campus was now alive with people moving in different directions, but working together like clockwork. And I contributed to that clockwork. I remember when I discovered my regular meal at Bear’s Den. It was past midnight and I was craving something, something unidentifiable. Until [find his name!] at the Grizzly Grill suggested the fried egg sandwich—two fried eggs with perfectly melted cheese on sourdough bread—a masterpiece. “We make them all day,” he informed me proudly. At home, fried egg sandwiches are my go-to food. I have learned to love the tikka masala at WUrld Fusion and the pesto gnocchi at the DUC vegetarian station, but it’s the fried egg sandwich, available all day, that gives me comfort, bringing home to Wash U. A month into school, my dorm is full of motion. Chaos. Late nights. Naps. Some homesickness. New friends. Where one might see disorganization, I see evidence of life, of making my way into the Wash U community. A towel on the ground because I came back from water polo practice at 11pm, and I was too tired to hang it up. Sticky notes covering my desk because I have plans in the making: join Green Action, order my textbooks! 5:30pm cycle, office hours for lit class?? Like entropy, the disorder of my room is always increasing. The more my dorm spirals out of control, allowing Wash U to reach in, the more I begin to join the community. I cannot say that in just four weeks I am there, that I am ingrained in Wash U and Wash U is ingrained in me, but I am certainly on my way. Elizabeth (Lulu) Feldman ‘21 studies in the College of Arts & Sciences. She can be reached at

CONSTRUCTIONS OF IDENTITY I came into college with my hands completely empty; no baggage from who I was perceived to be in high school, no personal agenda to make the “right” friends, no urgency to establish my M.O. I was an architect with absolutely no complete blueprints to show—pieces, yes. Sketches, post-it notes, but nothing finished. And I was absolutely ecstatic about it. I did have a blueprint of myself in high school, but I knew it wasn’t quite right. STEM classes filled much of my perceived identity, as did leadership and tennis, but it often felt like these were the only windows built into my entire structure. The promise of college is that whatever previous identity structure or plan that existed can be absolutely bulldozed. Though it sounds radical, this part of the transition is inevitable. As you pass your new classmates for the first time, they have no idea who you once were. Granted, a blank slate can be equally intimidating as it is exciting. But isn’t it a tremendous gift to finally try all of your could be’s? All of the ideas and inklings that kept you up at night? Walking through the activities fair, I could identify all of the organizations to which I had once devoted myself, and the notion that I could return to some and try thirty new ones was euphoric. What I have realized throughout this fun run of identity seeking is that no matter what choices I make, there are parts of my identity that are constant. At the end of my pre-orientation program, each group member wrote affirmations for one another. Upon receiving mine, I saw words that my closest family members and friends have used on numerous occasions. Seeing the familiar words shed a little perspective on my new blueprint ideology - that I knew more about myself than I thought I did, and that not everything necessarily needs changing. With that, I can confidently say that I have about 30% of that blueprint finished. There is still much brainstorming, drafting, and erasing that needs to be done, and I am excited. I’m scared. I know it’s okay to screw it up a little bit. But I’m ready for all of it. Claire Wild ‘21 studies in the College of Arts & Sciences. She can be reached at


BOOK OF LIFE Rachel Butler | Design by Dominique Senteza Colored light dapples empty Pews, wood worn to splinters by children Scraping the shiny surface with fingernails, edges Of books of prayer, who really was praying, we asked Laughing, peering at each other behind skirts, over hats, rolling marbles on the floor.

Adam and Eve looked down frowning At us, apple still uneaten and glowing full Of innocence, afternoon light, and there was Moses still Adrift in the reeds, face half hidden in the dark basket, though I secretly knew he looked like me, I had a dream that he opened an eye and it was mine.

The window in the back was new, they put it there When Moses at the mountain was struck by a baseball Flying from the dark outside, now there were only shapeless sharp Slices of green, deep blue, deeper black, and there was no story watching us When we whispered about the pieces of our own stories, glass bright and thin as air.



THE MORAL TONES IN OUR POLITICS Reuben Siegman | Illustration by Eddie Ho


ince the 2016 election, there has been a palpable discomfort among friends who supported different candidates. Thanksgiving table was tenser, and some friendships have chilled. After the 2012 election, on the other hand, even though people may have ardently disagreed with you, it was still easy to be friends. The current division is not going away anytime soon. There is a moral tone in our politics, one where someone is not only wrong politically, but is also condemned to be morally bad because of the politics they espouse. While this moral tone may be here to stay, the tension doesn’t have to.

One of the primary reasons for the moral tone in our politics is that each of us have different values and approaches to solving societal problems, and this is reflected in our politics. Part of the reason that the moral overtones became so loud this past election is because instead of dog whistling to racism and bigotry, as many politicians have done in the past, Donald Trump used a loudspeaker. Hillary Clinton supporters then felt emboldened in making moral judgements against those who supported Trump, hence the infamous “deplorable” comment. Furthermore, Clinton and her supporters were enormously confident that she would win the election, empowering them to proclaim their opinions louder and louder. They assumed winning would prove that the majority of Americans did in fact share their values, and were not willing to tolerate Trump’s racism. This divide, however, did not end after the 2016 election. People have realized that policies and politics inherently reflect our values. As Joe Biden said, “Don’t tell me what you value. Show me your budget, and I will tell you what you value.” The health care debate this summer demonstrated that while one’s preferred policy incorporates their philosophy on the ways to improve society, it also reflects their values. Some feel strongly that healthcare is a right to which all citizens are entitled, while others believe that the government should not interfere in the healthcare market. This is a huge disagreement, and it would be foolish to ignore the fact that this policy affects

the lives of millions of people. These people could end up losing their health insurance, with potentially devastating consequences, as many activists brought up through personal narratives. Such divisions will bring out deeply-held beliefs. These differences are not something to brush off, as they reveal our world view and values. It evokes a much stronger emotion once you realize that someone you care about has different values than you, as opposed to merely a disagreement over policy. We like to imagine that all of the people we have relationships with are likeminded and share our values, but this is simply not the case. This need not be distressing to think about. Reasonable people disagree, and our democracy was made to allow disagreement. Just because people disagree over deeply held values does not make them bad people. That way of thinking is dangerous. Not only does it hurt people who have good intentions, it lessens our authority to call out true evil. We must be very careful about what we designate as morally bad. Racism is evil, but having a liberal or conservative solution is not. Beliefs of superiority that reject the equality of all human beings are detestable and detrimental to our society. Those are the beliefs and politics that must be called out. When someone supports policies like the “Muslim Ban,” or opposes policies like DACA (on principle, not

whether the president has the power to enact the policy), there is no doubt that racism is prominent in their thinking. At the end of the day, however, almost everyone wants a better country to leave the next generation. While people may have different conceptions of what that better country looks like, as long as it is based in equality, we should keep in mind that people are working with good intentions. We can still be friends, family, or co-workers with those who have vastly different ways of thinking and values. Not only can we still maintain our relationships with them, we should work towards understanding why each person has different values and approaches and have compassion for one another. As John F. Kennedy said, “If we cannot now end our differences, at least we can help to make the world safer for diversity.” If you care about your values, you should work towards convincing people of them. This moment of deep division in our country is also an opportunity. The gaps in our society have been exposed, allowing us to see them as they are, and not hide them. Only by approaching one another in an honest manner with an open mind and compassion can we work together to mend those gaps. Reuben Siegman '18 studies iin the College of Arts & Sciences. He can be reached at



NEOLIBERALISM AND CLIMATE CHANGE Jeffrey Conner | Photo courtesy of Wikimedia Commons


he effects of climate change, long warned about but always seemingly far off in the consciousness of ordinary people, are already here, and they’re deadly. Harvey and Irma have devastated American coastlines in quick succession, causing unprecedented damage and ruining countless lives. India has been pummeled by severe flooding in recent years, one instance of which is considered the worst natural disaster in the subcontinent’s history and resulted in at least 11,000 deaths. In California, a state of emergency has finally been lifted for most of the state after it experienced its worst drought in over 1200 years. The unprecedented nature of these disasters is no coincidence. This trend of increasing frequency and intensity of extreme weather events will characterize life in the 21st century. Unfortunately, we find ourselves operating within a political and economic system that is woefully unprepared to address the coming environmental catastrophe.

Not only is neoliberal capitalism largely to blame for inaction on climate change, it also intensifies the problem with development practices that serve the interests of private capital but leave the rest to fend for themselves. This is especially apparent in the case of India. Following the 1991 economic crisis, the country took a strong neoliberal turn. Beginning with the New Economic Policy of 1991, India gradually liberalized its economy, and today it is one of the most unequal societies in the world, and the consequences of changing weather patterns on the nation’s poor have been exacerbated by this market-driven development. Just this past August, a flood in Mumbai was made worse by the clearing of wetlands and cutting of mangroves, which act as natural water drains, for the construction of luxury housing for India’s upper classes. Surrounding these pockets of extreme wealth are slums and shantytowns, inhabited by people who have little to no support from the government to fall back on. So when the floods swept through Mumbai, the water had nowhere to go and the poorly constructed homes of India’s underclasses were decimated. People lost everything and were forced to scavenge for food, eating rats and


drinking floodwater mixed with sewage, while the government took little action. The Indian state’s minimal role in ensuring sustainable housing for its citizens and rebuilding communities after disasters hit has already resulted in many additional fatalities to climate change, and the body count will only rise in the years to come. The United States, despite having a far more developed economy, suffers from a similar lack of preparation that disproportionally affects the poor. The consequences are also heavily divided along racial lines, as years of discriminatory housing policies have made minority communities far more vulnerable to natural disasters. Additionally, due to the lack of government protection and extreme wealth inequality in America, it is much more difficult for the underprivileged to evacuate, putting them at far greater risk of death. People can’t simply get up and leave when they don’t have anywhere to go, and leaving home means potentially losing everything. Some, as has been reported in Florida during Hurricane Irma, even face the possibility of getting fired from their jobs just for not showing up to work. Negligent urban planning also bears much of the blame for the severity of the damage and its human toll. In the wake of Hurricane Harvey, the city of Houston has seen this all too clearly. A combination of environmentally-unconscious practices, poor zoning laws, and a lack of political will to spend money on anything that might reduce long-term risk to climate change (the city had deemed drainage bayous clogged with debris too expensive to clear out) has made Houston particularly unequipped to handle flood conditions. This is especially frightening given that the city has experienced three 500-year floods in just the past three years. Disaster relief by the United States government has been both insufficient and, as shown in the aftermath of Hurricane Katrina, cruel to the nation’s poor. This is not due to a lack of resources; the United States is the richest country in the world. The government’s complete failure to do its job after Katrina

was a result of the fact that it just didn’t care about New Orleans’ predominantly AfricanAmerican lower class. Billions of taxpayer dollars went straight to private contractors looking to line their pockets rather than toward helping the people of New Orleans. The predictable result of putting the relief effort in the hands of profit-seeking entities was that the people who could afford help got it, but those who couldn’t languished. With no access to basic necessities, many were forced to turn to looting in order to survive and were subsequently vilified as dangerous, anarchic criminals by both the government and the media. To keep the

It turns out that largescale disasters require massive citizen participation and cooperation, and Cuba’s communist government is particularly effective in carrying that out. suffering population in check, the Bush administration contracted the private mercenary army Blackwater, which employed many of the same counter-insurgency tactics it had previously used in Iraq and Afghanistan to maintain order in New Orleans. The destruction wrought by Katrina was also seen as an opportunity by free-market fundamentalists to push through a radical right-wing agenda on the city. Schools were rapidly privatized and public housing was destroyed to make room for more profitable luxury condos. All of this amounts to a brutal class war waged by the powerful against the weak. While things might have turned out better had Katrina not occurred during the Bush administration, this is not saying much. The response


under a Democratic president would likely have been less overtly cruel, but the underlying systemic inequalities would still exist. In contrast, the United States’ impoverished southern neighbor Cuba is a model for hurricane preparedness and response, garnering praise from the UN. Over the past 18 hurricanes to hit Cuba, there have been only 45 deaths (ten of which were from category 5 Irma), while Katrina, which had been downgraded to a tropical storm by the time it hit New Orleans, resulted in over 1500 fatalities in the city alone. The Center for International Policy even published a report stating that Americans are 15 times more likely to die in a hurricane than Cubans. How does a relatively poor country so drastically outperform the world’s richest country in such a crucial measure of development? Large-scale disasters require massive citizen participation and cooperation, and Cuba’s communist government is particularly effective in carrying that out. Due to its location, Cuba is particularly vulnerable to hurricanes. To protect itself, Cuba enacted a national preparation and response system that prioritizes human life over private property (while also guaranteeing the protection of people’s personal possessions). As a result of the state’s mobilization efforts, hurricane preparation has become a way of life for Cubans. From an early age, children are taught how to help their community in the event of a hurricane, and this education continues into adulthood. All citizens are mandated to go through a civilian defense training program, and every year before hurricane season the entire country holds a two-day hurricane drill known as Ejercicio Meteoro. To make sure that every Cuban’s needs are accommodated for, municipalities keep detailed information on citizens that allow them to identify what services they might require. When a hurricane is imminent, entire communities are mobilized and every individual is given a role to play in protecting their neighborhoods and evacuating fellow citizens. This approach to natural disasters doesn’t require huge amounts of money. People naturally come together to protect their communities, and the government should harness that solidarity by providing the organizational structure and education for them to do so effectively.

Climate justice and long-term sustainability have to be priorities of policymaking in the 21st century, and this requires that we recognize the current system’s impotency in bringing that about. If radical changes in the structure of our political economy are not made, the world will begin to look more and more like a dystopian nightmare. The gap between the haves and the have-nots will become accentuated as dwindling resources drive prices for basic necessities higher, forcing the havenots to resort to desperate measures to stay alive. They will be met by severe crackdowns by police states that prioritize capital over human life. This may sound alarmist, but we have already seen this happen on a much smaller scale in the aftermath of Hurricane Katrina. The mass human displacement that is already inevitable will be even worse if the same development practices that have herded people into overcrowded living conditions that are especially susceptible to natural disasters are allowed to continue. Not only is promoting equitable and environmentally-conscious development the morally correct thing to do, it is also in the interest of the wealthier countries of the world that this happens. Refugee crises are incredibly destabilizing forces in developed countries. Bangladesh, a country of 163 million people, will be mostly underwater by the end of the century. This alone will

be create a far greater refugee crisis than the Syrian civil war, and the rest of the world must be prepared to handle human displacement on this scale well in advance. Neoliberal capitalism is simply not equipped to deal with the effects of climate change. This is a collective problem that requires collective action. There are no market-based solutions. Following the Cuban model, governments must take the lead in preparing for the changing world and protecting their citizens when disaster strikes. Failing to do so will result in a barbaric future.

Jeffrey Conner ‘19 studies in the College of Arts & Sciences. He can be reached at


Location not served by MetroLink City: Ferguson, MO Median Household Income: $36,645 Majority Demographic: 67% Black Average Home Value: $65,900 (est.)

Delmar Loop Neighborhood: West End Median Household Income: $32,937 Majority Demographic: 85% Black Average Single-Family Home Value: $157,417

City of St. Louis

City of St. Louis

Clayton City: Clayton, MO Median Household Income: $93,009 Majority Demographic: 75% White Average Single-Family Home Value: $597,700

5th and Missouri City: East St. Louis, IL Median Household Income: $19,520 Majority Demographic: 95% Black Average Property Value: $56,100

METRO UNLINKED: HOW THE ST. LOUIS METROLINK MARKS SOCIAL DIVISION Hanna Khalil & Ryan Mendelson | Design by Dominique Senteza

M St. Louis City Median Household Income: $35,599 Key Demographics: 46.8% White, 47.2% Black, 6% Other Average Home Value: $120,400

White Population Black Population MetroLink Red Line MetroLink Blue Line



@ washupoliticalreview





ast spring, I sat outside of Mallinckrodt during Alumni Weekend, and a woman passed by with her children. Her sons ran ahead, and I heard her say to her husband, “I don’t know what this building is. Campus looks so different now.” As they wandered away, I marveled at the thought of how our small campus, which is essentially just a 345-acre rectangle, could transform so much that it would be unrecognizable to an alumnus who graduated within the last 15 years. But then I thought of the construction projects in place at Olin Library and the Loop overpass, as well as the extensive plans for Brookings that had not yet gone into place, and I realized that if I ever visited in the future, I too would wander aimlessly through an unfamiliar campus.

But what does this mean for us as students now? On move in day, student helpers' shirts boldly declare, “Welcome Home,” but I do not know of any “home” that has one-third of itself under construction at any given time. Each project undertaken starts anew the cycle that absorbs every student’s time at Wash U: the fact that if you live through the construction of a new facility, you are unlikely to see the results. How are we expected to form a connection to a campus that is always changing? When once it seemed like the iconic bricks and architecture would always remain, now the only thing constant about Wash U is that it will never stay the same. Why would I want to come back if there is nothing that I remember to come back to? Nowadays it feels like I do not so much go to Wash U as walk through it, or what remains of it. Yet I cannot even imagine how I would feel if I had experienced my favorite places on campus being taken away. As a sophomore, I never saw Whispers, but when I talk to upperclassmen about their memories there is a palpable sadness concerning its loss. A legendary social hub, Whispers’ absence now creates a desert in the center of campus that forces students to the edges of the school to find any place for food or socialization. Construction fundamentally

changes the way we interact with this school and ultimately disconnects us from the campus we live in and study at for four years.

Spending more on construction increases prestige but decreases accessibility, which begs the question, Who do these projects really serve? Of course, Wash U is following in the footsteps of universities around the country who are also caught up in a disturbing craze for development, but this is no excuse. An obsession with construction has taken hold of colleges around the country, and huge public schools as well as small liberal arts colleges are borrowing billions of dollars to claim titles like “best dorms” or “best facilities” that help them rise in rankings and court the wealthiest students in the nation. Universities continue to go into debt to construct new amenities even though, according to The New York Times, overall debt levels at more than 500 colleges rose by over 200 percent between 2000 and 2011 while gifts and investments dropped by 40 percent. At some colleges, the costs of campus projects fall on the students, with increased prices for room and board or special service fees. Spending more on construction increases prestige but decreases accessibility, which begs the question, who do these projects really serve? Right now, it seems as if the cranes and bulldozers are in place for a future class of wealthier students that can better shoulder the

expenses of all of the current outlandish projects made in the name of recruitment. Constantly taking out loans for modern dorms and expansion projects is ultimately an unsustainable method of development, so it is about time we ask what we truly need from our universities. I know that a plea to stop or at least limit construction is hopeless and will likely be drowned out by drills digging yet another pit into the Danforth Campus. Maybe I just wrote this to retain my sanity amidst the noise. Or maybe this is all to challenge the way we view colleges and what they have to offer; do we really need an education supplemented by state of the art facilities? What would campus be like if instead of funding new buildings, we put money into reconstructing notoriously inaccessible ones? What if instead of trying to attract incredibly wealthy students with ever more expensive buildings, we redirected funds to recruit students for greater socioeconomic diversity? Or what if we put money into resources that students have repeatedly asked for, like more mental health counseling or support systems for students of color? A different approach to how we as students value and rank our universities will not stop the next building from being torn down and replaced, but it can radically change the priorities of higher education.

Sienna Ruiz ‘20 studies in the College of Arts & Sciences. She can be reached at



A STORY OF SUBWAYS AND METROLINKS Hanna Khalil | Illustration by Thomas Fruhauf


followed my mother urgently onto the platform, through the crowds of fast-moving stiletto heels, backpacks, strollers, briefcases and scrubs that were always running late. I concentrated on staying close behind while simultaneously taking attentive mental photographs of my surroundings: right turn off the train, to the stairs labeled with an “uptown” sign, turn right again. I wrote in my head a map I could reference tomorrow, one I would use until graduating high school.

As a new seventh grader, that day marked a coming of age. Not only was I entering the unexplored territories of middle school and beyond, but I was also embarking on my first morning commute – a ritual that would start and end my days, five days a week, for the next six years. I had ridden the New York City subway as long as I could remember, but always following my parents blindly through the grimy tunnels, never paying attention to the signs or service announcements, my small body ducking under the turnstiles so that we didn’t have to pay another $2.50 fee. But now, I was a twelveyear old – independent, mature, and capable of navigating this transportation system by myself. Or at least the two lines that would get from Astoria, Queens to the Upper East Side of Manhattan, where my magnet school was situated 45 minutes away. It didn’t take long for the journey to lose its shiny grandeur. My glamorous impressions of “going to school by myself” gave way to complaining about delays, commuters lacking basic etiquette, and being physically touched on all sides by a large man’s sweaty armpit, a petite woman holding six Trader Joe’s bags, a mother managing a stroller, and the dirty metal doors that could just barely close shut. But even though my morning commute was no longer exciting, the possibilities that came with this new accessibility to the subway were endless. Along with my free student Metrocard came passage to five boroughs, 24 subway lines, and 469 stations. As I made new friends who lived in different neighborhoods an hour away in opposite directions, I began to memorize avenue patterns and gain an internal GPS.


Once I moved to Wash U and left behind the spiderweb subway map of my home for the two horizontal lines of the St. Louis metro, I began to realize for the first time how much influence a transportation system had had on my life. Slowly but surely, I began to know the different personalities of New York’s neighborhoods: the touristic bustling streets of 5th Avenue, the hipand-cool cobblestone paths of SoHo, the townhouses of the West Village I dreamed of living in if I ever won the lottery, and the edgy, smoky ambience of St. Mark’s. No longer were my parents responsible for accompanying or driving me places - I transported myself to the museum I worked at after school, to the doctor for my yearly physical, to run errands, and to work on a project with classmates on the weekends. Following that very first solo ride, it felt like the borders of my city had been pushed wide open, with more to explore than I had time for. When I moved to Wash U and left behind the spiderweb subway map of my home for the two horizontal lines of the St. Louis metro, I realized for the first time how much influence a transportation system had on my life. I couldn’t relate to my friends’ stories of the liberation that came with turning sixteen and getting your license (I still couldn’t drive) or angsty teenage accounts of making out in the backseat of a car. But beyond personal development, as I began to get to know my new “home away from home,” I

could see that the construction of public transportation had huge political and social consequences as well. Public transportation helps facilitate a sense of proximity and community across distance. In New York, the subway acted as a democratizing force within a socioeconomically and racially fragmented city. Though the pristine, majority white-owned townhouses of the Upper East Side felt a world away from the “rough” parts of the South Bronx, on the subway, you didn’t know if the person sitting next you was a hotshot on Wall Street, a small business owner in Chinatown, or were on their way to their shift conducting the train you would get onto next. There was something equalizing about the knowledge that even if you could afford a taxi, there was no method more efficient for getting around than the trains crisscrossing underneath the city streets. The subway is a force that allows for people from all walks of life and socioeconomic, racial, and ethnic backgrounds to identify with being a New Yorker. And while it would be naïve to think that the rich used the subway to engage with areas outside of their elite comfort zones, it did give people in middle class and working class neighborhoods, like myself, access to resources otherwise unavailable. It facilitated my pursuit of a better education than my zone-school could provide, 45 minutes away, despite my parents both working full-time jobs and unable to take me themselves. As I left this environment and began to learn more about the unique history of St. Louis, I began to note the ways in which the construction of transportation influenced the meaning of community and identity within a totally different context. The separation of the affluent neighborhoods surrounding Wash U from East St. Louis felt especially great, knowing that that the easiest way to get from one to the other was by car. Interaction with communities of lower socioeconomic backgrounds would require an intentional decision by the more privileged, as these areas were separated not only socially, but structurally as well by the construction of the city. The citycounty divide, which Wash U sits comfortably at the intersection of, is reinforced by the lack of


trains connecting nearby suburbs such as Creve Coeur and Webster Groves to the city proper. As an outsider, the racial segregation of St. Louis I had read about felt even sharper due to the fact that it is unlikely for someone in Clayton to ever need to interact with someone living north of the Delmar Divide, if they didn’t want to. Despite St. Louis being a much smaller metropolis than New York, its divisions felt especially pronounced. In times of racial tension, especially in the aftermath of the Stockely Case, it is easy for white residents to stay comfortably in their white neighborhoods, not having to engage with protests a ten-minute car ride away. In contrast, I remember that following the murder of Eric Garner, protests in New York blocked Times Square – the location of a subway station connecting 12 lines as well as countless busses to Port Authority Terminal. This is not a personal judgement of the people of St. Louis or New York. In both cities, racial inequality flourishes. In both areas, the wealthy live in bubbles of privilege that too few challenge themselves to break out of. I was keenly aware that only three blocks north of my “elite specialized high school,” the streets transformed into an area where public schools were under-resourced, fancy apartment buildings gave way to affordable housing, and chic cafes were replaced by fast food chains. It was clear why the more affluent rarely seemed to venture uptown of their pristine enclave. New Yorkers do not hold a moral high ground for community engagement. Instead, the construction of their city’s transportation network forces a minimum of interaction across racial difference that people do not

necessarily go out seeking, but are impacted by nonetheless.

Public transportation helps facilitates a sense of proximity and community across distance. The impact transportation has on the connectedness of community could be seen in both cities, within the debate surrounding metro expansion. In New York, some criticized the extremely expensive and slow construction of the 2nd-avenue line, which serves the affluent neighborhood of the Upper East Side, as a misuse of funds benefiting the wealthy. Some argued that the 4.5 billion dollars spent on only three stops could have been invested in the infrastructure of existing aging parts of the current system, especially those serving middle and working-class neighborhoods of the outer-boroughs. In St. Louis, the proposal for the expansion of MetroLink led to debate on if the new line would serve to connect the more affluent county to the city, or if it would take a North-South route that could revitalize the economically neglected area of north St. Louis. As it stands, the system runs East-West, connecting a limited and more welloff section of the community. But a North-South line would connect under-resourced northern

parts of the county, such as Ferguson, to the economic opportunities in the center. It’s clear that these decisions would have implications beyond transportation itself. Future development of public transport will reflect which parts of the St. Louis community are deemed as more important and valued. It could either help exacerbate current social divides or help bridge them. Transportation has the power to construct new realities for a community. It is important for us all to reflect on the way public transportation shapes us, shapes the places we call home, and defines our mobility without us even realizing. This summer, at the age of 19, I finally began to learn how to drive. I wonder whether my relationship with St. Louis will change once I get my license, or if I will still tie my independence so closely to a system of trains. I hope that in either case, I challenge myself to go beyond the limitations or accessibility of the transportation around me and engage with my communities regardless. Ultimately, the bubble is as big or as small as you want it to be.

Hanna Khalil ‘20 studies in the College of Arts & Sciences. She can be reached at





he Delmar Divide stands as a ghost of its racial history. The phrase, coined to find a way to summarize revealing census data showing a wide disparity marked by the street of the same name, signifies St. Louis’s understanding of its most economically divided area. But the phrase also imparts an incomplete view of not only the scope of wealth disparity, but also of its scale.

Sitting in the shadows of these statistics, the Divide marks where the average mortgage collapses from $310,000 to $78,000, where the rate of individuals over the age of 25 with bachelor's degrees drops from 67 percent to 5 percent, and where this self-ravaging cycle of urban economic behavior to the south of the Divide is destroying the opportunity for economic growth to the north. The economic dichotomy that aggravates the development of the two halves acts as a determinant of their sociopolitical trajectories. To the west of the Gateway Arch, technology startups nestle themselves among charming housing development projects. To the east of the Arch, 45.8 percent of the population lives below the poverty line, and commercial employment opportunities are constrained to education and welfare administration. Neighborhoods to the west have a 30 percent African-American population, and those to the east have a 99 percent population. In St. Louis city, the average man is expected to live 70 years, while his St. Louis county counterpart is expected to live 76 years. In the city, the average woman lives 77 years, but her county peer leverages four more.

The concise assertion is that the city currently reads two different census reports: one of the thriving and one of the surviving. These socioeconomic factors have morphed into passive ones that empirically determine where wealth lands and flourishes. While wealth disparities remain a modern symptom of an “urban renaissance,” their roots take hold in a much darker period of the country’s history.

The singular disparity in these data points does not tell the lives of the individuals, but it does tell the story of healthcare access and of public education—the two primary indicators that determine the stability of a community.

The singular disparity in these data points does not tell of the lives of the individuals, but it does tell the story of healthcare access and of public education—the two primary indicators that determine the stability of a community.

These residential prejudices have adopted new pseudonyms: public safety, loan records, and education quality. As a city with some of the widest gaps in regional public health as well as unfair access to a thorough education, St. Louis is built on the legacy and reputation of zip codes.

The health disparities observed by the census were constructed from a legacy of policy that favored white wealth fleeing urban centers in favor of suburban land developments. Because of this Delmar Divide, St. Louis is now one of the most racially and socioeconomically segregated metropolitan areas in the United States.

In 1917, East St. Louis riots took at least 40 lives, fueling the Roaring Twenties with iterations of Caucasian homeowners’ failed attempts to restrict housing for their minority neighbors. In the closing months of 2017, a full decade later, the Carnahan Courthouse sits regally on the intersection of Market and Tucker, just four short


miles west of East St. Louis and just 12 miles from Ferguson. Defended by metal gates in the hours preceding the controversial Stockley decision, the swelling tension was an eerie reminder of Ferguson and the racist history in the bones of these neighborhoods. This is not the St. Louis most of us know. The narrative dictated by this data has inadvertently constructed an empathy gap in the way that we discuss wealth inequality. Like most complicated subjects, urban economic inequality is layered, confusing, and contradictory. Only in the context of the people is that empathy gap brought to light. This is not to say that the data is wrong. Accurately collected and processed, data is unbelievably valuable in locating public need, writing policy, and surveying trends, among a myriad of other benefits. Instead, this is to say that data is a poor storyteller; it aids in defending urban stereotypes. It is unbelievably easy to confuse trends in data with trends in communities, families, or individuals, and it is unbelievably easy to pretend that there is no difference between those two distinct forms of information. The pulse of this city has blossomed on tense racial lines. The 90-some municipalities bounded by the Mississippi and Missouri Rivers exacerbate the seething residential narrative of the city that is—at its best—bleak, and—at its worst—suffocating. St. Louis has served as a national symbol for systematically-derived racial inequality, and because this is not a new story we’re telling. But this will never sculpt the people of this city alone, because this is not the story we have to build.

Emma Baker '21 studies in the College of Arts & Sciences. She can be reached at




ince President Trump first proposed a “trillion-dollar infrastructure plan” on the campaign trail, it has featured prominently on Congress and the White House’s to-do list. After the failure of the repeal-and-replace bill for the Affordable Care Act, its importance has only increased. There has been a great deal of discussion as to how much of the actual infrastructure should be private and how much should be public, whether it will include high speed rail and upgrades to ports, and how to pay for it all. What has not been mentioned is that this bill is a golden opportunity to rejuvenate and reform America’s construction sector for the benefit of all, especially low-income families. Construction is among the world’s least efficient industries. According to The Economist, over the last seventy years productivity in the U.S. has grown by 800% in manufacturing, an incredible 1,600% in agriculture, and almost 400% overall. In construction, it has not budged.This lack of efficiency is hurting our nation’s most vulnerable, as builders and developers have increasingly found it unprofitable to build low-income housing since the cost is too high. As a result, lower-income individuals and families are forced into crowded, often unsafe buildings for which they pay high rents, as the demand for low income housing far outstrips supply. According to the National Low-Income Housing Coalition, in no state were there more than 56 units of cheap housing available for every 100 low-income households. In the past, we have tried to build our way out of this problem. The failure of Pruitt-Igoe in north St. Louis dramatically illustrates the limits of this approach. The buildings were poorly designed and underfunded and quickly declined in safety and quality of life until they were torn down in the mid-1970s. The lack of low-income housing in the United States ultimately comes down to the fact that new units are not being built. A lack of productivity growth is keeping costs high and hurting those in most need. What has held productivity growth back in the construction sector? Increasing regulation is partially responsible, but The Economist notes that it only accounts for about one-eighth of lost

growth since the 1980s. The biggest problem is that the industry is reliant on manual labor and small firms. My family owns a small homebuilding business, and as I grew up working on jobsites I saw firsthand how the vast amount of construction is still done by hand, with laborers employed by small firms. But why? Wouldn’t it be cheaper for business to consolidate and invest in capital, as they have in other industries? The answer lies in the deeply cyclical nature of construction. When the economy is doing well, few industries rise as far as housing, and when the economy collapses, few industries fall as far. The high cost of investing in labor-saving technology makes little sense if you know your business will go through a crisis in a few years. Workers can be laid off, but the costs of capital are set. As a result, construction firms have not invested in labor-saving technology, the cost of building has remained stagnant, and the poor suffer from a lack of affordable housing. This is where the infrastructure bill comes in. If it produces even a fraction of the proposed trillion-dollar investment, it will one of the biggest boosts to the construction sector in decades. Even if most of the actual building will be done by private firms, the federal government has enormous power over the process. They should use that power to push firms to invest in labor-saving technology. Congress should commit to spending a set amount of money over an extended timeframe. This will provide enough stability in the construction sector that firms will have an incentive to invest in devices such as self-driving bulldozers and cranes, as well as introduce technology already available, such as computer-aided drafting. The government should favor firms that do so in the bidding process, prompting modernization. Relevant regulatory agencies should look with favor on firms who try to merge. These measures will spur industry consolidation and investment in labor-saving technologies, helping the construction industry to increase its productivity and lower the cost of building. Once the cost of building new units is lower, private companies and developers will have an incentive to build low-income housing, giving families more options and making them

less beholden to predatory landlords.

According to the National Low-Income Housing Coalition, in no state were there more than 56 units of low-income housing available for every 100 low-income households Will this alone solve the lack of affordable housing in America? No. But it would be a bipartisan step in the right direction. The only way to truly get the number of affordable housing units we need in this country is to make it cheaper and easier to build them. This bill is a golden opportunity for Congress to encourage long-term, sustainable change in the construction sector, boosting the fortunes of lower-income families as they build infrastructure for the twenty-first century.

Christopher Hall ‘18 studies in the College of Arts & Sciences. He can be reached at



DECONSTRUCTING THE HARVEY EXPERIENCE Madi Bangs | Photo courtesy of Wikimedia Commons


’ll start with a few statistics. Over the course of six days from August 24th to the 30th, Hurricane Harvey dumped an estimated 27 trillion gallons of water onto the area from the tip of Texas to Louisiana. As observed by Vox, that’s approximately 1 million gallons of water for every inhabitant of the state of Texas. Data collected by the Nevada Geodetic Laboratory shows that the weight of all this water was enough to push the earth’s crust down by two full centimeters in Houston. The LA Times reports that some areas of Houston had over 50 inches of rain, and over 30,000 people were estimated to be displaced and in temporary shelters. We still don’t know how many lives were lost, homes damaged, or how much the reconstruction will cost. But none of those numbers mean much of anything to me on paper.

My first week of school was different this year. Instead of buying books, seeing friends, and trying to find a routine, I sat in bed hunched over my laptop for hours, glued to local news, calling family friends, and trying to anticipate what disaster would strike next. I poured over elevation maps, trying to make some sort of bargain with the universe about how high the water was allowed to reach with each band of rain. I would follow the height of the bayou and reservoirs near my house, until the water level overtook the sensors. I felt like with the amount of information I had at my fingertips, I should be able to confine the storm to a number, a concrete prediction. But the universe would not be reasoned with, and every hour I was forced to keep negotiating back as places I cared about were swallowed. If you were a resident of Texas, where would you put your 1 million gallons of water? Mine, along with many others, probably ended up in the Addicks and Barker Reservoirs, behind my house. Even as Harvey was winding down, they were still edging towards their limits, so the Army Corps of Engineers began controlled releases into the bayous and gutters which


moved into the roads and filled the streets. So even after the rain had stopped, the situation had not gotten any better. I watched water rush into the homes and businesses of several friends days after we thought we were safe. We knew the alternative may have been worse. But that’s hard to rationalize when it’s your home that fills up instead. Water is a peculiar form of home invader, because there is no way to win. You can’t fight it, push it back, or reason with it. There is no way to humanize yourself to an encroaching flood, it doesn’t care who you are or what you’ve done. It inexorably swallows whatever is in its path as it creeps, leaving behind months worth of mold, debris, and seething mounds of fire ants. The water acts as a mill, an equalizer, taking everything with meaning and turning it back to dirt. The floods do not discriminate, but they always manage to hit those who are most vulnerable to its effects the hardest anyways. As glad as I was to have been out of Houston

when Harvey hit, my only wish was to be there, at home. I felt like if I was on the ground, there would be something that I could do to help, some comfort I could give. The wet grip that Harvey had over my city had settled into my chest, and it seemed wrong to look for a distraction just because I wasn’t at home, dealing with this mess like everyone else. Instead I was here, in St. Louis, trying to sit through an hour’s worth of lecture without checking the weather. Every time my phone buzzed my stomach would drop, because I knew it could be someone else I knew telling me they were now homeless, their neighborhood was under water, they had lost everything. Weighed down by an odd form of survivor’s guilt, back at school it was odd to watch people going about their normal lives. When I would talk to friends about the destruction at home, the most common refrain that I would get was “wow, I can’t even imagine”. And I would laugh, because a week ago I couldn’t either. When you grow up


on the Gulf Coast, hurricane coverage is something that usually fades quickly into background noise by midsummer. Every year, we Houston natives make our hurricane boxes and distantly watch the pixelated green and yellow blobs on the weather map spiral off towards Mexico or the Atlantic Ocean. But then something like Harvey hits. Andrew, Allison, Katrina, Ike, Sandy, Irma, now Jose and Maria. Because of these storms, I now know the aerial view of my neighborhood better flooded than as before, because of the number of times I’ve seen it this week on national news. The standard shock of seeing the flooding and destruction on the internet is transformed by an almost numb jolt of recognition when you realize the photo is of a favorite bakery taken two blocks away from where you live. To truly understand the impact of Harvey though, you need to understand Houston. And I don’t think many people in America did until the mangled sights of our day-to-day were thrust into the spotlight as we were put through nature’s spin cycle. By the numbers, Houston, my hometown, is the fourth largest city in the United States with a population of 2.3 million, and a metropolitan area made up of over 6 million people. We’re a pretty flat city, but we have a larger footprint than some US states. We are global leaders in energy and medicine, and according to Houston’s government website, we were an “independent nation, Houston would rank as the world's 30th largest economy”. We speak more than 90 languages, our food is excellent, and we are a home to anyone that needs one, from immigrants, to the tens of thousands of Hurricane Katrina refugees who found a new home with us. We are a city of new beginnings and survivors. Anyone who knows me knows how viciously proud and protective I am of my city. And I have never been more proud of us than in the past few weeks. As a child, I watched the destruction of Katrina from afar. And from the survivors, I know that my shell-shocked response to this disaster is a privileged one. Even before the rain stopped, caravans of full of heavily equipped strangers were on the way into the city trailing rescue boats, and stretching for miles down what are now impassable roads. So many people have opened their hearts and their homes to our community. I feel so proud to know so many people I would now consider heroes, and others who have been so strong in the face of absolute adversity.

But while Houston was healing, and starting to pull things together, it seemed like the rest of the country was looking for someone to blame. On the left and right, it appeared everyone wanted a head to roll, something to be responsible for the swirling mass of wind and rain that interrupted their regularly scheduled news hour and caused the devastation the country couldn’t look away from. But in reality, Houston became a very public canvas for certain factions seeking to spin our narrative to their advantage. Reactions ranged from ignorant to absurd across the political spectrum. Factions of the left decided that Harvey and Irma were expressions of nature’s own wrath upon the Trump-loving regions of Texas and Florida, conveniently ignoring the fact that both Miami and Houston are historically left leaning cities. Factions of the right, when faced with criticism of Trump’s performance following Harvey chose Obama as a point of comparison, vehemently condemning his unpresidential reaction to Hurricane Katrina in 2005. Where

Because of these storms, I am now more familiar with the aerial views of my neighborhood when it is flooded than when it is not, because of the number of times I’ve seen it this week on national news.

probably came by the way of Ann Coulter in a late night tweet: “I don't believe Hurricane Harvey is God's punishment for Houston electing a lesbian mayor. But that is more credible than ‘climate change.’” Former mayor Annise Parker responded in the only appropriate way, confirming via Twitter a few days later that she could, in fact, control the weather. In all cases, the narratives hurt. These were the things we saw online, the questions my friends and family had to field as we tried to escape from the reality of reconstruction. And something similar is happening in Puerto Rico now. All the focus on defining and respecting the flag, and what it means to be an American in the contiguous United States has led us to ignore the Americans right outside our door. The story that I have told is one of destruction. The story I hope to tell soon is one of reconstruction. Resilience and new beginnings are a key part of Houston’s story, and as I mourn for my community and everyone touched by Harvey, and now Irma and the other storms ripping through the gulf, I know that we will persevere. So many people chose Houston as a place to start their lives over, and I know that they will do so again. It’s hard to accept that we will have to find a new normal from now on, and that my city will not be the same when I return. However, the stories that I have heard from my family and friends, of people of all walks of life coming together to support each other and help their city heal give me hope. But part of me worries that the rest of America will get worn out. As the ink on the sopping maps of Texas, Florida, the Caribbean and now Puerto Rico bleeds together, it’s easy to forget those for whom these disasters will define the next several years of their lives. We still need your help, but we are not here to support your narrative, or to be another prophetic bullet point in anyone’s agenda. The road to recovery will be long, hard, and in some places, probably still underwater for the next couple weeks. We can’t do it alone.

was President Obama during Katrina? Probably golfing, decided twitter, in stunning ignorance of both basic subtraction, and the length of presidential terms. A newly elected senator of the same name however, was in Houston helping resettle refugees. And of course everyone had something to say about climate change. My favorite reaction

Madi Bangs ‘19 studies in the College of Arts & Sciences. She can be reached at



THE (MIS)CONSTRUCTION OF KNOWLEDGE Dan Sicorsky | IIlustration by Avni Joshi What’s concerning is we’re not even talking about individuals we objectively should not trust to inform us, like the recently deceased Paul Horner, the 38-year-old owner of cnn. and other deceitful fake news sites, who told the Washington Post after President Trump’s election that compared to past years, "Honestly, people are definitely dumber. They just keep passing stuff around. Nobody factchecks anything anymore.” No, we’re talking about authors, judges, journalists, politicians—public figures we understandably trust to tell it to us like it is, but that too often fall short.


ustice Anthony Kennedy wrote in two rulings nearly 15 years ago that sex offenders’ rate of re-offense, at almost 80 percent, is “frightening and high.” Since then, his “statistic” has been used by hundreds of lower courts and lawyers to defend policies that banish offenders from most communities. The severity of the punishment would be arguably reasonable—if only its backing were true.

Our words are powerful, and how we use them matters. Curated well, they can explain abstract concepts, tell stories, and describe laws. But misused, or repeated without enough care, words can also cause irreparable harm. The onus is overwhelmingly on public figures to understand the weight of their words, and it is also on all of us to differentiate between trustworthy and misleading sources.

It couldn’t be further from. Justice Kennedy found his 80-percent statistic in a 1988 Justice Department guide, which in turn had cited a Psychology Today article written by the self-interested counselors of a male sex offender treatment program. The piece was filled with junk science. It claimed that “most untreated sex offenders go on to commit more offenses—indeed, as many as 80 percent do,” even though hundreds of studies prove sex offenders’ recidivism rate is among the lowest of all criminals, at around 3.5 percent. The author of the article has since publicly regretted having exaggerated, but the game of broken telephone he and his co-author initiated had already snowballed, constructing a fact backed by the full weight of none other than the Supreme Court. Before the Psychology Today writers knew it, like the wind-blown feathers of a slashed pillow, their claim dispersed, inhibiting the reentry to society of generations of sex offenders, including the over 740,000 in the U.S. today.

It is alarmingly easy to fabricate knowledge. An editorial in The New York Times earlier this year stated that former Gov. Sarah Palin’s PAC played a role in inciting the 2011 assassination attempt on then-Rep. Gabrielle Giffords, when in fact no such link existed. The paper later corrected itself and apologized publicly (and in court when Palin unsuccessfully sued), but in the meantime, trusting readers left believing an alternative truth.


High-profile figures and institutions also fudge facts when they make careless statements. Politicians lie; need I cite examples? How celebrities misrepresent information might be less obvious. Whoopi Goldberg, for example, claimed on ABC’s The View in January that former President Obama “didn’t do executive orders in the beginning.” PolitiFact later flagged her statement a Pants on Fire! lie, but it’s unlikely The View’s 2.7 million viewers ever found out.

This is especially worrying considering research on false memories that suggests people are alarmingly prone to believing fake information. Studies have found, for instance, that a 14-year-old could be “implanted” with the false memory of having gotten lost in a mall as a child, and that college students could be misled to believe they had been hospitalized with an ear infection. People are bound to trust Alex Jones’s claim that as part of their jobs police officers smoke marijuana yearly (another Pants on Fire! lie) if they’re prone to believing in false hospitalizations. That strikes a nerve. The situation is worsened by motivating reasoning, a decision-making mechanism that explains how people go as far as to instantly trust information they like while deeply scrutinizing what they don’t like. Consider how motivated reasoning might have played a role in Justice Kennedy’s ruling: When drafting the Court’s majority opinion, Justice Kennedy embraced supporting evidence from the Justice Department that proved a “frightening and high” recidivism rate, but presumably second-guessed a source that offered contrary—but more accurate—information. Scores of lawyers and politicians also exhibited motivated reasoning when they unconditionally believed Justice Kennedy’s sweet-sounding finding. Lying public officials, malleable minds, competing information… Is this what it feels like to live in a post-truth society? Our tools of instant information allow anyone to share anything


anywhere, we are psychologically at-risk for gullibility, and not even trusted public figures are telling it straight. Either humankind is widely incompetent, doomed to descend into an informational abyss for our inability to distinguish between facts, opinions, and falsities, or we need to revisit the ways we share and consume knowledge. Our collective wisdom is endangered but not extinct. In what remains, I propose some concrete steps to preserve the integrity of our universal knowledge. Essentially, it is up to the arbiters, the creators, and the consumers of information to ensure that what facts are out there deserve to be out there. Above all, fact-checking must grow omnipresent. Research in political science and psychology ascertains that when experts flag untruths, citizens’ misinformed views can often be righted, especially when the fact-check is presented by an empathetic source (such as a fellow Republican or Democrat), or through a chart or graph. One study even found that politicians were less likely to lie if a letter sent to their offices warned them that PolitiFact was fact-checking that state’s politicians. Though fact-checking works, the practice has yet to receive support from two crucial partners: the media and the government. All news outlets, save for Bloomberg TV, refused to provide on-screen fact-checking during the 2016 election’s televised debates. This was unfortunate, considering how debates, as well as congressional hearings, speeches, and campaign ads, are ripe with misinformation. Television outlets would be providing a valuable service if they introduced a recurring fact-checking sidebar to keep speakers accountable and audiences well-informed. Independent fact-checking agencies do not have the wingspan to monitor thousands of public figures and their hundreds of thousands of untruths. That's why the government could create a federal agency—a Department of Truth (not to be associated with the Ministry of Truth, a bureaucracy in George Orwell’s 1984 ironically concerned with lies) comes to mind—dedicated to monitoring public feeds and flagging those statements which misinform the public. A federal agency would have the unique power to

fine public figures who misinform deliberately not once or twice, but repeated times. A federal agency would also oversee government reports, such as the flawed 1988 Justice Department guide Justice Kennedy referenced.

the major caveat that the spike is due to an increase in non-violent accidents like collisions. A journalist trained in statistics who still reports inaccurately should be removed from her role of informing the public.

Either humankind is widely incompetent, doomed to descend into an informational abyss for our inability to distinguish between facts, opinions, and falsities, or we need to revisit the ways we share and consume knowledge.

Realistically, though, even with a Department of Truth or more copyeditors, falsified information will filter through the cracks and onto our screens. So, a third and especially crucial solution is for consumers of information to develop an eye for lies masking as truth. Classroom lessons on spotting biases and fake news don’t have to be boring, since information young people consume through, say, films, is just as prone to falsification. Imagine a new generation, one knowing Pocahontas’s true lover was not John Smith, as the Disney film portrayed, but another man, John Rolfe.

A second step toward restoring informational integrity is for media organizations to review processes for ensuring the accuracy of their content. In traditional newspapers, that responsibility would fall upon copyeditors. But of the 10,676 copyeditors working in newsrooms in 2002, according to a journalism watchdog, less than 5,680 remained in 2012. If they are truly interested in preserving accuracy, as they should be, then newspapers can start by hiring, not firing, copyeditors. Take it from the 100 copyeditors The New York Times removed from their positions this June, who, in an open letter to their bosses, called themselves “the immune system of this newspaper” who “go to great lengths to ensure quality and, most important, truth.” Media organizations might also require mandatory statistics trainings of all reporters. If Fox News publishes the headline, “Police Officer Deaths on Duty Have Jumped Nearly 20 Percent in 2017,” as it did this summer, it cannot ethically quote Blue Lives Matter’s national spokesman talking about “the war on cops” but omit

Lest we blame it on the kids, though, we would be wise to remember that adults can also develop as better readers, listeners, and thinkers. We should also be more demanding of our thought leaders: Politicians would think twice before lying if they felt their constituents’ support rested on their credibility. And all it takes to protest Sean Hannity or Tim Acosta is changing the channel. In The Descent of Man, Charles Darwin wrote: “False facts are highly injurious to the progress of science, for they often endure long; but false views, if supported by some evidence, do little harm…” It is not opinions that we must scrutinize, since disagreements are necessary for a forward-thinking, democratic society to search together for truth. But facts, current events, politics, histories? These are the foundations of our knowledge— yours, mine, that of those who came before us and that of those who are yet to arrive. When this wisdom is tampered with, very real people are led to believe very unreal things. Certain forces will always yearn to fool. But faced with the misconstruction of facts, it will always be our civic responsibility to defend our minds.

Dan Sicorsky ’19 studies in the College of Arts & Sciences. He can be reached at



Interested in writing, design, or illustration? Join WUPR! WUPR.ORG/CONTRIBUTE We always welcome new contributors.

GLOBAL INEQUALITY CONFERENCE: Dialogue + Perspectives + Collaboration October 26-28, 2017

Thursday October 26th: Brown Hall, Brown Lounge

Saturday October 28th: Goldfarb 132

5:00-5:30 pm Reception 5:30-7:00 pm Keynote address: Interventions to Address Global Inequality Professor James Midgley, University of California – Berkeley

Environmental Justice Case Competition 8:00-8:30 am Registration and Welcome 8:30-11:30 am Teams prepare case 11:30 am- 1:00 pm Presentations 1:00-2:00 pm Lunch Provided 2:00 pm Awards and Closing

Professor Midgley is regarded as a pioneer in the fields of international social work and social policy in the developing world.

Friday October 27th: Women’s Building Formal Lounge 10:00-11:30 am Roots of Global Inequality: Structures, Processes and Practices Panel Discussion 11:45 am-12:45 pm Student Poster Presentations of Research and Practice (Lunch provided) 1:00-2:30 pm Social Identities and the Experience of Inequality Panel Discussion

RSVP: For questions, please contact Tammy Orahood at 314-935-2863 or Sponsored by:

WASH U: THEN, Hanna Khalil & Ryan Mendelson | Design by Dominique Senteza

Construction of Brookings Hall, circa 1900

Construction of Brookings Hall, circa 1900

Brookings Exterior Grounds, circa early 1900s

Construction of Brookings Archway, circa 1901

NOW, TOMORROW Bro o Oct kings C obe r 20 onstru ctio 16 n



struc gs Con Brookin 016 er 2 Octob

Ann and Andrew Tisch Park, Scheduled Completion in 2019

Photos courtesy of the Washington University Construction Library Archives



THE GREATEST SUPREME COURT JUSTICE Luke Voyles | Photo courtesy of Wikimedia Commons


ustice Byron White does not have an excellent reputation among legal scholars or among the general public. Conservatives do not usually remember White too fondly because of his liberal rulings, and he is usually not particularly beloved by liberals because of his conservative rulings. When noted Yale law professor Robert M. Cover wrote his famous New York Times article comparing Supreme Court justices to professional baseball players, he compared White to Jackie Jensen. Cover explained that both Jensen and White were “better as running backs” (White was a noted running back at the University of Colorado in the 1930s). He also found that White was completely overwhelmed by the towering presences in the Earl Warren Court, but that White became more powerful in the Warren Burger Court. was even more scathing in its view of Justice White, calling him “an unmoving pragmatist-lives in infamy both for his individualistic approach to law and his success as a profession athlete.” As the title might give away, I completely disagree with these views on White. I believe that White was a magnificent centrist dedicated to a more equal United Sates, and he was also suspicious of governmental overreach. His life and career were unorthodox for a Supreme Court justice. Born in 1917, he played professional football for one year for the Pittsburgh Steelers and went into the army as an intelligence officer. He wrote the report on the sinking of the PT-109, onboard which was the man who would appoint him to the Supreme Court in 1962: John F. Kennedy. White served as the assistant attorney general under Kennedy and helped protect both Martin Luther King, Jr. and the Freedom Riders in Mississippi. As a Supreme Court justice, legal scholars had no reason to question White’s credentials on the liberal Earl Warren Court as he was an appointee of John Kennedy. At first, White did not seem to deviate too


much from Chief Justice Warren. White joined the majority in Abingdon School District v. Schempp (1963). He wrote the majority opinion of McLaughlin v. Florida in 1964, which partially overruled Pace v. Alabama (1883). McLaughlin allowed two people of two different ethnicities to live together. It would take three more years for Pace to be completely overturned in Loving v. Virginia (1967), a case where White joined a unanimous court. He helped the Court incorporate the right of a trial by jury guaranteed by the Sixth Amendment of the Constitution to the state level in his majority opinion for Duncan v. Louisiana (1968). Even after Warren Burger replace Earl Warren as chief justice in 1969, White continued to support liberal causes. In 1972, White concurred with the 5-4 ruling that determined that the death penalty laws of the United States were too arbitrary in Furman v. Georgia. He wrote for the majority in banning the death penalty for people only guilty of rape in Coker v. Georgia (1977), a landmark ruling in Eighth Amendment jurisprudence. White also spoke for the majority of justices in Duren v. Missouri (1979), when he explained that it was discrimination for juries to not require women selected for jury duty to serve that duty. As the 1970s progressed, White revealed himself to be quite the conservative in certain issues. For context, the one dissenter in Duren was Justice William Rehnquist, who joined the court in 1971. While White continued to hold the center-right of the Court, he frequently began joining with Rehnquist against many other members of the court. In 1973, they were the sole dissenters in Roe v. Wade. White also joined Rehnquist in supporting Congress’s power to expel non-citizens for no reason whatsoever in Immigration and Naturalization Service v. Chadha (1983). It was also during his time on the Burger Court that White made his great blunder as a writer of a majority opinion in Bowers v. Hardwick

(1986). White was incorrect both in his conclusion and in his methodology. Justice Sandra Day O’Connor joined White’s opinion, but would explain in her concurrence in Lawrence v. Texas (2003) why she agreed with White. O’Connor believed that states had the right to ban certain sexual acts (i.e., sodomy) for both sexes. However, O’Connor concurred in Lawrence because the Texas law under question only banned homosexual sodomy, which she considered to be gender discrimination and therefore, unconstitutional. White did not take that road out, and specifically allowed such a ban as a bar on homosexual activities.

White was the greatest Supreme Court justice because of his strong morality and for his hatred of insipid factionalism. Because I consider White to be the greatest Supreme Court justice ever, I must stop to explain a certain hypothesis that might rescue White’s reputation from the Bowers quagmire. All justices either write or join an opinion that later becomes condemned by practically everyone who sees the opinion. Chief Justice John Marshall wrote the majority opinion to Barron v. Baltimore (1833), forbidding the incorporation of the Bill of Rights to the state level in such an authoritative manner that legal scholars are still undoing the damage 170 years later. The revered


dissented in Solem in 1983, but saw the Court as destroying a harmless precedent of American jurisprudence. White became more concerned about the conservatives while continuing his crusade against abortion, dissenting from the Court when it upheld Roe with certain restrictions in Planned Parenthood v. Casey (1992).

Justice Joseph Story wrote the majority opinion for Prigg v. Pennsylvania (1842), that allowed the Fugitive Slave Act to override state laws banning the importation of African-Americans to slavery. The sole dissenter in Plessy v. Ferguson (1896) was John Marshall Harlan I, who wrote in the same dissent about not including people of Chinese origins among the American citizenry. The oft-quoted Justice Oliver Wendell Holmes, Jr. wrote the majority opinion in Buck v. Bell (1927), where he argued that a state could sterilize people who the state considered mentally ill. Chief Justice Earl Warren wrote a dissenting opinion that argued for reinforcing a ban on burning the American flag other than the officially recommended way against the majority in Street v. New York (1968) (White admittedly dissented in that case as well). Even White’s most liberal colleagues were not immune from making such blunders. In Osborne v. Ohio (1990), the court ruled that a state could make the mere ownership of child pornography illegal. Justice William Brennan wrote a dissenting opinion that was joined by Thurgood Marshall and John Paul Stevens. Brennan could have argued about whether somebody with such materials in their house knew they had it or whether a family member might have owned it, but he did truly

make that argument. Instead, he argued for the right of someone to have what most people considered “distasteful” as part of their First Amendment rights. The lesson of this paragraph is quite simple: No justice can ever be even close to perfect on the Supreme Court. White actually served his final seven years on the court with Rehnquist as chief justice, further increasing his influence and prestige on the court. He got to see the new guard join Rehnquist as Sandra Day O’Connor, Antonin Scalia, Anthony Kennedy, and Clarence Thomas joined the Court in his final years as a justice. White began disagreeing with his new colleagues and began to see how they might be as questionable for the cause of liberty and for a just society as his more left-leaning colleagues were. No case better exemplified his distaste for the conservative new guard than Harmelin v. Michigan (1991). Harmelin argued that Michigan could impose a life sentence without parole on someone for owning 672 grams of cocaine. White dissented vigorously against the rest of the conservative bloc. Harmelin basically crippled Solem v. Helm (1983), a case stating a state could not give a life sentence without parole to someone for merely embezzling around $100. He previously

By 1993, White was the most senior associate justice on the court. President Bill Clinton became the president of the United States in the same year. White stated that someone else should experience service on the Supreme Court to a man who he knew would replace him a left-leaning justice, in that case Justice Ruth Bader Ginsburg. He retired honorably and in relatively good health as he saw the struggles of his colleague, Thurgood Marshall, as Marshall desperately waited for a Democrat to enter the White House. As he saw Marshall honorably leave in 1991, so White followed his example two years later. White retired and died in 2002. Former colleague John Paul Stevens praised White for his knowledge of the balance between the three branches of government, but no one argued that he was the greatest justice in the Supreme Court’s illustrious history. I believe that White was the greatest Supreme Court justice because of his strong morality and for his hatred of insipid factionalism. White was also not influential because of his strong moral compass in decisions that were unpopular in their time and occasionally in the present. White’s Christian beliefs did not allow him to sign off on abortion, while he saw how Christianity and other religious branches could be oppressed through school-sponsored readings of certain religious texts. He tried to uphold a just and equal society that was fair to all citizens without infringing on the rights of too many people. He refused to cater to any one side, and proved all the while that William Butler Yeats’s poem “The Second Coming” could not have been more incorrect on one particular point. The center can, in fact, hold.

Luke Voyles '18 studies in the College of Arts & Sciences. He can be reached at



THE LAST TRUE CONSERVATIVE Max Handler | Illustration by Natalie Snyder


he Republican-controlled House of Representatives has voted to repeal Obamacare, more officially known as the Affordable Care Act or by the acronym “ACA,” over 50 times since its passage. Two separate Republican presidential nominees – Mitt Romney and Donald Trump – and countless potential Representatives and Senators campaigned, and in many cases won, on promises that they would repeal the bill. And yet despite Republican control of both houses of Congress and the Executive branch, repealing the ACA seems increasingly unlikely.

Twice now the Senate has made formal attempts at repeal. The first came this summer, the now-infamous “skinny repeal.” The second is the so-called Graham-Cassidy bill, named for its authors, South Carolina Senator Lindsey Graham and his Louisiana counterpart Bill Cassidy. Neither bill truly represents the full repeal of Obamacare that Republicans have long sought (the skinny repeal didn’t touch many of the ACA’s taxes and subsidies, and both allowed children to stay on their parents’ plans until they turned 26), but many on the right saw each as first steps towards a full repeal. The main villain of both these failures for many Obamacare opponents is John McCain, senator from Arizona and former Republican presidential nominee. After voting on a motion to proceed with the skinny repeal, McCain voted ‘no’ on the repeal itself. And despite the fact that the most recent bill was put forward by McCain’s close friend Lindsey Graham, McCain once again refused to support repeal. In both instances, McCain had the same reasoning: process. As finding the 60 votes typically required for passage would have been impossible, Republicans have instead used reconciliation, a complicated process that allows them to pass a bill with just 50 votes without having to worry about breaking a filibuster. 50 votes, of course, means a tie that can be broken by Mike Pence, who as Vice President can get a chance to use the rarely-invoked power of the office to break ties in the Senate. But the legislative chicanery does not stop there.


The skinny repeal was drafted in secret, and it was only voted on in the wee hours of the morning by Senators who did not know what was in it. Committee hearings, normally a necessity, have been done away with. Reports by the Congressional Budget Office (CBO), which estimate the financial impact of potential bills and thus provide the public with vital information about them, have been either done away with entirely or flat-out ignored.

Thus, conservatives ought to applaud John McCain for remembering what conservatism is all about These actions hid the actual contents of the bill from the public, preventing them from forming an opinion. Beyond that, the normal legislative process allows for experts and other senators to provide feedback on a bill and make it better. These departures from the normal process precluded any such improvements, leading to a worse bill. John McCain is almost the only Republican concerned by these irregularities. Lisa Murkowski and Susan Collins, the other two GOP no votes in the Senate, have also expressed their disagreement with the manner that their colleagues have gone about passing a repeal, but their most strident criticisms are substantive in nature. McCain by all accounts seems willing to support a normal repeal of Obamacare. Still, that is quite simply not enough for many on the right. And those feelings of anger at McCain are to some degree understandable. Many Republicans – including John McCain – campaigned on a promise to repeal Obamacare, and voters are right to be mad that the politicians for whom they voted have not delivered on that promise. However, to make McCain the villain of the

GOP’s failures is tantamount to forgetting what it means to be a conservative. Conservatives fight to conserve institutions and their power, as they know that it is much harder to build things than to tear them down. Stable legislative institutions are vital to the strength and success of this nation, allowing for the passage of new laws only after careful consideration. Obamacare ought to be repealed, but it ought to be done so through the normal process. The Senate was meant to be a deliberative body in which the minority party has the power to influence bills. Republican actions threaten that purpose. Norms only exist because of action; if Republicans continue at this rate, the Senate will continue its slide from an institution that produces good legislation through careful debate to a purely partisan, barely functioning one. Seeking to enact preferred policies by any means necessary is destructive; there was a time when those who called themselves conservatives understood. But as recent events have made abundantly clear, that sort of conservative is in desperately short supply. Thus, conservatives ought to applaud John McCain for remembering what conservatism is all about, and for refusing to depart from his principles despite being under Max Handler ‘18 studies in the College of Arts & Sciences. He can be reached at




or the past 104 years, the chief executive of the United States has been a Trinitarian Christian.

1. I would like to show how American presidents have become more religiously orthodox. 2. I would like to explore various explanations as to why American presidents have embraced orthodoxy. When William Howard Taft is remembered, it is mostly as the most overweight president or as the only president to serve as the Chief Justice of the Supreme Court. However, Taft held another distinction: he was the last known Unitarian to serve as the chief executive of the United States. Starting with the inauguration of President Woodrow Wilson in 1913, every U.S. president has been an orthodox (i.e. Trinitarian) president. A Trinitarian is a Christian who believes in worshipping the Father, the Son (Jesus Christ), and the Holy Spirit as coequal and as being three different persons but being one god. Trinitarianism is almost universal in most forms of Catholic and Protestant religions, marginalizing Unitarianism further than in certain times of the past. Though Taft was the last non-Trinitarian Christian president, he was not the first. He was not even the most unorthodox. Historian William A. DeGregorio wrote that John Quincy Adams refused to believe in either the Virgin Birth of Jesus or in the possibility that miracles occurred in nature. His father, second U.S. president John Adams, was a Unitarian. Additionally, the Unitarian Millard Fillmore challenged a New York law forcing public officials to claim that they believed in God, a kind of law that would not be completely abolished in the United States until the Supreme Court’s Torcaso v. Watkins ruling in 1960. Thomas Jefferson was a sort of Christian deist, who subscribed to the axioms set forward by the Bible without thinking that the Creator God actually intervened in the affairs of the world. Even Abraham Lincoln was not known to be

particularly religious or even to believe in God, though he respected the views of the majority of Americans. In fact, DeGregorio could not find an example of a zealous Christian until William Henry Harrison, the ninth president who ascended to the office. Harrison came to the office 52 years after George Washington’s first election.

Historians will never know what the men who served as the president of the United States believed. The aforementioned facts do not imply that every president before Woodrow Wilson was irreligious or even unorthodox. The 14th president, Franklin Pierce, would not open letters on Sundays. Grover Cleveland was a loyal Presbyterian and William McKinley was a true-blue Methodist. However, the now-104 year streak is odd in the face of such an odd religious history. There are some consistent factors, though. No president of the United States has publicly declared dedication to agnosticism, atheism, or any other religion than Christianity. A partial explanation of why presidents became both more orthodox and more overtly religious may be attributed to the effects of the Second Great Awakening. Historian Donald Scott wrote about the evangelical impact of the Second Great Awakening, as opposed to the revival within the congregations that occurred during the First Great Awakening of the 18th century. Ordinary women and men outside of the churches were the focus of this movement, and their conversion to Trinitarian Christianity was of the greatest importance to the pastors of the movement.

The widespread acceptance of a massive amount of people to a specific form of Christianity possibly encouraged politicians to be more public with their faith. Nevertheless, the individualism of some of the early presidents cannot be discounted. The first presidents were adherents to the Enlightenment, a movement that encouraged skepticism and promoted empiricism. The end of the Enlightenment and the beginning of the Second Great Awakening encouraged belief in miracles, healings, and the equality of all humans before the Judeo-Christian God. Presidents are now far more open about their faith in God than ever before. Democratic President Harry Truman of Missouri declared a National Day of Prayer while his successor, Republican President Dwight Eisenhower, founded the Annual Prayer Breakfast. As the above example shows, the Trinitarian streak is bipartisan as both Republicans and Democrats are more openly religious. Democrat James Buchanan did not join a church while president from 1857-1861, because he thought that the public would find his decision shameless. By contrast, Democratic President Jimmy Carter taught Sunday School classes while in office and Republican President Ronald Reagan invoked God after the Challenger disaster of 1986. Clearly, changes are obvious and widespread since the start of the Trinitarian streak In short, historians will never know what the men who served as the president of the United States believed. For a contemporary example, Hillary Clinton and Donald Trump never directly challenged Trinitarian Christianity. Neither wore Trinitarianism on their sleeve, but attacking the ideology is out of the question. We will never know exactly what they believe or who (if anyone) they worship. Nevertheless, the ubiquity of Trinitarianism among the major candidates shows its importance in gaining votes at present.

Luke Voyles '19 studies in the College of Arts & Sciences. He can be reached at




HOW TO DEBATE NON-INTERVENTIONISTS Nicholas Kinberg | Photo courtesy of Wikimedia Commons


can’t count how many times I’ve heard people say “the U.S. should mind its own business” or “we should leave the Middle East alone” or “alliances with none, trade with all.” Every time I do, I cringe a bit; not only because I disagree, but also because points like that are quite difficult to refute. I mean, as a policy, noninterventionism seems like a simple fix. If the United States choose not to get involved in the affairs of other countries, “calamities” like the Iraq and Afghan wars, for example, never would’ve happened. Yet people fail to realize the justifications for these wars and conflicts like them, the fact that hindsight is 20-20, and that they can’t exactly prove that things would’ve been better without these events. Take Iraq for example. Were you to go up to anyone on the street and ask them what they thought of the war, they would tell you that it was, one, not justified, and two, stupid in every way. Tell that to the Kurds, against whom genocide was committed by the Hussein regime in the al-Anfal campaign of 1988. Tell that to the near million Iraqis who were killed by the Hussein regime over its reign. Tell that to the UN weapons inspectors who were kept from inspecting Hussein’s weapons facilities in the months leading up to the invasion. Hussein’s invasion of Kuwait in 1990 practically begged the international community to remove that cancer of a government. And in 2003, Hussein finally got what was coming to him. Now, what if Iraq hadn’t been invaded? One must consider two premises in this scenario. The first is the intelligence community being correct in their assessment that Hussein harbors weapons of mass destruction. To this day, speculation abounds about what he had. My Garden: The Secrets of Saddam’s Nuclear Mastermind presents chilling evidence of the existence of WMDs and how scientists under the Hussein regime were forced to hide the weapons near their houses. Had it turned out that Hussein did


have some type of weapon of mass destruction, the United States would’ve looked quite stupid. This would have been not least because Hussein had used chemical weapons against the Kurds and because he refused to allow UN weapons inspectors to study his weapons facilities in the months leading up to the invasion.

If Iraq is the reason for why we shouldn’t intervene, Syria is the reason for why we should. For the second premise, consider the country through the lens of a policymaker in 2011. The Arab Spring has just begun. Revolutions are springing up across the Arab world. And in Iraq, Hussein is poised to put down any uprising, just as he did in 1992, when he brutalized even more Kurds. Much like Assad in Syria, Hussein not only refuses to give up power, but kills hundreds of thousands of his own people in the process. The result is worse than the Syrian Civil War. Over the span of a few months in 1988, over a 100,000 Kurds were gassed to death. Imagine the carnage over the length of a few years. Let’s move on to a more uplifting example. Afghanistan, though the Taliban was brutal, was not nearly as rogue as Iraq. For most of its history, it’s been a collection of tribes all just trying to get along with their lives. But there’s a problem. Al-Qaeda has taken root in Afghanistan, the Taliban being a complicit host. And the terrorist

group has just killed nearly 3,000 Americans one cold September morning. Barring any emotional response (which would be completely justified), it made complete sense to go after Al-Qaeda wherever it sat, even if that meant overthrowing the governments that gave it asylum. A common rebuttal to that might be, instead of invasion, why not special operations? I find that the people who suggest such a thing have no idea what it takes to go through with just one. According to Brookings, “spec ops” take a tremendous amount of planning, intelligence, money, and luck. Compared to an invasion, which would still be costly but get the job done much more quickly, a series of “surgical strikes” over the span of decades makes no sense to choose. But one might respond that even with the invasion, Al-Qaeda is just as strong if not stronger than it was sixteen years ago. I suppose that should demonstrate, then, how tough of an organization it is, and how it would have been much stronger without any substantive action taken. President Bush’s surge of troops in Iraq from 2007 to 2008 virtually destroyed Al-Qaeda and its affiliates by 2010. Iraq was, compared to the beginning of its civil war in 2014, a stable, mildly prosperous parliamentary democracy. Our withdrawal in 2011 endangered that progress, directly leading to the declaration of the Islamic State of Iraq and al-Sham in 2013 and the capture of Mosul the following year. Now, let’s address the elephant in the room: Syria. If Iraq is the reason for why we shouldn’t intervene, Syria is the reason for why we should. Since the beginning of the Syrian Civil War in 2011, President Assad has gassed his own people multiple times, caused over 75 percent of civilian casualties (according to UNHRC – Syria Emergency), killed over 500,000 of his own people and displaced millions more, abetted the rise of the Islamic State by making deals with


and fighting it only in token fashion, and promoted Russian and Iranian interests to the detriment of his majority Sunni population. The best we did for Syria was fund certain rebel groups, and even then, once the Islamic State came on the scene, we restricted ones like the Syrian Democratic Forces to fighting only that terrorist group. As a result, several other rebel groups bolted to fight the regime, usually joining the Islamist rebel group Ahrar al-Sham or the formerly Al-Qaeda affiliated jihadist group Hay’at Tahrir al-Sham (HTS). Beyond that, because of the indifference of the world’s greatest superpower, American allies like Israel must now negotiate with Russian counterparts to keep its mortal enemies, Iran and Hezbollah, from strengthening themselves just outside the Golan Heights and in Lebanon. Syria is now a puppet state of Iran, as most of Assad’s forces are loyal to the Persian country. Sunnis have been alienated to the point of having to live under the rule of HTS and the Islamic

State, as these groups unfortunately provide the most competent governance in the region. And to top it all off, no one trusts the our word. How could they when we vowed to intervene should Assad use chemical weapons, and settled for the “removal” of said weapons when he did? How could they when we declared that Assad had to go in 2011, yet in 2017, he remains? How could they when we say “never again” when “every few years” is more apt? And if humanitarian arguments don’t work for you, consider American interests. Since the end of the Second World War, it has been in the interests of the United States to maintain a global presence. This influence is threatened when the United States refuses to use its soft and hard power. For an example of this, look no further than President Obama’s foreign policy of “benign neglect” in the Middle East. In 2007, Iraq was relatively stable, the Gulf countries weren’t sanctioning each other (see Qatar), Iran was much weaker, and Al-Qaeda was all but destroyed. In 2017, Iran has made puppet states out of Syria,

Lebanon, Yemen, and Iraq, Gulf countries have sanctioned each other, and Al-Qaeda is poised to expand across the Muslim world. The fact that, following our attack on al-Shayrat in April (in response to Assad’s use of chemical weapons), people began comparing any military action against Syria to Iraq, shows a dangerous level of ignorance among the general populace. Americans, and, most of all, humanity, must realize and embrace their obligations to stop madmen like Assad and Hussein. Dictators will never have proper legitimacy, and, as a result, their regimes will be inherently unstable. This makes neutrality in the face of repression painful for all actors. Nonintervention will always take the side of the oppressor.

Nicholas Kinberg '20 studies in the College of Arts & Sciences. He can be reached at





'd have been better off broke.” This is how Abraham Shakespeare, a $30 million American lottery winner, described his windfall.

“It’s just upheaval that they’re not ready for,” a financial consultant explained. “People have had terrible things happen.” A common “filler” story for news channels is the “Curse of the Lottery.” These cautionary tales warn of how receiving large and sudden quantities of wealth without preparation leads to ruin. Venezuela should have heeded these warnings. The ongoing Venezuelan crisis can be better understood through this mythical “Lottery Curse.” In 1998, the Venezuelan people democratically elected Hugo Chávez as their president. For the first few years, Chávez ran the country competently. Then the bonanza hit. In 2002, oil prices surged. Venezuela has the largest oil reserves in the world, and its only oil company is, unsurprisingly, state-owned. Due to this price surge, Venezuela brought in over a trillion dollars of revenue over the next decade, and its GDP per-capita quadrupled. Hugo Chávez had just drawn the dream hand for a socialist leader: popular support, consolidated power, and seemingly unlimited cash. For all intents and purposes, Venezuela had just won the lottery. Like many who stumble upon such a jackpot, Chávez’s first instinct was to spend. He created countless new social programs and showered his political and military allies with wealth. Venezuelans prospered under Chávez’s lavish rule. During his fourteen years as president, infant mortality dropped by nearly half, unemployment reduced by half, and the poverty rate decreased by two-thirds. Minimum wages spiked up, as did the literacy rate and the general quality of life across Venezuela. Material wealth lulled the country into a false sense of security. Enamored by its present, Venezuela neglected its future. Chávez had not just redistributed wealth; he had also redistributed power. His party seized control by rewriting Venezuela’s constitution, taking over


businesses, and weakening the legislative branch while packing the courts. The military and political elites lined their pockets with petrodollars and had priority access to invaluable foreign currencies. Criticism was squashed and speech was silenced. As Chávez’s health declined, however, he handpicked Nicolás Maduro as his protégé and entrusted him to defend his legacy. Maduro, a bus driver without a high school education, won the highly-contested 2013 election by a thin margin. He then took the wheel and drove the unwitting Venezuelan people straight into disaster. This became clear when the inevitable happened: oil prices dropped. As many lottery-winners do, Venezuela had spent instead of saved. Oil prices fell in 2014 and— since oil constituted 98 percent of Venezuela’s exports—Chávez’s gilded house of cards collapsed. Instead of allowing excessive social programs to fail, Maduro propped them up by printing money uncontrollably. Both Chávez and Maduro neglected to make capital investments, failed to support local manufacturing, and weakened any long-term safety nets. As is the case with many lottery winners, Maduro and Chávez crippled Venezuela by making it dependent on temporary wealth. Maduro attempted to mitigate the damage by fixing foreign currency exchange rates and capping the costs of necessities such as food and medicine. At the same time, he inflated the Bolívar, Venezuela’s currency, to world-record levels through exorbitant printing. The Bolívar is soon expected to reach 1,660 percent, and has at times been worth less than the virtual currency used in the popular video game World of Warcraft. Venezuelans now face shortages of necessities from life-saving medications to bread and toilet paper. Poverty, disease, and starvation have become commonplace. Four out of five households, twice as many as in 1998, are experiencing poverty. Corruption is rampant as public

funds are pocketed by the political and military elite. Simultaneously, crime is widespread and spreading. Heavily armed militias take over poor neighborhoods. Purported “Mega-Gangs” roam the streets with bazookas and grenade launchers. Caracas, the country’s capital, holds the dubious honor of being the world’s most violent city, with about 4,000 murders a year.

The Bolívar has at times been worth less than the virtual currency used in the popular video game World of Warcraft. Meanwhile, Maduro continues to consolidate his power and crush the last vestiges of democracy. Opposition groups are in disarray as the government crushes manifestations and protests. The police killed over 120 protesters in the past three years and injured thousands more. A formerly “thriving” experiment in socialist democracy is teetering on the edge of becoming a military dictatorship. Venezuela, through corrupt leadership and planning, has managed to turn a blessing into a curse. In responsible hands, Venezuela’s "jackpot" could have revived its economy, infrastructure, and humanitarian resources. This bonanza petrolera could have been the golden ticket to securing a prosperous future. Instead, Chávez and Maduro embraced populism, corruption, and totalitarianism, and thus doomed their country.

Daniel Smits '21 studies in the College of Arts & Sciences. He can be reached at


Build a portfolio. Express yourself. Be heard. WUPR.ORG/CONTRIBUTE WUPR is always seeking submissions from Wash U undergraduates.





he evolving crisis on the Korean Peninsula represents one of America’s most consequential foreign policy challenges. In an armed conflict, military and civilian casualties would likely exceed half a million in the first 90 days alone. Four of the 12 largest economies are located near North Korea and the world’s largest economy—the U.S.—would be directly harmed in such an event. Waves of refugees, numbering in the millions, would flee to neighboring countries, culminating in a migrant crisis which would likely dwarf the Syrian experience. Such a conflict would be more economically devastating to the global economy than any war since WWII, while total casualties would exceed that of the Vietnam War.

Currently, the North Korean government, or more officially the People’s Republic of Korea, or PRK, has conducted 13 ballistic missile tests in 2017—culminating in the recent successful launch last week. In recent years, the PRK has ramped up its missile launch schedule with a record 23 last year and a total of 82 since 2011—far more than over the previous 27 years since testing began. While advanced missile tests alone would spark far less concern, the progress of North Korea’s nuclear program has spurred considerable condemnation and alarm from many nations, particularly the neighboring South Korea and its close ally, the United States. In just the past year, Kim Jong-un’s regime oversaw a nuclear test that greatly exceeded the explosive power of its previous record set last year. In numerical terms, the most recent test contained the force of an estimated 120 to 160 kilotons of TNT. By comparison, September 2016’s test yielded only 17 kilotons, while the “Little Boy” bomb dropped on Hiroshima equaled just 15 kilotons. Under Kim Jong-un’s leadership, North Korea’s nuclear program has made startling progress. So what is preventing the international community from swiftly halting these


developments? The situation is rife with competing geopolitical interests that immediately complicate any diplomatic or military action. Foremost among them is the PRK’s desire for self-preservation and eventual reunification of the Korean Peninsula. Without its nuclear and complementary ballistic missile programs, both of these goals are – in their view – unattainable. With such high importance attached to these twin programs, there is scarcely reason to believe any measures short of complete military intervention would prevent any further progress. In the backdrop is the antagonism between the China-Russia geopolitical alliance and that of the U.S. and South Korea. For years China, North Korea’s sole ally, perceived the PRK as a necessary land buffer between the sizable military forces of the United States and its allies. As such, many analysts have long assumed China would be willing to come to the PRK’s aid in the event of a preemptive attack, which would likely ignite a conflict unprecedented in scale since WWII. However, in recent years, particularly since the ascension of Kim Jong-un in 2013, China has become increasingly frustrated by the defiant and belligerent behavior of its neighbor to northeast. Notably, China recently approved new, sweeping economic sanctions against North Korea. Russia has played a more subdued role in supporting Kim Jong-un’s regime, however. Russian President Vladimir Putin is attempting to fill the vacuum of influence left by China of late. This would give Putin an excuse to burnish his foreign policy credentials while simultaneously expanding Russia’s sphere of influence, particularly through aid, both humanitarian and military, in the event of conflict. On the other side, South Korea seeks to preserve its own existence, both politically and literally. Moreover, any military intervention

in North Korea would imperil South Korea’s civilian population. Seoul, the capital of the country, boasts a population of 25 million and lies just 35 miles from the Demilitarized Zone or “DMZ”. At the figurative side of South Korea lies the United States, the nation the PRK has always viewed as its archenemy. While geographically distant from the region, the U.S. represents South Korea’s largest and most powerful military, economic, and political ally and has about 30,000 troops stationed on the peninsula. More alarmingly, on numerous occasions the PRK has touted its military capacity and desire to annihilate the United States. Underlying these realities is the long and eventful history of the PRK – an oft-ignored component in understanding and ultimately resolving the ongoing crisis in the region. The historical origins of North Korea’s aggressive and defiant foreign policy posture lie in the armistice signed in 1953 which concluded the Korean War. This left North and South Korea in essentially a frozen state of war and resulted in the formation of the DMZ, where 28,500 American soldiers, 522,000 South Korean, and 1,000,000 North Korean soldiers are stationed today. In this climate, North and South Korea are never far from conflict. In South Korea, this is evident in the two year military service mandate for all physically fit men between the ages of 18 and 35, while an even larger proportion of North Korean men serve in the armed forces. From a historical context, in 1958 Kim Il-Sung ordered the withdrawal of all Chinese forces in the country which had assisted in the PRK’s reconstruction efforts since the conclusion of the Korean War. No foreign troops have set foot in their territory since. In the 1950s, a key power struggle occurred between the regime led by Kim Il-Sung, and rival parties seeking influence over the nation. This led to the transition of the “Hermit Kingdom” from a multi-party authoritarian state somewhat


influenced by the Soviet Union and China, to a single party totalitarian government under the firm control of Kim Il-sung. Since that formative period, the isolated nation has constructed a political and social culture which revolves around the regime’s current leader. In each case, the people worship him as an exalted, superhuman figure with an almost religious fervor. The education and propaganda network is structured around the dualistic struggle between North Korea and its arch nemesis, the United States. Anyone who hesitates to accept the dogma and rules of the regime is subject to internment at brutal forced labor camps. Such conditions in the country have established a firmly united people built upon conditioned loyalty and the specter of fear. Such a nation would not only be difficult to subdue in a conflict, but would also be challenging to integrate into modern society if the current government were to collapse. What is often overlooked is the critical fact that North Korea’s motives are not strictly defensive in nature; North Korea survived for decades before even signaling it had nuclear ambitions. Instead, their actions reveal an ultimate goal of reunifying the whole of the Korean Peninsula. In the face of a US-led NATO military resistance and an estimated economy smaller than that of Birmingham, Alabama, a conventional military victory is out of the question for the North Korean government. To underline this point, the U.S. military budget for the upcoming year is a staggering $700 billion, which far beyond the next 10 largest military spenders combined. However, with the threat of nuclear retaliation, the PRK may effectively prevent American and international military intervention altogether. This would leave South Korea’s military as the lone stalwart against the PRK in its endeavors to conquer the entire peninsula. As the regime’s nuclear proliferation and complementary ballistic missile development is evidently essential to achieving this long-standing goal, a cessation of such efforts seems extremely unlikely. At the moment, North Korea’s government is one of the most reclusive and meticulously controlled in the world. North Korean society

During Kim’s ascension to power in 2013 many analysts were skeptical that he would last more than a few year. Now, we’re here. Such conditions in the country have established a firmly united people built upon the conditioning of loyalty and the specter of fear. mirrors a real-world version of the widely-read dystopian novel 1984. Unfortunately, many ludicrous misconceptions of North Korea pervade the media, as reliable information on domestic affairs is scarce, and also due to the lengths that the government takes to carefully accompany, monitor, and choreograph the few journalists that visit the nation. Of the little knowledge of North Korea that does leave the country, in many cases it sounds inconceivable but is actually true. One striking example of this is the Sinchon Museum of United States War Atrocities, a museum devoted entirely to American military crimes during the Sinchon Massacre alone, a scantily sourced massacre that occurred during the dawn of the Korean War. According to Korean Central Television, one exhibit describes the atrocity in gruesome detail as “In the last liberation war... during our strategic retreat, the American hyenas...arrested Min Youngshik...stabbed

her muscles with a three-pronged spear and sucked her flowing blood.” Kim Jong-un himself has released statements through his state-run news network claiming that “they are cannibals and homicides seeking pleasure in slaughter.” Koreans are fed propaganda regularly in various forms that reflects the extent to which the regime exercises control over the lives of the populace. All of this said, it is clear that whatever Kim does, the PRK follows. Based on the audacity and ambitiousness of Kim, it seems reasonable to consider that he will not easily budge on important issues to himself and the fate of his nation. To paraphrase Jonathan Pollack, the Senior Fellow of Foreign Policy at the Center for East Asia Policy Studies at the Brookings Institute, the last concession Kim would make is the relinquishment of the his nuclear and ballistic missile programs. A peaceful resolution of this crisis, resulting in the dismantlement of the nation’s nuclear arsenal, is unforeseeable. Instead, the United States and its allies must prepare for a scenario where North Korea possesses advanced nuclear armaments made perilously mobile through capable Intercontinental Ballistic Missiles, or a decisive, full-scale war with one of the world’s only remaining totalitarian states.

Johnathan Romero ‘20 studies in the College of Arts & Sciences. He can be reached at


WUPR is more than a magazine! We would love to meet you. Head to

WUPR.ORG/CONTRIBUTE to learn how to get involved.


WUPR's October 2017 issue, "Constructions"!