Political Review 19.3 | November 2013 | wupr.org
What Weâ€™re Losing
Editors’ Note Dear Reader, During our tenure as WUPR members and Editors-in-Chief, we have struggled with the “Political” part of our name. In its narrowest interpretation, “political” implies elections and campaign finance reform, policy debates and Gallup polls, foreign embassies and free trade agreements—in short, a world of discourse rooted solidly in CSPAN programming. Here at WUPR, we feel this definition is far too limiting; we prefer to approach the subject from a more expansive perspective. Originally, the word “politics” comes from the Greek word politikos, meaning “pertaining to citizens.” If we apply this understanding to the Political Review, it is hard to imagine all that many topics that do not belong within the publication. With this scope in mind, our theme for this issue centers on “What We’re Losing.” In an age of seemingly endless innovation, we must turn our focus to what is slipping away. Pieces of the world we’ve come to know over the course of our lifetimes are fading fast, some irrevocably. And while we may know that change is inevitable, it is easy to look around and feel anxious when long-familiar touchstones begin to disappear. We’ve left it up to our writers to weigh in on the implications of these losses, and they have tackled a wide array of subjects, exploring the impacts of the potential loss of species, ideologies, customs, and more. As we mentioned, to us, everything is politics. We chose this theme in the hopes of challenging popular conceptions of political topics, and we think our writers have done a fantastic job. Most of the articles in our theme section focus on issues far separated from the traditional notion of “politics” limited to legislation and policy. We hope you enjoy what they have to say and encourage you to join the conversation. Your opinions are always welcome.
Best, Moira Moynihan Will Dobbs-Allsopp
Table of Contents
How United are the Nations? Aryeh Mellman
The Days of Our Chilean Lives Adam Flores
Reinvention and Revival in the Conservative Caucus Jimmy Loomis
Show Me the Money Charlie Thau
Germany: The Hypocrisy of Republican Economic Policy Ari Moses
12 Looking Beyond the Turban Shivani Desai
A Medical Emergency: When States Opt Out of Providing Care Nathaniel Thomas
14 Is Texas Finally Feeling Blue? Alex Beaulieu
15 Hassan Rouhani: A Sheep or a Wolf?
26 Losing Hope and Losing Face Naomi Duru
16 Give ‘Em Hell Nina Hugh Dunkley, Jr.
27 The Beginning of the End of the Middle: An S.O.S. from the Middle Class Hannah Waldman
17 What’s in a Name? Grace Portelance
20 Goodbye to the Porkpie: The Decline of American Hat Culture Gabriel Rubin
21 In God we Doubt Jared Turkus
22 How Polarized is America, Really? Govin Vatsan
23 The Plight of the Bees Moira Moynihan
24 The Power of Paper Danny Steinberg
Staff List Editors-in-Chief: William Dobbs-Allsopp Moira Moynihan Executive Director: Nicolas Hinsch Staff Editors: Nahuel Fefer Gabriel Rubin Sonya Schoenberger Features Editor: Aryeh Mellman Director of Design: Michelle Nahmad Asst. Director of Design: Alex Chiu Managing Copy Editors: Stephen Rubino Director of New Media: Raja Krishna Programming Director: Hannah Waldman Finance Director: Alexander Bluestone
Staff Writers: Maaz Ahmad Stephanie Aria Alex Beaulieu Vinita Chaudhry Aaron Christensen Henry Clements Benjamin Cristol Kaity Shea Cullen Aashka Dalal Gabe Davis Shivani Desai Kevin Deutsch Hugh Dunkley, Jr. Naomi Duru Patrick Easley Rahmi Elahjji Adam Flores Chris Gibson Matthew Hankin Hana Hartman Arian Jadbabaie Jack Krewson Miranda Kroeger Joe Lenoff Wallis Linker Martin Lockman Jimmy Loomis Billie Mandelbaum Brett Mead Ari Moses Henry Osman Grace Portelance Naomi Rawitz Andrew Ridker Jonathon Robeson Gabriel Rubin Razi Safi Victoria Sgarro Ari Spitzer Danny Steinberg Charlie Thau Nathaniel Thomas Jared Turkus Govin Vatsan Scott Witcher Camille Lynn Wright Megan Zielinski
Front and Inside Cover Photography: Kevin George Back Cover Illustration: Esther Hamburger Editorial Illustrators: Alex Chiu Margaret Flatley Simin Lim Katherine McCarter Gretchen Oldelm Zach Rouse Steph Waldo Board of Advisors: Robin Hattori Gephardt Institute for Public Service Professor Bill Lowry Political Science Department
The Washington University Political Review is a student organization committed to fostering awareness of political issues. We shall remain dedicated to providing friendly and open avenues of discussion for students, irrespective of political affiliation or ideology. Submissions: email@example.com
How United are the Nations? Ayeh Mellman| Illustration by Katherine McCarter
he United Nations (UN), like its predecessor, the League of Nations, was born out of the ashes of a world war. With the advent of atomic weapons, a third world war could have been apocalyptic. And so, in a quest to quite literally save the world, the founding countries of the UN strove to create a body where the nations of the world could comfortably assemble and deliberate in order to promote global peace and security. Naturally, the UN co-opted additional functions as time went on, and today, aside from worldwide stability, its purposes include “helping nations work together to improve the lives of poor people, to conquer hunger, disease, and illiteracy, and to encourage respect for each other’s rights and freedoms.” The UN expanded its charge from merely keeping the world safe to improving it. This is an honorable resolution, to be sure, but the UN’s attempts to achieve such noble aspirations are often bizarrely misguided and hypocritical. It should be noted that the UN has achieved important successes in terms of its primary purpose, global stability. The International Atomic Energy Agency (IAEA), a UN organization, has made important contributions by regulating and inspecting nuclear arsenals, thus helping prevent a nuclear world war. Yet many other supposedly peacemaking actions of the UN seem decidedly wrongheaded. The UN Commission on Human Rights (UNCHR) should have been the apotheosis of the UN’s moral agenda: a body committed to seeking out and stamping out violations of human rights, as expressed in the Universal Declaration of Human Rights. One would think that this band of moral crusaders would have been committed to the highest ideals of the sanctity of human life, dignity, and equality before the law; but the ugliness of some of its members, such as notorious human rights abusers like China and Sudan, marred the hopeful visage of this virtuous commission. Libya, under the brutal dictatorship of Moammar Gadafi, was even selected to chair the commission. The presence of such nations on the commission flabbergasted most logical observers. A commission dedicated to protecting human rights placed some of its most well-known dissidents among its leaders. Instead of prosecuting human rights violations, these infamous commission members often used their privileged positions to deflect attention from their countries and continued their human rights violations free from scrutiny. The commission was abolished in 2006 and a new body, the UN
Human Rights Council (UNHRC), which operated by a new member voting procedure, replaced the hypocritical UNCHR. Finally, true defenders of human rights would be placed on the council to safeguard our inalienable rights. Well, that was the idea. In practice, the HRC has been just as unsuccessful as the CHR at preventing human rights abusers from joining its ranks. Past, current, and future members of the HRC include China (yes, again), Qatar, and pre-revolution Egypt, the last of which was described by Amnesty International as a “torture centre” at the time. In an ineffective move, the UN went through what was undoubtedly months of bureaucratic red tape and spent mounds of cash in order to replace an old body with a new one that doesn’t differ in any significant way. Even in the area of global stability, its raison d’etre, the UN has offered baffling solutions to serious problems. The nuclear disarmament committee, established to tackle one of the primary goals of the UN’s charter, has seen singularly unqualified candidates appointed. Iraq was selected to chair the prestigious annual Conference on Disarmament in 2003, a time when it was suspected of having WMDs
Instead of prosecuting human rights violations, these infamous commission members often used their privileged positions to deflect attention from their countries and continued their human rights violations free from scrutiny. and was refusing to cooperate with nuclear inspectors. Just recently, Iran, currently under sanctions from both the US and the UN Security Council itself for its uncooperative stance on nuclear weapons, was given a senior seat on the UN Nuclear Disarmament Committee. Much of the controversy over these appointments comes down to philosophical differences. The UN believes that including countries that abuse human rights in international conversations will somehow induce them into a more clement approach, while the prevailing opinion in the United States is that these appointments only serve to insulate these countries from criticism. Yet despite these theoretical disagreements, the UN remains a crucial tool in protecting global stability. While it may fall short in its secondary goals, it has (so far) achieved its stated purpose: preventing a third world war. While the circle of dictators it places on human rights and nuclear disarmament committees may remain puzzling, no conflict has since ripped apart the globe in quite the way the first two world wars did. Our world is an inherently unstable place, and the UN justifies its existence by making this planet even a tiny bit more stable. Aryeh Mellman is a sophomore in the College of Arts & Sciences. He can be reached at Aryeh.firstname.lastname@example.org.
The Days of Our Chilean Lives Adam Flores | Illustration by Steph Waldo
ichelle and Evelyn, childhood friends, know each other well. Their fathers worked closely as Air Force generals more than four decades ago. But questions about the death of Michelle’s father continue to swirl around the two families. Was Evelyn’s father complicit in the torture and death of Michelle’s father? The two women will find themselves facing off once again, but this time in the Chilean presidential election on November 17. The first round of voting in Chile’s presidential election occurs on November 17. Michelle Bachelet of the left-leaning Socialist Party and Nueva Mayoría coalition and Evelyn Matthei of the center-right Independent Democratic Union and Alianza coalition have been battling for months over the Chilean electorate. Bachelet has a natural advantage as she was president of the Andean country from 2006-2010, while Matthei is a former senator and Minister of Labor under the current president, Sebastian Piñera. In 1970, Salvador Allende became the first democratically-elected Marxist president of any Latin American country. Michelle Bachelet’s father, Alberto Bachelet, supported the Allende presidency. After three years under President Allende, however, General Augusto Pinochet led a CIAbacked coup d’etat that toppled Allende and installed Pinochet as the military dictator of Chile. His regime was backed by Matthei’s father, Fernando Matthei. The elder Bachelet was imprisoned for opposing the coup and tortured; he died from the effects of his imprisonment. Many have questioned Fernando Matthei’s role in General Bachelet’s death given that Matthei led the military installation in which Bachelet was held. Under the Pinochet regime, Chile underwent numerous social, political, and economic transformations. The Pinochet government was responsible for 3,000 killed or “disappeared” persons (socialists, university students, etc.), and the torture of thousands more. Additionally, Pinochet implemented neoliberal economic reforms promoted by Milton Friedman of the University of Chicago. While these reforms fostered national economic stability, they also resulted in extreme internal socio-economic inequality. The dictatorship divided the
Chilean public for decades, including Michelle and Evelyn. During the dictatorship, Michelle Bachelet worked covertly to transmit messages among socialist organizations and was herself imprisoned and tortured as a result. Conversely, the younger Matthei supported Pinochet during the 1988 plebiscite that determined if he would remain in power or if the country would return to a more open democratic system. Chile finally transitioned back to a democracy in 1990. Years following the end of the Pinochet regime, the effects of his governance reverberate throughout the Chilean political and economic systems. The economic and social reforms – such as a ban on all abortions and a market-driven education system – remain in place. And his political influence ensured that he avoided the judicial system for his alleged humanitarian crimes until a Spanish judge ordered his arrest in London in 1998. Although young Chileans are less concerned with the Pinochet dictatorship than their elders (just as the Cold War paradigm is no longer relevant to the younger voters in the United States), now, seven years after his death, his time in power still remains one of the most divisive subjects among the Chilean public. That notwithstanding, two months after the country commemorated the September 11 coup that toppled Allende
in 1973, the Chilean people will vote for a new president. This commemoration brings to the forefront the atrocities of a dictatorship in a year when two women on opposite sides of that regime vie for the nation’s highest office. In a changing Chilean society increasingly distancing itself from the 17-year rule of Pinochet, the two leading candidates represent a transition that defines the country. Both women serve as reminders of the nation’s uncomfortable past, but they also point to a future that is defined not by dictatorship, but rather by hope for new possibilities. Matthei promises to preserve the economic growth that accelerated under President Piñera, a fellow center-right leader. President Bachelet promotes a plan of social justice that includes free public university education, greater equality for samesex couples, and increased socio-economic equality. Yet despite their differences in vision, the two leading presidential candidates represent a shift from a past that has defined Chile for too many years.
Adam Flores is a senior in the College of Arts & Sciences. He can be reached at adam. email@example.com.
Revival and Reinvention in the Conservative Caucus Jimmy Loomis
n all-star lineup of prominent conservative political leaders and activists convened on September 28 at the St. Charles Convention Center to kick off the pinnacle of conservative ideological gatherings: the Conservative Political Action Conference (CPAC). CPAC is an annual forum sponsored by the nation’s leading conservative political body, the American Conservative Union (ACU). Founded in 1964, the ACU is the United States’ oldest and largest grassroots conservative group. Local, state, and national speakers and panelists issued a “call to action” to an audience buzzing with a shared enthusiasm for a tidal wave of conservative change in political direction. The crisp, fall morning began with an invocation and prayer. Talk radio host Dana Loesch greeted the estimated 300 conservative patriots who had traveled from all across the United States. Al Cardenas, Chairman of the ACU, presented a brief overview of the program to come. The first to speak was Senator Mike Lee (R-UT). Known for his youthful vigor and as a leader of the Tea Party movement, Senator Lee received a standing ovation as he took the stage. Comparing the federal government to a dictatorship, he criticized the policies of the Obama administration and the Democratic Party and stressed the importance of defeating the Affordable Care Act (Obamacare). In closing, he pledged to continue the conservatives’ fight in Washington, reassuring the faithful: “Fear not—the American people will always have the final word...” The daylong program continued until 6:00 p.m., and featured an impressive array of local, state, and national figures, including Rick Perry, Governor of Texas, Rick Santorum, Former US Senator, Elbert Guillory, State Senator from Louisiana, Sam Brownback, Governor of Kansas, along with members of renowned conservative think tanks, political organizations, and traditionally conservative media outlets. Participants, diverse in their backgrounds and experience, spoke about the issues facing the country: US involvement in foreign combat, education, the dismal state of the economy, Obamacare, and the moral decay of society. Other speakers participated in panel discussions on political hot topics, such as healthcare, immigration, climate change, and IRS targeting of conservative individuals and groups. Democrat-turned-Republican Elbert Guillory, the self-proclaimed “first black senator since Reconstruction” from Louisiana, implored conservatives to heed the sense of urgency for reform in America. He likened today’s Democratic Party and politics to “termites” who are attacking the moral foundation of our nation, and accused the “head termite” living in the White House of “destroying the fiscal integrity of our great nation.” Emphatically he claimed that the demise of morality and the decline of the traditional two-parent family structure, rather than guns or weapons, is what is responsible for the “culture of killing” which now reigns over America, the theory being that children who are exposed to violence in the streets and in the media from a very young age, grow up believing more violence is the appropriate consequence or response to almost any negative situation. In no uncertain terms, Guillory denounced homosexuality, stating that “the basic
foundational or family unit consists of one man and one woman.” Using other examples to bolster his opinion, he told the story of creation, stating when God took Adam’s rib to create a partner, God did not awaken Adam to say “Adam, here is Leroy,” and the story of the flood, stating that God did not tell Noah to pick any two of every species, but rather a male and a female. The program featured several of the “10 Under 40,” 10 rising young stars of the conservative party to watch, each of whom is younger than 40 years old, such as Clarice Navarro (Colorado state representative) and Casey Guernsey (Missouri state representative). Ms. Navarro-Ratzloff, a female Hispanic state representative from Colorado, used herself to illustrate the changing face of the Conservative Party. Telling the crowd of her workingclass, one-parent upbringing, she spoke of the broadening appeal of the conservative ideology to traditionally liberal and Democratic demographics. Possibly the most anticipated speaker of the day was Texas Governor Rick Perry. Driving home the party’s central theme of less federal government involvement, Perry told the crowd that, “[t]he answer to our economic ills will not be found in Washington, but rather among the states.” Governor Perry held up the booming Texas economy as living proof of the success of conservative policies. Texas, he boasted, has balanced its budgets, lowered taxes, created jobs, and, most importantly, has controlled spending. In stark contrast, he decried the present administration’s domestic and foreign policy failures, stating, “Long before our president presided over the downgrading of our credit, he had downgraded our standing in the world. He alienated Israel, he emboldened Iran, he muddled the foreign policy that we saw in the Arab Spring, and his latest gamble in Syria is a demonstration of weakness in a world that needs a strong America.” As many of the other speakers did, Perry also invoked the name of Ronald Reagan, as he called for a return to the Reagan foreign policy of “peace through strength.” Former Senator Rick Santorum’s speech emphasized the importance of Christian family values. The audience watched a trailer from Santorum’s current project; a film entitled “The Christmas Candle,” to be released at Christmastime. As Santorum remarked, it is the only film being released during the holidays about the true meaning of Christmas, “which is not about Santa Claus, reindeer, or gift giving.” Lt. Col. Oliver North, former United States Marine Corps member and White House advisor, proudly proclaimed the worldwide supremacy of the United States military, yet stated that recent foreign policy has made America weak, and appear weak in the eyes of our enemies. He showed footage of military operations, which featured a double amputee reaffirming his and the military’s commitment to continue to fight for Americans’ freedoms. North reminded the crowd that America remains “the home of the free, because of the brave.” During a closed press conference, this reporter had the opportunity to question Lt. Col. North on his view of the President’s handling of the most recent crisis in Syria, to which North retorted: “The administration has been making
National it up as they go along, do[ing] absolutely nothing to prepare for ‘what-if ’. Another highly anticipated speaker was Grover Norquist, a conservative libertarian Republican and President and Founder of Americans for Tax Reform (ATR). ATR is a taxpayer advocacy group Norquist founded in 1985 at the request of President Reagan. Norquist is best known for his “No Tax Pledge,” which binds signatories (elected officials) from voting for new or raising current taxes. Norquist succinctly summed up the ACU’s position vis-a-vis the federal government with his words: “We want one thing from the government. We wish to be left alone…Don’t raise our taxes, leave our property rights alone, leave our Second Amendment rights alone.” The enthusiastic crowd responded to speakers with clapping, cheering, and nods of approval throughout. However, although the conservatives appeared to present a united front, with their focused and relentless criticism of current policy, it was apparent that there are several factions within the movement: the establishment, whose party faithful have been loyal to the Republican Party for generations; the Libertarian wing, bearing the brand of the
Paul family, which supports individual freedoms and property rights, as well as laissez-faire economics, and the Tea Partiers, perhaps the loudest fragment of the party. With Senator Ted Cruz (R-TX) at the forefront, the Tea Party’s beliefs center on strong affirmation or vehement condemnation of social issues and on governmental reform in fiscal policies, such as decreasing the national debt through reduced government spending, and reshaping entitlement programs. Paradoxically, in the movement’s strength also lies its weakness, for as the conservative movement grows, the differences in ideology within the factions will yield a splintered Republican Party. If the Republicans wish to successfully capture a majority in the Senate, maintain the current majority in the House, or achieve their ultimate goal of putting their nominee in the White House in 2016, they must first find a way to unify their own party. To do this, they must satisfy not only the needs of the often vociferous conservative and populist base, but also those of the less vocal “Silent Majority”; keep in mind that it was President Nixon’s ability to harness the power of this modest constituency that
propelled a shaky former Vice-President into the Oval Office in his 1968 presidential election. However, today this large populace of moderate Republicans is leaving the ranks of the once-strong GOP in record numbers, hesitant to brand themselves as members of a party overtaken by political extremism. Losing valuable votes as this moderate element casts its ballots for independents, or a moderate Democrat here and there, it is almost as if the party is pushing them away, forcing them to cast a vote devoid of passion, in an act of resignation brought on by the alienation of their once-strong political identity. Though not as dominating as in years past, it is clear that the vein of populist American conservatism is alive and well among the party base, and will continue to be a significant force in shaping America’s political landscape.
Jimmy Loomis is a freshman in the College of Arts & Sciences. He can be reached at JLoomis@wustl.edu.
Advanced courses, internship opportunities, independent study, and the new minor in Religion and Politics equip Washington University students with the ability to identify, engage, and shape many of the most pressing issues in American public life. Visit us today to learn how coursework or a degree in Religion and Politics will contribute to your professional development and civic engagement.
Show Me the Money Charlie Thau
his past year, Texas A&M received $750 million in donations, and it was not because of its Economics program, a dramatic increase in Rhodes scholars, or any other academic reason. Incredibly, this dramatic surge in contributions can be almost entirely traced to a single person: reigning Heisman Trophy winner, Johnny Manziel. Johnny Football, as he is known, is absolutely electric on the football field, making spectacular, nearly magical things happen weekly, and is almost as compelling a character off of it. If he has made the university all this money, how much is he taking home? As it turns out, Johnny Manziel is not getting a dime. Throughout the past two months the debate over college athletes and compensation has resurfaced due to the recent allegations that Manziel received at least $7,500 in exchange for autographs. In a perfect example of the NCAA’s hypocrisy, this witch-hunt took place while their official merchandise website was selling Texas A&M jerseys with his number and the word “Football” on the back. The issue was quickly resolved, and Mr. Football was given an almost comical punishment, a suspension for the first half of A&M’s opening game against Rice. This hypocrisy is not new. Back in the 1990s the famous “Fab Five” at the University of Michigan revolutionized college basketball. These five players popularized much of current modern basketball style, including baggy shorts, black socks, and custom sneakers. Michigan, of
course, saw the marketing opportunity and started selling apparel mirroring the players’ gear. The Fab Five, despite being directly responsible for the culture change and the resulting dramatic increase in sales, gained none of the profit. In fact, they lived in poverty and were forced to borrow money to afford basic necessities. Fab Five member Jalen Rose summed it up best when he said, “I felt like a professional athlete who wasn’t getting paid.” Some, including ESPN’s Jay Bilas, have valiantly crusaded against the NCAA for leeching off of their student athletes’ likenesses while the players receive no compensation. While the great “academic institutions” of this country have sat idly by and counted their cash for decades, the familiar cry of “pay the players!” is once again re-emerging. The solution is not as easy, however, as simply paying all studentathletes a flat fee. Clearly Johnny Manziel is more valuable to the school than the backup punter, so why pay them the same amount? There is no reason to. A better solution is a distinctly American one: the free market. Players should be compensated according to market value, meaning that they should be able to profit off of merchandise and advertising that use their likeness. Under this proposal, they would not receive a salary from the school aside from their scholarship (roughly valued at $200,000 over four years). For example, Johnny Manziel could sign
FROM University of Texas at Austin University of Michigan University of Alabama
END ZONE 10
Wei Jin Weng
103.8 (In Millions)
University of Notre Dame
Louisianna State University
University of Arkansa
University of Georgia University of Florida
University of Nebraska Data Taken From ESPN.COM
as many autographs as he wanted and make as much money as the market would allow. This would enable the most popular players to make the most money from their successes. Understandably, many might have concerns with this system, but there are very reasonable counters to the most pressing of these potential problems. First, there is the traditionalist view that “student athletes should be amateurs and we must preserve the tradition of amateurism in college sports.” Please, give me a break. It is the worst kept secret (maybe of all time) that college sports, particularly football and basketball, are simply feeding grounds for professional teams. Further, college football in particular is a billion dollar industry with schools such as the University of Texas making over $100 million in revenue off of its team last year, much of which simply goes back into the program. Some might counter by saying that the rest of that revenue goes towards funding other sports programs within the university, especially many woman’s programs which may actually lose money. The money can also be used to better a university’s academics. All of this may be true, but according to Forbes, Texas football made $77,917,481 in profit this past year. My guess is that this profit is still more than large enough to cover the cost of all the sports at the university, even if it were to take a hit from siphoning off merchandising revenues to athletes. Moreover, it is not like the academic reputations of any big football schools are dramatically improving—it hardly seems like that money is being spent on academic pursuits now. You know what expenditures that revenue does cover? The $5.2 million, which increases by $100,000 every year,
you would have to pay the student athletes in other sports, a claim that speaks to the anxieties of Title IX proponents. Unfortunately, whether for better or for worse, the sports that generate the most profit are men’s basketball and football. Under a pay-for-play system it would be hard to justify paying student athletes in any other sport. The system that I am proposing, however, operates outside of that concern, because any athlete can take advantage of the free market. If a company wanted to use former Baylor women’s basketball star Brittney Griner for an advertisement, they should be able to, just as a company should be able to contract with South Carolina’s Jadaveon Clowney. The market should dictate which athletes have the most financial success based on who can generate the most revenue for a specific corporation. Yes, the star quarterback at Michigan is most likely going to receive more advertisement money than somebody on the swimming team in any given year, but there is also much more demand for that quarterback. The most valid concern, in my opinion, is the belief that if the free market were to be opened, the system would be tainted by boosters and agents who would intentionally pay enormous sums to the players through advertisements, autograph deals, and miscellaneous crooked ways of cashing in. But it is naïve to assume that this doesn’t happen already. For example, Oklahoma State recently found itself facing allegations of improper booster contributions, such as boosters distributing cash filled envelopes and paying one player $400 to take a Christmas tree out of an attic. It seems that the most sensible solution to this problem would be to institute a rule re-
Letting college athletes profit from the free market is the smartest and most effective course of action. paid to Texas Head Coach Mack Brown each year until 2020, as well as the $1,109,041 paid to Athletic Director DeLoss Dodds. Meanwhile, if a player can’t afford a hot meal and Mack Brown feels bad and buys him dinner, it’s an NCAA violation and the player will be suspended. Another concern revolves around the notion that if you pay players of certain sports such as football and men’s basketball, then
quiring that players would be unable to profit from boosters of that same school. If somebody who gives more than, say, $100,000 to a school annually is found to be essentially paying a player, then such an action would be treated as a violation of NCAA rules, the same way that Johnny Manziel getting paid $7,500 is today. Another legitimate concern is that the free-market proposal might corrupt the re-
It is perhaps the worst kept secret (maybe of all time) that college sports, particularly football and basketball, are simply feeding grounds for proffessional teams. cruiting process—players might well go to schools that put them in the best position to make money in college. Yet, schools already try to out-do each other in methods that are as bad if not worse than simply highlighting for athletes how well-positioned they would be to earn a profit for their efforts. Look no further than Oregon’s $68 million new space age facilities, which contain, among other things, a barbershop and luxury lockers. Letting college athletes profit from the free market is the smartest and most effective course of action. The players benefit according to their skill and marketability, and the universities benefit from increased attention to certain players. If the right precautions were taken, it would not corrupt the system (at the very least, not any more than it already is) and would draw even more fans to college sports. Lastly, and I think this is the most subtle aspect of this idea, it would encourage kids to stay in school. Players are commonly faced with the dilemma of staying in school for another year or making money in professional football or basketball to provide for their families. This system would allow many student athletes, who wish to stay in school longer and get their degrees, to do so while still making some money. At the very least, they will be no worse off than they are now. Hey, not everybody can be Johnny Football.
Charlie Thau is a freshman in the College of Arts & Sciences. He can be reached at firstname.lastname@example.org.
Germany: The Hypocrisy of Republican Economic Policy Ari Moses | Illustration by Alex Chiu
he Republican Party frequently condemns the Democratic Party for deficit spending, but what alternate economic policies do they have in mind? According to members of the party, Republicans want austerity measures and wish to cut financing for most social programs. Yet, they praise the German economic system; Representative Joe Wilson extolled the “German (Economic) Miracle” on the House floor. However, do conservatives truly want an economic system remotely similar to Germany’s social market economy? The current German government runs massive welfare programs as well as other social programs in order to ensure that their citizens have a set minimum quality of life. These social programs include the Kurzarbeit Program, strong unions, and extensive vacation benefits. John McCain, on the other hand, recently proposed that citizens should “[work] a second job [and skip] a vacation,” a sentiment that is wholly incompatible with Germany’s social market economy. Unemployment, Welfare, and Vacation Benefits Prominent Republicans, such as Senator Mark Kirk and Senator Jeff Sessions, have both argued against expanding unemployment insurance in the United States. Yet, in Germany, unemployment benefits are increasing. Both the employer and the employee pay the government for unemployment insurance, a stark contrast to the United States’ method, where the tax liability falls solely upon the employer. The German government provides unemployment insurance for a longer duration while also maintaining a higher level of compensation. Germans receive a minimum of 60 percent of their previous wages, whereas in the United States, the rate of unemployment benefits is lower than 50 percent of previous wages. The same Republicans who shut down the United States government over an individual healthcare insurance mandate praise Germany, a country that enforces a mandate on all citizens to contribute to the national unemployment insurance fund. In Germany, the unemployment rate has remained below 6 percent for the last few years, drawing admiration from the Republican Party. However, this achievement was accomplished because of the Kurzarbeit program, which dictates that German workers must accept lower wages and less pay for an agreed upon amount of time. This program relies on the enormous power of unions in Germany to create agreements with employers, resulting in low unemployment. For example, in a recession, labor demand by firms in the United
States declines, leading to layoffs and unemployment. Meanwhile, social programs in Germany grant more flexibility to employers, allowing them to reduce labor hours instead of enacting layoffs, generating limited to no rise in unemployment. Similarly, the vacation benefits that laborers receive in Germany are much more generous than those of their American counterparts. A German worker is legally entitled to four weeks of paid vacation a year plus additional leave time, resulting in 20 percent lower annual working hours - American workers are not entitled to any paid vacation. Germany may have a lower GDP per capita and lower worker income than the United States, but German laborers have more benefits, which makes up for the difference. Republican Ambition Why, then, do Republicans praise only the German economic model? In the years following the global financial crisis, the US economy has grown alongside the German economy. The Republican viewpoint asserts that the Democratic president, and the entire Democratic Party have hurt the economy through deficit spending. Contrary to their argument, the massive stimulus package enacted by the president has achieved substantial GDP growth. In order to gain more Congressional seats, the Republicans have been, for lack of better phrasing, misleading the American people though a misinterpretation of German economic success. The Republican view that Germany only enacts austerity policies is incorrect; in fact, their large social welfare programs can be interpreted as a stimulus package designed to incentivize employers to retain workers. The hypocrisy demonstrated by the Republican Party serves only to weaken and fracture the nation they swore to protect. Republicans claim to admire the German social market economy, yet, when any legislation proposed in the United States even resembles that of the German system, conservative Republicans ardently refuse to support it. The stubborn behavior of some in the Republican Party has stymied economic legislation to fund government social programs. This party no longer acts in favor of the American people; they act counterproductively, stymieing economic development instead of enhancing growth.
Ari Moses is a freshman in the Olin Business School. He can be reached at email@example.com.
Looking Beyond the Turban Shivani Desai | Illustration by Gretchen Oldelm
t is incredible how much power and meaning one piece of fabric can hold. For those who adhere to the Sikh religion, a turban (dastar) is a symbol of devotion and spirituality. It is a connection to the roots and history of the Sikh people and a way for them to portray their faith to the world. However, somewhere between this beautiful meaning and the American perception, something has splintered, leaving the “land of the free” deeply and inherently flawed. Imagine walking near Central Park one warm autumn evening, when suddenly you see a group of leering men and hear the words “Osama,” “terrorist,” and “get him.” And by him, they mean you. Lying on the ground with a fractured jaw, bruises, and several other injuries, you feel helpless and shocked. This is not the community that you know. Lady Liberty’s homeland has become a crossfire of hate crimes and intolerance. Now, picture this: you have been forced to appear in court because you encountered prejudiced police officers who took one look at you, wrote you a ticket, and called you depraved. To make matters worse, as you walk into the pristine courtroom - the embodiment of American justice - you hear the words, “take that rag off of your head or get out.” As you exit, the words echo and crash repeatedly. This is not what freedom of religion means to you. Unfortunately these bleak scenarios are not just “what if ” situations. They are all too real. On September 21, 2013, Columbia University Professor Prabhjot Singh was brutally attacked by a group of men. On September 27, 2013, Jagjeet Singh was ordered to remove the “rag” from his head or leave the courtroom. These realities are sobering enough on their own, yet they are only indicative of a larger pattern of hate-crimes, violence, and intolerance toward those of the Sikh faith. In the last twelve years, news of ruthless violence against Sikhs has cropped up periodically in the headlines and captured the American consciousness, only for the issue to quickly disappear back into obscurity until the next hate crime occurs. On December 12, 2001, Surinder Singh Sidhi was beaten with a metal pole, by two men who repeatedly screamed, “We’ll kill Bin Laden today.” On March 14, 2004, a Sikh temple was defaced with the message “Rags Go Home.” And on August 5, 2012, a shooting occurred
at a Wisconsin Sikh temple, killing six, injuring more, and proving to be the worst attack on an American place of worship since the
As you walk into the pristine courtroom — the embodiment of American justice — you hear the words, “take that rag off of your head or get out.” 1963 Sixteenth Street Baptist Church bombing in Birmingham. Since then, Sikhs have been murdered, assaulted, and injured on numerous occasions, all with similar sentiments hurled at
them. Surveys conducted by the Sikh Coalition have found that three out of four male Sikh students have faced scorn and harassment based on their religious identity. Nine percent of Sikhs in New York and ten percent in San Francisco have experienced hate crimes and violence on account of their religion. According to Kirtan-Singh Khalsa, a spokesperson for an international council on Sikh affairs, “Sikhs are accustomed to ridicule because of their turbans.” Examining the data, it truly seems that the physical appearance of religion, the turban, lies at the heart of the problem. Many people seem to associate the turban with labels such as “terrorist” and “foreigner.” The hateful see it and jump to assumptions and conclusions; they demonize the turban, unable to look beyond the object to the person underneath it. The truly ironic thing is that the Sikh religion rests on a central tenet of peace. Sikhism also relies on the ideas of naam japna, recognizing a uniting force between all human beings, vand chakna, giving back to others and the community, and kirat karni, earning an honest and just living. It is a beautiful religion that gives its followers strength, direction, and purpose. So why is there so much intolerance for it? Many people just do not know or truly understand Sikhism; they see something different, and it scares them. But that is not an excuse. The bottom line is this: if people act based off of preconceived, incorrect notions, fear, and the appearance of a few yards of fabric, then we are not living up to the ideals of our founding. Freedom of religion is guaranteed to us under the First Amendment, but such written protection is meaningless if the people who should be upholding it are too busy committing injustices and perpetuating violence. A concerning pattern is emerging. Aggression towards Sikhs is occurring over and over again, rather than as a series of anomalous cases. It is becoming more than an individual hate crime; it is becoming a culture of hate crimes. And America, the land of the free, America, the nation that embraces all diverse peoples, cannot afford to accept or tolerate a subculture of hatred. Shivani Desai is a freshman in the College of Arts & Sciences. She can be reached at firstname.lastname@example.org.
A Medical Emergency: When States Opt Out of Providing Care Nathaniel Thomas ten percent, Republicans claim, is too much. It is true that funding ten percent of Medicaid in states like Mississippi would require a hefty sum of money, but the benefits far outweigh the costs. The bulk of these benefits come in the form of federal money that goes to the previously uninsured. A study of fourteen states by the Rand Corporation, a think tank, found that by forgoing Medicaid expansion the states are effectively losing $8.4 billion of Medicaid funding over three years and assuming roughly $1 billion of costs for those who currently provide healthcare to the indigent. The same study estimates that the cost of expansions is less than the cost of assuming uncompensated care, which is the cost that the hospitals must assume if their patrons do not have health insurance and cannot pay for their healthcare. This Medicaid expansion is a win-win for states because it increases medical coverage and reduces costs. In Missouri, for example, Medicare would cover 800,000 additional people. But increased medical care would not only be beneficial to our surrounding area—it would help nationwide. Ultimately then, the claims of the politicians who attempt to stop the expansion of Medicaid make little sense because the economic and health benefits far outweigh the costs. Mainly, their
his past May, the Republican-controlled legislature in Missouri voted to opt out of the federal government’s upcoming Medicaid expansion. This decision is not unique to Missouri—25 other states have opted out as well, claiming that the increased tax burden of a Medicaid expansion outweighs its benefits. The majority of states that rejected Medicaid are poor, southern states with large numbers of people who need better coverage and who would benefit from the expansion most. In fact, the choice not to expand will preclude over half of the nation’s lowwage workers and roughly two-thirds of poor blacks and single mothers from obtaining adequate health insurance. So why are the states that need Medicaid the most the ones opting out? The answer revolves around the conservative politicians who have been largely responsible for garnering opposition to Obamacare: essentially, they claim that an expansion of Medicaid would not be cost effective, that the increase in taxes would outweigh the benefit of providing millions with health insurance. This argument cropped up many times in the states’ discussions about Medicaid, including Missouri’s, where Republicans claim that coverage is generally good and that the tax burden of Medicaid would hurt poor families more than it would help them. In some states, this argument is almost justifiable. In Mississippi, for example, under the Medicaid expansion, one-third of the state’s population would be eligible for coverage according to The New York Times. Republicans claim that this would drastically increase taxes, which is true. However, for the first three years of the Medicaid expansion the federal government would fund it in its entirety, and after those three years, states would only have to fund ten percent. This
Ultimately then, the claims of the politicians who attempt to stop the expansion of Medicaid make little sense because the economic and health benefits far outweigh the costs. objections to Medicaid are symbolic. They see the federal government’s extension of Medicaid as fundamentally bad because it raises taxes and increases the size of the government. It is inevitable that expanding government-subsidized health insurance programs will increase the size of the government, so if politicians wish to make symbolic gestures or profess their disapproval of Obama they should choose a different method. At the end of the day, sacrificing peoples’ health insurance for a political agenda is not justifiable because the politicians in Washington aren’t losing out—in the end, the losers are the poor who cannot afford health insurance, the constituents of the politicians who oppose expanding Medicaid.
Nathaniel Thomas is a freshman in the College of Arts & Sciences. He can be reached at email@example.com.
Is Texas Finally Feeling Blue? Alex Beaulieu | Illustration by Zach Rouse
o one said it was possible. No one believed anyone could accomplish such a feat. Even the rather liberal New York Times declared that, “turning Texas blue - or even purple - is going to be a lot harder than most folks imagine.” But has the time finally arrived? Can one state senator turn the Lone Star state blue and consequently transform Texan politics? Leading up to the 2014 gubernatorial election, Wendy Davis has a long road ahead of her. But numerous factors indicate she can do exactly that. Davis attained national recognition after her filibuster of a controversial anti-abortion bill. Sporting pink sneakers, Davis stood in front of the Texas State Senate for 11 hours voicing her concerns over a bill that would essentially eliminate most abortion clinics throughout Texas and prevent abortions altogether after the twentieth week of pregnancy. Although the bill eventually passed, it was too late: Wendy Davis became a national star. When Governor and former presidential candidate Rick Perry announced he would not seek a fourth term, voices throughout the country cried out for a Davis candidacy. And although she originally appeared hesitant, Wendy Davis eventually announced her run for governor in early October. Although the odds are against her- no Democrat has occupied the Governor’s mansion since 1995 - the state senator possesses favorable traits that will make this race interesting. A common argument cited against Da-
vis’s electoral prospects is that her fame may be recognized nationally but does not resonate on the statewide level. But according to a poll conducted by Public Policy Polling, 68 percent of Texans have heard of the state senator. The same poll also found that 39 percent hold a favorable opinion of her, while 29 percent have an unfavorable opinion. With an entire year before Election Day, posting those kinds of numbers in a heavily conservative state is impressive. And the nu-
She epitomizes the roots upon which this country was founded: the American Dream. merous endorsements from popular interest groups, such as EMILY’s List, will certainly improve local name recognition. Also, Davis’s likely contender, Texas Attorney General Greg Abbott, only leads the state senator by a margin of 8 percent, with half of Texan voters undecided. This is especially alarming for the relatively well-known Abbott considering Davis has been widely known for only a few months. Another observation skeptics highlight is Abbott’s seemingly insuperable fundraising scheme. Although the attorney general has already collected $20 million, Wendy Davis
will likely continue to rake in big donations thanks to her impressive national recognition. Most likely, these two candidates will be pretty even in fundraising. In the past ten years, Rick Perry collected more than $100 million in contributions for his gubernatorial elections. So, having an equal financial footing will significantly boost Davis’s chances compared to previous Democratic nominees, who simply could not match Perry. Another factor favorable to Davis is Texas’s changing demographics. With Hispanics continuing to immigrate to Texas, the demographics this election are expected to be quite different than during the 2010 gubernatorial race. In fact, projections indicate that Hispanics may come close to equaling Caucasians by the 2014 election. This statistic is important because about 60 percent of Texas Hispanics vote Democratic. Moreover, throughout his political career, Abbott has isolated minorities through his controversial rhetoric. Earlier this year, Abbott claimed that “without voter fraud, Obamacare would not exist,” essentially placing the blame on minorities, who, by voting fraudulently, were able to elect Democrats to pass Obamacare. Such bombast may not just increase the percentage of Hispanics voting Democratic, but could also galvanize many who otherwise would not have voted. Although these previous factors are significant, the most important element in a possible Davis victory is the candidate herself. Few Texas Democrats have achieved a following comparable to Davis’s so early in their political careers. With a Harvard law degree and extensive experience in the field, she clearly possesses the aptitude for the job. But more importantly, she epitomizes the roots upon which this country was founded: the American Dream. Growing up poor and living in a trailer park as a single mom at the age of 19, Davis’s story of endurance and resilience has already inspired thousands. Her story will continue to motivate and attract those who would not typically vote for a Democrat. With the right coalition, cogent messaging, and unrelenting determination, Wendy Davis will single-handedly turn Texas blue. Alex Beaulieu is a freshman in the Olin Business School. He can be reached at firstname.lastname@example.org.
Hassan Rouhani: A Sheep or a Wolf? Gabe Davis
hen Israeli Prime Minister Binyamin Netanyahu took the stage to address the most recent UN General Assembly in New York, he was not a happy man. Iranian President Hassan Rouhani’s friendly media blitz had inspired optimism regarding a more moderate Iran, but Netanyahu was intent on reminding us with whom exactly we were dealing. He lambasted the Iranian president, referring to his recent jovial tactics as a “charm offensive,” which should be regarded as “a ruse.” Netanyahu proceeded to claim that, “I wish I could believe Rouhani… [but I don’t].” Aware that Rouhani’s smile-filled interviews and catchy tweets were distinguishing him from former President Mahmoud Ahmadinejad, his inflammatory predecessor, the Israeli PM stated that, “the only difference between [Ahmadinejad and Rouhani] is this: Ahmadinejad was a wolf in wolf ’s clothing, Rouhani is a wolf in sheep’s clothing.” Though Rouhani insisted in his own speech that Iran simply desired a peaceful nuclear-energy program, Netanyahu bluntly concluded, “Iran is developing nuclear weapons,” and urged President Obama not to soften economic sanctions against Iran, but rather to tighten them. Many criticized Netanyahu’s speech, arguing that it had an air of unwarranted desperation and negativity, which undermined the potential for diplomatic efforts. The fact is, no one appreciates a killjoy. It is no surprise that the prospect of a levelheaded, in-touch, moderate Iranian president caused such optimism. One can picture Obama’s mouth watering as he envisions a legacy-defining deal that stifles Iran’s nuclear weapon prospects while improving US-Iranian relations. Perhaps it was this vision that enticed Obama to break the United States’ thirty year hiatus of communications with Iran when he reached out to Rouhani by phone and engaged in a cordial and positive conversation with him. Though it is easy to eat up Rouhani’s tantalizing rhetoric, and though we want so badly to see a stable, democratic Iran, Netanyahu’s angry warning is more than warranted, both in tone and content. We cannot forget that only a year ago, former President Mahmoud Ahmadinejad infamously urged world forces to annihilate Israel. Rouhani was voted into office by an electorate calling for a more moderate Iran, but the current supreme leader of Iran, Ayatollah Khomeini, held the same po-
sition throughout Ahmadinejad’s hostile rule. The Ayatollah holds a higher office than the President, and their relationship is somewhat murky. We must be wary of the Ayatollah’s own agenda working against Rouhani’s, as he
We cannot forget that though Rouhani may represent a more moderate Iran, he will have to do a lot more than smile and wave before Israel even discusses alternatives to sanctions and military action. Real action is needed. too has a stake in Iran’s policy decisions and has yet to show how moderate he will be with a new president. Considering the uncertainty of the situation, simply talking about moderation should not warrant the amount of public optimism from President Obama that it has. This seems especially true when we consider that the economic sanctions the United States has in place have been crippling for Iran, making life near-
ly unbearable for many of its citizens. According to a poll conducted by Gallup, 85 percent of Iranians say that the sanctions have hurt Iranian citizens and 31 percent claim they have personally suffered as a result of sanctions. Clearly, Iran has little leverage; as long as the sanctions remain in place, the country and its citizens remain in economic shackles. Israel’s PM would argue that with Iran’s back so against the wall, now is the time to push Rouhani to make substantive policy changes, e.g. an agreement to decrease Iran’s uranium enriching capabilities. Unlike Obama and other world leaders, Netanyahu cannot afford to turn immediately towards diplomacy. If Iran obtains a nuclear weapon, it is Israel that most fears the threat of instant annihilation. Netanyahu likely desires a diplomatic resolution somewhere down the road, but he must maintain Israel’s hawkish tone until Iran proves itself truly moderate. With so much at stake, Netanyahu does not mind being the spoiler of any fuzzy diplomatic feelings. If the Israeli leader had aimed to make Obama and other world leaders reconsider that Rouhani was crafting an illusion of peace, he certainly succeeded. After Netanyahu’s speech, Obama met with the Israeli PM and reassured him that the United States has an “unshakeable bond” with the Israeli people, adding that he would “take no options off the table, including military action,” in regards to the Iranian nuclear question. Netanyahu’s speech was a necessary reminder that we are still dealing with Iran. We cannot forget that though Rouhani may represent a more moderate Iran, he will have to do a lot more than smile and wave before Israel even discusses alternatives to sanctions and military action. Real action is needed. If Rouhani genuinely cares for his people and has no interest in a nuclear weapon, then an agreement in which Iran agrees to decrease its uranium enrichment capabilities to 20 percent, the level necessary for energy production, should be entirely realistic. Until action is taken, however, President Obama should be as hesitant as Prime Minister Netanyahu in trusting President Rouhani. Only time will tell if the new Iranian leader is a sheep or a wolf. Gabe Davis is a freshman in the College of Arts & Sciences. He can be reached at email@example.com.
Give ‘Em Hell Nina Hugh Dunkley, Jr.
hough most of America is enjoying its 2013 electoral off year, some will be looking forward to next year and the sumptuous federal and state elections of 2014. Intriguing Senate races from that of Mitch McConnell, the Republican leader in the Senate, to that of Texas gubernatorial candidate Wendy Davis, make 2014 a year to watch. But instead of focusing on McConnell, Davis, or any other nationally known politicians, it would be prudent to look out for the woman who could be one of the biggest game changers in 2014: Ohio secretary of state candidate and current state senator Nina Turner. Turner could easily be compared to another motivational Midwesterner: brassy, bold, and swaggering president Harry Truman. With a similarly combative speaking style that could be summed up by the legendary populist saying, “raise less corn and more hell,” like Truman, Turner seeks both to aid her increasingly minority voter base and to serve as a voice for the working class. In the same way Truman was able to become the patron saint of Missouri and to energize the Midwest, now Nina Turner is attempting to energize her home city of Cleveland and her state of Ohio. Unlike Truman though, who focused on the “Do Nothing” 80th US Congress, Turner focuses on current Ohio Secretary of State John Husted, whom she deems the “secretary of suppression.” Turner, with multiple guest appearances on MSNBC and endorsements from both the Ohio Democratic Party as well as Democratic National Chairwoman Debbie Wasserman-Schultz, is an obvious candidate that Democrats want to highlight in 2014. Further, a 2014 victory for Nina Turner would be reminiscent of Harry Truman’s come-from-behind victory in the 1948 presidential race. If elected, Nina Turner would become the first black Democratic candidate to hold a statewide office in Ohio. Currently, all six statewide offices are held by Republicans. Similarly, just as Nina Turner has to compete next year within a Republican controlled state government, so too did Harry Truman in his 1948 reelection bid have to contend with his Republican opponent, Thomas E. Dewey, while facing a Congress where both houses were dominated by Republicans. And, while the position that Turner is vying for may, at the outset, make this race seem not as important to national politics as other 2014 races, we have seen several cases where clearing up contested votes and voting laws in Ohio have made a difference. A recent example of the importance of Ohio’s Secretary of State was in early 2012, when Secretary of State Jon Husted shortened voting hours and strengthened voter I.D. laws,
no doubt influencing the outcome of the elections that day. Jon Husted’s previous decisions may be just enough to put Turner over the top next November. If Turner, like Truman, uses her rhetorical skills to exploit the chinks in her opponent’s armor, by continually highlighting Husted’s voting law changes, while simultaneously catering to working and middle class voters, she could be successful next November. In addition to benefits for her party at the national and state levels, Turner could also use her position to help her gain a greater office in Ohio politics. While a Senate or gubernatorial bid for Nina Turner may be years away, let’s look at the facts as they currently stand: 1) Current Ohio Senator Sherrod Brown served as
With a similarly combative speaking style that could be summed up by the legendary populist saying “raise less corn and more hell,” like Truman, Turner seeks to aid both her increasingly minority voter base, while at the same time serving as a voice for the working class. Ohio’s secretary of state before reaching Congress, and 2) Assuming that Governor John Kasich is reelected in 2014, his term ends in 2018. In 2018, Ohio will again be a bellweather state despite eight straight years of Republican governance. If Nina Turner, what with her rambunctious spirit, polished speaking skills, and colorful personality, ran for governor after serving a term as Secretary of State, she could have a plausible chance at being elected. So, what exact campaign strategies could Nina Turner use to win her 2014 election? For starters, Turner could implement Barack Obama’s successful campaign strategy of the 2012 election: win big in the swing counties of Franklin and Cuyahoga Counties, while simultaneously finding some way to wrest control of Hamilton County from its current Republican establishment. Nina Turner may be the underdog throughout her entire election, facing an obstructive Republican political establishment regardless, if Turner, like Harry Truman, can successfully rile up her base while at the same time attracting Ohio’s independent voters, then she may in the long run still be victorious.
Hugh Dunkley, Jr. is a freshman in the College of Arts & Sciences. He can be reached at firstname.lastname@example.org.
What’s in a Name? Grace Portelance
hink back to the last election you voted in- it may have been student government; it may have been for your state government. Odds are, you found yourself staring at two or more unfamiliar names, and you made an arbitrary choice, picking the candidate who “seemed the best.” Unless you closely followed every race on the ballot, you had no information whatsoever about the candidates and made a snap judgment. When little other information is available, the results of our subconscious biases can come out in a way that seriously alters political outcomes. There are hundreds of elections voted in by people with less than perfect information, yet something informs their choices. Is it the perceived race of the candidates? Consider the 2012 Washington State Supreme Court elections. In this race a well-qualified incumbent, Steve Gonzalez, was heavily endorsed for re-election by state officials and the legal community because of his exemplary work and extensive experience. His opponent, Bruce Danielson, did not campaign and was widely seen as extremely unqualified. The majority of voters likely knew only the names of the candidates; no information was distributed officially and the race was relatively low-profile. Despite all this, one would assume that such an unbalanced race in terms of qualifications would yield a clear result for Gonzalez. Shockingly, 30 out of 39 Washington counties voted for Danielson, who garnered 42 percent of the vote. So what’s in a name? The concept that an unqualified unknown with an Anglo name could come so close to beating an exceptionally qualified incumbent with a Latino name is troubling to say the least. The effects of a name are not lost on candidates. Loretta Brixey, a California politician, upon losing a local race as a Republican, switched parties and names to become Democrat Loretta Sanchez (her maiden name) and won an election to represent a heavily Latino
Name perceptions can be difficult to shake. President Obama’s middle name, Hussein, while an extremely common Arabic name, also carries a heavy political load because of recent conflict in its region of origin. The idea that people may feel that simply because our president shares a name with Saddam Hussein, he is unfit to be president, anti-American, or even a terrorist speaks volumes. On a similar (yet lighter) note, former presidential candidate Mike Huckabee essentially became the butt of every late night joke during his brief run because of his allegedly ridiculous sounding name. According to Conan O’Brien, “[T]he only way it could be worse is if his name was George W. Huckabee.” Voters look for strength in a leader, so a weak and silly sounding name can be very damaging -- after all, a candidate who is ridiculed by popular media figures hardly projects a leaderlike image to the nation. Names can be molded like any other facet of political life to achieve an intended image, yet the negative implications of name politics are severe. Whether it involves a name being criticized for sounding ethnic, weak, or anti-American, a whole lot of judgments are being made on an arbitrary and inaccurate basis. So what’s really
Name politics seem to be a tool to be manipulated just as appearance, diction, and message are. part of the state. These kinds of changes are not made arbitrarily, as name politics are a tool in contemporary politics to be manipulated just as appearance, diction, and message are. Our judgments based on names extend far beyond simple race comparisons. Presidential candidate Hillary Clinton frequently used name changes to her advantage--while she began her career in the public eye as Hillary Rodham, she altered her name to fit her new roles as her public profile changed. When her husband ran for president, she became Mrs. Bill Clinton, a family woman that could fill the traditional role of First Lady. This simple adoption of her husband’s name has cultural meaning that resonates with voters who have a certain image in mind of whom they want in that role. Clinton’s name games continued when Mrs. Bill Clinton became Hillary Rodham Clinton as she gained more political power; by keeping both names she projected both a family persona and an independent, feminist attitude. These changes are considerable when you realize that Hillary initially wished to keep her maiden name (Rodham) because it “ showed I was still me.” It appears sense of self is not what sells in politics today.
By Andrew Teman [CC BY-NC 2.0]
in a name? Names hold our heritage and our history. They cannot adequately convey someone’s ability to represent his or her constituents, and they cannot be used to decipher values or beliefs. Name politics seems to be a part of a larger trend of identity politics. In times of intense media scrutiny and mass misinformation, the focus has shifted and those looking to be elected have reacted in kind, zeroing in on how public persona can be changed to help election outcomes. Politicians do this because it works. At the end of the day, a politician’s defeat or success could hinge on his phrasing something the right way, wearing the right outfit, conveying the right sense of authority. Or, perhaps, there is a chance it could just be his having the right name.
Grace Portelance is a freshman in the School of Arts & Sciences. She can be reached at email@example.com.
What Weâ€™re Losing
What We’re Losing
Goodbye to the Porkpie: The Decline of American Hat Culture Gabe Rubin| Illustration by Alex Chiu
ohn F. Kennedy showed up at his inauguration wearing a fedora. But sometime between his arrival at the Capitol and “ask what you can do for your country,” he left the hat by the wayside: a new-generation president discarding years of tradition and announcing a dark new day for American headgear. At least, that’s the simplistic answer for why American men stopped wearing hats. But like so much else about the Camelot myth, Kennedy’s influence on the decline of the fedora (and its younger sibling, the slightly smaller Trilby) is far overblown. The winds of change that swept the nation during the 1960s blew off plenty of hats in the process, and Kennedy had nothing to do with it. The fedora and the Trilby revolutionized American style around the turn of the century, with nearly every man wearing one or the other by the 1920s. Photos from public gatherings in the first half of the century invariably show men clad in the gray or brown brimmed hats. Though men in hats were ubiquitous, the image of certain men in hats became the indelible images of the
men in the Depression dug ditches for the Civil Works Administration, breaking their backs through New York winters. They wore the only clothes they owned: suits and hats. Caro describes the sight of the men as both comical and profoundly heart wrenching; the suits and hats represented a dwindling hope for dignified work that was nowhere to be found. Following the war, returning doughboys exchanged their helmets for fedoras. Many American cities in the 1950s reached their all-time population peaks while the roaring US export economy supported millions of industrial jobs in urban areas. But automobile purchases also soared during the ‘50s, and the construction of interstate highways
Frank Sinatra cocked his Trilby back at an angle, saxophonist Lester Young pulled his porkpie low, Al Capone’s ravenous eyes glared out from under his fedora. era. Frank Sinatra cocked his Trilby back at an angle; saxophonist Lester Young pulled his porkpie low; Al Capone’s ravenous eyes glared out from under his fedora. J. Robert Oppenheimer, the physicist who led the Manhattan Project, became so identified with his porkpie that when Physics Today put him on its cover in 1948 they simply used a photograph of his hat resting on a pressure valve. But even beyond gangsters and musicians, few men would leave their homes with their heads uncovered. The prevailing style of the era dictated formality, and no man was ever fully dressed without a jacket, a tie, and a hat. Even in the depths of the Great Depression this uniform prevailed. Historian Robert Caro recounts how chronically unemployed
and the passage of suburb-friendly federal housing policies began a marked change for inner cities. Commuting to work became increasingly commonplace. Though popular images of the time would seem to suggest otherwise, the ‘50s car boom signaled the coming decline of American hat culture. Men wore hats on their way to work largely because they were outdoors (walking or taking mass transit) and in public. They didn’t need to wear hats in self-contained automobiles. And if they neglected to don a Trilby in the morning, they were unlikely to wear one later in the day, either. The counter-culture of the 1960s finished what car culture had started. Vietnam made short hair a symbol of the militaristic
American establishment, so innumerable young men chose to let their free flags fly. The sullen upper-echelon of Washington continued to wear hats, and the young men at risk of being sent to their deaths had no interest in mimicking the establishment’s sense of style. (Ironically, Defense Secretary Robert McNamara, one of the chief architects of the Vietnam quagmire, had also been a top executive at the Ford Motor Company- so forget Kennedy; McNamara clearly wanted to kill the hat.) Beyond the war, the eruption of ethnic pride in the late ‘60s further splintered American popular style. Groups like the Black Panther Party and the Nation of Islam had their own sense of style, wearing black berets and bowties as their respective trademarks. Hair became a powerful symbol of self-expression as well, and few have ever managed to wear a fedora over an Afro. The brimmed hat had become a symbol of white patriarchy, and its rejection became inevitable. By the 1970s, the fedora and its cousins had gone into deep hibernation. In recent years they have begun to reemerge, largely due to the popularity of gangster movies and the importance of irony in hipster culture. Like the mustache and coke-bottle glasses, hipsters can’t seem to resist using a former symbol of the white cultural hegemony to show their individuality. Yet the hat won’t return to its perch atop the collective heads of American men- car culture is too entrenched- and the slide toward informality since the end of the 1950s has only continued to progress. The fedora died in the ‘60s. Just don’t give Kennedy the credit. Gabriel Rubin is a junior in the College of Arts & Sciences. He can be reached at firstname.lastname@example.org.
What We’re Losing
In God we Doubt Jared Turkus
ongress shall make no law respecting an establishment of religion, or prohibiting the free exercise thereof.” While the United States was founded on this two-fold principle of freedom of religion and freedom from religion, it did not keep the religious right from gaining influence over public policy during the early years of the Cold War. “Under God” was added to the Pledge of Allegiance in 1954, and “In God We Trust” became the US official motto in 1956 and has appeared on Treasury notes since 1957. In the 1950s, this trend towards religiosity represented a strategic response to Soviet secularism; however, six decades later, and nearly a quarter century after the fall of the Soviet Union, religion continues to pervade our national politics. Self-described atheists hardly ever run for Congress, and none has ever actively sought the White House, and the current climate hardly encourages a break from this pattern. According to a 2012 Rasmussen poll, 48 percent of voters rank a candidate’s faith (or lack thereof) as “important to their vote.” Given that 64 percent of Americans believe in Jesus Christ’s resurrection, according to another Rasmussen poll, professions of atheism and agnosticism would put most candidates at a distinct political disadvantage. Many candidates even avoid the topic of evolution, as nearly half of Americans claim belief in intelligent design. Judeo-Christian norms and institutions are deeply entrenched at the state and national level. The religious right has successfully kept gay conversion ‘therapy’ legal in forty-eight states and same-sex marriage illegal in thirty-six. The oath of office for the Vice President, members of the House, Senate, and executive Cabinet, and all military officers and federal employees except the President often concludes with the words, “So help me God.” Congress has made a mockery of our First Amendment and violated its mandate by establishing a legislative framework that favors Judeo-Christian institutions. Somewhat paradoxically, despite this religious framework, cultural attitudes towards abortion, same sex marriage, women’s equality, and the role of science in classrooms are moving rapidly to the left. According to Pew Research, 16.1 percent of Americans now list their religious preference as ‘none,’ an all-time high. This growing secularity is starting to influence public policy outcomes on both national and state levels. According to a CBS News Poll, 55 percent of Americans now support legalizing same-sex marriage. Earlier this year, the Supreme Court struck down a key provision of the Defense of Marriage Act, allowing gay couples to access federal tax benefits available to heterosexual couples. All of the fourteen states that have legalized gay marriage did so in the last nine years. Similar liberalization has occurred in the realm of women’s rights. According to a recent CNN/ORC poll, 79 percent of Americans believe abortion should be legal in all or some circumstances. As more women began to enter the workforce and traditional family values began to erode, issues of gender discrimination rose to national prominence. The gender gap in wages - estimated for a woman to be seventy-seven cents to a man’s dollar by the Institute for Women’s Policy Research - stems from domestic norms of a patriarchal family structure that finds its roots in religion, but politicians are taking steps to address this disparity. In 2009, President Obama signed the Lilly Ledbetter Fair Pay Act, which banned gender pay discrimination and eased entry into the workforce for women.
The influence of intelligent design in education is also waning. The 2005 Dover ruling banned teaching intelligent design in Pennsylvania’s public school science classrooms. Gallup found that 61 percent of Americans now believe evolution should be taught in public schools, compared to 54 percent who believe schools should teach creationism. While these data point to an overlap between people who believe in evolution and intelligent design, they also show that
By Kevin Dooley [CC BY 2.0]
Congress has made a mockery of our First Amendment and violated its mandate by establishing a legislative framework that favors Judeo-Christian institutions. Americans are still very much divided over the appropriate role of religion in the classroom. According to Pew Research, 25 percent of youths today identify as religiously unaffiliated, the highest percentage in any age group; teaching evolution as the primary theory of biological development is likely partially responsible for the trend towards secularity. If the rise of atheism and agnosticism continues, perhaps belief in God will not be a prerequisite for public office. Perhaps the word “God” will disappear from our currency, national anthem, oath for public office, and national motto. While religion maintains a strong grip on the American legal and political system, new legislation surrounding abortion, same-sex marriage, women in the workforce, and evolving attitudes on the role of science in public education suggest that this hold is slipping away.
Jared Turkus is a junior in the College of Arts & Sciences. He can be reached at email@example.com.
What We’re Losing
How Polarized is America, Really? Govin Vatsan | Illustration by Margaret Flately
e constantly hear that the United States is becoming too “polarized.” The news media point to increasing and uncompromising partisanship as the defining character of the American political experience, and the basis of a “Culture War.” But while pundits fixate on polarization as the root of our nation’s political ills, such sweeping arguments are built upon misconceptions—and misrepresentations—of our nation’s political landscape. In some respects, Americans are indeed becoming more polarized. According to a 2012 Gallup poll, the percent of Americans identifying as liberal and the percent identifying as conservative both jumped four percent between 1992 and 2011, while the percent identifying as moderate dropped eight percent over the same period. During this two-decade window, conservatives overtook moderates to become the largest ideological group in the United States comprising 40 percent of the population. But while these numbers demonstrate a trend towards increased polarity in our political system, an analysis of the breakdown of opinions on specific issues reveals that we may be oversimplifying this trend. Let’s break down the numbers. According to a 2013 Gallup poll, the percentage of self-identifying economic conservatives has actually decreased by 2 percent over the past decade, while the percentage of economic liberals has increased by the same fraction. The perceived moral acceptability of most social issues has increased over the last 12 years, notably for gay rights (19 percent increase), births out of marriage (15 percent increase), and stem cell research (8 percent increase). Additionally, support for the legalization of marijuana has increased by a dramatic 35 percent since 1991, from one in six supporting legalization to one in two. On the conservative side, however, support for increased gun control legislation has dropped almost 30 percent, and support for abortion, the death penalty, and doctorassisted suicide have all either decreased or remained unchanged. Regardless, these statistics cast doubt upon the common claim
that Americans, as an entire population, have gotten more economically or socially conservative in recent decades. So what drives the misconception about increasing polarization? First, although the overall ideological differences between liberals and conservatives may be small, the
The myth of a Culture War in America is ultimately manufactured by the symbiotic relationship between politics and sensationalized media coverage. difference between the Democratic and Republican parties has grown. Over the last two decades, the percentage of self-described Democrats has stayed roughly stable, the percentage of Republicans has decreased from 31 to 24 percent, and independents grew from 29 to 38 percent of the American population. Moreover, over the last decade, the percentage of both moderate/conservative Democrats and moderate/liberal Republicans has decreased, largely due to absorption into the growing class of independents. This loss of moderate voters gave rise to ideological realignment within both parties and
thus heightened partisanship. As a result, we are more likely to link conservatism with the Republicans and liberalism with the Democrats, leading to a conflation of the terms. It is clear, though, that Congress is getting more polarized. According to recent statistics, both the House and the Senate are now more polarized than in any previous point in history. Political scientists reason that this bifurcation is caused by primary elections that are increasingly small and ultimately unrepresentative of the national electorate. The average percentage of eligible citizens voting in primaries has been steadily decreasing, down to less than 19 percent in 2010. This is down from over 30 percent in the ‘50s and ‘60s. In comparison, over 37 percent of the eligible electorate voted during the 2010 national mid-term election, almost twice the percentage that voted in the primary. This creates a far more partisan voter base for primaries, which in turn affects the ideology of the candidates sent to the national election. Essentially, restrictive primaries give rise to extreme candidates. Consequently, in the general election the more moderate national electorate has to choose between candidates selected by the partisan few. Our elected officials thus hold views that are inconsistent with the majority of Americans, which results in a more polarized Congress. And when the media gets involved, polarization in Washington becomes confused with the polarization of the American populace. Political elites with strong and uncompromising party loyalties are able to manipulate average, under-informed American voters. Elites use the media as a tool through which to attack and accuse one another. Journalistic fixation on this back and forth then creates the illusion of widespread polarization. Ultimately, the American Culture War is not an organic outgrowth of ideological differences that divide ordinary Americans, but rather a myth manufactured by the symbiotic relationship between partisan politics and sensationalized media coverage.
Govin Vatsan is junior in the School of Engineering. He can be reached at gsvatsan@ wustl.edu.
What We’re Losing
The Plight of the Bees Moira Moynihan
uring my semester abroad in France, our program sent us on a short sejour to a small town along the Spanish border named Céret. A century earlier, the likes of Pablo Picasso, Henri Matisse, and Salvador Dalí had been drawn there by the allure of southern France and the vast countryside sprawling along the foothills of the Pyrenees. At the dawn of each printemps, the landscape would change with the blossom of the cerisies, a sure marker of the upcoming harvest crucial to the town’s agricultural economy. But in the space of a century, this process has become more uncertain each year with the seemingly inexplicable loss of bee populations crucial to the process of the pollination and bloom of these plants. My host mothers’ home sat immediately beside one of these cherry tree plots, and I was there just in time for the annual blossom. I spoke with the farmer who owned this and many other plots of land, and despite the beautiful backdrop of fields of trees in bloom, underlying our conversation was the harsh reality that M. Alexandre’s livelihood was dependent on cultivating crops that depended heavily on pollination, and that the drastic decline of bee populations affected him more and more each year. This narrative is not unfamiliar. In the past months, Time has run a front cover story on the disappearance of bees; articles have appeared in The New York Times and stories have run on NPR; even Whole Foods adopted the cause, asking its consumers to “Bee the Solution.” Yet, despite all the media attention and scientific study, the
Man’s seeming inability to weigh the long-term consequences of our actions against the gratification these provide in the short term may ultimately be the downfall of this lynchpin insect. loss of the world’s bee populations remains inexplicable. Experts are neither sure of the root of the problem nor how we should go about solving it. At present, scientists suspect that either parasites or pesticides bear the lion’s share of the blame. The Varroa Mite’s dominance as a parasite is a byproduct of global agricultural mechanisms. Its introduction into non-native environments has proven fatal for countless bee colonies. Their presence can devastate a colony, and they serve as a catalyst for viruses that prove fatal to bees. The beekeepers in Céret with whom I spoke reported having to burn entire colonies upon discovering these mites, accepting the harsh reality that the loss of one colony carried significantly less negative impact than a Varroa Mite infestation. In the past half a century, American bee populations have diminished between 40-50 percent according to a Yale study. Unsurprisingly, this fall in population is also acutely correlated with the agricultural world’s increased reliance on pesticides. While the early, extremely harsh pesticides used on plants have largely been phased out, a new class of pesticides, neonicotinoids, that were introduced in the 1990s have now become another suspect in the disappearance of
bees. Though thought to be safer for bees than the harsh pesticides of decades past, another sudden drop in the bee population in 2004 has caused activists to point to these neonics as the culprits. However, the data surrounding this claim is contradictory and confounding, since, though the US bee population has substantially diminished, Australia, Canada, and Europe, where the pesticide is also widely used, have not experienced the same drastic losses. Moreover, the beekeepers that I spoke to in southwestern France neither used pesticides nor experienced Varroa infestations regularly, yet they too would return to find entire hives dead or vanished, further underscoring just how little we know about their disappearance. The loss of bees would fundamentally alter the composition of our agricultural world (eliminating at least 1/3 of the foods we eat), according to the same Yale study. Moreover, because the cause of the loss of bees is still unknown, advocates struggle greatly to rally people behind the cause. Unfortunately our inability to understand the problem does not exclude us from its effects. In China, we see the results of inaction, as apple farmers now must pollinate their trees themselves by hand. Humans perform this work with enormous inefficiency compared to our bee counterparts, and it would be nearly impossible for the world to adopt this method and maintain its current levels of pollinated plant consumption. The outlook for bees is not optimistic. At the end of my interview with one of the French beekeepers, it was clear that, to him, our chances of ameliorating the problem are slim. He assured me that, “they will disappear; man has no discipline,” a sentiment that has stuck with me as the plight of the bees moves more and more into the public eye. Man’s seeming inability to weigh the long-term consequences of our actions against the gratification these provide in the short term may ultimately be the downfall of this lynchpin insect. As we consume more meat, demand cheaper produce, and create longer shelf lives, the burdens on our agricultural system to produce greater quantities with more efficiency cannot be met without consequence. Thus, our choices in the grocery store aisle matter enormously. Of course, these choices exist as a result of a certain amount of socioeconomic and educational privilege, so it is even more important that those who have these privileges take action. While we do not fully understand how we can quell the loss of crucial bee populations, our choosing to consume organic, sustainably grown products cannot hurt. The world must begin to make choices on behalf of bees, as it simply cannot afford not to. Moira Moynihan is a senior in the College of Arts & Sciences. She can be reached at firstname.lastname@example.org.
What We’re Losing
The Power of Paper Danny Steinberg
was recently going through various boxes, drawers, and bins around my house. Among the pictures of my three-year old self and old VHS tapes, I found one of the most powerful remnants of this millennium: the front page of The New York Times from September 12, 2001. I hadn’t known we still had that, and it took me more than a few minutes to fully reabsorb the banner headline and the pictures below. Aside from the memories I have about that day twelve years ago, what else could have stirred such vivid feelings? Perhaps it was the physical paper in my hands. Having something that powerful in my hands, being able to manipulate it, move it closer, fold it, turn it, all allowed me to enhance my experience, and I fear that with the decline of print news, the same intimacy I have been fortunate enough to share with events both local and global will be harder, even impossible, to attain in the future. Newspapers are woven into the fabric of American society. In any portrayal of the first forty-odd years of the twentieth century, one of the most common features is a young boy selling newspapers and yelling, “extra!” Perhaps the most iconic job for any child with a bike is a paper route. For many years, one of the hallmarks of big cities was a widely respected, well-read newspaper. I have even been instructed on the proper method of folding a paper for optimal convenience when on a subway. So what has happened to newspapers, that quintessentially American form of communicating information? From before the assassination of Abraham Lincoln to the present day, readers of newspapers have been captivated by the banner headlines that proclaim, with the most emphatic of tones, that which is most recent and most important. I remember my dad teaching a younger me how to look at the front page of a newspaper and tell which stories were more important. I remember asking, with wonder, if he had ever seen a banner headline. In a world where speed is the name of the game, a printed piece of paper that must be laid out, sent to the press, organized, wrapped up, and delivered takes a long time to get to its final destination. In its short existence, the Internet has done away with the delay between an event occurring and its being publicized. In addition, the number of sources from which one can get information has increased exponentially. Anyone with access to a computer or a smartphone can blog, tweet, or post anything they want to and have it instantly available to the entire connected world. Consider the Arab Spring; much of the initial information, as well as the organization of protests, was made public by average citizens, not seasoned reporters, and all these citizens needed was access to the Internet. Immediate, online news isn’t necessarily bad, but it doesn’t compare to news provided by a trained journalist. As much as the global availability of information is a good thing, this glut of information also comes at the cost of quality and reflection. It is all too easy to push thoughtful debate to the sideline while under pressure to get the story out before the competition. In contrast, a story’s life in the realm of paper consists of editing, revising, more editing, and more
revising. As a result of the high standards any well-respected publication holds itself to, nothing gets printed without passing through the strictest scrutiny. Paper media is certainly on the decline. Founded in March of 2007, NewspaperDeathWatch.com, a site that follows the declining newspaper industry, lists over ten American metropolitan newspapers that have folded over the last six years, as well as several others that have decreased the frequency of their publications. At the beginning of October they ran a small piece about how at the end of the year, Lloyd’s List, a publication dedicated to maritime shipping and the self-proclaimed world’s oldest daily newspaper, will cease producing its print edition and move entirely towards an online format. This shift is representative of how newspaper companies are reacting to the changing news delivery environment. As people increasingly own tablets and other mobile devices with which they can easily access the internet, the papers that adapt to the digital age fastest and most fluidly will have the best chance of surviving and maintaining their influ-
Having something that powerful in my hands, being able to manipulate it, move it closer, fold it, turn it, all allowed me to enhance my experience, and I fear that with the decline of print news, the same intimacy I have been fortunate enough to share with events both local and global will be harder, even impossible, to attain in the future. ence. However, even the most well designed, user-friendly layout will never compare to the feeling of thin paper between fingers, the sound of crinkling while flipping from page to page, and the minimal loading time from article to article.
Danny Steinberg is a senior in the College of Arts & Sciences. He can be reached at email@example.com.
What We’re Losing
Losing Hope and Losing Face Naomi Duru
merica: land of the free and home of the brave. On the international stage, we used to stand for something much bigger than ourselves. Countries sought our help in times of need, and we were usually quick to come to their aid. However, in a post-WWII world, the perception of America as an international Samaritan has been on the decline. Our failure to intervene in the Rwandan genocides that began in 1994 and our self-interested policies in the Middle East have raised questions about the moral compass by which the United States decides intervention or passivity. It is time to question whether US foreign policy does more harm than good—for Americans, and for citizens of the world. There was a time when US interventions consistently ushered in favorable results. In 1965, we invaded the Dominican Republic after the nation’s dictator was assassinated and replaced by Juan Bosch. Fearing communist control in the region, President Lyndon B. Johnson sent in over 20,000 troops to protect lives and property during a Dominican revolt that eventually lead to the breakout of a popular rebellion demanding reinstatement of country’s elected leader. The United States played a similar role in the 1960s when the Congo underwent a national crisis that killed 100,000 citizens. The United States didn’t intervene with full force, but it did send in 3 military transport aircrafts to provide logistical support to the central government during their time of need. Though these interventions were all conducted within a Cold War framework, they are representative of a period of successful US interventions designed to safeguard democratic institutions abroad. Unfortunately, later efforts in this vein were less successful and more fraught with unforeseen complications and inconsistencies that have tarnished the image of the US abroad. Abstention from action during the Rwandan genocide constituted a pivotal point in the history of US foreign policy. Unlike in the Holocaust, we knew the implications of violence in Rwanda before, during, and after the fact. However, the United States passed on an opportunity to mediate the situation as the genocide was just beginning, making a clear statement that the US would not intervene in issues we didn’t understand in countries peripheral to our national interests. Despite letters from Congress claiming support for American executive action, the US remained passive, and the killings continued. And yet, while we don’t fully understand the religious and regional complexities of the Middle East, we nonetheless invest tremendous national resources in the region. Rwanda wasn’t the first time America failed to intervene in response to clear human rights violations. In 1975, over 2 million people died from starvation, forced labor, and political executions under the Khmer Rouge in Cambodia. Despite the magnitude of the slaughter, driven by the Vietnamese invasion of Cambodia and the growing power of the Communist Party, the United States chose to forgo intervention. Some argue that the United States’ decision to abstain was colored by recent involvement in the Vietnam War. However, the US didn’t even use “soft power” to help address the atrocities. The United States could have spoken out, condemned such actions, and drawn attention on the international stage, but we didn’t even denounce the actions going on in Cambodia until 1978, three years later. Today, the United States is stepping up its involvements overseas, but it’s doing so in all the wrong ways. While American politicians publicize our actions in the Middle East as being part of our larger duty
as a world power, US interventions are motivated by US interests, not humanitarian concerns. But American politicians refuse to own up to this reality; the Arab Spring only heightened America’s efforts to save face overseas. The United States has been passive about issues of concern to the populations of Middle Eastern nations. These issues were overshadowed by our direct security interests, and the transparency of US motives has made America even more threatened and hated in the region than before. Unfortunately, the United States doesn’t seem to have learned its lesson in the Middle East, still preferring to secure oil flows and other security interests over engaging directly with the region’s people. Unsurprisingly, United States’ lowest approval ratings come from the Middle East and South Asia. For instance, drones have recently been pitched as the “more humane way” to conduct warfare, as they
The transparency of US motives has made America even more threatened and hated in the region than before. Unfortunately, the United States doesn’t seem to have learned its lesson in the Middle East, still preferring to secure oil flows and other security interests over engaging directly with the region’s people. allegedly target terrorists with greater precision and lower risk to US troops. This optimistic rhetoric is undercut by the true impact of drone warfare on perceptions of the United States in South Asia and the Gulf. Haykal Bafana, a Yemeni-Singaporean international lawyer, claims that US drones are perceived as “the boogeyman” in Yemen. Bafana questions the efficacy of drones in advancing US interests and security if the United States throws away its principles and morals along the way. Yes, the United States is now compared to the scary monster that hides under children’s beds. Recent US efforts in Egypt and Syria have only increased international skepticism of the American motives in the region. Back in 2011, the ousting of President Mubarak caused Egyptians to perceive Americans as intrusive in their domestic affairs. In 2008, a Pew poll showed that only 22 percent of Egyptians had a favorable view of the United States, down from 30 percent in 2006. Today American interventions that are out of step with stated American purposes have fueled American unpopularity abroad. The United States’ global leadership in recent years has been less than exceptional. We must align our actions with our national rhetoric—otherwise, the image of America as a moral and effective superpower will continue to fade in international memory. Naomi Duru is a freshman in the College of Arts & Sciences. She can be reached at firstname.lastname@example.org.
What We’re Losing
The Beginning of the End of the Middle: An S.O.S. from the Middle Class Hannah Waldman | Illustration by Simin Lim
t’s not that we don’t care, we just know that the fight ain’t fair, so we keep on waiting, waiting on the world to change,” croons John Mayer in what has become an anthem for the millennial generation. Let’s face it—as college students, we often shrug our shoulders at big issues, assuring ourselves that our nation’s problems are inevitable simply because they can feel too daunting to fully comprehend—much less to address through activism. After four years of “recovering” from the Great Recession, we still regard the subject of the United States’ economy with a tired acceptance that the rich are getting richer, the poor are getting poorer, and there is nothing we can do about it. The Occupy movement attempted to address this issue through superficial activism. However, by blaming Wall Street and emphasizing the role of the wealthy, the movement aligned its efforts with petty class warfare, overlooking the root cause behind our economy’s anemic recovery: the dwindling power of the middle-class. Why revitalize the middle class? In a new documentary entitled Inequality for All, former Secretary of Labor and University of California, Berkeley professor Robert Reich explains that middle class individuals comprise the core of consumer spending, yet earn far less than in the past, when adjusted for inflation. In a study conducted by the Economic Policy Institute, the years 1947-1979, known as the Great Prosperity, witnessed a 119 percent increase in wages. After this period, though, the value of goods appreciated and worker compensation stagnated, eventually falling below the 1979 peak. If the federal minimum wage followed the trend in average productivity since 1968, it would have reached $21.72 per hour by 2012—nearly triple the current $7.25. In order to prove that strengthening the middle class is not a zero-sum game, Inequality for All features an interview with Nick Hanauer, venture capitalist and CEO of the Pacific Coast Feather Company. Hanauer remarks that no matter how much money he has, he can only sleep on one pillow. Like countless other businesses, the Pacific Coast Feather Company relies upon the diffuse wealth of middle class consumers, not the
concentrated wealth of the upper class. According to the Industrial Production Index, the wealthiest Americans spend only 5 per-
However, as long as politicians continue to frame the wealthy as “job creators” entitled to tax exemptions, our government’s policies contradict basic principles of supply and demand. cent of their assets on commodities, investing the bulk of their earnings in hedge funds and private equity and thus contributing a smaller portion of their income to the economy than their middle and lower class counterparts. We can no longer consider the upper class the center of the economic universe. In the 2012 election campaigns, Romney and Obama both made sweeping statements promising their support to “Middle America”. However, as long as politicians continue to frame the wealthy as “job-creators” entitled to tax exemptions, our government’s policies contradict basic principles of supply
and demand. Large corporations, not unlike Hanauer’s, create jobs only if consumer demand—driven largely by the middle class— supports such growth. In order to address the backwardness of our current policies, Reich explains, we must shift the conversation surrounding the US economy by looking to different models of capitalism. During the Great Prosperity, wages reflected rising prices, the government incentivized education through the GI bill, and workers organized through unions that gave them power to influence their compensation. Unions, organized to protect the interests of the middle and lower classes, played a pivotal role in the health of the US economy. But unions today hold only a shadow of the power they had in previous decades. President Reagan’s public defeat of the Professional AirTraffic Controllers Organization, which instigated a nationwide assault on labor unions, marked the beginning of the end of the middle class, as they lost the collective bargaining power required to demand fair wages. While unionized workers made up over one-third of all laborers during the 1950s, today, the Bureau of Labor Statistics reported that this group comprised only 11.3 percent in 2012, down from 11.8 percent in 2011. Despite decreased union membership, unionized workers earn, on average, $943 dollars per week, whereas laborers not represented by unions receive an average of $742 per week. In Inequality for All, Reich calls for a return to the American dream by creating a society that encourages social mobility. Students can advance this initiative by supporting businesses that allow unionized labor, acting as advocates for tax reform, and informing themselves about the larger implications of income inequality. As students and voters, we must conceptualize ourselves as participants in, not spectators of, the greater economic system. No matter how complex the issue of economic inequality may seem, we as college students should start acting on it now. If everyone keeps waiting on the world to change, it never will. Hannah Waldman is a sophomore in the College of Arts & Sciences. She can be reached at email@example.com.
WASHINGTON UNIVERSITY’S WUSTL READER
TAKE A CLOSER LOOK
A new way to get top stories from across the university on your mobile device with our free app COMPLETE PUBLICATIONS
SHARE AND CONNECT
Download it today reader.wustl.edu
Talking Points 13 billion
Dollars JP Morgan paid to the US government as a settlement in its dispute over mortgage bond sales.
Civilians killed by US drone strikes in Pakistan in the past decade, according to an October UN report.
Dollars spent by Bishop FranzPeter Tebartz-van Elst renovating his residence in the Vatican and other church buildings; he has since been dismissed by Pope Francis.
Homes destroyed by forest fires in Australia over one week in October.
Projected end date of the Syrian Civil War, according to a Washington Post review.
Minimum number of people that have signed up for new healthcare plans amidst the glitch-plagued rollout of the Obamacare websites.
“Let’s be honest, we eavesdrop too. Everyone is listening to everyone else. But we don’t have the same means as the United States, which makes us jealous.”
“The next year, unlike previous years, is really the year of decision [on whether or not to bomb Iran].”
“It is a law to regulate protests, not to ban them.”
–Bernard Kouchner, former French foreign minister, on accusations of excessive US surveillance By World Economic Forum [CC BY-SA 2.0]
–Amos Yadlin, former head of the Israeli Military Intelligence Directorate
By Assaf Shilo, Israel Sun [CC BY-SA3.0]
–Hani Abdel Latif, Egyptian Interior Ministry Spokesman, on a proposed law that would ban public sit-ins and require protestors to register protests with the Interior Ministry, on pain of a harsh fine or up to three years imprisonment By Flickr user wisegie [CC BY 2.0]