Perceptions (Vol. 3, No. 1)

Page 1

PERCEPTIONS 3.1

Our Brave New World Published by the Temple University Undergraduate History & Social Sciences Association


PERCEPTIONS VOLUME 3, NUMBER 1 SPRING 2017

OUR BRAVE NEW WORLD

Š 2017 Temple University Undergraduate History & Social Sciences Association All Rights Reserved

Cover image: "Welcome To Our Brave New World" Collage on panel, 29 x 60 inches, 2008 Copyright Scott P. Ellis Image courtesy of Headbones Gallery Scott P. Ellis was born in Colborne, Ontario, 1970. Scott has chosen to regurgitate popular media and propaganda and serve up his own visual dish of worldly interpretation.


TABLE OF CONTENTS Letter from the Editor ...........................................................................................4

Credits ...........................................................................................................................5

Section I: Reflecting On Our Past .........................................................................6 Disability in Early Islamic Society & Medieval Europe .................................7 Chris Rumbough, Senior Political Science & African American Studies Major, History Minor Activity/Passivity in Dream of the Red Chamber: .......................................11 Homoerotic Sensibility & Sexual Hegemony in the Mid-Qing Dynasty GVGK Tang, Senior History & Sociology Major, LGBT Studies Minor Horace Pippin's Supper Time: .....................................................................14 A Reflection on Mid-Century Black American Life Gabriel Jermaine Vanlandingham-Dunn, Senior Africana Studies Major How Espionage Affected the Image .............................................................16 of the United States During the Cold War Joseph E. Swadlow, Junior Secondary Education & History Major “Mittler zwischen Hirn und Händen” (“Mediator Between ........................20 Brain & Hands”): Women and Proto-Fascist Ideology in Metropolis Evron Clay Hadly, Sophomore History & German Major, Anthropology Minor Jamaica: The Irish-Caribbean Connection .................................................25 Sabrina Wallace, Sophomore History Major, Political Science Minor

Section II: Understanding Our Present ..............................................................30 Intimacy & Sexuality for Adolescents & Adults ...........................................31 with Autism Spectrum Disorder Sydney Victoria Butler, Senior Human Development & Community Engagement Major


Discourse & Power: Comparative Analysis .................................................36 of 2016 Campaign Rhetoric Hazim Hardeman, Senior Strategic Communication Major, Sociology Minor Conflict Resolution & Reconciliation in ......................................................40 Post-Genocide Rwanda Nancy Dordal, Junior Anthropology Major, Spanish & Chemistry Minor The Flatiron Building: An American Icon ...................................................44 & the Original American Skyscraper William Kowalik, Junior American Studies Major, Art History Minor Going Nuts Over Betel-Chewing: How the Areca Nut .................................51 Has Started a Cultural War in South-East Asia Alexander Voisine, Junior Global Studies & Spanish Major

Section III: Envisioning Our Future ...................................................................56 The Evolution and Future of the Power ......................................................57 of the Federal Government of the United States Christopher Chau, Junior History & Political Science Major, French Minor The Meaning of Semen in Society, Pornography, & Reproduction ............60 Shali Pai, Junior Sociology of Health Major, Healthcare Management & Biology Minor Beyond the West ..........................................................................................65 Armon Fouladi, Junior Actuarial Science Major, History & International Business Administration Minor Marriage Equality: A Controversial and Ill-Considered Goal .....................70 Asia Kopcsandy, Sophmore Women's Studies Major, Sociology & French Minor How Colonial Rule and Nationalism Inform ..............................................74 the Understanding of Mali's Fragile State Amanda Morrison, Freshman Global Studies & Strategic Communication Major, Spanish Minor

Instructions for Submission & Contact ..................................................................79


LETTER FROM THE EDITOR Dear Reader, In this issue, Our Brave New World, we capture the current political climate. In a time of immense change and uncertainty, we continue to reflect on our past, seek to understand our present, and aspire to envision our shared future. While reading the following works, we hope you will too. After a seven-year hiatus, Perceptions undergraduate research journal is hot off the presses. We couldn’t have done it without our amazing volunteer editors and a pool of over two dozen submissions. The sixteen wonderful papers featured here cover a diverse array of topics, and have allowed us to come back in full force. In her keynote presentation at the 1981 National Women’s Studies Association Conference, Audre Lorde said, “When we turn from anger we turn from insight, saying we will accept only the designs already known, deadly and safely familiar.” These Temple students have challenged us to look beyond the scope of what is known and to seek meaning from perspectives and experiences outside of our own. Enjoy!

GVGK Tang Editor-in-Chief, Perceptions

4


CREDITS

Editor-in-Chief GVGK Tang Senior History & Sociology Major, LGBT Studies Minor Associate Editors Courtney DeFelice Senior History & Political Science Major Christopher Chau Junior History & Political Science Major, French Minor Armon Fouladi Junior Actuarial Science Major, History & International Business Administration Minor William Kowalik Junior American Studies Major, Art History Minor Assistant Editors Maggie Dekker Senior Sociology Major Eli Siegal Junior History Major, Geography & Urban Studies Minor Sabrina Wallace Sophomore History Major, Political Science Minor

5


I

REFLECTING ON OUR PAST Disability in Early Islamic Society & Medieval Europe Chris Rumbough, Senior Political Science & African American Studies Major, History Minor Activity/Passivity in Dream of the Red Chamber: Homoerotic Sensibility & Sexual Hegemony in the Mid-Qing Dynasty GVGK Tang, Senior History & Sociology Major, LGBT Studies Minor Horace Pippin's Supper Time: A Reflection on Mid-Century Black American Life Gabriel Jermaine Vanlandingham-Dunn, Senior Africana Studies Major

How Espionage Affected the Image of the United States During the Cold War Joseph E. Swadlow, Junior Secondary Education & History Major Mittler zwischen Hirn und Händen” (“Mediator Between Brain & Hands”): Women and Proto-Fascist Ideology in Metropolis Evron Clay Hadly, Sophomore History & German Major, Anthropology Minor Jamaica: The Irish-Caribbean Connection Sabrina Wallace, Sophomore History Major, Political Science Minor 6


Disability in Early Islamic Society & Medieval Europe Chris Rumbough

permanent and visible disabilities like amputations or blindness because of the incurability of their situation (Stiker 1999). In the Middle East, by contrast, Islamic jurisprudence mandated that disabled poor people were more deserving of aid than similarly-poor abled people (Ghaly 2010). This contrast is an example of how “charity” is not the universal and apolitical concept it sometimes is taken to be. In Europe, Christian culture generally considered aid for disabled people to be a means by which the rich providers of aid could attain salvation; despite internal criticism and attempts at reform, this attitude remained endemic to the culture throughout the Middle Ages (and, arguably, still exists in Western culture today) (Stiker 1999). In essence, then, charity in the European context could arguably be seen as a sort of extraction of labor – the recipient of charity provides salvation and, in return, is given material aid which acts as a sort of “wage” for the spiritual self-aggrandizement they are providing to the rich donor. The disabled or sick or unemployed person, then, is not outside the labor force so much as they are a culturally-specific service worker, providing spiritual aggrandizement in exchange for a wage of alms. The permanently, visibly disabled person who cannot be visibly/ performatively cured or bettered by alms is not capable of providing cultural/religious aggrandizement to the rich, and as such was not considered to have earned the wage of charity. In the Middle East, by contrast, Islamic jurisprudence considers the provision of aid to disabled people to be a communal or governmental duty – the modern jurist Yūsuf al-Qaraḍāwī argues that Muslim zakāt, the wealth designated for the poor, was the first organized system of state-provisioned welfare in the world (Ghaly 2010). Because it is considered a duty of society rather than a favor bestowed on the poor, its structure and function differ from the European analog; the disabled people who receive charity are not expected to provide aggrandizement in return and, as such, there is no cultural need for the recipient of zakāt to be visibly cured. This is an example of how seemingly apolitical notions such as “charity” have intertwined material-

Questions about the political and economic dimensions of disability as a social phenomenon are often neglected; usually, they are dismissed with vague, ahistorical assertions that the oppression and impoverishment of disabled people – referred to as ableism – is somehow an automatic or universal element of human societies. This is not founded in reality. While the oppression of disabled people is extremely old and widespread, it is neither universal nor inevitable. Rather, structures of ableism both influenced and were influenced by specific cultural, political, and economic contexts throughout the world. This paper investigates how the economic and cultural standing of disabled people in early Islamic/Middle Eastern society differed in comparison to that of disabled people in medieval Christian/ European society. While both civilizations were dominated by abled people and relegated disabled people to positions of dependence and poverty, Middle Eastern/Islamic social relations between the abled and disabled were more positive and accommodating than those present in European society. The general impoverishment of disabled people has been widespread in both Europe and the Middle East for their entire known histories; as such, most histories of disabled people tend to be histories of charity (Stiker 1999; Ghaly 2010). However, the general framing of disabled people as objects of charity is somewhat simplistic in terms of historical discussion; it posits charity as essentially apolitical, which certainly seems insufficient to understand the most important dynamic of abled-disabled social relations for much of premodern history. It is informative to instead look at the cultural and material functions of charity, in order to understand how and why specific forms of alms evolved. For example, many hospices created in Europe in the fourteenth and fifteenth centuries, which had the goal of providing aid to pilgrims, the poor, and the sick, refused to admit people with 7


cultural dimensions which are highly significant to uncovering the obscured history of disabled people.

the holy fool to be evidence for the general equality or liberation of the mad population. Only under a very specific set of circumstances could a mad person win the relativelyprotected and accepted status of being recognized as a holy fool; the vast majority of mentally disabled and ill people had no reprieve from their disenfranchisement. Despite the widespread cultural impact of holy fools, mentally disabled and ill people could not hold any official position in the mosque or state, both in early Islamic society and in the later Ottoman Empire (Ghaly 2010).

Similar to how European culture created the role of recipients of charity as manufacturers of salvation, cultural concerns have informed several other facets of disabled people’s economic standing in the Middle East and Europe. In both regions, there arose the figure of the “fool”, a person sometimes valued in the upper echelons of society in both regions. In Europe, the king's or prince’s fool was a figure who, by virtue of visible physical disability (generally dwarfism or kyphosis/ hunchback) was rendered non-threatening enough to provide entertainment and perhaps insight through mockery, satire, and irony (Stiker 1999). However, people with such physical disabilities were unlikely to find work elsewhere in medieval Europe, and they would likely have been turned away from many sources of alms for the same reason as other permanently and visibly disabled people. Therefore, it is more accurate to consider the position of “fool” to be a means for extracting labor and value from a subordinated and disenfranchised population than it is to interpret this position as evidence that this population was particularly valued or protected. The fool’s position is based in the fact that their population is so thoroughly disenfranchised that there is no threat in them mocking positions of power; this position is a reflection of alienation rather than special status.

The other main instance of the cultural significance of disability creating a niche for disabled labor is in the case of blind muezzins in the Middle East. The muezzin is the worker designated to announce the hour of prayer and traditionally would climb to the top of a minaret to ensure they could be heard (the invention of the loudspeaker has changed this for some mosques). However, this practice raised concerns about invasion of privacy – the vantage point of the top of the minaret allowed the muezzin to see into neighboring houses’ private areas. For this reason, there actually developed a rather broad consensus among jurists that blind people were the most qualified to hold the position, as they were literally immune to the dangerous potential for sin presented by the opportunity to spy on the townspeople (Ghaly 2010). This stands in contrast to roles like that of the European fool, as it is a position based on valuing the unique capabilities of disabled people rather than a position based on disabled people’s perceived helpless inability to threaten power structures. There is seemingly no direct analog for the preferential status of the blind muezzin in European culture; blind people experienced some cultural significance in Christian Europe because of their prominence in Gospel stories, but this only fed into prevailing systems of charity rather than creating any unique opportunities for them (Stiker 1999). Still, the preferential status blind people in the Middle East enjoyed for the position of muezzin likely made little overall impact in their welfare; issues of class, gender, and education would have restricted the potential number of blind

An analogous figure, the “holy fool”, did exist in the Middle East, but did not become a widespread cultural fixture until much later – by the Ottoman period, between the sixteenth and nineteenth centuries, holy fools were ubiquitous (Scalenghe 2014). The holy fool centered around mental disability rather than physical disability. This figure was rendered non-threatening by virtue of madness rather than visible physical disability, and was responsible for divine inspiration and even miracles rather than satire and courtly entertainment (Scalenghe 2014). Of course, it would be inaccurate to declare the existence of 8


people who could qualify for this position to a small fraction, leaving the rest firmly situated in the prevailing system of poverty and charity.

truly fascinated with the role of disability as a mystical and spiritual phenomenon. In European/Christian literature, the figure of the leper is near-mythic in proportion; the leper is the archetypal recipient of saints’ and martyrs’ blessings and is often the figure through which Christ is revealed (Stiker 1999). This had a direct material result; many leper colonies became more similar to religious orders or monasteries than to ghettos of the poor, as they were highly secluded but their needs were often substantially met (Stiker 1999). While stories regarding leprosy existed in the Qur’an as well as the Gospel, the fascination with leprosy as the archetypal suffering was not as strong in Islamic literature and culture and as such treatment of lepers was not so far removed from treatment of other disabled and sick people in the Middle East as it was in Europe (Stiker 1999; Ghaly 2010). Setting aside the particular case of leprosy, however, both cultures were fascinated with the nature of disability and suffering, which they considered to be inherently linked concepts. Scholars and theologians of both cultures generally interpreted disability as an inherently evil and degraded state of being, which presented a challenge to their belief in an omnipotent God of perfect benevolence. Neither society arrived at any unified consensus regarding what God’s plan and intent was in creating disabled and ill people. In Europe, however, the cultural uncertainty surrounding disabled people coalesced into fear in a way that Islamic cultural views never seemed to – at least, not to the same extent. In the wake of the Black Death, disabled people – again, specifically lepers – were blamed in European discourse in the fourteenth century for spreading the plague (Stiker 1999). This trend continued to the end of the Middle Ages, with disabled and poor people increasingly being seen as criminal and dissolute elements – a cultural trend which became a lasting fixture in the West, as the history of eugenics will attest (Stiker 1999). The Middle East seems to have avoided developing an analogous trend, even though it also suffered from the plague; this may be because European colonialism incentivized European societies’ embrace of scientific racism, ableism,

This issue leads to the topic of general cultural differences regarding perceptions of disabled people. There are some surprising intersections in this area – most interestingly, the role of Greek physiognomy, which enjoyed widespread popularity among early Islamic jurists (Ghaly 2010). Greek physiognomy – on which the authoritative sources had largely been translated into Arabic and begun enjoying widespread circulation and respect among scholars by the ninth century – was an early science that held that the spiritual characteristics of people could be deduced by studying their physical appearance (Ghaly 2010). The science appealed to many jurists as an extension and exploration of the nature of firāsa (a term which, broadly speaking, refers to intelligence and discernment, particularly of a mystic or spiritual sort) (Ghaly 2010). This science had a negative outlook on visibly disabled people, associating their disabilities with imagined spiritual corruption. Interestingly, however, internal debate between Islamic jurists seems to reflect an active resistance to the importation of Greek authors’ negative attitudes towards disabled people. Even as some jurists’ writings were warning the populace to beware anyone with “physical defects,” there was a sizeable opposition – a school of thought which argued Greek physiognomy to be an unscientific and sinfully-presumptuous perversion of genuine firāsa (Ghaly 2010). This is an interesting contrast to Europe’s own involvement with physiognomy; the science did not gain its full prominence in Europe until the Enlightenment, long after its popularity had waned in the Middle East, but there was little influential resistance among European scholars to the racist, classist, and ableist ideas that physiognomy and phrenology were used to support (Rattansi 2007). Further similarities in the cultural depictions of disability center around the role of suffering in theology. Both medieval Christian and early Islamic literature seem 9


and classism to an extent that was unrivaled in the Middle East.

does show that the legacy of Islamic jurisprudence possesses historical value and liberatory potential for disabled people – potential that the orientalism and cultural imperialism of the West are attempting to erase.

The overall trend, in every area of analysis, has been that early Islamic society was more progressive in its social relations between abled and disabled people than European society was at the same time. Economically, the configuration of systems of charity were more focused on the wellbeing of the disabled recipients in the Middle East; in Europe, charity was used as a means of extracting social capital and spiritual selfaggrandizement, and had little interest in permanently-disabled people. Culturally, there was greater debate and resistance surrounding scientific ableism among Islamic jurists than there would be among European scholars. This assessment is not intended to romanticize the history of the Middle East; in it, as in Europe, disabled people were generally impoverished and excluded from all walks of economic and political power. It would be dishonest to attempt to use the evidence gathered here to argue that disabled people were not oppressed or enjoyed meaningful equality in the Middle East. A more reasonable interpretation of the evidence here, however, would be that were it not for European imperialism intervening dramatically in the fate of the Middle East both before and after the fall of the Ottoman Empire, the cultural and material conditions might have favored the emergence of disabled people’s organizing and activism in the region long before any such movements emerged in Europe. It would not be surprising if new research into primary sources were to find evidence of early disabled activism in the Middle East. Islamic jurists recognized “disabled people” as a group and made commitments to their well being (at least on paper) far earlier than their European neighbors did, and social programs intended to benefit disabled people were older and more advanced in the Middle East. As such, the cultural and material conditions in the region were somewhat more favorable for the emergence of disabled consciousness and mobilization than they were in Europe. Setting aside such hypotheticals, however, it is certainly reasonable to say that the evidence

References Borsay, A. (2005). Disability and social policy in Britain since 1750. New York: Palgrave Macmillan. Ghaly, M. (2010). Islam and disability: Perspectives in theology and jurisprudence. New York: Rutledge. Lewis, B. (2006). A Mad fight: Psychiatry and disability activism. In L. J. Davis (Ed.), The Disability Studies Reader 2nd Edition (3-16). New York: Routledge. Rattansi, A. (2007). Racism: A very short introduction. Oxford: OUP Oxford. Scalenghe, S. (2014). Disability in the Ottoman Arab world, 1500-1800. New York: Cambridge University Press. Stiker, H-J. (1999). A history of disability. Ann Arbor: The University of Michigan Press.

10


Activity/Passivity in Dream of the Red Chamber: Homoerotic Sensibility & Sexual Hegemony in the Mid-Qing Dynasty GVGK Tang

straight!”2 – mistaking zhengjing for something that referred to sexuality rather than respectability. Little did Xue know that Liu was a young aristocrat who had turned to acting after falling on hard times. In fact, Liu was not insulted by the insinuation that he was a “homosexual,” but, rather, that he was an actor-prostitute willing to perform the passive role in a sexual encounter – submitting himself to someone so crude and of a lesser status. Such a supposition threatened his already tenuous sense of masculinity and social standing.

One evening during the prosperous reign of the Qianlong Emperor, government officials and wealthy young men gathered outside in the lush garden of a Beijing estate. The air was sweet and balmy – filled with shouts and boisterous laughter of drunken delight. Xue Pan, the boorish playboy of the Jia clan, longingly eyed Liu Xianglian, a handsome young actor, from across the spacious courtyard. As Liu was leaving the party, Xue ran to catch up with him. “Where are you off to?”1 Xue began, grabbing Liu by the arm, “If you leave, the fun will leave with you!” Concealing his disgust, Liu answered coyly, “Is that so?” Xue smirked, “Meet me by the bridge beyond the northern gate and I’ll show you.” Liu accepted his invitation with a plan in mind. Later that night, as he waited for Liu to arrive, Xue was struck from behind and fell to the ground. When he opened his eyes, he was shocked to find Liu standing above him, an iron hammer in hand. Xue cried out, “Why did you agree to come here – just to give me a beating?” “Now do you know what kind of man I am?” Liu responded angrily, delivering another blow to Xue’s torso. “I now know that you’re respectable!” Xue replied with a groan.

With the continued integration of queer theory into a globalized chronology of historical change, scholars have been forced to reconcile the Euro/Americentric essentialisms of “homosexual” identity and its Westernoriented narratives with historiographies of sexuality in late imperial China. The field of queer studies itself has tended to center solely on sexual desire and its sociocultural identifications. However, historians have begun to interrogate the contextual variability of what constitutes so-called “queerness” – or “nonnormative” sexual acts, behaviors, and orientations. For this reason, the homoeroticism illustrated in literary works such as Dream of the Red Chamber must be viewed as a kind of “‘sensibility’ that overlaps with but is not necessarily confined to … sexuality itself.”3

An outsider may read this synopsis of a chapter from Cao Xueqin’s Dream of the Red Chamber and assume that Liu had taken offense at Xue’s assumption that he would be interested in another man. Indeed, an older translation of this exchange concludes with Xue proclaiming “I now know you’re

During the mid-eighteenth century, Dream of the Red Chamber was penned in the vernacular – making it more accessible to the literate segments of China’s population. Wellreceived at the time of publication, it is still considered a Chinese literary classic and has

___

2. Bret Hinsch, Passions of the Cut Sleeve: The Male Homosexual Tradition in China (Berkeley: University of California Press, 1992), 196.

___

1. Cao Xueqin, The Dream of the Red Chamber, trans. H. Bencraft Joly (Adelaide: eBooks@University of Adelaide Library, 2014), accessed October 20, 2016, https:// ebooks.adelaide.edu.au/c/cao_xueqin/ c2359h/complete.html.

3. Cuncun Wu, Homoerotic Sensibilities in Late Imperial China (New York: RoutledgeCurzon, 2004), xiii. 11


even inspired its own field of study – “Redology.” The semi-autobiographical nature of the work makes it a rich primary source not only for its literary and thematic value, but also for its illustration of aristocratic family life during the Qing dynasty. Dream of the Red Chamber exemplifies contemporary representations of homoeroticism in poetry and fiction. However, this so-called “queerness” is not necessarily indicative of a direct opposition to Qing-era “anti-sodomy” legislation (or, as some historians argue,4 “anti-rape” legislation). Rather, such literary manifestations of homoeroticism may be read as a co-option of the “queer friendly” culture established by the preceding Ming dynasty and a conflation with (or, sexualization of) NeoConfucian gendered hegemonies.

Qing sexual norms were grounded in gendered societal valuations, demonstrating that sexuality, indeed, acted as a nexus of power. Desires were regulated, while statuses were constructed and (dis)empowered. Xue Pan’s humiliating encounter with Liu Xianglian is juxtaposed with that of the sophisticated, well-liked protagonist Jia Baoyu (his cousin) and another young male actor.6 The finesse of Jia’s seductions (of both men and women) is contrasted with Xue’s continued bawdy arrogance. Xue eventually becomes jealous and reports Jia’s affair with the actor to his strict Confucian father, Jia Zheng. Jia senior castigates his son – again, not for his same-sex relationship, but (1) for pursuing too low a class of commoner and (2) for choosing this particular young man, who is already the favorite of a high-ranking official who could threaten the family. In turn, the father figure is a stand-in for the draconian Qing state and its Neo-Confucian ideologies.

Dream of the Red Chamber upheld Qingera social/sexual norms. For example, the character of Xue Pan acts as a comic foil for the work’s other characters, whose homoerotic proclivities are much more respectable. This so-called respectability engages a conventional active/passive sexual dichotomy as age and social strata were the primary means of assigning acceptable role fulfillment. In other words, younger men and men of a lower “class” (e.g., actor-prostitutes) were expected to perform the passive role in same-sex sexual activity, while older men and those of a higher “class” (e.g., aristocrats) typically performed the active role. Indeed, the Qing state’s primary sexual precept was “the impenetrability of all ‘decent’ people.”5 As such, this sexual hegemony combined multiple dimensions of status.

Current historiographic conceptions of sexuality are based on a Foucauldian model of power dynamics that determine what constitutes “normal” and “abnormal” sexual experience. Meanwhile, the dawn of China’s last empire saw the prioritization of societal etiquette as a means of specifying what was and was not sexually “appropriate” or “couth” – as illustrated by Dream of the Red Chamber. The manner in which the Qing state designated and promoted sexual propriety is well evidenced in discussions of active and passive, and, therefore, dominant and subordinate (gendered, “classed”) sexualities – which were allotted differing resources, freedoms and influence. Immersed in the programmatic revitalization of NeoConfucianism, the Qing government adapted existing sexual narratives and gendered hegemonies of penetration.7 Ming-era

___ 4. Matthew Sommer, Sex, Law, and Society in Late Imperial China (Stanford: Stanford University Press, 2000), 116.

___ 6. Xueqin, Red Chamber.

5. Giovanni Vitiello, The Libertine's Friend: Homosexuality and Masculinity in Late Imperial China (Chicago : University of Chicago Press, 2011), 128.

7. Vivien Ng, “Ideology and Sexuality: Rape Laws in Qing China,” The Journal of Asian Studies 46 (1987): 57. 12


homoerotic culture was repressed, with “sexual relations between men in the late imperial era [viewed, in part] as acts that profoundly destabilized the gendered social hierarchy by treating some men (the penetrated) like women.”8 By requiring interpersonal conduct to conform to government regulation, the Qing state became the ultimate authority on public morality and “social engineering.”9

References Hinsch, Bret. Passions of the Cut Sleeve: The Male Homosexual Tradition in China. Berkeley: University of California Press, 1992. Ng, Vivien. “Homosexuality and the State in Late Imperial China.” In Hidden from History: Reclaiming the Gay & Lesbian Past, edited by Martin Duberman, Martha Vicinus, and George Chauncey, Jr., 76-89. New York: Penguin Books, 1989.

___ 8. Matthew Sommer, “The Penetrated Male in Late Imperial China: Judicial Constructions and Social Stigma,” Modern China 23 (1997): 140.

Ng, Vivien. “Ideology and Sexuality: Rape Laws in Qing China.” The Journal of Asian Studies. 46 (1987): 57-70.

9. Vivien Ng, “Homosexuality and the State in Late Imperial China,” in Hidden from History: Reclaiming the Gay & Lesbian Past, edited by Martin Duberman, Martha Vicinus, and George Chauncey, Jr. (New York: Penguin Books, 1989), 88.

Sommer, Matthew. “The Penetrated Male in Late Imperial China: Judicial Constructions and Social Stigma.” Modern China 23 (1997): 140-180. Sommer, Matthew. Sex, Law, and Society in Late Imperial China. Stanford: Stanford University Press, 2000. Vitiello, Giovanni. The Libertine's Friend: Homosexuality and Masculinity in Late Imperial China. Chicago : University of Chicago Press, 2011. Wu, Cuncun. Homoerotic Sensibilities in Late Imperial China. New York: RoutledgeCurzon, 2004. Xueqin, Cao. The Dream of the Red Chamber. Translated by H. Bencraft Joly. Adelaide: eBooks@University of Adelaide Library, 2014. Accessed October 20-21, 2016. https://ebooks.adelaide.edu.au/c/ cao_xueqin/c2359h/complete.html.

13


Horace Pippin's Supper Time: A Reflection on Mid-Century Black American Life Gabriel Jermaine Vanlandingham-Dunn

When I was seven I began to get into trouble. It happened this way. In spelling, if the word was dog, stove, dishpan, or something like that, I had a sketch of the article at the end of the word. And the results were, I would have to stay in after school and finish my lesson the right way. This would happen frequently and I just couldn’t help it. The worse part of it was, I would get a beating when I got home, for coming home late, regardless what I were kept for. (Pippin, 1947)

My grandparents often spoke about their childhoods when I was younger. From them, I learned a lot about the lives of Black folk in the 1940s-50s. On family trips to North Carolina, they would point out fields where relatives were said to have been lynched and/or buried due to the racist activity of the Ku Klux Klan. I learned about the history of our economic displacement and how there were few decent jobs for Black women and men. I learned about how they survived some of the hardest times in history for Black folk in the southern states. So when I recently came across Horace Pippin’s Supper Time (1940), it immediately reminded me of them, their struggles, and the importance of Black folk telling their own stories, whether orally or visually.

When viewing Pippin’s work for the first time the viewer might be drawn in by the peculiar nature of the subject matter and the line work. The initial response might be that of what one would say of Jean-Michel Basquiat’s paintings – that racist, problematic word primitive. Many of his works paint an eye-grabbing narrative and Supper Time is no different. Pippin’s work purposefully reimagined real situations concerning Black life, even those which he himself, or his family, experienced. ... Pippin created a body of paintings, burntwood panels, and drawings in which he revised not only his own but his family’s marginal and distorted existence within official records. With no interest in creating revisionist vignettes for white consumption, Pippin as a memorialist-witness undertook the missing work of commemoration, not only for his life but for Black family histories more generally. (Bernier, 2015)

Pippin’s Supper Time is not a well-known painting. In Horace Pippin: Negro Painter in America (The Quadrangle Press, 1947), the piece is called Cabin Interior (listed as being produced in 1941). Now a part of the Barnes Foundation Collection (acquired by Albert C. Barnes in 1941), it is dated 1940. Such discrepancies may seem like small details, but it is important to look at Black artwork with a non-traditional “art world” lens. Many Black artists from the past were overlooked significantly during their lifetimes, only to become torchbearers of the Black experience after their death. This issue is exemplified by the difficulty of finding any scholarly articles (recent or past) on Supper Time or Pippin’s work not associated with his time in the military or his depictions of religious figures. However, the visual theme of Supper Time, Pippin’s religious devotion, and his time as an Army man do add a layer of understanding to stories that my grandparents gave to me. Born in West Chester, Pennsylvania in 1888 and raised in Goshen, New York, Horace Pippin described his earliest encounters with art as plenty painful.

In works such as Supper Time, Domino Players (1943), Giving Thanks (1942), Sunday Morning Breakfast (1943), and Harmonizing (1944), Pippin does what many writers of the Harlem Renaissance attempted to do: give voice to the societally-based normalization of conditions facing Black folk after World War I (in which he fought and was partially disabled in 1918) and during the Great Depression. Being shot affected Pippin’s ability to do any kind of heavy work, including his ability to lift his right (drawing) arm up to his head. After figuring out how to steady his right arm with his left, he eventually began making oil paintings in 1928. With his oils, Pippin captured not only aspects of my grandparents’ hard lived youth, but also their Northern experience starting in the 1950s.

14


Visually, even at its small (12” x 15 ⅛”) size, Supper Time immediately brings back memories of my grand-aunt Gertrude’s home in Gastonia, North Carolina. The interior of the home almost matches the skin tone of the family shown (possibly because the work is an oil painting on burnt-wood panel). The various shades of brown upon the door, walls, tables, chairs, floor, and family dominate the paint from one corner to the next. Small, thin shades of white occupy the area around the door, and this gives the impression that the probably small home was drafty when the wind blew. Laundry, the little girl’s clothing, the mother’s headscarf and apron, tableware, and what appears to be a snow-filled window are all white as well. The two striking colors are the mother’s blue dress and the father’s pinkish shirt. The mother’s position is similar to that of the mother in Pippin’s Sunday Morning Breakfast, yet the body and facial features are not as clear in Supper Time. Her arm is outstretched, delivering something to the table. My grand-aunt Gertrude’s home was tiny in the same manner as Pippin’s snapshot of Black life. The construction of the home depicted here, and in my memory, gives the impression that this family could have lived on a small dirt road in the backwoods or in a segregated part of a Northern state, like Pennsylvania.

Supper Time is a reminder of the forgotten marginalization of Black people during The Great Depression. Pippin, born only a few years after slavery was abolished and managed to contribute to a dialogue about the hardships of the past, acknowledging America’s African survivors. This country was built on the backs of those rarely credited or compensated for their “contributions.” Supper Time, like the Blues, is a beautiful piece of Black art that can be enjoyed and cherished; yet it conveys a message that is important for all to know. It exists because of the hardships faced by Black people and our ability to survive – and survival tactics come in many forms. Survivalist nature should not be confused with primitivism; the latter term is for oppressors.

References Bernier, Celeste-Marie, and Horace Pippin. "Chapter 2: Don't Tell Me How to Paint." Suffering and Sunset: World War I in the Art and Life of Horace Pippin. Philadelphia: Temple UP, 2015. 105. Print. Rodman, Selden. Horace Pippin, a Negro Painter in America. New York: Quadrangle, 1947. Print.

A detail that stands out in the painting is the mother’s wristwatch. This element might reflect Pippin’s own life after marrying and trying to find work to support his family. Because of his injury he could not be as active as men were expected to be, hence he did small things to assist his wife who ran a laundry service. The “luxury” of a wristwatch for a member of poor family might symbolized his wife as the breadwinner or as the recipient of a gift from a white patron. As revealed in novels written by Black women during the twentieth century, the theme of the help receiving gifts is common. Black women often had no other choice but to work for wealthy families as servants, a practice that still persists today (and also includes Black men).

Stein, Judith E. "Chronology." I Tell My Heart: The Art of Horace Pippin. New York, NY: Pennsylvania Academy of the Fine Arts, in Association with Universe Pub., 1993. 186-88. Print.

15


How Espionage Affected the Image of the United States During the Cold War Joseph E. Swadlow

the Soviets government documents during the 1930s. Chambers had been a part of a communist underground organization called the Ware Group and was an editor at Time magazine before working at the communist newspaper The Daily Worker. By 1938, however, he abandoned communism and in 1948 he went to the House Committee on UnAmerican Activities to give information about Alger Hiss.

During the Cold War, the United States government relied heavily upon espionage to gain information that would help them protect the country from nuclear war with the Soviet Union. The topic was so relevant it made its way into popular culture; for example, in the film Dr. Strangelove or: How I Learned to Stop Worrying and Love the Bomb, the Soviet ambassador used secret cameras to take pictures of the American war room. On more than one occasion, espionage related incidents during the Cold War affected the image of the United States in a negative way, which hurt them strategically in both domestic and foreign affairs. There were a few embarrassments, such as the U-2 incident with pilot Francis Gary Powers, which made the government look devious and untruthful. The Americans also had to deal with Soviet spies such as Alger Hiss and the Rosenbergs. The way they handled those situations also permeated everyday life and fueled the fear of communism in many Americans.

Hiss was a very intelligent and youthful New Dealer, which went along with Joseph McCarthy’s narrative that anyone could be a Communist. At first, Hiss denied that he even knew Chambers and actually sued him for libel. As more information came out about Chambers and Hiss’s relationship, it became obvious to the House of Un-American Activities Committee that Hiss was lying and they arranged to have another meeting with him. Chambers continued his accusations and told them that he and Hiss were spies for the Soviets. He also produced many secret government documents that Hiss had stolen as evidence. The most notable documents were known as the “Pumpkin Papers” because Chambers hid film of the papers inside of a pumpkin at his farm. Hiss went in front of two grand juries in which the second jury convicted him on two counts of perjury. There was a statute of limitations on espionage, so they were able to convict him strictly because he lied about knowing Chambers and giving him government documents, not for spying.

The threat of espionage was used as a political tool during the Cold War by influential people like Senator Joseph McCarthy from Wisconsin, who took advantage of the public’s fear of communism to gain power. McCarthy fabricated a list of United States government officials who were said to be communists, which fueled the Red Scare and caused Americans to become very paranoid about the communist menace infiltrating their society. Although many of McCarthy’s claims about government officials and celebrities have been proved false or exaggerated, the trial of Alger Hiss in 1948 gave him and the Red Scare some credibility in the eyes of the American public.

Hiss’s prison sentence was 5 years, but he was released early for good behavior. Throughout his life he tried to overturn his conviction, but was unsuccessful. He had a few theories about how he was framed, although through recent research, it is believed that Hiss was actually guilty of being a communist spy. While many citizens saw Hiss as a traitor, he also had sympathy from some liberals who thought that he was innocent, but the same could not be said for Chambers. People on both sides of the political spectrum saw him as a traitor to their interests and they thought of him as someone who was not loyal to his friends because he ratted out Hiss. The Alger

In 1948, Whitaker Chambers accused Hiss of being a spy for the Soviet Union. Interestingly, the accusation included that Chambers himself was a part of the Communist Party of the United States and he claimed to have worked with Hiss on sending 16


Hiss case validated American concerns about domestic communist spies that could infiltrate the United States government. The threat of Soviet espionage was tangible in the United States after that case and everyone was aware of it (Berresford, 2008).

were being treated unfairly from these accusations, so he reflected what he thought America’s image was into his play. While there were many people in America who sided with the Rosenbergs and felt that the government was becoming too extreme in regards to how it dealt with espionage, there were other Americans did not feel any remorse. An example would be Lucy S. Dawidowicz, a writer for Commentary, a Jewish-American, anti-Stalin magazine. On January 1st, 1952 she wrote an article entitled “‘Anti-Semitism’ and the Rosenberg Case: The Latest Communist Propaganda Trap.” Written before the Rosenbergs’ execution, Dawidowicz argued that the United States had done the right thing by convicting the couple because they went against their country and helped the Soviets by giving them secret atomic information. The article later went into detail addressing how people that thought the Rosenbergs were innocent or that the United States government was too harsh were only feeding into the Communist propaganda. In response, it stated that Americans should not feed into the anti-American campaign that the communists were putting on. The fact that they tried to make the United States government out to be anti-Semitic because they convicted the Rosenbergs was silly because they were spies and they were guilty (Dawidowicz 1952). The country was split on the issue.While some people thought that the Rosenberg case made the United States look bad, others thought it made the United States look strong.

Possibly the most infamous case of espionage during the Cold War was that of Julius and Ethel Rosenberg during the 1950s. Paranoia had spread quickly in the United States over the topic of Communist spies, so the United States took harsh measures in punishing the Rosenbergs by execution, which made the government look rather callous. The Rosenbergs encountered a problem when the FBI questioned Ethel’s brother, David Greenglass who worked on the atomic bomb project, about being a Soviet spy. In order to mitigate the FBI’s interest in him, Greenglass told them that Julius was an actual spy that sent information on the atomic bomb to the Soviets. While the FBI thought that Ethel knew about her husband’s actions and could have possibly been involved with Julius’s illegal activities, they arrested her mainly to get Julius to confess about his espionage and give up more names of Communists. Julius did not confess but the Rosenbergs were found guilty anyway and were electrocuted in 1953. Ethel’s execution was botched the first time and because she did not die immediately, her death looked very inhumane, especially because she was not very involved in the spying the her brother and husband were apart of. The Rosenberg case had a large reaction from the public and definitely affected America’s image of how its government was handling Soviet espionage during the Cold War (Markowitz, 2013).

Another mishap during the Cold War involving espionage was the 1960 U-2 plane incident. The CIA used spy planes that were meant to fly over the Soviet Union and take pictures. Francis Gary Powers was a pilot of one of the U-2 spy planes and on May 1st, 1960 he was shot down by a missile while flying over the Soviet Union. Powers was trained to kill himself if he ever got shot down, but he ended up not doing so and and was taken hostage by the Soviets. Powers was cooperative with the Soviets and which made President Dwight Eisenhower look like a fool, especially

Political leftists started to believe there was a communist witch hunt going on in the United States. Coincidentally, Arthur Miller’s play, The Crucible, which came out only a few days after the Rosenbergs were killed, told the story of actual witch hunts that occurred in Salem, Massachusetts during the 1690s. Miller wrote the play because the FBI accused him of being a communist. He believed that people 17


considering he lied about not knowing about Powers or any of the U-2 planes. There was a cover story made by the government claiming that the aircraft was a part of NASA and was just doing weather research, which was confirmed by secret government documents that were declassified long after the incident. The United States government did not know at the time that Powers was still alive and that the Soviets had him captive. The spying made Eisenhower look bad, but the cover up made it look even worse as it made Nikita Khrushchev, the Soviet Premier, very upset. He even demanded an apology from Eisenhower. This incident destroyed most of the progress that had been made between the United States and the Soviet Union (Nathan, 1975). Espionage had a considerable influence on the image of the United States during the Cold War. Spies were very involved in creating an atmosphere of anxiety in the United States, especially when people like Alger Hiss and Whitaker Chambers, Julius and Ethel Rosenberg, and Francis Gary Powers, made their may into the public eye. Espionage affected United States’ society due to its covert nature. While not as pervasive as actual war, it created a war zone within society as espionage itself was so ubiquitous. Ordinary people did not know whom to trust, and some even turned against each other.

“Cover plan to be used for downed U-2 flight.” www.eisenhower.archives.gov, Box 15, Office of the Staff Secretary, May 2, 1960, https://www.eisenhower. archives.gov/research/online_documents/u2_incident.html

“State Department press release #249 concerning U-2 incident, May 6, 1960.” www.eisenhower.archives.gov, #249, State Department, May 6, 1960, https:// www.eisenhower.archives.gov/research/online_documents/u2_incident/5_6 _60_No249.pdf

18


References “A FRAGILE DÉTENTE: The U-2 Incident Re-examined.” Military Affairs 39, no. 3 (Octiber 1975): 97-104. America: History and Life with Full Text, EBSCOhost (accessed April 28, 2016). Berresford, John W. “The Grand Jury in the Hiss-Chambers Case.” American Communist History 7, no. 1 (June 2008): America: History and Life with Full Text, EBSCOhost (accessed April 27, 2016). Dawidowicz, L. S. (1952). Anti-semitism and the rosenberg case. Commentary, 13, 41. Retrieved from http://search.proquest.com .libproxy.temple.edu/docview/ 1290193563?acc Markowitz, Andy. 2013. "Rosenberg, Julius (1918--1953),and Ethel Rosenberg (1915--1953)". In Culture Wars in America: An Encyclopedia of Issues, Viewpoints, and Voices, edited by Roger Chapman and James Ciment. London: Routledge. search.credoreference.com.libproxy. temple.edu/content/entry/sharpecw/ rosenberg_julius_1918_1953_and_ethel _rosenberg_1915_1953/0. Reklaitis, George. 2005. "Hiss, Alger". In Encyclopedia of Intelligence & Counterintelligence, Rodney P. Carlisle. London: Routledge. http://search.credo reference.com.libproxy.temple.edu/ content/entry/sharpint/hiss_alger/0 Rife, Jamie, and Jamie Rife. 2005. "Powers, Francis Gary". In Encyclopedia of Intelligence & Counterintelligence, Rodney P. Carlisle. London: Routledge. http://search.credoreference.com .libproxy.temple.edu/content/entry/ sharpint/powers_francis_gary/0 19


Mittler zwischen Hirn und Händen” (“Mediator Between Brain & Hands”): Women and Proto-Fascist Ideology in Metropolis Evron Clay Hadly

(re)create a futuristic city and Metropolis was the result of this aesthetic vision. However, Lang was just as much horrified by the technology displayed in these “future” cities as he was fascinated by them. Again, this issue is reflected in the film itself. German film theorist, Anton Kaes, notes that Metropolis creates “images that fetishize technology even as they display its cataclysmic power.”2 It is Lang’s seemingly contradictory representation of technology that creates the main aesthetic tension of the film. On a purely visual level – in the set design and composition of shots – Metropolis is representative of the relationship between Expressionism and Neue Sachlichkeit (New Objectivity). Neue Sachlichkeit was a reaction against Expressionism, a movement which embraced technology and modernity rather than shunning it. The push-and-pull between these two aesthetic movements dominated Weimar artistic and intellectual spaces in the 1920s. Andreas Huyssen, in his classic interpretation of the film, argues that both these movements are represented in the film, one as utopian and the other as dystopian. The fetishized images of technology are representative of the “machine-cult” of Neue Sachlichkeit, while the images of technology as repressive and oppressive are representative of a more Expressionist approach.3 Another film theorist, R.L. Rutsky, takes Huyssen’s critique further, arguing that both visions of technology presented in Metropolis are dystopian and that only through mediation can these two conflicting, dystopian views of technology be brought to heel.4 Understanding the complicated relationship that the film has with technology

Metropolis (1927) remains one of the most influential and scrutinized science-fiction films of all time. Past scholarship has analyzed the problematic ending to the film’s conflict between labor and capital, while other works have looked at how the film constructs and presents the relationship between women, sexuality, and technology. However, these two analytical approaches have often remained separate and fallen short of articulating a more cohesive argument regarding the actual ideological message of the film. This paper seeks to further investigate the proto-fascist elements that other scholars have analyzed and uncovered in this work. I argue that the proto-fascist message of Metropolis lies in the fact that the film places women as simultaneously central to and outside of the class conflict. This relationship is depicted – both visually and on a textual level – through the character of Maria. Therefore, the protofascist elements of Metropolis lie not only in its depiction of class conflict, but in how the film portrays the relationship between women and that conflict. The film is based on a novel written by Thea von Harbou (1888-1954), which was published in 1926. However, work on the concepts behind Metropolis had begun earlier in the decade. Fritz Lang (1890-1976) – Von Harbou’s then husband and future director of the film version of Metropolis – had first been inspired to create this story after a visit to New York City in 1924.1 Lang, who had been trained as an architect and an actor before turning to directing, was fascinated with trying to

___ 2. Ibid. 187. 3. Andreas Huyssen, “The Vamp and the Machine: Technology and Sexuality in Fritz Lang’s Metropolis,” New German Critique 24-25 (Autumn 1981-Winter 1982): 223.

___ 1. Anton Kaes, “Metropolis (1927): City, Cinema, Modernity,” Weimar Cinema: An Essential Guide to Classic Films of the Era, ed. Noah Isenberg (New York: Colombia University Press, 2009): 174.

4. R. L. Rutsky, “The Mediation of Technology and Gender: Metropolis, Nazism, Modernism,” New German Critique 60 (Autumn 1993): 3-4. 20


is especially important because of the role the gendered Machinenmensch plays in the climax of the film.

focus on in the context of Metropolis. Firstly, it is important to note the level to which the German public was aware of the real possibility of a proletarian revolution and the role that played in public discourses. The Russian Revolution had occurred only a ten years prior (from the premier of Metropolis) and a failed socialist uprising had occurred in Germany itself during 1919.7 Anxieties about the revolutionary Left came from both the democratic center and the far right. Secondly, the rise of the concept – and threat – of die Neue Frau (the New Woman) is another example of these conflicting public discourses. Historian Barbara Hales presents this definition of the concept: “the term ‘New Woman’ […] referred to the independent woman who was participating in the workforce in large numbers.”8 It is important to note that the New Woman posed a threat to society not only because she was “independent” but because she was making decisions about her sexuality outside of the control of men. Thus, this term held a sexual connotation.

It is also important to take into account the very specific political context under which Metropolis was produced. German historian Thomas Mergel describes the Weimar Republic as: “[…] a political experiment whose existence was due less to the strength of the republican idea than to the weakness of the regime it succeeded. The revolution of 1918-19 was an upheaval, merely in political terms; it changed the parameters of the political system, but not […] those of the social order.”5

It follows then, that Weimar society was full of deep-seated contradictions, especially in the public and political spheres. At the same time that some of the most progressive European legislature was being passed in regards to women’s rights, citizenship laws, etc., there were many parts of the country that were further entrenching themselves in conservative and traditional values. In addition, it is this presence of underlying conservatism that would eventually allow for fascist ideology to take hold in the 1930s. Furthermore, as Mergel argues, it was the high expectations of the regime, along with the structural and social limits of progress that enabled such a drastic shift.6 This tension between progress and tradition was represented in public discourse and is why so many images we have of the Weimar Republic present a conflicting view. For the purpose of this paper, there are two anxieties of modernity that are important to

It is within this socio-historical context that the ideological message of Metropolis is developed. However, rather than arguing that Metropolis is some kind of premonition of fascism, this paper treats the film as a cultural text that is both informed by and informs the culture that produces it. Metropolis depicts “the modernist dimension in fascism and the fascist dimension in modernity.”9 In other words, Metropolis is not a fascist film (in that it was not produced by an explicitly fascist regime), but it does reveal the elements of society that the National Socialist German Workers’ Party would later be able to take hold of and manipulate for its specific ideological purposes. Metropolis’ message is one of a

___ 5. Thomas Mergel, “High Expectations – Deep Disappointment: Structures of the Public perception of Politics in the Weimar Republic,” in Weimar Publics/Weimar Subjects: Rethinking the Political Culture of Germany in the 1920s, ed. Kathleen Canning et al. (New York: Bergahn Books, 2010), 194.

___

6. Ibid., 192

9. Kaes, “Metropolis,” 174.

7. Kaes, “Metropolis,” 186. 8. Barbara Hales, “Taming the Technological Shrew: Woman as Machine in Weimar Culture,” Neophilologus 94 (2010), 302.

21


glossy compromise; the film wants the audience to agree to the concept of “equality” under fascism. This goal is achieved by introducing the concept of a “mediator” needing to exist between labor and capital. What is most dangerous about Metropolis is this “mediatory logic”10 and how women figure into that constellation. In Metropolis (and under fascism), women occupy a contradictory relationship to the state and the preservation of capital. On one hand, women are integral to the functioning and propagation of the state – as mothers and as workers in the “domestic” sphere – yet they are expected to exist completely outside of the state-making process. Additionally, women who do not conform to these domestic roles are seen as a threat to the state and even those who do conform to those roles are seen as necessitating control.

pleasure garden. Her introduction immediately forces Freder to stop in his tracks. He asks her what she is doing there and she states that she is showing his “brothers” (the children) how the people above ground live. Fitting with the classic Oedipal reading of the film, cultural critic Gabriela Stoicea argues that, in this moment, “[Freder] is unsure about how to respond to Maria – as a lover, as a would-be father figure to the children, or as a son.”11 Regardless of the role that Maria takes on in relation to Freder, she serves as the catalyst for his awakening as, prior to her appearance, he is unaware of the suffering of the workers under Metropolis. The second scene of significant narrative importance, involving Maria and her relationship to the workers’ rebellion, is the scene where Rotwang and Fredersen spy on her preaching to the workers in the catacombs. Maria calls for a mediation between the workers and the overseers of the machines, which foreshadows the ending of the film. However, Fredersen rejects this ideology of mediation that he will later come to accept. But why? Huyssen argues that it is because the threat that Fredersen perceives has nothing to do with a workers’ revolution but, rather, with his fear of emotion, of nurturing, and of women not under his control.12 So, in light of the film’s ending, this scene frames the issue with the worker’s rebellion not as that of rebellion, but as that of women behaving “out of turn.”

Four scenes from Metropolis stand-out as being representative of the politics of the overall film, on both a visual and narrative level. All four of these scenes involve a different depiction of Maria, which reflect upon the different roles women hold in relation to the broader themes of the film, such as technology, sexuality, and labor. In addition, these scenes proceed from various points in the narrative of the film and mark a transformation of Maria’s character. While at the beginning of the film she may pose a threat to the structures of power, by the end she has been subdued and completely removed from the state-making process.

The third scene sequence is perhaps the most important – the real Maria is captured and an evil, Vamp-y robot doppelganger takes her place, seducing the workers and upper classes alike and wreaking chaos throughout

The audience’s first introduction to Maria is when she interrupts Freder in the pleasure gardens of Metropolis. She emerges from between the frames of two, huge doors surrounded by the children of the workers. Immediately, this imagery invokes a certain message about Maria’s character. She is presented as the good, virginal mother figure in comparison to the female prostitutes of the

___

___

11. Gabriela Stoicea, “Re-Producing the Class and Gender Divide: Fritz Lang’s Metropolis,” Women in German Yearbook 22 (2006): 21-42.

10. Rutsky, “Mediation of Technology and Gender,” 19.

12. Huyssen, “The Vamp and the Machine,” 229. 22


the entire city. This Machinenmensch does exactly what Fredersen wants her to do, which is destroy and subvert the machinations of the worker’s revolt. This Robot Maria remains subservient to the men who created her, but eventually she runs amok and begins to lead the workers below ground to destroy everything. At the very end of this sequence, she is captured and burned at the stake. The Robot Maria, like the real Maria, is acceptable to men only so long as she remains within their control.13 If she deviates from the path that they lay out for her, then she must be punished. In addition, this sequence of destruction presents two very interesting concepts: first, it attempts to convince the audience of the corrupting quality of unchecked female sexuality. The Robot Maria is not merely destructive, but she is an explicitly sexual being who seduces the workers of Metropolis into destroying their own city and homes. Second, this scene sequence presents a very particular view of the masses. Not only are the masses visually represented as possessing some kind of mechanical sameness throughout the entire film, but, as cultural theorist Stefan Jonsson argues, in these scenes with Robot Maria, Metropolis articulates a very clear point: “[…] given the definition of the masses as opposed to individuality and given the view of individuality as the support of culture and knowledge, the conclusion follows automatically; a representation of the masses as disorderly and destructive, as an agent of leveling passion in need of discipline and guidance.”14 In other words, the masses are susceptible to corruption and require some kind of controlling factor – a mediator – to

preserve order. This view also supports Rutsky’s claims about the drive for “wholeness” that is represented in Metropolis and its fascist implications.15 Maria makes a very short appearance at the very end of the film. While in the beginning of the film there might have been the chance for her to take on the role of the “mediator,” the presence of the Robot Maria has completely ruined that possibility. Maria is returned to her maternal status and Freder now supplants her as the mediator between his father (capital) and the foreman (labor).16 This mediation, again, brings up the tricky concept of “wholeness.” Through mediation, labor and capital are able to make amends with one another. In addition, this mediation is imbued with particular “feminine” qualities. It is, after all, a “heart” which must mediate between the head and the hands. But the mediation does not occur only between labor and capital; rather, as Rutsky argues, the mediation includes an overarching ideology, which is the mediation between rationality and emotion, femininity and masculinity.17 However, Maria – the only named female character of the film – exists completely outside of this paradigm. This further emphasizes the gender politics outlined earlier in this essay: under fascism, women are both integral to the functioning and (literal) propagation of the state, but are outside any of its decision making processes. Furthermore, Rutsky argues that the mediation between these ideological concepts must take place through an individual who “combines the will of the father (the ‘skull of steel’) with the eternal-feminine spirit and ___

___

15. Rutsky, “Mediation of Technology and Gender,” 11.

13. Hales, “Taming the Technological Shrew,” 313.

16. Hales, “Taming the Technological Shrew,” 313.

14. Stefan Jonsson, “Authority Versus Anarchy: Allegories of the Mass in Sociology and Literature,” Crowds and Democracy: The Idea and Image of the Masses from Revolution to Fascism (New York: Columbia University Press, 2013): 63.

17. Rutsky, “Mediation of Technology and Gender,” 7.

23


emotions of the mother (the ‘soft heart’).”18 This is why the mediation at the end of Metropolis must occur through Freder and can not occur through Maria. In addition, the ability of Freder to embody both of these contradictory, gendered concepts, to create a sense of natural wholeness, is what gives this film its proto-fascist bent, beyond even its problematically envisioned ending to class conflict.

References Hales, Barbara. “Taming the Technological Shrew: Woman as Machine in Weimar Culture.” Neophilologus 94 (2010): 301-306. Huyssen, Andrea. “The Vamp and the Machine: Technology and Sexuality in Fritz Lang’s Metropolis.” New German Critique 24-25 (Autumn 1981-Winter 1982): 221-237.

In undertaking this analysis, I have hoped to develop a more nuanced and critical approach to this important cultural text. By uncovering and reflecting on the proto-fascist gender and class politics of this film, I aimed to not only articulate a more cohesive and nuanced position on this film as it existed in its particular cultural moment, but to further discussion about its impact on modern texts. As a seminal work of science-fiction film, art, and literature, Metropolis continues to have relevance to contemporary discourses.

Jonsson, Stefan. “Authority Versus Anarchy: Allegories of the Mass in Sociology and Literature”. In Crowds and Democracy: The Idea and Image of the Masses from Revolution to Fascism. 61-67. New York: Columbia University Press, 2013. Kaes, Anton. “Metropolis (1927): City, Cinema, Modernity”. In Weimar Cinema: An Essential Guide to Classic Films of the Era. Edited by Noah Isenberg, 173-191. New York: Colombia University Press, 2009.

___ 18. Ibid., 21.

Mergel, Thomas. “High Expectations – Deep Disappointment: Structures of the Public perception of Politics in the Weimar Republic.” In Weimar Publics/Weimar Subjects: Rethinking the Political Culture of Germany in the 1920s, edited by Kathleen Canning, Kerstin Barndt, and Kristin McGuire. 192-210. New York: Bergahn Books, 2010. Metropolis. Directed by Fritz Lang. 1927. Berlin: Kino International, 2010. Streaming. Rutsky, R. L. “The Mediation of Technology and Gender: Metropolis, Nazism, Modernism”. New German Critique 60 (Autumn 1993): 3-32. Stoicea, Gabriela. “Re-Producing the Class and Gender Divide: Fritz Lang’s Metropolis”. Women in German Yearbook 22 (2006): 21-42.

24


Jamaica: The Irish-Caribbean Connection Sabrina Wallace

immigrants in Jamaica due to their white racial identity. Irish presence in Jamaica begins with the English colonization of the Americas, particularly Barbados and other West Indies islands. In the early 1600s, Irish Catholics began crossing the Atlantic to mainland American colonies – their cost of travel paid through a contract of indentured servitude for up to seven years. These emigrants may have been motivated to leave home for the New World due to the recent conquering of Ireland by the Tudors and the resulting wars, rebellions, and pressure to conform to an Anglo-Protestant culture.3 However, the West Indies eventually became the focal point of Irish emigration in the seventeenth century as “Irish-born governors of the Caribbean islands encouraged their fellow countrymen to emigrate and…the predominant southern Irish trade routes brought Catholics to colonies dominated by plantation agriculture.”4 With the British colonization and settlement of West Indian colonies beginning in the early 1600s, Barbados was settled by the English in 1627 with the help of indentured servants (Irish included) and was established as an important port and base for English colonial territories in the Caribbean.5

In the mix of indigenous, Spanish, West African, South Asian, and Middle Eastern cultures, it may be difficult to note the remnants of a surprising Irish influence on the island of Jamaica. Today, the impact of Irish immigration in the seventeenth century can be seen in names: the Jamaican parishes of St. Andrew and St. Catherine, the communities of Irish Town and Dublin Castle, and popular surnames like Burke, McCarthy, and O’Connor.1 With recent popular history books such as Sean O’Callaghan’s To Hell or Barbados (2000) or Jordan and Walsh’s White Cargo: The Forgotten History of Britain’s White Slaves in America (2007), the Irish slave narrative has become a point of contention among writers and historians who debate the difference between slavery and servitude.2 In studying the circumstances of Irish immigrants to Jamaica, a push-pull factor is apparent and includes both involuntary capture and voluntary indentured servitude. The unique Irish-Jamaican colonization connection established throughout the seventeenth century encapsulates the poor treatment the Irish received under English rule as second-class citizens; at the same time, the ability for Irish immigrants to socially advance despite indentured servitude and enslavement demonstrates the privileges afforded to Irish

While British colonization advanced abroad, Ireland was entering Cromwellian control. Brutally ending the period of Confederate Ireland following the Ulster rebellion of 1641, Lord Protectorate Oliver Cromwell’s arrival in Ireland in 1649 swiftly put down resistance, ushering in the

___ 1. Rob Mullally, “One Love: The Black Irish of Jamaica,” The Wise Geese, January 19, 2013, accessed December 1, 2016, http:// thewildgeese.irish/profiles/blogs/one-lovethe-black-irish-of-jamaica-part-1-of-3-to-hell.

___ 3. James E. Doan, "The Irish in the Caribbean," ABEI Journal: The Brazilian Journal of Irish Studies 8 (2006): 105, September 9, 2011, accessed December 1, 2016, http://www.academia.edu/5074224/ The_Irish_in_the_Caribbean.

2. Liam Hogan, “The Myth of ‘Irish Slaves’ in the Colonies,” University of Limerick, Academia.edu, October 31, 2014, accessed December 1, 2016, https:// www.academia.edu/9475964/ The_Myth_of_Irish_Slaves_in_the_Colonies.

4. Ibid. 5. Ibid. 25


“imposition of godly rule” in Ireland.6 Following his success at the Siege of Drogheda, Cromwell wrote of the Irish Royalists “when they submitted, their officers were knocked on the head; and every tenth man of the soldiers killed; and the rest shipped for the Barbadoes” – thus beginning the first wave of involuntary Irish migration to the West Indies.7 Through the Act of Settlement (1652) and the Act of Satisfaction (1653) under Cromwell, Catholic lands were seized and many Irish Catholics were pushed into the poorer and isolated lands of Connacht.8 Royalist soldiers, Catholic priests, teachers, and bards were taken as political prisoners.9 Irish wanderers with no land – the poor and the homeless – “were legally defined by the state as social undesirables and transported to the West Indies to help meet the insatiable demand of sugar planters for labor.”10 Between 1648 and

1655, over 12,000 Irish political prisoners were sent to Barbados.11 The arrival of Irish laborers (both volunteer indentured servants and those forced into bondage) coincided with British capture and control of Jamaica. Cromwell’s plan to seize the Spanish West Indies was known as the ‘Western Design’; when the British failed to take control of Santo Domingo in Hispaniola, they turned to Spanishcontrolled Jamaica as an alternative concession to bring back to Cromwell.12 Taking control of Jamaica in 1655 with relative ease against the Spanish, the British realized “they lacked workers to exploit their conquest.”13 Soon, volunteer and imprisoned Irish laborers from the surrounding islands of Barbados and the Leewards came to work the sugarcane fields of Jamaica. In its very beginnings, the population of Jamaica was a white majority (3,000 whites and 500 blacks in 1660) since “the Dutch and Portuguese dominated the slave trade in the early seventeenth century and most white landowners in Barbados and the neighboring islands were unable to purchase slaves of African origin.”14 It is at this time that the laboring Irish of Jamaica may be most closely associated with slaves; in 1667, an account by the English adventurer John Scott revealed that the Irish were “poor men, that are just permitted to live,…derided by the Negroes, and branded with the Epithite of

___ 6. Thomas Bartlett, Ireland: A History (Cambridge: Cambridge University Press, 2010), 128. 7. "For the Honourable William Lenthall, Esquire, Speaker of the Parliament of England: These.," Oliver Cromwell to William Lenthall, September 17, 1649, in The Oliver Cromwell Website, accessed December 1, 2016, http:// www.olivercromwell.org/ Letters_and_speeches/letters/ Letter_index.htm.

___

8. Bartlett, Ireland: A History, 130.

11. Rebecca Tortello, “The Arrival of the Irish,” Pieces of the Part, December 1, 2003, accessed December 1, 2016, http://old.jamaicagleaner.com/pages/history/story0058.htm.

9. Rob Mullally, “One Love: The Black Irish of Jamaica.” 10. Hilary McD. Beckles, "A ‘riotous and unruly lot’: Irish Indentured Servants and Freemen in the English West Indies, 1644-1713," The William and Mary Quarterly 47, no. 4 (1990): 503-22, doi: 10.2307/2937974.

12. Rob Mullally, “One Love: The Black Irish of Jamaica.” 13. Ibid. 14. Ibid. 26


white slave.”15 Irish laborers worked twelve or more hours a day under the hot sun without proper clothing or shoes and under the scrutiny of English overseers; the poor conditions for the working Irish were further exasperated by religious difference with AngloProtestant masters that “increased the abuse.”16 It is no surprise that after experiencing or hearing about the working conditions in Jamaica and across the Caribbean, “many native Irish associated emigration to the New World with banishment and slavery.”17 Facing greater difficulty in attracting Irish and British servants, by 1670, the West Indies were again facing a white labor shortage. The following letter by the governor of Barbados in 1667 represents the “typical English opinion” on the implications of a white labor shortage while maintaining a negative prejudice for Irish servants:18

Despite the second-class treatment that the Irish faced (as in other Caribbean colonies and at home in Ireland), Irish laborers and servants still fared better than the black chattel slaves that were becoming the majority in Jamaica after the Royal African Company was given a British royal charter and the English West Indies slave trade was formalized in 1672.20 In the late seventeenth century, “even former servants could do well” as evidenced by the Irish making up 10% of landowners in Jamaica in 1670 with some even owning and renting out slaves of their own.21 In 1729, 20% of the Jamaican colonial assembly was composed of men with Irish surnames, more evidence of the opportunity for the upward mobility of Irish servants who had fulfilled their indenture contracts.22 Aiding the social mobility of Irish servants and modeling the Barbados Slave Code of 1661, Jamaica adopted Slave and Servant Acts in 1664 to legalize the system of colonial white supremacy in Jamaica to the benefit of Irish servants and laborers.23 Between the time these slave codes were

There yet remains that I acquainte your Lordships with the great want of servants in this island, which the late war hath occasioned. If labour fayles here, His Majesty’s customes will at home; if the supply be not of good and sure men the saifety of this place will always be in question; for though there be noe enemy abroad, the keeping of slaves in subjection must still be provided for. If you Lordship shall offer a trade with Scotland for transporting people of that nation hither and prevent any excess of Irish in the future, it will accommodate all the ends propounded.19

___ 20. "British Involvement in the Transatlantic Slave Trade.," The Abolition Project, 2009, accessed December 1, 2016, http:// abolition.e2bn.org/slavery_45.html. 21. Doan, “The Irish in the Caribbean,” 109.

___

22. Nini Rodgers, “The Irish in the Caribbean 1641-1837: An Overview,” Irish Migration Studies in Latin America, November 11, 2007, accessed December 1, 2016, http:// www.irlandeses.org/0711rodgers2.htm.

15. John Scott quoted in Kerby A. Miller, Emigrants and Exiles: Ireland and the Irish Exodus to North America (New York: Oxford University Press, 1985), 144. 16. Doan, “The Irish in the Caribbean,” 107.

23. Edward B. Rugemer, “The Development of Mastery and Race in the Comprehensive Slave Codes of the Greater Caribbean during the Seventeenth Century, The William and Mary Quarterly, 3rd ser., 70, no. 3 (July 2013): 15, accessed December 1, 2016, http:// barbadoscarolinas.org/PDF/Making %20Slavery%20English%20by %20Rugemer.pdf.

17. Ibid. 18. Hilary McD. Beckles, "A ‘riotous and unruly lot,’” 509. 19. Francis Willoughby quoted in Hilary McD. Beckles, "A ‘riotous and unruly lot,’” 509. 27


enacted and a census taken in 1673, the black population in Jamaica has increased from 15% to over 50% of the total population.24 In 1681, whites in Jamaica were outnumbered by Africans 3 to 1.25 Between 1690 and 1713, the white population in Jamaica decreased by 3,000 as “the growth of a slave-based economy steadily diminished economic opportunities for freed servants in the West Indies; thus, during the 1700s most Irish Catholic emigrants journeyed to the mainland colonies” of North America.26 The Slave and Servant Codes of 1664 defined the system of slavery in Jamaica, “lifting the stature of servants while depressing that of slaves.”27 In an effort to attract more white servants (who were eventually replaced by the exponentially growing slave trade), protections were given: white Irish servants were allotted more food rations and clothing and a ban against whipping white servants was instituted.28 In addition, the Jamaican Slave and Servant Codes did something novel in legislating servitude: “previous Servant Acts had consistently used the term ‘Christian’ to refer to European indentured servants, but Jamaica’s 1681 Servant Act dropped the term ‘Christian’ and added the word ‘white,’” definitively racializing the system of slavery in Jamaica and defining the distinction between black slaves and more privileged white servants, Irish included despite continuing to face economic inequity when compared to their English oppressors.29

had changed: “in 1678 the majority of these Irish people may have been servants, bonded and free, but in 1729 they had disappeared either by dying, emigrating elsewhere or becoming smallholders.”30 In 1833, with the abolition of slavery in Jamaica, there was a brief resurgence in Irish immigration and indentured servants heading to the West Indies. In an episode of repeating history, Ireland again feared Cromwellian-style enslavement; with slaves being free, demand for labor in the lowlands of Jamaica resulted in bounties for European immigrants and “the institution of ships like the Robert Kerr, known as ‘man-traps’ and sub-agents who wandered into quiet Irish towns and attracted people with the promise of free passage, high wages and the hope of bettering their lives.”31 Ultimately, though, this second wave of Irish immigration to Jamaica failed and labor was instead replaced by the Maltese, free AfricanAmericans, and Asians.32 In the time span from Jamaica’s initial English colonization in 1655 to the last shipment of Irish emigrants in 1841, it is estimated that somewhere between 30,000 and 80,000 Irish people were brought to Jamaica.33

24. Ibid, 16.

Much like Irish immigration patterns to North America, the Irish immigration to the West Indies, particularly to Barbados and Jamaica, ebbed and flowed with push and pull factors like Cromwellian rule in Ireland, demand for white workers, and the development of the English slave trade. Some of the Irish immigration was voluntary as Irish men and women signed indentured servitude contracts for a chance at a better life. Other emigrants were captured, deemed political prisoners or lowly Irish undesirables by the

25. Ibid, 18.

___

26. Doan, “The Irish in the Caribbean,” 112.

30. Nini Rodgers, “The Irish in the Caribbean 1641-1837: An Overview.”

Throughout the seventeenth century, the position and nature Irish emigrants in Jamaica ___

27. Edward B. Rugemer, “Mastery and Race,” 19.

31. Rebecca Tortello, “The Arrival of the Irish.”

28. Ibid.

32. Ibid.

29. Ibid.

33. Ibid. 28


English gentry and forced into labor. Together, these people travelled across the Atlantic and spread the Irish diaspora to Jamaica, leaving a faint but uniquely Irish cultural trace marking their place in history.

The_Myth_of_Irish_Slaves_in_the_Col onies. Mullally, Rob. "One Love: The Black Irish of Jamaica." The Wild Geese: The History of The Irish Worldwide. January 19, 2013. Accessed December 01, 2016. http:// thewildgeese.irish/profiles/blogs/onelove-the-black-irish-of-jamaica-part-1of-3-to-hell.

References Bartlett, Thomas. Ireland: A History. Cambridge: Cambridge University Press, 2010.

Rodgers, Nini. "The Irish in the Caribbean 1641-1837: An Overview." Irish Migration Studies in Latin America. November 11, 2007. Accessed December 1, 2016. http://www.irlandeses.org/ 0711rodgers2.htm.

Beckles, Hilary Mcd. "A "riotous and Unruly Lot": Irish Indentured Servants and Freemen in the English West Indies, 1644-1713." The William and Mary Quarterly 47, no. 4 (October 1990): 503-22. Accessed December 1, 2016. doi: 10.2307/2937974.

Rugemer, Edward B. "The Development of Mastery and Race in the Comprehensive Slave Codes of the Greater Caribbean during the Seventeenth Century." The William and Mary Quarterly, 3rd ser., 70, no. 3 (July 2013): 1-34. Accessed December 1, 2016. doi:10.5309/ willmaryquar.70.3.0429.

"British Involvement in the Transatlantic Slave Trade." The Abolition Project. 2009. Accessed December 1, 2016. http:// abolition.e2bn.org/slavery_45.html.

Tortello, Rebecca. "The Arrival of The Irish." Pieces of the Past. December 1, 2003. Accessed December 1, 2016. http:// old.jamaica-gleaner.com/pages/history/ story0058.htm.

"For the Honourable William Lenthall, Esquire, Speaker of the Parliament of England: These." Oliver Cromwell to William Lenthall. September 17, 1649. In The Oliver Cromwell Website. Accessed December 1, 2016. http:// www.olivercromwell.org/ Letters_and_speeches/letters/ Letter_index.htm. Doan, James E. "The Irish in the Caribbean." ABEI Journal: The Brazilian Journal of Irish Studies 8 (2006): 105-16. September 9, 2011. Accessed December 1, 2016. http://www.academia.edu/ 5074224/The_Irish_in_the_Caribbean. Hogan, Liam. The Myth of "Irish Slaves" in the Colonies. University of Limerick. Academia.edu. October 31, 2014. Accessed December 1, 2016. https:// www.academia.edu/9475964/

29


II

UNDERSTANDING OUR PRESENT Intimacy & Sexuality for Adolescents & Adults with Autism Spectrum Disorder Sydney Victoria Butler, Senior Human Development & Community Engagement Major Discourse & Power: Comparative Analysis of 2016 Campaign Rhetoric Hazim Hardeman, Senior Strategic Communication Major, Sociology Minor Conflict Resolution & Reconciliation in Post-Genocide Rwanda Nancy Dordal, Junior Anthropology Major, Spanish & Chemistry Minor The Flatiron Building: An American Icon & the Original American Skyscraper William Kowalik, Junior American Studies Major, Art History Minor Going Nuts Over Betel-Chewing: How the Areca Nut Has Started a Cultural War in South-East Asia Alexander Voisine, Junior Global Studies & Spanish Major

30


Intimacy & Sexuality for Adolescents & Adults with Autism Spectrum Disorder Sydney Victoria Butler

supports for individuals like themselves. Each participant had previously been diagnosed with Asperger’s or ASD and was at least of 18 years of age. The range of ages for diagnoses’ was from age 16 to age 58 while the range of ages at the time of participation was from age 18 to age 62. The participants included both women and men with various levels of education. In this study, 15 participants expressed “a longing for greater emotional intimacy—both romantic and otherwise” (Mueller et al., 2008, p.179). This finding is contrary to what some may believe about individuals with ASD or disabilities in general. Wedmore describes in another similar study conducted by Bauminger, Schulman, and Agam in 2004, 16 adolescents with HFASD were compared to 16 average adolescents in their “perception of friendship, friendship qualities, lack of social relationships, and self” (2011, p. 12). Similar to the findings of this first study it was shown that despite the difference in how close friendships are perceived by individuals with HFASD versus by individuals without it, these relationships are just as important to them (Wedmore, 2011, p.12).

Intimacy and sexuality are two taboo subjects seldom discussed by many. These subjects discussed involving individuals with Autism Spectrum Disorder (ASD) are even more unwelcomed by society. In understanding all of the aspects of intimacy and sexuality for individuals with ASD, it is crucial that the facts, myths and realities be understood as well. Being aware of these aspects will open the door to accepting that intimacy and sexuality is possible and attainable for individuals with ASD. Before examining research on the presence of intimacy in individuals with ASD, I must define what intimacy is. Laurence Steinberg, a professor and researcher on adolescence, defines intimacy as “the psychosocial domain concerning the formation, maintenance, and termination of close relationships” (2014, p.317). For most typically developing adolescents, intimacy is first experienced in infancy with attachment to the mother. As infants grow into adolescents they begin to understand cognitively the idea of friendships and so on. Some theorists believe that attachment as an infant is an indication of future intimacy with people (Steinberg, 2014, p. 322). For individuals with ASD this attachment may be interrupted or normal at birth, but as the adolescent grows into adulthood these attachments or lack thereof may not influence intimacy as strongly as theorists believe. For individuals with ASD, intimacy may be different but not because of a lack of interest in the subject. While some may assume that individuals with ASD are disinterested in the idea of being intimate with another person in a friendly, romantic, or sexual relationship, there is research that proves otherwise.

In addition, there are numerous studies on the topic of adults with ASD or intellectual disabilities and the desire for and interest in intimacy. One of these studies includes a 20year longitudinal study by Meghan A. Farley and ten other researchers conducted in 2009 with 41 parents of individuals with ASD on the status of their (if any) intimate relationships (Wedmore, 2011, p.13). Another study included a questionnaire given to 76 individuals in 2006 by Eline M. Sieblink and 2 others on the “knowledge, attitudes, experience, and needs” of individuals with intellectual disabilities (Sieblink, p.1). The Farley et al. study in 2009 found that “of the sample [of parents] whose children were not in romantic relationships, 44% believed that their son or daughter would like to be in a romantic relationship” (Wedmore, 2011, p. 13). The Sieblink study of 2006 found that some participants expressed that they want to pursue a “committed relationship without immediate need for a sexual relationship

In a 2008 study by Muller, Schuler and Yates, 18 participants were interviewed about the experiences they have in their inner worlds and recommendations for social 31


while others revealed the desire to have a sexual relationship without the need for a steady relationship” (Wedmore, 2011, p.13). Whether a committed relationship is desired without sexual intimacy, or sexual intimacy is desired without a committed relationship, the desire for some form of intimacy is real.

have these experiences and feelings. From these studies it is clear that people with ASD have the presence of intimacy within their lives and romantic relationships. Further evidence will show that this desire can manifest into a sexual behavior with others. In addition to being viewed as not intimate, adolescents and adults with ASD are also viewed by society as not being sexually attracted to other individuals, or not having sexual desires. Since it has been established that individuals living with ASD are capable of having the desire for intimacy, it is important to move further and prove that they also have sexual desires. To begin, Steinberg describes sexuality as including an understanding of sex organs, an awareness of sexual arousal, and identifying sexual orientation (Steinberg 2014, p.351). Wedmore writes that the reason why some individuals with ASD might be seen by society as asexual or lacking sexual desire is because of caregivers. She describes that “some caregivers are not supportive of the desire of people with ASD to date or to experiment sexually with other people. Out of protection, caregivers may avoid the topic of sex or expressively set rules against it” (Wedmore, 2011, p. 13). This idea that caregivers want to protect their loved ones from any negative aspects of exploring sexualty is not an idea formed with bad intent, but it does deprive individuals the right to act as a whole human being. Another reason for society’s view of individuals with ASD is the lack of sexual education provided to this community. Wedmore also described that “according to a study done by Kalyva (2009)…Only 12.5 percent of teachers reported that they would feel confident providing sex education to children with Autism (Kalyva, 2009) (2011, p. 14). Wedmore’s findings about societal messages and pressures in her 2011 study were that several participants felt that individuals without ASD made sexual talk and experiences awkward for them by operating under the notion that individuals with ASD should not engage or be associated with sex (2011, p. 77). This combined protection from

In a 2011 study conducted by Haley V. Wedmore, 9 participants including 5 individuals with ASD, 1 significant other to an individual with ASD, and 3 caregivers of individuals with ASD were surveyed on what makes intimacy work, what is challenging and how it is experienced. Wedmore’s findings about ‘Autism Characteristics’ that make intimacy challenging were that some of the participants reported being overstimulated, overwhelmed and unable to process too much information at once simply due to their diagnosis, which caused a rift when intimacy entered the picture (Wedmore, 2011, p. 67). There were certain stimuli that would cause a participant to become overwhelmed therefore interfering with the process of developing intimacy if that stimulus were present. For one participant who also struggled with anorexia, this stimulus was food (Wedmore, 2011, p. 67). For another participant, physical touch presented a sensory issue for her boyfriend with ASD. The participant expressed that her boyfriend was hypersensitive which caused some areas of the body to not produce a physical reaction, and some to be painful if touched (Wedmore, 2011, p. 69). In addition, one participant expressed that he had trouble understanding the cues of his significant other because of social communication deficits, and being attentive to her needs because of his fixation on tasks (Wedmore, 2011, p. 70). In discussing what individuals with ASD were attracted to in the category of how intimacy is experienced respondents expressed that there were particular aspects about their partner that they admired such as a specific shirt because of the images of on it. One participant enjoyed that her partner was well informed about Latin and was a hard worker (Wedmore, 2011, p. 78). No matter how it happens, individuals with ASD do 32


caregivers and lack of sex education from both caregivers and teachers leaves individuals with ASD without the proper understanding of their sexuality.

from sexual activity, 59% could identify a condom in a picture, and only 51% could identify a drawing of an individual masturbating (Sieblink, 2006, p.289). In the category of “Sexual Attitudes”, participants had a positive attitude towards heterosexual kissing, hugging and intercourse, although slightly less positive for intercourse, a close to negative attitude towards homosexuality, and a fairly neutral attitude towards “Impersonal sexual activities” which were defined as masturbation, watching adult movies and using prostitution (Sieblink, 2006, p.289). In terms of “Sexual and Relational Experience”, the mean percentage of participants with kissing, hugging and boyfriend/girlfriend relationship experience was in the mid to high 70’s. The mean intercourse experience percentage was 46% for men and 62% for women. For “Impersonal sexual activities” the masturbation mean was 70% for men and only 32% for women, the mean percentage for watching adult movies was 2% higher for both men and women than the masturbation mean percentage, and the prostitution mean percentage for men was 32% and 0% for women (Sieblink, 2006, p289). Lastly, in measuring the “Sexual and Relational Needs” of the participants, overall the mean percentage for kissing was the highest in the high 80’s, followed by intercourse, followed by masturbation (Sieblink, 2006, p.290). The mean percentages for hugging for the participants as a whole was around the same as for kissing, and, but having a boyfriend or girlfriend was the highest.

The myths that have surfaced over the years about Autism and sexuality have affected individuals with ASD by forming their ideas of societal assumptions about them ranging from “…we don’t have feeling like “normal” people. —Wolfie (@wolfie74)” to “…we’re incapable of feeling love or higher emotions at all. —TG (@outoutout)” to “…our people can’t, won’t don’t want to or will never have a sexual or romantic relationship, so why teach consent? —Mand Hoskins (@Mandlovesgeeks)” (Autism Now & ASAN, 2013, p.106). These myths are not only myths amongst individuals without ASD, but individuals with ASD know that society holds these assumptions. In turn these negative views towards the sexuality of individuals with ASD can lead to individuals expressing themselves in inappropriate manners with “an increased risk of sexually perpetrating, being sexually perpetrated against, or practicing unsafe sex” (Wedmore, p.14). Wedmore also writes, “Gougeon (2009, p. 277) argues that the current sexuality education curriculum or rather, the “ignored curriculum of sexuality,” promotes sexual incompetence and leads to exclusion, legal issues and denial of full citizenship/inclusion for people with disabilities” (2011, p.15). It would be beneficial to both individuals with ASD and individuals without ASD if sexuality education curriculums were taught to individuals with ASD in order to promote a healthy understanding and practice of sexuality.

It is evident from these findings that these individuals had little knowledge of advanced sexual content, they had positive attitudes towards sexual activities, a moderate to high level of experience with basic displays of intimacy such as hugging, kissing and relationships, and a moderate to high need for sexual and relational intimacy with a boyfriend or girlfriend. The desire for, experience with, and need for sexual and relational behaviors for individuals with ASD demonstrates that there truly is a need for proper sexuality education to remedy the lack

In the study of Sieblink et al. in 2006, the four categories of results found from the 28-question questionnaire were “Sexual Knowledge”, “Sexual Attitudes”, “Sexual and Relational Experience”, and “Sexual and Relational Needs”. The findings for “Sexual Knowledge” questions were that 93% were aware that women have the possibility of pregnancy after sex with a man, while 76% were aware of the risk of developing an STD 33


of knowledge of advanced sexual content that the Sieblink study found.

achieved effectively despite the fact that there are challenges associated. Some of the sensory aspects that need to be accounted for include aroma, air temperature, lighting, bedding, body moisture and the frequency and intensity of background noise (Autism Now & ASAN, 2013, p. 45). Even with the results of Bourgendien et al., and the challenges associated with achieving and experiencing pleasure, both of these sources as well as the Wedmore study of 2011 demonstrate that while the process may be different for individuals with ASD, it is possible and it happens.

Bourgendien, Reichle and Palmer conducted a study in 1997 with 89 adults with ASD who lived in group homes throughout North Carolina. This study found that all of the participants were engaging in sexual activities with masturbation being the most prevalent behavior. The participants were randomly selected from a total of 142 group homes in North Carolina that served individuals with ASD or that served individuals with developmental disabilities. The caregivers who responded were given a questionnaire about the sexual behaviors of the individuals under their care. The results showed that 68% of the participants masturbated. Of that 68%, 66% achieved an orgasm and 16% showed frustration in the lack of success from masturbation. Bourgendien et al. write, “The signs of frustration reported by staff members included indicators such as anxiety, agitation, or aggression toward self or others when their masturbation was interrupted as well as sounds of frustration and repeated attempts to go to the bedroom or bathroom” (1997, p. 5).

To conclude, individuals with ASD have proven to have the ability and desire to be intimate, romantic and sexual. The smaller aspects of the latter such as sensory aspects, experiencing arousal or orgasm, or processing information in romantic relationships and being touched happen differently for some individuals with ASD, but they are definitely present. As these topics associated with individuals with ASD are discussed more, they will become normalized in society. If myths are dispelled, one day there will be space for individuals with ASD to access sexual education and to be accepted when engaging in sexual behaviors or romantic relationships.

While it may be difficult for some individuals with ASD to feel pleasured from sexual activity, this is not true for all. The Autism NOW Center and the Autism Self Advocacy Network wrote a handbook for people with ASD on sexuality and relationships. In this handbook the notion that “Orgasms experienced by ASD individuals are different from individuals without ASD” is questioned. It is noted, “Sex and intimacy between couples where one or both partners have ASD is highly influenced by the sensory system. Sensations that trigger a strong reaction, whether painful or pleasurable, tend to be amplified. This has been proven in day-to-day life activities, and it can easily apply to orgasms” (Autism Now & ASAN, 2013, p. 45). The handbook then goes on to show that it is possible for individuals with ASD to create an environment where an orgasm can be 34


References Autism NOW & Autism Self-Advocacy Network (2013). Relationships & Sexuality: A Handbook for and by Autistic People. Washington, DC: Elesia Ashkenazy & Melanie Yergeau. Mueller, E., Schuler, A. and Yates, G. B. (2008). Social Challenges and Supports from the Perspective of Individuals with Asperger Syndrome and Other Autism Spectrum Disabilities. Sage Publications and the National Autistic Society, 12(2), 173-190. Sieblink, E. M., De Jong, M. D. T., Tall, E., Roelvink, L. (2006). Sexuality and People with Intellectual Disabilities: Assessment of Knowledge, Attitudes, Experiences and Needs. Mental Retardation, 44(4), 283-294. Steinberg, L. (2014). Adolescence. New York: McGraw-Hill. Van Bourgondien, M. E., Reichle, N. C., & Palmer, A. (1997). Sexual Behavior in Adults with Autism. Journal of Autism and Developmental Disorders, 27 (2). Wedmore, H. V. (2011). Autism Spectrum Disorders and Romantic Intimacy. Graduate Theses and Dissertations, Paper 10143.

35


Discourse & Power: Comparative Analysis of 2016 Campaign Rhetoric Hazim Hardeman

indexicality, practice and their relation to Kiesling’s typology of power. In what was perhaps the most shocking moment and a cause célèbre of the 2016 campaign season, business magnate turned presidential candidate Donald Trump launched a forceful, xenophobic diatribe against Mexicans. At a rally, Trump proclaimed from his pulpit that when Mexico sends immigrants to the U.S “they’re bringing drugs, they’re bringing crime, they’re rapists” (CNN, 2015). He continued by supposing that “some” may be good people (CNN, 2015). This was characteristic of Trump’s camping rhetoric--not necessarily the outright xenophobia, but the persistent “otherization” of marginalized groups. Critical to this is his anaphoric use of the word “they’re.” By repeating “they are” Trump creates, and emphasizes, the distinction between us versus them—good versus bad. Rhetorical theorist Kenneth Burke calls this an antithesis type of identification whereby the speaker identifies persons as enemies of the audience (Borchers, 2006). This linguistic practice by Trump is pernicious and shot through with power as it (re)constructs a society in which Mexicans live under alterity, as a result of his sweeping and wanton demonization.

The phrase “mere rhetoric” in commentary on political discourse has always seemed intellectually lazy and sometimes utterly irresponsible. Even in this political environment where incendiary and sophomoric comments abound, the discourse of political figures should not be dismissed offhandedly. This sentiment is informed by a perspective that recognizes that, in large part, linguistic expressions contain a plethora of different meanings. One way to move beyond the dismissal, debasement of language is to accept the idea that any metalinguistic commentary must proceed from the premise that language is never neutral. This means, essentially, that language, which is acquired through socialization, is not impervious to the values, norms, beliefs, prejudices, and ideologies of the society from which it was developed. Russian philosopher Mikhail Bakhtin renders this proposition axiomatic when he writes in lucid prose, “it is not, after all, out of a dictionary that the speaker gets his [or her] words" (Bakhtin, 1934, p. 482). Instead, it is from the mouths and minds of others that one acquires language. Marked by sociocultural features, certain linguistic expressions are valorized while others are debased in a given social context. This is to say that power--ubiquitous and variegated-pervades linguistic expression. Scott Kiesling delineates this in his sociolinguistic analysis of fraternities where he lays out a typology of power (defined as the ability to modify action) including: physical, economic, knowledge, structural, nuturant, demeanor, and ideological power (Kiesling S. F., 1997, p. 410). With the framework I have set, I will perform a critical analysis of the discourse of the two 2016 presidential candidates from the democratic and republican party, Hillary Clinton and Donald Trump, with an eye out for foundational linguistic anthropological concepts such as language ideology,

Palestinian intellectual Edward Said, in Representations of the Intellectual, warns against the general, collective language spewed by politicians. Said posits that we, them, and they are all watchwords employed by the politician who seeks to “consolidate ‘our’ identity as beleaguered and at risk” (Said, 1994, p. 32). Although Trump’s linguistic practice perpetuates this feeling of being beleaguered and at risk, he did not create it; he merely exploited it. Trump recognized the pervasive fear and economic vulnerability facing many Americans and harnessed it into an assailing ideological power. As Kiesling states, “ideological power is the process whereby ways of thinking about the world are naturalized into a community’s behavior” (Kiesling S. F., 1997, p. 410). The community behavior that Trump 36


appeals to is the tendency to impute individual social failings to the encroachment of minority groups, who take ‘American’ jobs and erode the country’s cultural and moral fabric, instead of structures of domination. Some may see this as simple demagoguery, but the point is there must be an existent ideology that the demagogue exploits to modify action. In Trump’s case, the action being modified is compelling, frightening a segment of the American body-politic to the ballot box. Despite the unethicality, Trump’s use of ideological power has made his linguistic practice efficacious. 
 Former U.S Secretary of State and 2016 democratic presidential nominee Hillary Clinton’s pedigree as a polished politician lent itself to very different linguistic practices as she vied for the presidency. However, the Clinton campaign did not elude controversy. During a fundraising event in Charleston, S.C., Black Lives Matter activist Ashley Williams interrupted the fundraiser unfurling a banner that read “We have to bring them to heel” (Capehart, 2016). This phrase is from Clinton’s infamous 1996 speech at Keene State College (where she called black youth who repeatedly commit crimes “super-predators”) in support of the 1994 Violent Crime Control Act, which spurred policy that overwhelmingly plagued the black community with hyper-aggressive policing and draconian sentencing laws. Responding to Williams’s obstruction, Clinton insisted that she would address the issue while ignoring Williams and sticking to her talking points. After the brazen activist was escorted out, Clinton, with a disconcerting insouciance, said: “now, back to the issues.” This remark by Clinton is a linguistic practice that reinforces the disempowerment of blacks in the political process. Perhaps Clinton was annoyed by Williams’s politics of obstruction, but the secretary’s remarks betray an insensibility to the fact that blacks historically and contemporaneously have had their voices silenced in the mainstream political process. Therefore, obstruction often is the only viable means to have our voices heard. Failure to

recognize this causes Clinton to become a participant in black disempowerment and disenfranchisement through her linguistic practice. Moreover, Clinton’s comment indexes whiteness; meaning, she lays out in plain view her privileged social status as a white woman. Of course, the presidential hopeful did not do this explicitly, but as Kiesling points out in his work on whiteness, linguists Michael Silverstein shows that indexicality is multifarious, therefore the indexing does not have to occur in the speaking moment, but can occur in relation to larger sociopolitical ideologies (Kiesling S. F, 2001). The ideology being indexed here is one that regards the political concerns of blacks with apathy. These comments writ large the fact that Clinton’s propinquity to power has propelled her from the concerns of many black folks. This is evidenced by Clinton’s desire to hastily move on and get “back to the issues” when confronted and called to take accountability for one of the most pernicious policy decision for which she was a catalyst. All in all, Clinton’s remarks before a predominately white audience were well received. This is because she draws on what Kiesling calls structural power—the power of a place within a structure. In this instance, Clinton’s place is defined by her positionality as a white woman in a white dominated society and a white dominated arena in politics (granted, mainly male). This positionality from which Clinton garners power also distances her from the concerns of black folks, as she, from her privilege perch, becomes arbiter of what issues are salient and which are not. This is the ultimate expression of power as former secretary Clinton’s indexing of whiteness allows her to define the political concerns of blacks as irrelevant. Heretofore I have focused on some of the more controversial comments made during the 2016 presidential campaign season, but, to be sure, the less invidious comments have also been revelatory. Take for instance Trump’s seemingly innocuous 37


reminder to Florida Governor Jeb Bush during a debate that “this is a country where we speak English” (CNN, 2015). Undergirding this comment is a language ideology that presumes linguistic diversity is a threat to American identity, as if different languages could not possibly coexist. Here, Trump also indexes an ideological sentiment that recoils at the notion of a pluralistic democracy. Or, consider Secretary Clinton’s claim that one of the biggest tests facing the incoming president is whether or not they “can deliver results that improve people’s lives” (C-SPAN, 2016). This is part and parcel of Clinton’s emphasis on pragmatism, a political philosophy she avowedly adheres to, branding herself as “a progressive that likes to gets things done” (The Washington Post, 2013). By extolling the virtues and importance of pragmatism, Clinton engages in a linguistic practice that primes voters to consider ability to get things done as an important criterion when heading to the polling booth. In Kiesling’s typology of power this would be knowledge power as Clinton draws on a long tradition of American pragmatism which some have argued is a quintessentially democratic ideal (Chipman, 1911).

militaristic society could these comments parlay into more votes. And, only in a society that struggles to eradicate xenophobia, racism, and sexism can Trump’s xenophobic, racist, and sexist discourse be politically viable. Trump is merely a mouthpiece for our nation’s underbelly. Understanding this, hopefully we begin to seriously scrutinize the rhetoric of politicians, realizing that in doing so we are scrutinizing society, too.

References Bakhtin, M. (1934). Discourse in the novel . In S. Blum, Making sense of language: Readings in culture and communication (p. 482). New York: Oxford University Press. Borchers, T. (2006). Rhetorical theory: An introduction. Long Grove, Illinois: Waveland Press, Inc. Capehart, J. (2016, February 25). Hillary Clinton on ‘superpredator’ remarks: ‘I shouldn’t have used those words’. The Washington Post. Retrieved from https://www.washingtonpost.com/ blogs/post-partisan/wp/2016/02/25/ hillary-clinton-responds-to-activistwho-demanded-apology-forsuperpredator-remarks/

In all, I have endeavored to show why language, particularly political discourse, should not be summarily dismissed. The controversial statements, which have seldom been subjected to rigorous analysis, show best just how high the stakes are. In conclusion, consider Trump’s inflammatory and nefarious proclamations that he would “bomb the shit out of [ISIS]!” and “take out their families” (Time, 2015). More than articulating what would be a blatant war crime, Trump’s remarks are linguistic practice that perpetuate a patriarchal norm of using violence as the primary modality of social recourse. The point of Kiesling’s framework of power is to show that in a given social context certain linguistic expressions are conferred power as a result of broader ideological structures. This means that the larger society is complicit in Trump’s (and Clinton’s) rhetoric. Only in a hyper-

Chipman, W. (1911). Pragmatism and politics. Proceedings of the American Political Science Association . 8, p. 189. Montreal: American Political Science Association. CNN. (2015, September 16). Trump: 'This is a country where we speak English'. CNN. (2015, June 18). Trump's outrageous Mexico remarks. C-SPAN. (2016, March 30). Hillary Clinton Campaign Event in Harlem, New York.

38


Kiesling, S. (2001). Stances of whiteness and hegemony in fraternity men's discourse. Journal Of Linguistic Antropology, 11(1), 101-115. Kiesling, S. F. (1997). Power and the language of men. In S. Blum, Making sense of language: Readings in culture and communication (p. 410). New York: Oxford University Press. Said, E. (1994). Representations of The Intellectual. New York: Random House, Inc. Time. (2015, December 2). Donald Trump Says He’d ‘Take Out’ Terrorists’ Families. The Washington Post. (2013, October 13). The moment when Hillary Clinton won the first Democratic debate.

39


Conflict Resolution & Reconciliation in Post-Genocide Rwanda Nancy Dordal

apparent European-like features (BuckleyZistel 2006, 135). In 1959, tensions culminated in a violent overthrow of the Tutsi elite, resulting in the exile of many Tutsi people. The Rwandan Patriotic Front, comprised of many Tutsi exiles, invaded northern Rwanda in 1990, initiating the civil war. After a ceasefire was broken in April 1994 with the downing of the president’s plane, Hutu government officials orchestrated the genocide in which nearly one million people, predominantly Tutsi, were killed in a period of one-hundred days. Though the government elites coordinated the genocide, the killings were performed by many ordinary citizens. Rwanda, both its government and its people, was left with the task of reconciling a divided country and holding hundreds of thousands of génocidaires accountable.

Rwanda, as a central African country, is no stranger to ethnic tensions and violence. The violence of the 1994 genocide is remembered as one of the bloodiest moments in history in which almost one million people were killed over one hundred days. That averages out to about ten thousand deaths a day. How does a nation reconcile this type of atrocity? How do neighbors who may have killed each other’s families live next to each other in peace? The United Nations set up the International Criminal Tribunal for Rwanda in order to hold those guilty of facilitating the violence accountable. Other cases were tried in national courts, both domestically and internationally. The gacaca court system was also adapted from its traditional use as a mechanism of mediation. ‘Gacaca’ means ‘grass’ in Kinyarwanda, and pertained to the open air environment of these courts (Nyseth Brehm et al. 2014, 336). In this system, perpetrators answered to their crimes in front of a public audience and community elders. These courts gave both punitive and restorative justice for individuals and aimed to reconcile the community as a whole. This paper focuses primarily on the effects, both positive and negative, of the ICTR and gacaca system specifically.

After mass violence such as that of the 1994 Rwandan genocide, there often occurs a period of transitional justice focused on reconciliation and accountability to stimulate a collective healing process. The Rwandan government and the international community came up with several types of tribunals for trying those accused of war crimes during the genocide. The United Nations created the ICTR, the International Criminal Tribunal for Rwanda, with a similar set-up as the Nuremberg Trials post-World War II. Some génocidaires were tried in foreign national courts such as Switzerland, while others were tried in Rwandan national courts. These efforts, however, could not effectively and efficiently process the thousands of criminal cases. Thus, the Rwandan government adapted the traditional gacaca courts for post-genocide use beginning in 2001, seven years after the genocide. Rwanda, like many countries that experience atrocities, resorted to a focus on restorative as well as punitive justice to garner public reconciliation.

The Rwandan genocide of 1994 was the culmination of not only a bloody civil war, but also of years of tension between ethnic groups in Rwanda. In pre-colonial Rwanda, the Tutsi monarchs governed over the Hutu, Twa, and Tutsi through a feudal system. The Twa are known to be a pigmy people, while the origins of the differences between the Hutu and Tutsi are still debated to this day. Scholars generally agree that differences between the Hutu and Tutsi were exacerbated with the arrival of German and Belgian colonists in the late nineteenth century and mid-twentieth century, respectively. The colonists deemed the Tutsi the superior ethnic group because of their

The United Nations’ creation of the International Criminal Tribunal of Rwanda aimed to address the need for a punitive response of the genocide by the international 40


community. Because of the scale of the tribunal, mostly high-ranking génocidaires were tried. This triggered major criticism of the ICTR: it could not be reconciliatory for the Rwandan people if the tribunal was so far removed from Rwanda and the victims of the genocide. Carla de Ycaza writes, “The ICTR, unlike the localised grassroots-based gacaca courts, is seen by many Rwandans as a means of developing international law, due to the fact that the court has adopted a primarily western legal approach, and takes place remotely in Tanzania, away from the affected population in Rwanda” (2010, 16). The United Nations had been trying to establish an international court since the end of World War II when the term “genocide” was coined. From the viewpoint of the Rwandan people, the UN used the Rwandan genocide as an opportunity for the creation of such a court instead of focusing its efforts on reconciling the broken country of Rwanda.

On the other hand, Gacaca courts function with the cooperation of the entire community. Traditionally, the courts were used in Rwanda for lesser crimes such as theft and provided a means of restorative justice. The accused sat before an inyangamugayo, a judge, and a jury of his peers in a public space. This decentralized approach encouraged audience participation during the trial itself. The courts took confessions and the category of the crime into account. (Nyseth Brehm et al. 2014, 337). A person accused of theft was more likely to get a lenient punishment if he or she confessed early. The purpose of these courts was to restore and maintain social harmony through truth-telling. Physical evidence of the crime came second to witness testimony. Thus, confessions of the accused facilitated the restoration of collective harmony. The gacaca system that was established after the genocide adopted similar principles as the traditional courts, but was fundamentally a new system. The adapted gacaca courts for post-genocide cases needed to be more punitive in order to hold the génocidaires accountable and give retribution to victims. The adapted courts still took place in public with the involvement of the entire community, however, the performance of the trial proceedings followed a Western approach. The local nature of these courts still emphasized a collective reconciliation over individual retribution; but interpretations of the events of the genocide and the tensions leading up to it could vary across the country, allowing for a collective memory of the local community, instead of the nation as a whole.

Human rights expert Sigall Horovitz, however, argues that the ICTR did provide reconciliation for the Rwandan people by initiating the abolition of the death penalty in Rwanda. In 1994, Rwanda, then a nonpermanent member of the UN Security Council, was the only member to vote against the creation of the ICTR because it would ban capital punishment as a sentence for the guilty. Years later, in 2007, Rwanda officially abandoned the death penalty “to satisfy the ICTR’s referral conditions” (Horovitz 2016, 520). When Rwanda abolished the death penalty, genocide cases could be transferred from international courts back to Rwanda. Horovitz also interviewed experts familiar with the genocide and found that the abolition of the death penalty offered some reconciliatory effects. Death row prisoners and their families were given hope to one day be set free. This hope fueled some prisoners’ confessions and apologies in order to be released at an earlier date. Horovitz noted that even the victims of the genocide saw the abolition of the death penalty as a measure of reconciling the country because of the happiness it gave the families of prisoners (2016, 531).

The gacaca courts were largely successful after the genocide as they allowed for a more efficient mechanism of addressing war crimes than the international courts. Almost two million cases were tried in the gacaca system between 2001 and 2012 when the system officially ended (Nyseth Brehm et al. 2014, 339). Gacaca courts also offered solutions for reintegration of perpetrators into Rwandan society as well as reparations 41


for victims. This system of transitional justice has been hailed by scholars as successful because it addressed many of the complexities that arise when a nation faces atrocities perpetrated by its own against its own. Hollie Nyseth Brehm et al. look at the gacaca courts as an opportunity for innovation in transitional justice efforts. The courts merged punitive and restorative justice, traditional and modern judicial approaches, and community and state authority.

functional in the short term, also dangerously ignores the factors that led up to the civil war and genocide in the first place. Gender-based war crimes were also a major pitfall of the gacaca courts. Historically, crimes against men and women differ dramatically during periods of mass atrocity including the Rwandan genocide. Women face more sex-based crimes such as sexual assault and rape. Karen Brounéus explains that in the context of the Rwandan genocide, “sexual violence was systematically used by Hutu extremists towards Tutsi women and girls as a method of war, not only to inflict pain and humiliation but also to spread HIV—and thus ensure the end of the Tutsi people” (2008, 55). These types of sexual crimes are often extremely traumatic for the victim, not only because of the nature of the crime but also because of the stigma attached to them. Victims of these crimes had to relive their trauma through its retelling during the gacaca courts. Nicole Ephgrave encountered many women who experienced the same symptoms, such as fainting or shaking, during their testimonies (2015, 182). Because of the publicity of the court system, victims of sexual war crimes also faced retaliation from the families of their perpetrators sitting in the audience. The reconciliatory nature of the gacaca courts, which proved effective for many lesser categories of crime, failed the women who were forced to give testimony in front of their entire community. This finding is consistent with much of the research on restorative justice: it often forces victims of sexually-based crimes to relive their trauma and fails to adequately punish the perpetrators.

One of the major criticisms of the gacaca system stems from its basis in truthtelling. Because of the emphasis on confession rather than empirical evidence, silencing and coercive efforts are rampant forms of corruption in restorative justice systems (Doughty 2014, 784). The effectiveness of gacaca courts also varied depending on the community and the integrity of the inyangamugayo. In some parts of the country, ethnic divides have intensified because of bias towards the government (Ephgrave 2015, 181). There have been accusations that courts have ignored war crimes allegedly committed by the Rwandan Patriotic Front, the ruling party after the genocide. Anthropologist Bert Ingelaere also discovered through interviews that many blamed each other for the drawbacks of the gacaca system: We often heard the remark that the own group—Hutu or Tutsi—was ready to embrace the procedures installed by the state to foster reconciliation and live by the principles of unity propagated from above, but that the other party—Tutsi or Hutu—did not understand these ideas or was not ready or willing to do so. (2009, 517)

Though Rwandans currently live in coexistence with one another, cases such as these brought up by Ingelaere highlight the divide that is still present. The divide is only bridged by the collective ‘chosen amnesia,’ a term coined by Susanne Buckley-Zistel, where Rwandans actively choose to forget the atrocities of the 1994 genocide committed by their own neighbors. This approach, though

The Rwandan genocide following its civil war in 1994 left a fragmented country to reconcile and bridge the ethnic divides. The specific atrocities needed to be addressed nationally and internationally in unprecedented ways. The genocide led to the formation of the International Criminal Tribunal for Rwanda as well as the adaptation of the traditional gacaca court 42


system. In theory, these two systems implemented together provided a thorough transitional justice system, however, both courts have faced heavy criticism for their respective deficiencies. The response to the Rwandan genocide serves as an example that traditional mechanisms of justice can be adapted to carry out restorative justice. However, as explained above, restorative justice often fails the needs of sex-crime victims and thus there is also a need for national and internationally based punitive tribunals as well.

References Brounéus, Karen. “Truth-Telling as Talking Cure? Insecurity and Retraumatization in the Rwandan Gacaca Courts.” Security Dialogue 39, no. 1 (2008): 55-76. Buckley-Zistel, Susanne. “Remembering to Forget: Chosen Amnesia as a Strategy for Local Coexistence in Post-Genocide Rwanda.” Africa 76, no. 2 (2006): 131-150. De Ycaza, Carla. “Performative functions of genocide trials in Rwanda: Reconciliation through restorative justice?” African Journal on Conflict Resolution 10, no. 3 (2010): 9-28. Doughty, Kristin C. ““Our Goal Is Not to Punish but to Reconcile”: Mediation in Postgenocide Rwanda.” American Anthropologist 116, no. 4 (2014): 780-794. Ephgrave, Nicole. “Women’s testimony and collective memory: Lessons from South Africa’s TRC and Rwanda’s gacaca courts.” European Journal of Women’s Studies 22, no. 2 (2015): 177-190. Horovitz, Sigall. “International Criminal Courts in Action: The ICTR's Effect on Death Penalty and Reconciliation in Rwanda.” The George Washington International Law Review 48, no. 3 (2016): 505-547. Ingelaere, Bert. “‘Does the truth pass across the fire without burning?' Locating the short circuit in Rwanda's Gacaca courts.” The Journal of Modern African Studies 47, no. 4 (2009): 507-528. Nyseth Brehm, Hollie, Christopher Uggen, and Jean-Damascène Gasanabo. “Genocide, Justice, and Rwanda’s Gacaca Courts.” Journal of Contemporary Criminal Justice 30, no. 3 (2014): 333-352. 43


The Flatiron Building: An American Icon & the Original American Skyscraper William Kowalik

explained as paramount to the nature of its iconic status. Much of the essence and identity of the Flatiron Building can be attributed to a man who was not alive to oversee its planning or construction, and in a city other than New York. The Flatiron Building was the vision of Harry Black, President of the Fuller Company (Alexiou 35). While the company’s namesake founder and Black’s father-in-law, George Fuller of Chicago, did not live to see the construction of the Flatiron; it was Fuller who pushed the envelope of architectural innovation, making the Flatiron Building possible (5). It was his company that, after Fuller’s death, was responsible for its construction. George Fuller loved tall buildings (3). Harry Black turned Fuller’s architectural enterprise into what The New York Times called “the largest construction company in the world to date” (35). Prior to 1892, the heavily regulated but outdated New York City Building Code would not have allowed for the construction of a skyscraper like the Flatiron Building (15). Black himself represented a traditional rags-to-riches story; the “selfmade man” was born poor in Canada in 1863, eventually moved to Chicago to find work, and ultimately made his way up to the top (22).

The modern American city extends upward; glass and steel skyscrapers pierce the sky, each one taller than the last. This phenomenon had to start somewhere. While scholars do not dispute that the world’s first skyscraper, the Home Insurance Building (built 1884), was erected in Chicago, much of the origin story of the “original American skyscraper” is not even related to Chicago. The story of this quintessential, original American skyscraper takes place in New York, at the intersection of Fifth Avenue, East Twenty-Third Street and Broadway. The Flatiron Building, while significantly dwarfed by neighboring buildings, still commands a presence in the modern vertical city. Since it opened in 1902 as an ornate veneer covering a modern steel skeleton, the Flatiron Building has symbolized the progress and growth of the modern city at the dawn of what is known as the “American century.” The Flatiron Building prevails as a negotiation between the past and modernity. Classical architectural ornamentation adorns the building, but in a new way. Decoratively, it is a traditional; functionally, it is completely progressive. In fact, the Flatiron holds the distinction of being the first building in New York to be called a skyscraper, a legacy of architectural design that began with a twenty-two story building in 1902 (Peisch 24).

During its operation, the George Fuller Company (under Fuller and, later, Harry Black) would construct a hoard of significant buildings around the United States. To anyone other than an architectural historian, most of these buildings would not necessarily stand out to the general passerby with exception of a select few. Despite this, the Fuller Company was responsible for many of the most important buildings of the early twentieth century. Besides the Flatiron, they included the Trinity Building, the New York Times Building (original), The Plaza Hotel, the Bellevue-Stratford Hotel, The Willard Hotel, Macy’s on Herald Square, Marshall Field in Chicago and (the original) Pennsylvania Station (“Prominent Buildings” 10-11).

The cultural significance of the Flatiron as a recognizable and iconic building is largely attributed to its triangular shape, and the public’s vocal reception of it. Plenty of triangular buildings exist, so what makes this building so special? “...the Flatiron remains timeless, impervious to fad, and always willing to put one of its three faces forward for the new photographer who sees it for the first time” (Kreitler x). To understand the Flatiron Building, the circumstances of its origin must first be 44


The peculiar plot of land that the Flatiron Building would soon call home was the result of the intersection of the city’s grid (planned in 1811) and Broadway, which predates the grid and cuts diagonally through it (Kimmelman). Harry Black carefully chose the unique location of the Flatiron, at the intersection of two of the most prominent streets in New York, Broadway and Fifth Avenue (Alexiou 40). The small triangular plot was purchased for $815,000 in May 1901. Newspapers proclaimed that it was the most valuable piece of land in the world (40). A great location needed a great building. Black hired Daniel Burnham to design the Flatiron. Hailing from Chicago, Burnham had been the literal and figurative architect of the Chicago’s 1893 Columbian Exposition. This momentous event ushered in the grandeur of the City Beautiful (43). In 1988, The New Yorker commented,

“From the beginning, Burnham wanted a building that would stand out. Burnham believed that whatever difficulties would come with the Flatiron project would be worth it, because he would finally have a building in New York. Not just a building, but a skyscraper – to serve as an advertisement for not just the Fuller Company but himself, too” (Alexiou 47).

The City Beautiful prioritized and monumentalized public space, “creating a dialogue between law and architecture, not so much about specific architectural styles, but about buildings in the context of the planned city—expanding the regulatory authority for “civic beautification” (Revell 38-40). Despite the growing prominence of the skyscraper’s role in the city to Revell, the notion of the skyscraper was in direct contradiction with the City Beautiful Movement (40). Just a decade before, Jacob Riis stunned the New York upper class with his images of deplorable tenement life on the lower end of Manhattan (Alexiou 39). The Chicago School, pioneered by individuals like Daniel Burnham, Louis Sullivan, and others was key in the transition of American architecture upward and forward. It established the skyscraper and simplified it into a series of connected elements – a modernist ideal, which eventually lead to the Bauhaus influence on Mid-century Modern and the International Style. This aesthetic served as a contrast from the ostentatious Beaux arts style. Make no mistake; the Flatiron had always been designed to be different as “Beaux-Arts architecture represented the antithesis of Chicago’s raw originality” (19). Throughout American history there has been attempts to search for and to establish an American style of architecture. It was very much this mission that has shaped the nature and practice of architecture in the United States ever since. The earliest search for the “American style” yielded the Greek Revival, first manifested in Latrobe’s Bank of Pennsylvania in Philadelphia. The Gothic and Victorian styles of the nineteenth century each presented their own contributions to this new American architectural dialect. However, each of them

“Many years passed before he skyscraper came to be generally accepted here as a symbol of beauty. It is therefore not surprising that one of the first skyscrapers to win the admiration of New Yorkers—The Fuller Building, of 1903, soon to be nicknamed the Flatiron Building—was the handiwork of an outsider, the Chicago architect Daniel Burnham” (Kreitler 60).

Burnham, whose massive practice meant that he could not directly oversee every project, appointed Frederick Dinkelberg, a respected architect to oversee the building’s design (Alexiou 48). In Dinkelberg’s design, the triangular building would not come to harsh, sharp points, but gracefully curved corners, with façades reminiscent of the three basic components of a column (51-52). The context of the Flatiron’s ornate and sparkling white limestone and terracotta façade, a stark difference from the design of the older lowrise city, is a product of the City Beautiful Movement. Aimed at beautifying and cleaning up the “dirty and dangerous” city, it takes cues from traditionally inspired European designs, and combines them with modern construction methods.

45


still recalled their European ancestors, and were not truly original beyond making modern alterations. Cass Gilbert’s Woolworth Building, in the Gothic Revival style (unique for a skyscraper), was dubbed the “Cathedral of Commerce.” The Flatiron too was inspired by the European influences of French and Italian Revivals. It was only the pure elemental simplicity and organization of the Chicago School that first introduced a truly American style. Joanna Merwood-Salisbury appropriates the philosophy of the Chicago School, “form follows function” to reflect the Capitalist influence in the development of the skyscraper as “form follows finance” (Merwood-Salisbury 25). The Flatiron’s iconic shape, a product of maximizing space, is a representation of Louis Sullivan’s ideal. Roger Shepherd asserts that architectural publications of the time did not proclaim a uniquely American architectural style, but instead “strip a structure of superfluities and to allow ‘the building to clearly declare itself” (Shepherd

31). To Dinkelberg, the notion that an American architectural style could be created that would break entirely with the European tradition was preposterous (Alexiou 49). Despite the extraordinary significance of the Flatiron Building, my research indicates that there is little scholarly work related to the building beyond an exclusively architectural perspective. The Flatiron Building as an American icon deserves a broader scholarly approach to its architecture and, more significantly, the genesis of its identity as “the American skyscraper,” its associations with popular culture and the rise of the modern city. Architectural historian Roberta Moudry, in response to the concept of the multilayered facets to building cities, states: “This considerable topical and analytical overlap serves as a hinge between studies that are rooted in varied historical subdisciplines. This cross-disciplinary conversation, however, is enabled by a consistent cultural approach to the architectural history of the city” (Moudry 9). The public’s reception of the Flatiron Building was mixed; it was met with intrigue, disgust, and plain fear. The building and the space it occupies “has been as graphically described as a stingy piece of pie” (Wight 66). Many people referred to the Flatiron as “Burnham’s Folly,” concerned that the building was not safe and, given its shape, could fall down (“Lower Fifth”). A September 17, 1903 article from the New York Times emphasizes the damaging winds caused by the Flatiron, causing serious property damage to other buildings and injuring pedestrians. “As far as the down-town section of the city is concerned the storm wrought its worst havoc in the vicinity of the big Flatiron Building at Madison Square. Yesterday’s high wind gave that odd structure its first opportunity to show what it could really do under the most favorable conditions in the way of causing discomfiture to traffic and a general disturbance of existing conditions all about it, and it readily sustained the reputation it has gained…Anything less heavy than street car that came within the

Fig. 1. “Well I'll be blowed”, postcard, 1905. Alexiou, Alice Sparberg. Flatiron New York: St. Martins, 2010. p.153

46


zone dominated by the triangular structure was blown away” (“Furious Gale”).

Fig. 2 “The Flatiron”, Edward Steichen, 1904, printed 1909, The Metropolitan Museum of Art, New York.

In a 1915 review of the building in the Architectural Record, Peter Wight asserts, “It is a great pity that the architect should have chosen to build on this very odd site an ordinary tall building…and thus have produced a very commonplace and conventional skyscraper” (Wight 69). Wight’s assumptions about the nature and normality of Burnham’s design choices would prove to be the minority view, as the uncertainty over the building waned and love for its peculiarities and unmatched uniqueness grew. When the public first saw the building take shape and referred to it as the Flatiron because the buildings shape and plot of land were reminiscent of that of an old clothes iron. It was given the infamous and iconic moniker, the Flatiron Building. This upset Black, whose attempt at branding the building as the Fuller Building as advertising for the company had failed. The public’s nickname stuck (Alexiou 106). Even in a Fuller Company document highlighting the range of their projects, it is listed as “Fuller (Flatiron) Building” (“Prominent Buildings” 10). Regardless of the building’s name that was used, Black was most certainly able to tout the modernity and conveniences of the Fuller (or Flatiron) Building. The building, outfitted with electricity, would have its own steam and electric plants, and would have six Otis hydraulic elevators. There would be little use for stairs in a building that was meant to express the capabilities of 1902 (Alexiou 123). Despite the public’s concern for the safety of this new and different building, it was, in fact, capable of withstanding “four times the wind force that would ever actually come up against it” (149). The winds created from the building’s shape were however quite powerful. The building’s tip of “prow” was where the worst of the winds could be felt, resulting in the unfortunate circumstance of women’s long skirts to blow upward, exposing their legs and undergarments, while providing nearby men a glimpse.

“By then [1904] it was long common knowledge that if you walked by the Flatiron on a windy day, at the very least your skirts were going to end up over your head, and your legs exposed. It was now all the rage for men of a certain character to loiter at the foot of the Flatiron, expressly to catch a glimpse of feminine flesh” (151).

Think of the image of Marilyn Monroe standing above a subway grate, resulting in a similar phenomenon. As the story goes, police officers would yell at the men, standing on Twenty-Third Street, to beat it, or skidoo—giving rise to the once popular phrase in American culture, at least in the last century, 23 skidoo (152). It was during the time of the early skyscrapers, the 1880s through the 1910s, that sent a clear message about the state and future of American progress, albeit far from perfect. In 1893, following the 1890 census, the American West and the American frontier had been settled, conquered, and closed– ending an integral chapter of Manifest Destiny as the American people 47


looked for new ways to conquer the “untamed” landscape. Given the rise of the skyscraper, the new Manifest Destiny was to conquer height. The skyscrapers that would follow were part of a competition of who could reach the sky first, each grander and taller than the next. This race began with the construction of the very first skyscraper in Chicago. The skyscraper was born in Chicago, but perfected in New York. It is this connection to Manifest Destiny and American expansionism that allows us to consider the Flatiron Building among other American icons, not exclusively because of its cultural ties, but because of its relationship to other American icons. The Flatiron Building, by virtue of being among the first skyscrapers, and the first in New York, merits consideration and position as an American icon. “It was precisely the Flatiron’s fluid nature—from every angle, it looked different —and that made people love it so. The building embodied New York...” (Alexiou 126). Other unique features on the inside of the building make the it both inefficient and endearing. The building that Harry Black designed to be the prominent headquarters of the Fuller Company in the fashionable Manhattan neighborhood around Madison Square continues to enthrall tenants; it is currently occupied by Macmillan Publishers. The Flatiron has gendered bathrooms, which are small in size and located on every other floor—likely a last minute or later addition to include a space for female employees (Stapinski).

(Alexiou 153-154). Edward Steichen and Alvin Coburn, other distinguished photographers of the same movement, would also capture the silent majesty and gracefulness of the Flatiron Building at night through photography. It made sense that the modern building representing the modern city would be rendered in the modern method of capturing moments in time. It is in these representations that we see the image of the Flatiron Building shift from Burnham’s Folly to the icon we know today. Like most icons, it was the people who adopted it as an iconic image of New York, and an iconic skyscraper, not the work of those who were behind it – Black, Burnham, and Dinkelberg. It was also the people who gave the Flatiron its name. If it had not received the nickname the Flatiron Building, and remained the Fuller Building as it was intended, it is unlikely that it would hold as much meaning as it does today. Executive Vice President of St. Martin’s Press, Matthew Shear, said of the Flatiron: “It’s a little quirky sometimes, and I think that’s a good thing,” (Stapinski). Martin Kemp defines an icon as having “achieved wholly exceptional levels of recognizability and has come to carry a rich series of varied associations for very large numbers of people across time and cultures, such that it has to a greater or lesser degree transgressed the parameters of its initial making, function, context, and meaning” (Kemp 3).

The Flatiron Building, along with many other skyscrapers, has come to represent the visual identity of New York. Many contemporaries of the Flatiron Building have been lost and forgotten. Only historians will remember the World Building and Park Row. The Singer Building was lost. The Metropolitan Life Insurance Company tower was stripped of its decorative façade in a mid-century attempt to modernize the building (Shepherd xi). However, the Flatiron Building has endured. In 1966, the Flatiron Building was designated a New York City landmark by the City’s Landmarks Preservation Commission, one of the cities

Many people have loved the Flatiron Building since its creation, despite its initial mixed and skeptical reviews. Today, it is best recognized not by its real-life, physical occupation of space, but as the building seen in the scenes of photographs and movies; in popular and visual culture. Alfred Stieglitz was enraptured by the Flatiron Building, capturing it on film on a snowy day in 1902. Stieglitz’s photo-secessionist movement sought to elevate photography as art – art as something beautiful, not simply documentary in nature. When he captured the Flatiron, he captured something that he saw as beautiful 48


first, guaranteeing protection from demolition.

References Alexiou, Alice Sperberg. The Flatiron: The New York Landmark and the Incomparable City That Arose With It. New York: Thomas Dunne-St. Martin's Griffin, 2010. Print.

While New Yorkers’ consciousness of the Flatiron waned as the years went on and the Fuller Company moved into a newer larger, art deco building that was representative of the tastes and preferences of a culture, at the onset of the Great Depression (Alexiou 250-251), this is no longer the case. The Flatiron has remained a relevant icon, a relevant part of the New York streetscape, though dwarfed by the now much taller skyline. The Flatiron Building has remained as an icon associated with New York and the birth of the skyscraper. The March 9, 2015 cover of The New Yorker features the Flatiron, a cartoon of the everpresent icon. A new building in SoHo at 10 Sullivan Street takes the now famous form of the Flatiron (Chaban); it is said that imitation is the sincerest form of flattery. The most exciting and significant change in its history has still yet to come. In 2009, Italian developer, Sorgente purchased a controlling interest in the Flatiron with the intention of turning it into a hotel. Surely, this will be the greatest alteration to occur at the one hundred-fourteen year old building. One thing is certain; as the building changes, just as the city changes around it changes, its status as an icon will ebb and flow along with it. Iconic from the moment it was conceived, the Flatiron serves as a crucial point for the birth of the American skyscraper and for the evolution of the modern city.

The American Skyscraper: Cultural Histories. Ed. Roberta Moudry. New York: Cambridge U P, 2005. - Revell, Keith. "Law Makes Order: The Search for Ensemble in the Skyscraper City, 1890-1930." 38-62. Print. Architecture and Capitalism. Ed. Peggy Deamer. New York: Routledge, 2014. - Merwood-Salisbury, Joanna. "The first Chicago school and ideology of the skyscraper." 25-47. Print. Chaban, Matt. "Flatiron Building, Admired but Rarely Copied, Inspires Developers." The New York Times 4 Apr. 2016: n. pag. NYTimes.com. Web. 20 Apr. 2016. "FURIOUS GALE LASHES THE CITY AND HARBOR: Buildings Damaged and Many Craft Wrecked." The New York Times 17 Sept. 1903: n. pag. ProQuest Historical Newspapers. Web. 16 Apr. 2016. <http://search.proquest.com .libproxy.temple.edu/hnpnewyork timesindex/docview/96280139/ Person_Flatiron_narrow/1? accountid=14270>. Kemp, Martin. Christ to Coke: How Image Becomes Icon. Oxford: Oxford U P, 2012. Print. Kimmelman, Michael. "The Grid at 200: Lines That Shaped Manhattan." The New York Times 2 Jan. 2012: A1. Kreitler, Peter, comp. Flatiron. Washington: American Institute of Architects, 1991. Print.

49


"Lower Fifth Avenue before the Flatiron Building." Ephemeral New York, 17 June 2011, ephemeralnewyork. wordpress.com/tag/burnhams-folly/. Accessed 29 Jan. 2017. Peisch, Mark. The Chicago School of Architecture: Early Followers of Sullivan and Wright. New York: Random House, 1964. Print. Columbia University Studies in Art History and Archaeology 5. Prominent Buildings Erected by the George A. Fuller Company. New York: George Fuller Company, 1910. Internet Archive. Web. 22 Apr. 2016. Shepherd, Roger, ed. Skyscraper: The Search for an American Style. New York: McGraw-Hill, 2003. Print. - Wight, Peter: “Architectural Appreciations No. II: The ‘Flatiron’ or Fuller Building” from Daniel Hudson Burnham and His Associates, July 1915, 64-69 Stapinski, Helene. "A Quirky Building That Has Charmed Its Tenants." The New York Times 25 May 2010: n. pag. NYTimes.com. Web. 21 Apr. 2016.

50


Going Nuts Over Betel-Chewing: How the Areca Nut Has Started a Cultural War in South-East Asia Alexander Voisine

has raised serious health concerns tantamount in severity to the health concerns associated with tobacco use (Wagner 1). As a result, many South-East Asian countries have passed public health policies designed to curb the use of betel, but in so doing, have also created policies that operate as a considerable threat to betel chewing as a cultural practice. With betel chewing comes a heated debate between tradition and public safety, culture and modernization, and an ever-changing discourse informed by technology that links health practices, scientific progress, and cultural standards across all regions of the world. The loss of betel chewing as a cultural practice is unique because it is not entirely negative, given that it would result in a decrease in the presence of the diseases that it causes.

The chewing of betel, a popular stimulant composed of a psychoactive areca nut, betel leaves, and lime, is a ubiquitous cultural practice in South-East Asia. The widespread popularity of betel chewing is reflected by its unanimous adoption into the various religious and cultural groups of South-East Asia; it is a tradition that spans religions, languages, and geography. Betel chewing has a long history in South-East Asia and occupies an important space in the historical interaction of the various nations of the region. It has an intricate role in many facets of the dynamics of South-East Asia, functioning as a form of sustenance, associated with magic and religious rites, used as a universal currency, and treated as a valuable good worthy of being used in ceremonial sacrifices to the gods. Though pervasive and still relevant in contemporary South-East Asian life, betel chewing has lost a considerable amount of the influence that it had in the years before globalization. Changing attitudes regarding betel chewing amongst South-East Asian youth and growing health concerns have gradually begun to attenuate its prevalence in SouthEast Asian society. While it will ultimately be up to the peoples of South-East Asia to determine if their adherence to cultural tradition is more important than their health, the contemporary movement away from betel chewing is undeniably a harbinger of either a rejection or a sharp re-definition of the role that it plays in South-East Asian society.

Betel chewing is deeply ingrained in the history of South-East Asia, dating back to at least 504 BC, which is the earliest reference to betel found in South-East Asian literature (Rooney 14). Its exact origins are unknown, but researchers have attributed the habitual chewing of betel to Buddhism, where it began to take the form of a cultural and religious practice. It was then spread to India, Vietnam, Burma, Thailand, Cambodia, Malaysia, Sumatra and Bali, and later to Indonesia, Madagascar, Nepal, and Papua New Guinea (“Alive and Spitting” 1). The diffusion of betel chewing throughout SouthEast Asia resulted in a plurality of community-specific adaptations as each region, culture, and religion adopted it into the context of their own customs and traditions. It has been found in Vietnamese folklore, Cambodian oral tradition, Indian literature, Persian texts, and chronicles of Chinese exploration in Vietnam, all in forms specific to the linguistic and cultural tradition of the region, but nonetheless present and significant (Rooney 14-15). Though each of the many communities in Sout-East Asia engaged with betel chewing in distinct ways, it was unanimously accepted into every community it reached as a powerful substance worthy of deep veneration and respect. The euphoric and physiologically

The markedly rapid transition from betel chewing as an esteemed cultural practice to its current status as an unhealthy addiction associated with the lower classes can be attributed to the imposition of both Western and dominant Eastern ideas. Classified as “containing a poisonous or deleterious substance which may render it injurious to health” by the United States FDA due to its links to oral cancer, betel chewing 51


“magical” effect produced by the areca nut contained in betel came to be associated with the supernatural. It was consequently integrated into religious practices as both an object of sacrifice and as a way of seeking a deeper, drug-induced connection with the gods. In short, it stimulated the body in such an inexplicable way that it came to be viewed as a divine experience. Due partially to its strong ties to religion, betel chewing “prevailed amongst the South-East Asian royalty” and was often used as a “social denominator and a symbolic element for solidifying relationships amongst royalty in the region” (Rooney 7-8). Betel chewing served as a universal form of communication between the nobles of different kingdoms and existed both as social capital and an element of proper etiquette when dealing with important individuals. Its association with royalty, its strong presence in a variety of regions in South-East Asia, and its value as an indicator of social status made betel chewing a profoundly important cultural practice in the history of the region.

and refusing it was “esteemed a deadly insult” (Reid 531). It played an important role in relations between South-East Asian cultures, and was often used as a gift to kings or leaders of different kingdoms. Betel chewing’s significance transcended religion, language, and regional civilizational differences, which is a testament to its salience in South-East Asia and its smooth diffusion to surrounding regions. So why has it transitioned away from its historical significance towards its current status as a dying and unhealthy habit? South-East Asia’s long history of colonization and interaction with the West and other Asian empires shifted many cultural traditions and practices in different South-East Asian regions, but the entry of particularly Western and North Asian ideologies in the region had a similar effect on betel chewing in all South-East Asian communities. In the 18th and 19th centuries, South-East Asia was controlled primarily by European empires, with the British in India and Myanmar, the Dutch in Indonesia, the French in Laos, Cambodia and Vietnam, the Americans in the Philippines, and the Portuguese in East Timor. Policy designed by European colonists during this time period reflected an imposition of Western values on the cultures of South-East Asia, serving “primarily to link South-East Asia more firmly to a world economy dominated by the West and to introduce Western ideas and technologies” (Lockard 118). With respect to betel chewing, the cultural clash was immediately evident. An English colonist, Mary Cort, wrote in a travel journal that betel chewing is “an unhygienic, vile, ugly, and disgusting habit” (Rooney 5) and many other Europeans shared the same impression. Betel chewing leaves the mouth with a thick red residue that resembles blood, and when spat, leaves stains. To the colonists who did not understand the cultural significance of betel chewing, it was yet another backwards tradition that could only be adjusted through a process of civilization. Soon enough, as Western culture was imposed on South-East Asians and industrialization solidified

The significance of betel chewing lies in the fact that it spans the myriad religious, political, and cultural institutions in SouthEast Asia. In fact, it seems to be an exception to Samuel Huntington’s theory that culture is largely determined by civilizations comprised of the main religions of the world; betel chewing as a cultural practice exists in the geographical regions where Islamic, Hindu, Confucian and African civilizations are positioned (Huntington 24-25). Betel chewing is thus a cultural practice that exists at the intersection of four of Huntington’s supposedly divided and conflict-ridden civilizations. It also functioned historically as a form of monetary and social currency between different regions, given its immense cultural value. For some, betel was as valuable as food: “The Thais ‘prefer to go without rice or other food rather than to deprive themselves of the betel’ noted Nicolas Gervaise, a French visitor in the seventeenth century” (Rooney 1). In Indonesian culture, chewing betel was a “social necessity for every adult in society” 52


economic ties with the West, the disgust for betel chewing began to be absorbed into indigenous populations. In Thailand especially the impact of Western values on betel chewing culture is stark: “Younger Thais, many of whom have been educated abroad and have inculcated Western ideas, find betel chewing no longer socially acceptable” (Rooney 66). The long process of Western acculturation that coincided with modernization in South-East Asia has profoundly mutated the tradition of betel chewing, but intra-Asian influence cannot be overlooked when analyzing the roots of the changes in betel chewing. Japan also viewed betel chewing as an unhealthy practice and banned it during its 50-year occupation of Taiwan (Tacon 1). Japan expanded aggressively into South-East Asia in the early 19th century, and with its invasion came a reinforced condescension of betel chewing as a cultural practice. Though “Japanese control shattered the mystique of Western colonialism and invincibility,” it viewed betel chewing in much the same way that the West did, further dissolving the sanctity of the cultural practice (Lockard 145).

technology has given South-East Asians the tools to better comprehend the negative effects of betel chewing; a simple Google search yields numerous warnings about its links to oral cancer, heart disease, and liver disease. Medical technology has also empowered doctors and researchers to analyze the links between betel chewing and disease and to dissuade patients from engaging in habitual betel chewing. Technology has given South-East Asians more control over their health by presenting the physiological realities of betel chewing, something that was not as readily accessible before the burgeoning of technology in the region. While betel chewing is seen by most medical professionals as a harmful practice, many South-East Asians are still deeply attached to it as both a cultural relic and as a means of economic security. In Papua New Guinea, “an estimated two-thirds of the population is involved in the growing, distribution, or use of the nut,” and large percentages of its citizens “rely on the daily sale of the nut to survive” (Cassey 8). Most other South-East Asian countries experience similar levels of attachment to betel chewing. Despite efforts to educate the public about its detrimental effects, it has actually increased in some urban areas of South-East Asia, such as Mumbai (Rooney 66). There appears to be a bit of resistance to the campaigns against betel chewing in certain parts of South-East Asia, perhaps indicating that it will persist in contemporary South-East Asia as a cultural practice. Nevertheless, as medical technology increasingly informs public health policy and as South-East Asian nations modernize and develop more advanced health care systems, the balance of power will likely tip in favor of central governing agencies which are less defensive of betel chewing. Supra-territorial organizations like the World Health Organization have also spoken out against betel chewing, and the United States FDA issued an Import Alert in 2014, calling for a “refusal of admission of betel nut in conventional foods” (Wagner 2). The global consensus against betel chewing is sure to

The biggest threat to betel chewing culture in contemporary South-East Asia appears to be from within South-East Asia itself, as doctors and public health officials become aware of the carcinogenic components involved in the betel chewing tradition. In 2008, a campaign called “Voices of Tobacco Victims” was launched in India by Dr. Pankaj Chaturvedi, with the intention of advocating for “more stringent control in India” of betel chewing, which is known as gutka or paan masala in Hindi (Seervai 1). Additionally, in 2015, Aung San Suu Kyi “asked chief ministers to take action to make people cut down on their habit of chewing [betel] nuts and spitting out the dark juice in public places,” reflecting Myanmar’s shifting position on betel chewing (Tun 1). In Indonesia, cigarette smoking, though an arguably less healthy alternative, has “almost completely replaced the chewing of betel among male Indonesians in the last century” (Reid 529). The increase in access to 53


continue to unravel the threads of its position in South-East Asian society. Cultural loss is often conflated with a sense of disrespect for less dominant cultures. The forces of acculturation, modernization, and even cultural appropriation take elements from less powerful cultures and mutate them into forms that are in line with the values and practices of the dominant culture. Colonization and occupation have cast a malignant shadow on the idea of cultural loss because it has historically forced certain cultures into extinction. Betel chewing, however, inhabits an interesting space in cultural loss discourse. Similar to selfimmolation in India and facial scarring in African villages, betel chewing is a harmful cultural practice. When members of a culture are shown that they need not adhere to the hazardous traditions of their culture, that there are other cultures that do not follow the same customs of self-harm, they are essentially given the important choice of determining for themselves what they feel reflects their culture. In South-East Asia, betel chewing remains a widely respected tradition, but amongst the younger generation, it is viewed as old-fashioned and embarrassing (Rooney 67). The beauty of culture is that it changes, and as cultures and traditions coalesce in the ever-globalized world, certain practices are maintained and others are lost. Betel chewing appears to be a diminishing practice, and as it fades, it may lose the meaning it has held in the past in South-East Asia. Betel chewing’s ability to extend beyond civilizations, to act as a common and relatable ritual in the region, and to serve as a purveyor of the South-East Asian tradition of sharing and communicating is quite notable. It shows how a regional practice can become a crosscultural phenomenon. Betel chewing may be waning as a cultural practice, but the SouthEast Asian tradition that lies behind betel chewing, that of sharing enjoyable experiences and communicating crossculturally, will likely persist.

54


References

Wagner, Roberta. “FDA Response to Resolution 2013-02,” Department of Health and Human Services, Association of Food and Drug Officials (24 March 2014), College Park, Maryland.

“Alive and Spitting,” Gulf News, Al Nisr Publishing LLC, (27 February 2009), United Arab Emirates. Cassey, Brian. “Chewing Over a Betel Ban.” The Sydney Morning Herald, (9 November 2015), Sydney, Australia. Huntington, Samuel P. "Foreign Affairs - The Clash of Civilizations? Samuel P. Huntington." Foreign Affairs – Home, (Summer 1993) Tampa, Florida. Lockard, Craig A. South-East Asia In World History. (1st ed.) Oxford University Press (2009), New York, NY. Reid, Anthony. “From Betel-Chewing to Tobacco-Smoking In Indonesia.” Journal of Asian Studies. Vol. XLIV No. 3, (May 1985), Ann Arbor, MI. Rooney, Dawn F. Betel Chewing Traditions In South-East Asia. (1st ed.), Oxford University Press (1993), Kuala Lumpur, Malaysia. Seervai, Shanoor. ”India’s Battle To Ban Chewing Tobacco.” The Wall Street Journal, (12 June 2013), New York, NY. Tacon, Dave. “They’re Dressed To Thrill But It’s The Betel Nut That Stimulates.” The Sydney Morning Herald (11 March 2012), Sydney, Australia. Tun, Zarni. “Myannmar Government Issues New Guidance For Reducing Betel Nut Chewing.” Radio Free Asia, Translated by Khet Mar, Written in English by Roseanne Gerin, (14 June 2016), Washington, DC. 55


III

ENVISIONING OUR FUTURE The Evolution and Future of the Power of the Federal Government of the United States Christopher Chau, Junior History & Political Science Major, French Minor The Meaning of Semen in Society, Pornography, & Reproduction Shali Pai, Junior Sociology of Health Major, Healthcare Management & Biology Minor Beyond the West Armon Fouladi, Junior Actuarial Science Major, History & International Business Administration Minor Marriage Equality: A Controversial and Ill-Considered Goal Asia Kopcsandy, Sophmore Women's Studies Major, Sociology & French Minor How Colonial Rule and Nationalism Inform the Understanding of Mali's Fragile State Amanda Morrison, Freshman Global Studies & Strategic Communication Major, Spanish Minor

56


The Evolution and Future of the Power of the Federal Government of the United States Christopher Chau

After the Civil War, new amendments were added to the Constitution to protect the civil liberties of the newly freed slaves. The Thirteenth, Fourteenth, and Fifteenth Amendments sought to incorporate African Americans into society. In a period known as Reconstruction, the former Confederate states were excluded from this process to ensure that its new laws were being carried out. This was unprecedented at the time, since it marked the first time that the federal government had a free hand to intervene in what would normally to be considered state affairs that were permitted by the Tenth Amendment. States could no longer deprive any citizen life, liberty or property without due process of law, and all were to be treated under equal protection of the law. The Fourteenth Amendment eventually became the precedent when regarding civil rights and equality among citizens. Nearly one hundred years later, the issue of civil rights sprung up again in the United States regarding Jim Crow laws in the southern states, which not only allowed for, but promoted legislated segregation. Brown v. Board of Education of Topeka overruled the previous Plessy v. Ferguson ruling of “separate, but equal” facilities in public schools in 1954. To enforce its ruling, President Eisenhower sent in the National Guard to ensure that integration was implemented. The following Civil Rights Act of 1964 banned segregation in public accommodations based upon the Commerce Clause of the Article I, Section 8, and the Voting Rights Act of 1965, which allowed the federal government to intercede in state and local elections.

Since the ratification of the United States Constitution in 1787, the power and authority of the federal government has grown tremendously. After replacing the weak and ineffective government originally created by the Articles of Confederation the Constitution became a guiding force for the future development of the United States for many years to come. With its profound influence over the average American’s life, its power continues to expand to this day. At the time of its inception, the powers of the federal government were mostly formulated within Congress. In Article I, Section 8 it listed all the specific duties and responsibilities of Congress such as the ability to print money, regulate commerce, and to declare war. This was unprecedented as the previous government under the Articles of Confederation was denied these powers that were only reserved to the states. It also stated that it had the authority to make all laws “necessary and proper” as well as the power to levy taxes that provide for the "common defense and general welfare of the United States.” These clauses, because they were extremely vague and ambiguous, resulted in further interpretations by the Supreme Court which allowed the government’s further expansion of its power. Notably, McCulloch v. Maryland in 1819 disputed the federal government authority to establish a national bank. The Marshall court, under Chief Justice John Marshall, ruled in the government’s favor, arguing that, it was permissible under the elastic clause, being that it was under the United States’ best interests to charter a national bank. This also established that states had no power over the federal government, and reaffirmed the Supremacy Clause, listed in Article VI, Section 2. This states that the Constitution is the “supreme law of the land” and therefore state laws cannot overrule federal and all must obey its authority.

In the early days of the United States, the nation was widely comprised of small, independent farmers. The economy was mostly family based and there was no need for any federal intervention. During its colonial period, Great Britain managed the thirteen colonies with an economic system known as mercantilism; this revolved around the idea of government intervention in the market through high tariffs and taxes in an attempt to garner as much wealth as possible for the country. The colonists were distrustful 57


of this system, so after their independence, they wanted a completely free hand in managing the market. As the nineteenth century came, industrialization took hold of the country as most of the new factory workers relocated to the cities. Urban areas quickly swelled up with people and there was a stronger need for the government to provide sanitation to control the spread of disease, law enforcement for the steady growth rate of crime, fire departments and routes for commerce such as roads, canals and railroads. The transcontinental railroad became part of eminent domain as the federal government began to take land from private owners and convey it to companies to construct it. As life became more industrialized, often workers were at the mercy of their bosses who could dismiss them for no reason whatsoever, or to crowd them in extremely unsafe working conditions for long hours with very low wages. Unions were often organized in order for workers to act collectively to initiate change. The federal government was once again in the early twentieth century called to intercede to protect the rights of workers. Passing legislation such as the Clayton Antitrust Act and the Federal Trade Commission Act in 1914, the Progressive Era of the Roosevelt, Taft, and Wilson administrations saw a rapid expansion governmental authority in regulating big businesses and trusts from becoming too powerful. Although there was a shift to less government power after World War I in the 1920s, the Great Depression demanded that the government help lift the country out of its economic depravity. In what became his New Deal, Franklin D. Roosevelt planned to provide relief to the rapidly growing unemployed and elderly with the Social Security Act of 1935 and tried to prevent another economic catastrophe with the Glass-Steagall Act by forcing the separation of investment and commercial banks in order to protect the public’s accounts in commercial banks. In addition, there was the National Labor Relations Act that permitted the federal government to mediate in labor disputes.

Fast forward to the twenty-first century, most careers have moved from the agricultural and manufacturing sectors to the service sector. This incorporated a wide range of occupations, including retail, healthcare, education, entertainment, software, finance, and law. Since promotion to higher income jobs requires education, the government had to invest more of its budget into it. The common American’s social attitude began to change as people are living longer lives, therefore providing a need for increased government investment into healthcare and into people’s personal lives. Programs such as Medicare and Medicaid, enacted in the Social Security of 1965, remained steadily popular at 77 and 68 percent approval rating respectively, according to a survey conducted by the Kaiser Family Foundation in April-May 2015. Throughout its almost 250 years, the U.S. federal government saw itself grow into one of the most powerful forces over the lives of its citizens. The ambiguity of its powers in the Constitution allowed for the interpretation of its duties to reach far beyond its literal meaning. As our social attitudes change, the federal government will only grow more powerful in the future to better serve the needs of its people. Regardless, society will continue to move forward and the government will need to foster that advancement as it always had throughout history. If its power wanes, then the country will lack the cohesion to continue forward and perhaps will regress back to the fragmented Confederation as it was before the Constitution. The framers intended the Constitution to be a flexible and changeable document that can be fitted to society’s needs, whatever they may be and hopefully for the betterment of the United States and its people.

58


References Bailey, Thomas A., David M. Kennedy, and Lizabeth Cohen. The American Pageant: A History of the Republic. 11th ed. Belmont, CA: Wadsworth Publishing, 1997. Print. DiNunzio, Mario. Great Depression and New Deal: Documents Decoded. ABC-CLIO, 2014. 28 September. 2016. <http:// www.myilibrary.com.libproxy. temple.edu?ID=643938> Greenberg, E. S., & Page, B. I. (2014). The Struggle for Democracy (12th ed.). Boston: Pearson. Mangan, Daniel. "Medicare, Medicaid popularity high: Kaiser." CNBC.com. NBCUniversal, 17 July 2015. Web. 2 Oct. 2015. Wilentz, Sean, and Jonathan H. Earle. Major Problems in the Early Republic, 1787-1848. 2nd ed. Boston, MA: Houghton Miffin, 2007. Print.

59


The Meaning of Semen in Society, Pornography, & Reproduction Shali Pai

Ralph Norman also mentions how the “reproductive potential of sperm” was accredited for male skill and prowess in writing, although it was mainly white males who were educated in the skill of writing during this period in time. He suggests that males’ ability to write stems from their ability to produce sperm. This idea of semen giving men certain advantages over women persists in various cultures and ultimately helps reinforce gender roles.

Semen is defined as “the viscid, whitish fluid produced in the male reproductive organs, containing spermatozoa.” Cum, jizz, splooge, swimmers, nut, seed, wad, spunk, baby batter; this substance goes by many names and has carried much meaning over the span of history, but that meaning always seems to change. We view semen as lifegiving, but sometimes we see it as dirty. Of course, that all depends on the situation: and when (or where) it comes out, literally. It is undeniably important in society, culture, pornography, and reproduction. The following information explores the history of semen, how we think of it now, and the different roles it plays in different situations. For example, semen in a crime scene is viewed differently from semen held in a sperm bank or semen found in a porn video. Most importantly, this essay examines why we let this bodily fluid that goes by many names have so such power and meaning in today’s world and media.

The sexual connotation behind the “spirit” persisted in writings of the famous men of the Elizabethan age, especially William Shakespeare, unsurprisingly. Also referred to as a “divine seed”, sperm was seen as a magical substance that had the capability of travelling through the entire body. During orgasm, ejaculation was considered the expulsion of one’s soul, justifying why men were so exhausted post-sex. Ejaculation was regarded as a almighty and seemingly uncontrollable action, comparable to the power of the Holy Spirit. If this were accurate, this idea would explain why we are complacent with the trope of the insatiable sex drive of men. It is very easy to dismiss the male sexual misconduct with the phrase “boys will be boys” because we have been conditioned to think the male libido or semen is so strong and uncontrollable that whoever is on the receiving end must be submissive.

Semen has been an integral part of life not only because of its power to create but also because of the power it holds in various arenas and how its meaning has shaped our society and the role of men. In Ralph Norman’s, Sexual Symbolism, Religious Language and the Ambiguity of the Spirit, he explores how semen has been mentioned again and again through euphemisms in religious and historical text. He writes, “In the seventeenth century, the word ‘spirit’ stood euphemistically for semen…Donne also drew on renaissance ideas of metempsychosis which allowed him to view sperm as something physically connected with the spirit of a man, and potentially associated with the Holy Spirit himself” (Norman 233). Therefore, even the framework of Catholicism symbolizes the importance of masculinity with its allusions to the “Father, Son, and Holy Spirit” in which the Holy Spirit actually means man’s sperm.

This sexual double standard has been discussed many times; a man with many sexual partners is seen as manly and desirable, but a woman with many sexual partners is considered slutty, loose, easy, and less feminine. This may stem from the idea that it is a man’s right to have sex with as many women as they can because sperm is biologically cheap in comparison to the number of eggs the average woman can produce. Sperm is a lot easier acquired than an egg simply by nature. Any time a man ejaculates he has the potential to give life but a woman is only able to be fertile in very specific windows of ovulation and does not necessarily need to experience orgasm to get pregnant. On a very simple biological level, 60


only the man has to experience sexual pleasure to orgasm and insert his semen in a female for impregnation to occur. The woman’s pleasure and satisfaction is simply optional in the pursuit of creating a child, whereas a man’s pleasure is mandatory because without it we would not have semen.

education, health, and religion, but by grouping sex with negative things like stress and drugs, sex develops a negative connotation. Semen is seen as something very precious and not to be wasted on casual sex. Chinese culture emphasizes that semen should not be wasted on sex for pleasure. “Wasting semen” also includes having sex with someone out of wedlock or any sex without the intention of creating life. Ejaculation is seen as “energy suicide,” an action solely completed for the sake of creating life. This enforces the ideas of purity and sacredness associated with sex without pleasure and women only being able to have contact with semen if the purpose is life giving (Mandala 1). China values semen more for its functional purpose whereas in the culture of Papua New Guinea, sees semen’s worth as much more symbolic.

While writings of the past glorify and praise men because of their ability to produce semen, remarkably, biological texts prove to be no different. In a review of the book Sperm Counts, Edwin Davies writes, “Sperm is anthropomorphized in order to give it traditionally masculine characteristics – describing them as assertive or competitive… heroic, active figures overcoming obstacles to achieve a goal. The egg meanwhile, is overwhelmingly depicted as being female, and remains passive” (1). The value placed on semen being powerful and in control on a microscopic level transfers into standard pornography where scenes end with the man ejaculating either inside the woman, or somewhere on her person, as a way to signify that she has been conquered and that now that the man has finished, sex is over. Biologically, a man is justified in his sexually aggressive acts due to the overwhelming power of semen. This transfers into social constructs that suggest a male is superior to the inferior woman, who must be reminded of her place by degradation through the use of the mighty semen.

In Papua New Guinea there is another meaning of semen that results in a very unique cultural practice. Dr. Ananya Mandal writes in her article, Semen in Culture, “the tribes believe the semen of older men can bestow manliness and wisdom to younger men and for this the men need to fellate their elders to receive their authority and powers” (1). Once again, semen is regarded as sacred but only when ingested by younger men in a non-sexual way. In contrast to modern day American pornography where ingesting semen is seen as vulgar and degrading, these young men in Papua New Guinea view it is a coming of age ceremony.

Semen is viewed as immensely valuable in many different cultures around the world.For example, in traditional Chinese medicine, semen is regarded as a sexual energy called “jing” and the ejaculation of semen is seen as “energy suicide” (Mandal 1). The idea was that every time you exhausted your fixed amount of this energy through things like sex, stress, or drugs you were shortening your life energy. To acquire more energy or “jing,” one was supposed to do positive things like studying, exercising, and praying (Mandala 1).

To further expand on the meaning of semen in pornography, we should look to what Tracey Clark-Foley regards as “the defining aesthetic of modern porn,” otherwise known as the “money shot”. This is when the male porn performer ejaculates onto a partner. In her article, Explaining the “Money Shot”, she expands on how the money shot originated and how it has affected current society on a psychological level. There is an “implied degradation of the phallus ‘spitting’ on the woman’s face – that part of the body which is most associated with one’s individual dignity and

Perhaps this was just a way to convince youths to put their time into good things like 61


personality…Moore sees the money shot as a way of ‘marking’ a partner as ‘territory or property’” (1). Now, we see that semen is not only seen as a valuable seed, it is also seen as a way to dominate, mark, and make whoever they are ejaculating on less pure. Semen, once a mark of knowledge in traditional literature and thought, now represents a mark of dominance, especially when in contact with a woman. This duality of the sacred and the profane in regard to semen depending on its context is most blatant in the environment of pornography.

semen. There is a strange romanticism that comes with the idea of getting consent from the one you love or care for to do something degrading to them. This acceptance is sought after by men so it is undoubtedly frustrating when a man cannot “perform” or properly ejaculate after engaging in sexual intercourse. Why is it that so much of a man’s identity rests on his sexual performance? Men who suffer from erectile dysfunction feel emasculated by the condition. It is clearly a big deal to many men, our society, and a reason for a man to feel like less or abnormal even though a Huffington Post Healthy Living article said that, “according to the Cleveland Clinic, as many as 52 percent of men experience erectile dysfunction, with it affecting 40 percent of men age 40, and 70 percent of men age 70” (Chan 1).

A more extreme version of the so-called money shot is bukkake, a type of sexual activity in which several men ejaculate on another person. This fetish originated in pornography from Japan in the mid 1980s. In Gail Dines article, Porn Syphilis and the Politics of the Money Shot, Bill Margold, a veteran porn actor and producer is quoted saying, “The most violent we can get is the cum shot in the face. Men get off behind that, because they get even with the women they can't have. We try to inundate the world with orgasms in the face." (Dines 1). In this scenario, the symbolism of semen on a woman’s face is now symbolic violence and aggression.Ejaculating on a woman’s face is a primal way of indicating that she has been conquered and the man has “won”. It should be noted that scenes like the money shot and bukkake only end once the male has ejaculated and there is absolutely no regard for the pleasure of the woman or receiver of the semen. In fact, it is preferred that the woman gets minimal pleasure since ejaculating on her is seen as her punishment.

In January of 2016, news was released about a possible option of birth control for men. It was described as a “tiny, switchable valve” that once implanted, “prevents sperm from entering the semen stream”. That same month, BuzzFeedVideo released a video titled, “Would You Use a Switch That Turns off Your Sperm?” where they asked multiple men on the streets of L.A. that if given the opportunity, would they implant this switch to use as a form of birth control. Only one of the six men agreed to it with the rest saying that it just “wasn’t for them” or seemed “weird”. One of the men answered when asked if he would use a switch to turn his sperm on and off said, “No. Not at all. Yeah I don’t want anyone messing with my jawn. My jawn is private…I like the idea, just not for me”. While his response was met with laughter from the interviewer, the statement alone shows he has regard for only his own release and pleasure with little thought of the woman’s preference or safety. While their answers can be rationalized, it is hard to deny that it is unfair that the burden of preventing pregnancy falls upon the woman. Society is more comfortable with letting men run free with their powerful semen and having women be mindful and more educated on

Clark-Foley’s most striking point is that the ejaculation of semen onto a partner “is not merely a carnal fantasy; it is also an emotional one – a fantasy of ‘unconditional acceptance’ in which the female – or male, presumably – seems to say ‘I exist wholly for you. I will never reject you. You cannot disappoint me”(1). When a man ejaculates on their partner, while it symbolically means dominance, there is also a level of acceptance that comes with openly receiving a man’s 62


things like birth control and protection because sex is still seen as a male-driven activity.

of women. It is ironic that while semen is viewed as pure and a life-giver, when it comes in contact with a woman, the woman is regarded as sullied and impure. There is an undeniable trade off with sex where a woman cannot win. Even if she does everything “right” by traditional standards and waits until marriage to have sex only for the sake of having a child, she will still be seen as used and less valued because she has come in contact with semen and therefore is no longer virgin and pure. This double standard is amplified by history, pornography, literature, cultural practices, and general mindsets all over the world. The contrast of pure semen versus impure women continues to impact us today. The gender roles we still employ are derived from an amalgamation of many factors, but it is clear that the value placed on something as simple as a man’s semen has definitely made its mark.

Men as a whole are generally not as educated about their reproductive systems as women are. For example, many men do not realize that semen and sperm are two different things. Even if they were to have a switch to shut off their sperm, they would still have semen come out when they ejaculate. There are two problems to this way of thinking. The first stems from a lack of education on male reproduction and the fact that male fertility and sexual pleasure are so intertwined that men would rather run the risk of getting a woman pregnant than not being able to ejaculate fluid when they climax. The second problem is the belief that sex is not completed until the man has ejaculated semen. The mindset to let the man solely worry about his own pleasure while the woman worries about everything else is enforced by the idea of letting “boys be boys” because of this seemingly insatiable sex drive that stems from their powerful semen.

References

There are plenty of other everyday situations where semen takes on different meanings. Semen found in a sperm bank is seen as a valuable and expensive resource for women or couples that want a child. Semen at a crime scene is vile and damning evidence of what is often sexual violence and so the people investigating are disgusted by the acts of an evil man. When found in a hotel room under black light, it is seen as disgusting and dirty and the assumed result of sex between a impure man and woman. The list of examples goes on but the ultimate point is that the fluidity of semen’s meaning has aided in supporting the duality of good versus evil semen as a result, good people versus bad people.

BuzzFeedVideo. "Would You Use A Switch That Turns off Your Sperm?" YouTube. January 19, 2016. Accessed February 16, 2016. https://www.youtube.com/ watch?v=ivBS8cWvxfc. Capogrosso, Paolo, Michele Colicchia, Eugenio Ventimiglia, Giulia Castagna, Maria Chiara Clementi, Nazareno Suardi, Fabio Castiglione, Alberto Briganti, Francesco Cantiello, Rocco Damiano, Francesco Montorsi, and Andrea Salonia. "One Patient Out of Four with Newly Diagnosed Erectile Dysfunction Is a Young Man— Worrisome Picture from the Everyday Clinical Practice." The Journal of Sexual Medicine 10, no. 7 (2013): 1833-841. doi:10.1111/jsm.12179.

As we become more educated about society and the roles of both men and women, we must make sure to learn what brought us here. The value placed on semen has had a hand in molding the gender role of men and as a result, molded the gender role

Chan, Amanda L. "Erectile Dysfunction May Affect 1 In 4 Men Under 40 Seeking Treatment, Study Suggests." The Huffington Post. June 11, 2013. 63


Accessed April 26, 2016. http:// www.huffingtonpost.com/2013/06/11/ erectile-dysfunction-young-men-age40-younger_n_3405085.html. Davies, Edwin. "BIONEWS - Book Review: Sperm Counts - Overcome by Man's Most Precious Fluid." BioNews. May 28, 2012. Accessed February 26, 2016. http://www.bionews.org.uk/ page_146174.asp. Dines, Gail. "Porn, Syphilis and the Politics of the Money Shot | Gail Dines." The Guardian. 2012. Accessed April 27, 2016. http://www.theguardian.com/ commentisfree/2012/aug/28/pornsyphilis-money-shot-condoms. Mandal, Ananya, MD. "Semen and Culture." News-Medical.net. September 13, 2013. Accessed April 27, 2016. http:// www.news-medical.net/health/Semenand-Culture.aspx. Norman, Ralph. "Sexual Symbolism, Religious Language and the Ambiguity of the Spirit: Associative Themes in Anglican Poetry and Philosophy." Theology & Sexuality 13, no. 3 (2007): 233-56. doi 10.1177/ 1355835807078258.

64


Beyond the West Armon Fouladi

negative views that would mark attitudes towards the Middle East and Africa. This was continued by Western reverence of these early “classical” works. The second root of the negativity perpetuated by American film towards these regions is “Orientalism”, a term coined by Edward W. Said to describe Western views of the Middle East (which many then referred to as the Orient). Said describes Orientalism is “a distribution of geopolitical awareness into aesthetic, scholarly, economic, sociological, historical, and philological texts; it is an elaboration not only of a basic geographical distinction…but also of a whole series of ‘interests’ which … it not only creates but also maintains” (12). Said also ties in this movement with the superiority complex that the West has with respect to the Middle East and Africa by pointing out, “Orientalism depends [on its] flexible positional superiority, which puts the Westerner in a whole series of possible relationships with the Orient without ever losing him the relative upper hand” (7). These underlying intellectual roots dominate Western thought and shape the views that Hollywood has on the Middle East and Africa, which in turn shapes public perception of these regions.

There is no denying that the American film industry is not wholly American. With growing film industries in most countries and the increasing rise of globalization, it is impossible for something as big as Hollywood to be uninfluenced by foreign film. Many of the industry’s biggest names, both past and present, are immigrants, such as Roman Polanski and Hugh Jackman. Many films are set overseas, and many more are filmed there as well. There are a multitude of essays showing how foreign media has influenced American films. Yet, two regions are constantly overlooked and marginalized: the Middle East and Africa. There are two reasons for this: relatively low immigration from these regions to America and the negative view the West has towards the regions themselves. The immigration factor is apparent through statistics. Since 1960, Africa has never accounted for more than 5% of total U.S immigration (MigrationPolicy), while immigrants from the Middle East have figured prominently since 1970, jumping from 200,000 to 1.5 million in a 30 year period (CIS). By comparison, the Americas(excluding the United States) have accounted for 20 to 50% of immigration since 1960 (MigrationPolicy). The less quantitative reason, Western indifference towards Africa and the Middle East, will be addressed throughout this paper. I intend to examine this by showing the reasons for Western negativity, how it has portrayed the Middle East, and how it has portrayed Africa.

One of the most covered areas in the Middle East is Egypt, specifically Egypt during the period of antiquity. Ancient Egypt is special because it was qusai-adopted into Western society. The Ancient Greek historian Herodotus (484-425 BCE) claimed that much of Classical Grecian society was derived from Egypt, including their religion (and by implicit proxy, their culture). This claim, in addition to the long period of time that Egypt that was under Ptolemaic and Roman control (332 BCE – 641 CE) accounted for the Western “adoption” of Ancient Egyptian society. This results in Ancient Egypt being treated slightly better than its contemporaries.. A good example of this is Mankiewicz’s Cleopatra (which was based on the writings of several Roman historians, namely Plutarch and Appian), which shows Cleopatra VII during the Roman Civil Wars and the takeover of Egypt by the Romans.

The Western negativity towards these regions has two intellectual roots: Ancient Western writers and Orientalism. Ancient Romans and Greeks began the tradition of negative views towards Africa and the Middle East with such writers like Herodotus (484-25 BCE) and Strabo (64 BCE – 24 CE), who describe Africa as infertile and belittle the population. These writings, whose inaccuracies would not be corrected until late in the 20th century, were the beginning of the 65


This is the closest that will see in this region to a “native” (Cleopatra) being considered an equal to a colonizing European. The portrayal of Arabs is good, as there is “Creative cinematography, realistic costumes, and authentic Roman and Egyptian settings [to] provide spectacular sights. The scenario also refers to Roman politics and Egyptian contributions, not that Rome’s greatness came about, in part, because of Egypt’s richness of corn, wheat, gold, and jewels.” (Shaheen, 152). A more typical example of how Egyptians are treated is seen in 1932’s The Mummy. In this film, an Ancient Egyptian priest is inadvertently resurrected, and begins terrorizes a group of archaeologists while trying to resurrect his dead lover. This is expanded upon in the 1999 remake of the film, though several plot points are changed and the depiction of Egyptian antiquity is more misconstrued than in the original 1932 film. The 1999 film is also more racist towards Arabs and Egyptians than its 1932 counterpart. According to Jack Shaheen, the 1932 Mummy doesn’t offer any real negative stereotypes against native Arabs (358).

OPEC and Hezbollah terrorists in the 1970s, coupled with America’s defeat in Vietnam and stagflation, resulted in Americans projecting their frustrations and anger onto Arabs, making them scapegoats and villains (7-21). Prior to the 1970s, villainous Arabs usually took the form of slavers or sheiks who sat lazily on the throne and were often oversexed (Shaheen, 27). From 1970s onwards, however, Arabs have almost become the go-to bad guy, with sheiks and terrorists playing the villains in numerous films. Four films that involve Arab villains are examined here: Disney’s Aladdin (1992), Russell’s Three Kings (1999), Friedkin’s Rules of Engagement (2000), and Eastwood’s American Sniper (2014). The first of these films, Aladdin, serves as Disney’s interpretation of the tales from 1001 Nights. Shaheen notes that the film contains several racist elements, especially the fact all the heroes of the film are anglicized while the others are shown as “ruthless, uncivilized, caricatures” that have “large, bulbous noses and sinister eyes” (57). Also of note is the fact that the film does not take place in Baghdad ,a real place that was known as a center of culture, but in Agrabah, a mystical backwards kingdom (Shaheen, 57). Another element that is pointed out is that Arabic names are mispronounced and the Arabic writing in the background are actually nonsensical scribbles (Shaheen, 58).

In contrast, the remake features Egyptians more prominently; however, they “appear as hostile, sneaky, and dirty caricatures” (359). The reason for the difference in treatment is somewhat political. Although Arabic stereotypes are still the oldest stereotypes still used in film (Shaheen, 1), they really exploded in the 1970s, becoming the go-to stereotypes for Hollywood. Shaheen refers to this as “The New Anti-Semitism”, noting that technically Arabs are Semites, because “many of the films directed against Arabs were released in the last third of the twentieth century, at a time when Hollywood was steadily and increasingly eliminating stereotypical portraits of other groups” (12). Even with films that do not really deal with Arabs sometimes have slurs against Arabs and Arab stereotypes stuffed into films, with 120 films having such “cameos” between 1980 and 2001 (Shaheen, 33). Semmerling offers an explanation for this: he claims that the rise of

Three Kings tells the story of a group of U.S soldiers at the end of the Gulf War who go looking for a stolen cache of gold and wind up helping a group of Iraqi refugees escape custody. There is disagreement over whether or not this film is actually racist. Shaheen, who was brought in as a consultant (518), says the film shows multiple sides to the Iraqi people, presenting dignified Iraqi rebels helping Americans and respecting Islam , which many films do not(519). Semmerling, however, disagrees, He says that “Arabs are still portrayed as threatening to America” (125) and cites a torture scene in the film (151) as undermining the film’s intention of presenting humane Arabs. 66


Semmerling writes “Arabs are still totalized into an exploitable Otherness. They are appropriated as pawns … so that Americans can defend and reassert the myth for themselves….The Arabs in Three Kings are used to help the audience legitimate and feel better about regenerating its militaristic national self through the classic war genre” (161-2). This shows how a film that features sympathetic Arabs can still be construed as racist.

then promptly killed by the Butcher, who also murders his son by drilling a hole in his head. Almost nothing is done by anyone in the film to portray Arabs as anything other other than terrorists. The other region that will be examined is that of Africa, in particular Sub-Saharan Africa. Africa’s colonization gained steam at the Berlin Conference of 1884, during which the major European powers divided up Africa amongst themselves with little to no regard to the Africans already living there. Many of the initial stereotypes and archetypes that dominated stories surrounding Africa were in the process of being formed as films were being invented (Cameron 11). This, coupled with a centuries old societal racism that pervaded America, helped create a foundation for films about Africa that would continue for most of the 20th century and still has a profound impact. Several archetypes emerged concerning fiction about Africa, the most important being:

Conversely, both critics agree that Rules of Engagement is exemplar of racist treatments of Arabs. The film, which deals with the court-martial concerning a rescue mission in Yemen, is widely considered to be anti-Arabic. Shaheen calls it “one of the most anti-Arab scenarios of all time” (434). He cites the 15 minute rescue mission scene as one of the main points, from the unruly and uncivilized nature of the local Yemenis, to Samuel L. Jackson yelling “waste those motherfuckers!” to the marines, who then proceed to shoot 83 people in the crowd (434-5). Semmerling concurs with Shaheen’s observations, saying that the film is a conservative attack on multiculturalism (169). Both note that this is a prime example of how the Arab is made the villain.

● The White Hunter – A white man (often times a big game hunter) who has lived in Africa for some time and is knowledgeable about the area and its people. A prime example, and the basis, of this archetype is Allan Quatermain, who first appeared in the book King Solomon’s Mines by H. Ridder Haggard (1885). A connecting archetype is The Explorer, which is functions a parody of the White Hunter.

This is similar to American Sniper, a film adaptation of Navy sniper Chris Kyle’s autobiography. Rania Khalek writes that the film “valorizes American military aggression while delivering Hollywood’s most racist depiction of Arabs in recent memory”(Khalek). While toning down the racism that permeates Kyle’s original book (Khalek), the film is still highly anti-Arabic. Even if one ignores the fact that Arabs are almost always referred to as “savages’, there is still the fact that almost every single Arab character is a killer and connected to AlQaeda, from a friendly resident who turns out to have a cache of weapons to an enforcer known as “the Butcher”. The one Arab character that does not try to kill Chris or his men is one man known, unceremoniously, as “The Sheik”, who agrees to sell out the aforementioned Butcher for $100,000. He is

● The Jungle Lord/Queen – A white man/woman who was raised in the African wilderness by either wildlife or an African tribe.. The best example of this archetype is Tarzan, created by Edgar Rice Burroughs. ● True Blue White Woman – A white woman who is not of Africa but a tourist. This archetype is often used as a damsel in distress, and is often used as the love interest. Well known examples include Jane Porter from the Tarzan franchises.

67


● The Good African – A character in this archetype is usually a native who is either or subservient or friendly and helpful to the white characters of the film.A noted example of this is Umbopa from King Solomon’s Mines.

constructs of the noble savage, and present an image of a universal human identity while neglecting any deeper differences and nuances between cultures” (40). Tarzan is very indicative of how many films tend to treat Africa.

● The Savage/Hostile Black – This archetype is the antithesis to the Good African. This archetype has shifted over time from describing violent indigenous tribes to black revolutionaries targeting whites.(Cameron 113-4).

Blood Diamond, despite focusing on serious issues concerning Africa, still uses many archetypes that originated during the Imperialist era. Danny Archer (Leonardo DiCaprio) is a modernized disillusioned version of the White Hunter, while Maddy Bowen (Jennifer Connelly) is an updated True-Blue White Woman. Interestingly, the main African character Solomon Vandy (Dijimon Hounsou) does not fall into either of the two African archetypes, even though are some critics think that Vandy falls into “Good African” archetype (Haisan/ Anderson/Wood, 235-6). There is evidence that is contrary to this analysis , as there are multiple scenes where Vandy clashes with Archer, which runs contrary to the “Good African” form. The archetypes that fall to Africans are fulfilled by other minor characters (like the bartender and the rebels). Despite this, there is still a subconscious vein of white paternalism throughout the movie. It is only through the help of the two principal white characters (Archer and Bowen) that the principal black character (Vandy) is able to rescue his family and achieve his goals.

Other important features of the portrayal of Africa in American film include a lack of geographical accuracy and the idea of lost civilizations. These arose mostly due to the fact that a lot about Africa was still unknown at the time (Cameron 11). While many of the archetypes and features have faded over time, especially The White Queen and The Noble Savage, they can still be seen in two recent films: Disney’s Tarzan (1999) and Zwick’s Blood Diamond (2006). The earliest of these films, as well as the one with the most hallmarks of a colonialist film, is Tarzan. Despite Disney’s best efforts to distance the film’s content from its racist and imperialist roots, many of the archetypes and hallmarks still pervade the original. These archetypes include geographical nonsense (a jungle next to an ocean), a TrueBlue White Woman (Jane Porter), a White Hunter (Clayton), the Explorer (Professor Archimedes Q. Porter), as well as the quintessential Jungle Lord himself, Tarzan. Also worthy of note is a mid-movie montage in which Jane, Porter, and Clayton all take turns teaching Tarzan various aspects of Western culture, reflective of the missionaries and government teachers that came into Africa during the imperialist period. Also of note is how Disney wrote Tarzan and his animal friends into recognizable familial archetypes (Lin, 36). All this shows, according to Lin, that while “Disney’s animated films appear to champion multicultural tolerance and understanding, they merely reinforce Western essentialist

Through this paper, it has been demonstrated the intellectual roots of the Western views of the Middle East and Africa, how these views have been translated into American films, and the little impact films from these regions have had on American cinematic production. Clearly, this has hurt the image of both these regions in cinema and the public mind. There is no way of knowing if this will change any time in the future, but one can hope that it becomes a bigger part of the American cinema.

68


References Camarota, Steven A. "Immigrants from the Middle East." Center for Immigration Studies. Center for Immigration Studies, Aug. 2002. Web. 13 Apr. 2015. Cameron, Kenneth M. Africa on Film: Beyond Black and White. New York: Continuum, 1994. Print. Galindo, Brian. "19 Things You Might Not Know About “Aladdin”." BuzzFeed. B uzzFeed, 31 July 2013. Web. 13 Apr. 2015. Khalek, Rania. ""American Sniper" Spawns Death Threats against Arabs and Muslims." The Electronic Intifada. N.p., 22 Jan. 2015. Web. 13 Apr. 2015. Lin, Ng S. "Disney’s Brand of Boutique Multiculturalism." N.p., n.d. Web. McKerrow, Raymie E., Marouf Hasian, Jr., Carol W. Anderson, and Rulon Wood. "Cinematic Representation and Cultural Critique The Deracialization and Denationalization of the African Conflict Diamond Crises in Zwick’s Blood Diamond." JSTOR. NYU Press, n.d. Web. 13 Apr. 2015. Said, Edward W. Orientalism. New York: Vintage, 1979. Print. Semmerling, Tim J. "Evil" Arabs in American Film: Orientalist Fear. Austin: U of Texas, 2006. Print. Shaheen, Jack G., and William Greider. Reel Bad Arabs: How Hollywood Vilifies a People. New York: Olive Branch, 2009. Print. "U.S. Immigration Trends." Migration policy.org. N.p., 23 Jan. 2013. Web. 13 Apr. 2015.

69


Marriage Equality: A Controversial and Ill-Considered Goal Asia Kopcsandy

South Africa, and Argentina would do the same. U.S. states also passed bills directly opposing DOMA. In 2015, the US Supreme Court ruled in Obergefell v. Hodges that same-sex marriage must be recognized by all states, declaring marriage equality as a right, causing national celebration and uproar (Strangio).

Over a century ago, Karl Kertbeny coined the term “homosexual” to describe someone who feels same-sex attraction (Meem 32). Since then, it has been no easy task to establish LGBTQ folk as world citizens who deserve the same freedoms as straight people. The struggle continues, but not without victories that have restored what can sometimes be a withering hope. Just over a year ago, on June 26th, 2015, same-sex marriage became legal in the United States. The response of the American public and media was mixed, as one would expect from a nation so split ideologically. While there was much talk of liberal pro-marriage equality and contrasting conservative anti-marriage equality beliefs, further-left LGBTQ folk who decried marriage equality as the paramount goal of the LGBTQ rights movement were underrepresented.

A great number of factors made this event possible. Gay historian George Chauncey writes that marriage has changed in four main ways that have made it more palatable to straights people and more urgent to obtain for LGBTQ people. First, “the freedom to choose one’s marital partner became a fundamental civil right” (Chauncey 60), as evidenced by Loving v. Virginia, which ended the ban on interracial marriage in 1967. Second, gender roles within heterosexual marriages have become less strictly defined, causing these relationships to be somewhat more gender-neutral or egalitarian both in the eyes of the law and in daily life. Third, marriage became a “primary nexus for the allocation of” a number of state and private benefits such as tax breaks, welfare, and insurance (Chauncey 71). Finally, the power of the Christian church of any denomination to deny marriage rights to an individual has greatly diminished. Between the many denominations who have embraced marriage equality and the fact that marriage is a fairly secular institution in the United States, the power of religion as a reason to deny marriage between certain people is greatly reduced.

The twentieth and early twenty-first centuries have been marked by human rights struggles on a global scale. Among and intertwined with these movements is the fight to have same-sex marriages recognized. In the 1970s, when the Netherlands was the first nation to legalize limited rights for LGBTQ couples, Maryland was the first U.S. state to specifically ban same-sex marriage (“Gay marriage around the world”). The next several decades were wrought with victories and losses for LGBTQ people, the world simultaneously progressing and regressing. In 1996, the same year that Greenland and Iceland legalized registered partnerships, U.S. President Bill Clinton passed the Defense of Marriage Act, banning the federal government from recognizing same-sex unions (Meem 81). Though opposition continued, the year 2000 marked a pivotal positive change that would spread internationally: the Queen of the Netherlands signed the first same-sex marriage bill into law that December (“Gay marriage around the world”). In the coming years, other nations such as Belgium, Canada, Spain,

The media wasted no time in covering the historic event of American marriage equality. Liberal coverage of the ruling was decorated with gay iconography, using the language of empowerment and triumph over odds through hard work. The American Civil Liberties Union’s blog post included a photo of a rainbow flag in the shape of the US overlaid with the words “Love Wins!” (Strangio). It recounted the reflections of the judges and debunks reasons for the prohibition of same-sex marriage, claiming that the “kids need a mother and a 70


father” argument is unfounded. Towards the end, Strangio offered words of affirmation and validation, finishing with a wish to celebrate in the midst of the continued struggle for full LGBTQ rights.

leading a different sort of life. It's not about ... being assimilated. ... It's about being on the margins, defining ourselves” (Anonymous Queers 1-2). They articulate that queer people do not or should not want to assimilate into the dominant straight culture; they want to be liberated on their own terms instead of being allowed to join or participate in institutions that they know are oppressive. They want to retain their difference from straights people and straight culture but also receive fair treatment, a concept that appears to be alien to many liberals and conservatives who think that the only possible goal of the LGBTQ movement is total equality. The Anonymous Queers know that equality under an unjust system is not a worthy goal; rather, the goal is to achieve liberation from the system. Radical queers also tend to have other very pressing issues on their minds, which will be discussed in the final section of this paper.

Conservative news sources were predictably disapproving of the ruling for a variety of reasons. One of the common justifications was rooted in Christianity, or, more specifically, God’s supposed condemnation of non-normative sexuality. Pastor Jeffress wrote for Fox News that “Friday’s Supreme Court decision represents a collective shaking of our fists in God’s face saying, ‘We don’t care what You say about life’s most important relationship. We know best’” (Jeffress). He was worried about the implications for churches and Christian businesses that would no longer be allowed to deny services to LGBTQ people. Another cause of conservative anger was that the decision was made by the Supreme Court Justices, who are not elected by the people and, therefore, do not represent them. Columnists such as Fox’s Cal Thomas wrote on this, forgetting that this is not the first time that the Supreme Court has taken civil rights matters into its own hands before the American public was quite ready for it, such as in Brown v. Board of Education, Loving v. Virginia, and Roe v. Wade similarly historic and transformational cases.

Marriage may be a significant victory, but LGBTQ Americans have a great deal of turmoil and struggle to look forward to in the near future. President Donald Trump and Vice President Mike Pence combined with a Republican-controlled House of Representatives, a Republican-controlled Senate, and several soon-to-be-open Supreme Court seats spell trouble for the rights of all marginalized people. While Donald Trump has been inconsistent in his views about marriage equality, Mike Pence is staunchly opposed to the LGBTQ movement, having voted several times in favor of restricting LGBTQ rights and claiming homosexuality is a choice that will lead to “societal collapse” (“2016 Presidential Candidates on LGBTQ Rights”). Pence also believes in the power of conversion therapy for young LGBTQ people, which is psychologically abusive and has been discredited by the medical establishment (Stack). It will be quite easy to push discriminatory laws through Congress, just as the Supreme Court will most likely take a very conservative turn.

However, one population is routinely underrepresented in both liberal and conservative responses to the ruling: LGBTQ folk who are disinterested in or outright disagree with the ruling. Without critical analysis, these people may seem like they are merely victims of internalized homophobia. While that may be true for some of this population, many of them do not fall under this category. There are plenty of LGBTQ individuals who love themselves and are committed to the liberation of their people, but disagree with the focus on marriage rights. In an anonymously published leaflet from 1990 titled “Queers Read This,” the authors articulate a radical stance that is not uncommon today: “Being queer means 71


In addition to a wholly conservative government, the American left is feeling a profound backlash from what is commonly referred to as the “alt-right,” the modern neoNazi, white supremacist wing of American politics. The alt-right movement is primarily composed of white men who feel victimized by the social progress of the last century and wish to reverse it to ensure their continued privilege. While the alt-right movement is primarily online, it would be a mistake to underestimate the power of bigotry and hatred toward women, people of color, LGBTQ folk, immigrants, the disabled, and other marginalized peoples. Without a strong and cohesive resistance movement, America may fall into the throes of fascism, threatening the progress achieved thus far and the very lives of the oppressed.

marriage equality marginalizes queer folk who are not upper-class and white. The current world climate of LGBTQ rights is, in a word, unprecedented. Though the new administration feels like an impending dark storm, we by no means inhabit a “calm” before the said storm. LGBTQ people have never been able to relax when it comes to the struggle for liberation worldwide and this goal has not yet been achieved. Behind all the rainbow flags and pride parades is a people who is tired but unwilling to give up. We can only hope that our strength does not waver in the face of new and frightening obstacles, and that we will one day gain the respect we deserve.

With this looming cloud in mind, there is no shortage of action to take. We may now have the right to marry whoever we want and raise children, but this decision can easily be overturned. In addition, it is not nearly the most pressing issue for LGBTQ folk in the eyes of many Americans, especially those further to the left than liberals. Homelessness is a serious problem for LGBTQ people, considering the realities of housing discrimination and the fact that intolerant families may kick out their LGBTQ children. LGBTQ youth as a whole are twice as likely to become homeless, and these homeless LGBTQ youth are almost three times more likely to commit suicide (National Coalition for the Homeless). LGBTQ people in general need easier access to mental health services, but this is specifically urgent for transgender people because currently doctor approval is the only path to gender affirmation surgery and hormone replacement therapy (Teich 56), which helps trans people deal with dysphoria. These issues of homelessness and mental health can be serious matters of life or death for some LGBTQ people, something that the marriage equality movement simply cannot address. While it is clear why samesex marriage is desirable, the essentialization of the LGBTQ movement as the fight for

References "2016 Presidential Candidates on LGBTQ Rights" Ballotpedia. N.p., n.d. Web. 04 Dec. 2016. Anonymous Queers. Queers Read This. New York: n.p., 1990. Print. Chauncey, George. Why Marriage?: The History Shaping Today's Debate over Gay Equality. New York: Basic, 2004. Print. "Gay Marriage around the world." BBC News. N.p., 23 Apr. 2013. Web. 04 Dec. 2016. Jeffress, Robert. "Gay Marriage: Why Supreme Court Got It Wrong." Fox News. FOX News Network, 26 June 2015. Web. 04 Dec. 2016. Meem, Deborah T., Michelle Gibson, and Jonathan Alexander. Finding Out: An Introduction to LGBT Studies. Los Angeles: Sage, 2014. Print. National Coalition for the Homeless. LGBTQ Homelessness. Washington, DC: National Coalition for the Homeless, June 2009. PDF. 72


Stack, Liam. "Mike Pence and ‘Conversion Therapy’: A History." The New York Times. The New York Times, 30 Nov. 2016. Web. 04 Dec. 2016. Strangio, Chase. "June 26th: A Historic Day for Equality." American Civil Liberties Union. ACLU, 26 June 2015. Web. 04 Dec. 2016. Teich, Nicholas M., and Jamison Green. Transgender 101: A Simple Guide to a Complex Issue. New York: Columbia UP, 2012. Print.

Works Consulted Thomas, Cal. "You've Been Warned, America, Gay Marriage Is Just the Beginning." Fox News. FOX News Network, 30 June 2015. Web. 04 Dec. 2016.

73


How Colonial Rule and Nationalism Inform the Understanding of Mali's Fragile State Amanda Morrison

“one of the least developed and most food insecure countries…6th lowest in the world.”3 Even though Mali now receives over one billion U.S. dollars of developmental aid each year, it remains unstable due to the presence of radical Islamist insurgent groups.4 The terrorist coup in 2012 halted democratic and anti-terrorism progress in northern Mali, and more recently, a 2015 Al Qaeda attack in the capital of Bamako left twenty-seven people dead.5 Thus, colonial rule and nationalism perpetuated underdevelopment and the growth of terrorism in Mali. Colonialism catapulted the West African country into a state of instability that now affects the international community. This paper will examine the historical roots of instability in Mali in order to understand the social, cultural, economic, and political factors that helped create Mali’s current unstable government and infrastructure.

The Sahel region of Africa has been home to seemingly endless military and political conflict for over half a century. This geopolitical region encompasses twelve nations that are prone to drought, food shortages, corrupt governments, and terrorism. Sahelian nations are also home to five million malnourished children. Although all twelve countries struggle with instability and food insecurity, the Sahelian nation Mali is perhaps the most unstable. After Mali ranked high on the Global Vulnerability Index and was classified as a fragile state by the Organization for Economic Cooperation and Development in 2014, the international community turned their attention specifically towards this country.1 Even though the instability in Mali has increased in recent years, the vulnerability of the nation is nothing new.

When Mali gained independence from France in June of 1960, it did so without a war or overt fight for separation. Papers were simply signed, and the Mali Federation— made up of both Mali (then called French Sudan) and Senegal—became its own governing body, free from French rule. While the lack of conflict over Malian independence

Mali’s fragile state has deep-seated roots; stemming back as early as 1312 after the rule of Mansa Musa and continuing through French colonial rule from 1892 to 1960. Instability remained during the dismantling of the Mali Federation in 1960 and furthered through an attempt at democracy that ended suddenly with an extremist coup d’état of Mali’s governing body in 2012.2 The least developed and most food insecure conflicts created through the process of decolonization in Mali led it to be

___ 3. World Food Programme: Fighting Hunger Worldwide. “Mali.” Accessed November 14, 2016. https://www.wfp.org/countries/mali. 4. Arieff, Alexis. Crisis in Mali. Issue brief. Congressional Research Service. Federation of American Scientists (2013). Accessed November 30, 2016. https://fas.org/sgp/crs/ row/R42664.pdf

___ 1. Global Humanitarian Assistance: A Development Initiative. “Mali.” Accessed December 10, 2016. http:// www.globalhumanitarianassistance.org/ countryprofile/mali/.

5. Wagner, Meg, Cameron Joseph, and Rich Schapiro. “American Aid Worker Among 27 Dead After Al Qaeda Militants Storm Mali Radisson Blu Hotel.” New York Daily News (2015). Accessed December 10, 2016. http:// www.nydailynews.com/news/world/ shooting-reported-mali-hotelarticle-1.2441291.

2. Michigan State University. “Mali: History.” Accessed December 10, 2016. https:// msu.edu/user/staatz/university_of_mali/ Background.htm. 74


might seem like a benefit at face value, it actually set Mali up for failure. Not only did this decision result in Senegal choosing to leave the Mali Federation only two months after it was established, but the lack of unity over Mali’s independence left the country without a national identity. In fact, scholars argue that this separation between Mali and its former French colonial rule signified that the new nation was “standing up to nationalists favoring immediate independence” and revoking the idea of a nationalist state.6 Furthermore, “as anticolonial struggle gathered momentum, it necessarily adopted as the unit of selfdetermination the colonial territory.”7 This means that independent Mali emerged mainly as a product of anti-colonialism and a response to other African countries that had already started to rebuild after decolonization. The sudden Malian independence signified a disconnect between the new social and cultural identity in the state versus what some of the inhabitants of the state wanted their identity to be. This social and cultural conflict catapulted Mali into an identity crisis that still exists today. Thanks to the disagreements between the Tuareg people—ethnic groups living in northern Mali—and Islamist Malians, present day government conflicts are rarely solved, infrastructure continues to crumble, and terrorism is rampant.8 The lack of a cohesive Malian identity coupled with the Tuareg’s desire to secede and nationalize with “their own country in the North, which they call Azawad,” creates great sociocultural

tension.9 Due to these continued tensions, Mali is unable to progress in other areas.

___

9. Karimi, Faith, and Erin Burnett. "A Ticking Time Bomb: What's Behind the Instability in Mali?" CNN (2012). Accessed December 14, 2016. http://www.cnn.com/2012/10/21/ world/africa/mali-q-a/

Unfortunately, sociocultural factors are not the only negative influences on stability in Mali. Economic struggles have plagued this West African country in recent years, and thus, the international community funnels aid money to governmental and nongovernmental organizations there every year. The roots of economic problems in Mali can be traced back to French colonialism. The colonial government instilled in Malians the mentality of a conventional production system. As a result, when the country became independent in 1960 and needed to restructure the economy, many citizens opposed industrialization. Baz Lecocq explains further, “The modernization of the economy could only be successful if the mentality of the rural population could be transformed from a backward traditional outlook on production and society to a modern rational one.”10 Lecocq argues that Mali was doomed to economic failure from the beginning of decolonization. Economic instability, however, is not the only issue; economic success also depends on food security. Since Mali ranks so poorly on international food security assessments, there is an increased risk for continued economic decline in the coming years.11 Maximo Torero, the division director of the Markets, Trade and Institutions Division at ___

6. Gifford, Prosser, and W. M. Roger Louis, ed. Decolonization and African Independence: The Transfers of Power, 1960-1980. Yale University, 1988. 5. 7. Ibid.

10. Lecocq. Disputed Desert: Decolonisation, Competing Nationalisms, and Tuareg Rebellions in Northern Mali. 78.

8. Lecocq, Baz. Disputed Desert: Decolonisation, Competing Nationalisms, and Tuareg Rebellions in Northern Mali. The Netherlands: Koninklijke Brill NV, 2010. 367.

11. World Food Programme: Fighting Hunger Worldwide. “Mali.” Accessed November 14, 2016. https://www.wfp.org/countries/mali. 75


the D.C.-based International Food Policy Research Institute, furthered this idea when he commented on how food insecurity has negative effects on a nation’s economy:

repercussions of colonialism and contested nationalism. When Mali became its own state in August of 1960, the country’s new “leadership sought legitimization and security through sanctifying their territory as embodying nationality.”14 Mali struggled to get on its own two political feet after decolonization because of nationalist tensions and the sudden creation of an independent government. Although socialism was the dominating political force in Mali throughout the latter half of the 20th century, democracy made a successful appearance in the 1990s and stretched through the early 2010s. Yet just when things were looking up, the political conflicts between the democratic government and the Tuareg rebels caused democracy to come crashing down. Ban Kimoon, former Secretary-General of the United Nations, toured the Sahel region of Africa in 2013 and explained the situation in Mali by pointing out that, “Despite progress made towards re-establishing constitutional order in Mali, the past two years has witnessed a military coup d’état, fighting between Government forces and Tuareg rebels, and the seizure of the north by the Islamists.”15 In this statement, Ki-moon is commenting on the fragile foundation of Malian government following French colonial rule and how it failed to help the progression of democracy in this already unstable country. Additionally, Faith Karimi and Erin Burnett of CNN discussed the recent political climate in Mali that allowed extremism and anti-democratic ideals to take root. They

“Without stable and long-lasting food security, there will be a continued negative effect on human capital and this will raise government fiscal costs, with negative consequences on government public spending. This also will lead to stagnated economic growth in the long term.”12

Since food insecurity has such a negative impact on economic stability in Mali, regional and global efforts are needed to combat this food crisis. While food insecurity seems like a problem that could easily be solved by the international community, the conflicting nationalisms in Mali often prohibit this progress. A RAND Corporation analysis in 2013 concluded that, “all sources of revenue, including international aid, become objects of intense competition.”13 Unfortunately, theft and terrorism have a strong potential to corrupt monetary aid provisions because of the competition between nationalist Tuaregs, other Malians, and even terrorist groups such as Al Qaeda in the Islamic Maghreb (AQIM). Finally, it is impossible to discuss Mali’s current terrorism crisis, along with its failures in government and infrastructure, without also discussing the political __ 12. Torero, Maximo. "Food Security Brings Economic Growth - Not the Other Way Around." International Food Policy Research Institute (2014). Accessed December 14, 2016. https://www.ifpri.org/ blog/food-security-brings-economic-growthnot-other-way-around.

___ 14. Gifford, Prosser, and W. M. Roger Louis, ed. Decolonization and African Independence: The Transfers of Power, 1960-1980. 5. 15. United Nations News Centre. "Africa’s Vast Sahel Region Threatened by Terrorism, Organized Crime, Ban Warns." Last modified December 12, 2013. Accessed December 12, 2016. http://www.un.org/apps/news/ story.asp?NewsID=46726#. WFMlYrYrLLY.

13. Pezard, Stephanie, and Michael Shurkin. "Toward a Secure and Stable Northern Mali: Approaches to Engaging Local Actors." RAND Corporation (2013). Accessed November 19, 2016. http://www.rand.org/ pubs/research_reports/RR296.html. 76


noted, “The Tuareg rebels took advantage of the power vacuum and seized some parts of the north. They have always wanted independence, and have staged several rebellions since the 1960s.”16 Their rhetoric shows the persistence of the problems created during the period of decolonization in Mali. Thanks to the lack of a national social, cultural, and political identity in Mali, Tuareg rebels were able to destabilize progress and unseat democracy in 2012.

importance of Western support for Malian stability, global humanitarian efforts are also needed. Other research concluded that, Given that the instability in the north was a key factor leading to the breakdown of Malian democracy, it seems clear that more pro-active donor behavior in this region could have helped to avert the crisis.”18

If global investments can lessen the effects of social, cultural, economic, and political conflicts in the region, they will in turn improve the state of Mali’s government and infrastructure. With international security and counterterrorism efforts on the line, it is in the best interest of the international community to provide aid and direction for security and stability in Mali.

After examining numerous sources and scholarly research, it is evident that colonialism and nationalism have profoundly prohibited stability in Mali—both in the past and in the present. In order to move Mali towards a brighter future full of food security and national unity rather than terrorism and instability, Mali’s government will need to appease Tuareg rebels in the North while also rebuilding infrastructure in the South. However, evidence shows that Mali should not be working alone to return to a democratic state. A New York Times article explains,

___ 18. Van de Walle, Nicolas. "Foreign Aid and Malian Democracy." Research and Communication on Foreign Aid (2014). Accessed December 14, 2016. http:// recom.wider.unu.edu/article/foreign-aidand-malian-democracy

“While it draws scant attention from the Western media, the Sahel-North Africa region is actually more important than Afghanistan to the vital interests of Western powers. North Africa provides energy security for Europe with its vast oil and natural gas deposits, along with maritime security in the Mediterranean. Governments in the region have the potential to foster democratic change in post-authoritarian states.”17

Although this analysis underscores the ___ 16. Karimi, Faith, and Erin Burnett. "A Ticking Time Bomb: What's Behind the Instability in Mali?" CNN (2012). 17. Crocker, Chester A., and Ellen Laipson. "The Latest Front in a Long War." The New York Times (2013). Accessed on November 19, 2016. http://www.nytimes.com/ 2013/03/08/opinion/global/the-sahel-isthe-latest-front-in-a-long-war.html. 77


Accessed November 19, 2016. http:// www.rand.org/pubs/research _reports/RR296.html.

References Arieff, Alexis. Crisis in Mali. Issue brief. Congressional Research Service. Federation of American Scientists (2013). Accessed November 30, 2016. https://fas.org/sgp/crs/row/ R42664.pdf

Torero, Maximo. "Food Security Brings Economic Growth - Not the Other Way Around." International Food Policy Research Institute (2014). Accessed December 14, 2016. https:// www.ifpri.org/blog/food-securitybrings-economic-growth-not-otherway-around.

Crocker, Chester A., and Ellen Laipson. "The Latest Front in a Long War." The New York Times (2013). Accessed on November 19, 2016. http:// www.nytimes.com/2013/03/08/ opinion/global/the-sahel-is-the-latestfront-in-a-long-war.html

United Nations News Centre. "Africa’s Vast Sahel Region Threatened by Terrorism, Organized Crime, Ban Warns." Last modified December 12, 2013. Accessed December 12, 2016. http:// www.un.org/apps/news/story.asp? NewsID=46726 #.WFMlYrYrLLY.

Gifford, Prosser, and W. M. Roger Louis, ed. Decolonization and African Independence: The Transfers of Power, 1960-1980. Yale University, 1988.

Van de Walle, Nicolas. "Foreign Aid and Malian Democracy." Research and Communication on Foreign Aid (2014). Accessed December 14, 2016. http:// recom.wider.unu.edu /article/foreign-aid-and-maliandemocracy

Global Humanitarian Assistance: A Development Initiative. “Mali.” Accessed December 10, 2016. http:// www.globalhumanitarian assistance.org/countryprofile/mali/. Karimi, Faith, and Erin Burnett. "A Ticking Time Bomb: What's Behind the Instability in Mali?" CNN (2012). Accessed December 14, 2016. http:// www.cnn.com/2012/10/21/world/ africa/mali-q-a/

Wagner, Meg, Cameron Joseph, and Rich Schapiro. “American Aid Worker Among 27 Dead After Al Qaeda Militants Storm Mali Radisson Blu Hotel.” New York Daily News (2015). Accessed December 10, 2016. http:// www.nydailynews.com/news/world/ shooting-reported-mali-hotelarticle-1.2441291.

Lecocq, Baz. Disputed Desert: Decolonisation, Competing Nationalisms, and Tuareg Rebellions in Northern Mali. The Netherlands: Koninklijke Brill NV, 2010.

World Food Programme: Fighting Hunger Worldwide. “Mali.” Accessed November 14, 2016. https://www.wfp.org/ countries/mali.

Michigan State University. “Mali: History.” Accessed December 10, 2016. https:// msu.edu/user/staatz/ university_of_mali/Background.htm. Pezard, Stephanie, and Michael Shurkin. "Toward a Secure and Stable Northern Mali: Approaches to Engaging Local Actors." RAND Corporation (2013). 78


INSTRUCTIONS FOR SUBMISSION Interested in being featured in our next issue? Send us an email at tuhssa@gmail.com! Get published in Temple's one and only peer-reviewed, undergraduate history & social sciences research journal! We’re looking for anything related to history, sociology, political science, economics, anthropology, geography/urban studies, global studies, Africana studies, American studies, Asian studies, Latin American studies, & gender/sexuality/women's studies! Submissions should be 1000 words (or four pages) minimum, Times New Roman, 12 pt. font, double spaced. Thank you for your interest! – Perceptions Editorial Board

CONTACT Website temple.sites.edu/tuhssa Facebook page facebook.com/189316056936 Facebook group facebook.com/groups/72543762899/ Twitter twitter.com/tuhssa OwlConnect temple.collegiatelink.net/organization/tuhssa Instagram instagram.com/tuhssa

79


Turn static files into dynamic content formats.

Create a flipbook
Issuu converts static files into: digital portfolios, online yearbooks, online catalogs, digital photo albums and more. Sign up and create your flipbook.