ISRF Bulletin Issue XVII: The Past in the Present

Page 1

i s r f

b u l l e t i n

Issue XVII

The Past in the Present

Edited by Dr Rachael Kiddey



i s r f

b u l l e t i n

Issue XVII

The Past in the Present


First published September 2018 Copyright © 2018 Independent Social Research Foundation


TABLE OF CONTENTS EDITORIAL 4 FROM THE DIRECTOR OF RESEARCH

7

EPIDEMICS AND GLOBAL HISTORY

9

SOCIAL SCIENCE AND IMPERIAL PROJECTS

13

HOW CAN WE TALK POLITICS ACROSS THE POST-2016 DIVIDE?

23

PROVOCATIONS FOR A MATERIAL INTERNATIONAL LAW

27

CRISIS: WHERE HUMAN NATURE MEETS CULTURAL CRITIQUE 34


EDITORIAL Dr. Rachael Kiddey ISRF Academic Editor

W

elcome to this - the seventeenth – issue of the ISRF Bulletin. Those familiar with the Bulletin will know that it usually consists of a number of short articles produced by ISRF Fellows, often with responses from Academic Advisors. In this issue, we are also pleased to include two articles by Fellows of the Max Planck Institute for the History of Science, with which we have partnered this year for the annual ISRF Workshop. What happens when social scientists and historians meet and talk? This was the intellectual impetus for the theme of the sixth ISRF Annual Workshop which this year will be held in Berlin, with the title ‘Relating Pasts and Present: History of Science and Social Science’. For historians (and archaeologists), what constitutes knowledge and how (and by whom) it is produced is always specifically historically situated, while social scientists, from anthropologists to psychologists, remind us that there is always also a spatial or environmental element to knowledge. People across time and space cannot be expected to think or know in the same ways and by looking at how things change in historical perspective, we shed fresh light on global transformations more widely. For example, Edna Bonhomme (Fellow of the Max Planck Institute for the History of Science) discusses this matter in relation to attitudes to health and healing in the Middle East (see her article, this issue). To the philosopher, of course, observations that ‘things change’ point to the future being different again, which leads to the conclusion that reflexivity must be a vital part of social science’s methodology. From outright racist interpretations of the bodies and cultures of ‘colonial subjects’ to the pernicious denigration of native peoples’ resistance to colonialism, Martin Thomas (this issue) is right to draw our attention to the uncomfortable truth that social science was itself deeply implicated in the Western imperial project. Not least, many

4


DR. RACHAEL KIDDEY

early social anthropologists and ethnographers were simultaneously colonial administrators who used (and abused) scientific language and practices to justify the oppression of and violence towards colonial peoples. However, with the exception perhaps of Frantz Fanon1, most philosophers of social science writing in the mid-twentieth century would not have considered social science (or social scientists) at fault, which serves to further illustrate the importance of continually re-evaluating history itself, revisiting what we know to reassess how it is understood from professional and popular perspectives. From the Troubles in Northern Ireland to the fatalities following the most recent election in Zimbabwe, what happens when debate is exhausted and when situations become interminably polarised is unrest and violence. The true value of re-evaluating what we think and why is well unpacked in relation to identity politics by Sherrill Stroschein (this issue). Identity, a standard research subject for social scientists of all types for well over a century, is an extremely powerful contributor to politics. Stroschein argues convincingly that the reason is an ‘ideological and identity overlay’, whereby one’s identity as NOT something is far more important to protect than any change that might come from voting differently. To cite a recent U.K. example, despite the fact that Cornwall (a rural county in the south west of the U.K.) is due to receive €1000 per capita in European Union (EU) funds between 2014 and 2020, the county voted overwhelmingly to leave the EU during the 2016 Brexit referendum. Contrary to patronising explanations that 56% of Cornish voters voted Leave because they did not understand the vote or because they are parochial, global economics and international law may actually have played bigger roles in the result. For example, with the exception of seasonal tourism, trades, and low-paid retail jobs, there is almost no work in the county. Cornwall’s traditional jobs were in mining, agriculture, and fishing, industries that have received immense pressure from EU regulations for many decades, while successive London-centric U.K. governments did little to address regional wealth disparities. One interpretation of the Cornish Brexit result is that it was important for voters to send a strong message of disagreement and disgruntlement to those faceless international lawyers in Brussels and politicians in Whitehall. Far from white sand beaches and idyllic cottages, the hugely under-employed, poverty ridden Cornish interior 1. Fanon, F., 1968. Black Skin, White Masks. s.l.:MacGibbon and Kee. 5


EDITORIAL

is, in many ways a county-scale example of what Jessie Hohmann (this issue) describes as an ‘object of international law’. As she makes perfectly clear, for all the highbrow, intellectual puff of contemporary law and politics, the ‘real’ – material, physical - world is where most people live and the ways in which this relates to governance is precisely what Hohmann’s ISRF-funded project will unpack. To some extent we can see the ‘material turn’ that has pervaded the humanities and social sciences recently as a form of ‘crisis’ that has contributed importantly to all of the critiques of Western, postEnlightenment thinking touched on above. Although, the term ‘crisis’ is itself at times a tool of social criticism (see Schmidt, this issue), an ‘instrument of rule’ used to control people and delimit who is considered eligible to contribute to democratic debate.2 In her article here, Susanne Schmidt (Fellow of the Max Planck Institute for the History of Science) traces the history of the midlife crisis to the ancient concept of ‘critical’, ‘climacteric’ years or ‘periods of transition’. Some of Schmidt’s illustrations are included in this issue. Finally, I am about to enter a period of transition myself as I reluctantly finish my role as ISRF Academic Editor in order to take up a British Academy Postdoctoral Fellowship on ‘Migrant Materialities’, at the School of Archaeology at the University of Oxford. It has been a great pleasure working for the Foundation and I hope to remain a keen colleague as it grows and blooms in new directions. For now, I very much look forward to welcoming you all to the sixth ISRF Annual Workshop, an opportunity for friends and Fellows to ‘talk down the discourse’, to borrow Sherrill Stroschein’s excellent phrase; a chance to discuss, debate, and deliberate on some of the most pressing concerns of our day in full, interdisciplinary perspective.

2. See also Agamben, G., 2005. State of Exception. Chicago/London: University of Chicago Press. 6


FROM THE DIRECTOR OF RESEARCH Dr. Louise Braddock

T

he ISRF, as I noted in the first Bulletin of the year, is entering its second decade. Not given to gestures, we do not envisage a celebratory event. But the fact, and the marker, of the passage of a decade naturally prompt reflection on how things have gone so far. Evidently, in terms of research done that might not otherwise have been done our Fellows tell us, and their work shows that indeed, something has been achieved. The question I find myself returning to is whether and how the ISRF’s stated aim of promoting new ways of thinking, even if one does not quite know what one means by this, is being realised. Now at one level this is an impossibly grandiose ambition anyway, and assessing its realisation is not obviously do-able in any interesting way, but it does put down a marker for the ISRF’s interest in non-instrumentalised reflective thought which, even if not exactly a new mode of thinking (being indeed extremely old) is now seen (again) by many as being at risk of extinction in its home environment, the academy.

Against this background, at the ISRF we aim to maintain a hopeful blend of the ideal of an intellectual ethic with a realistically achievable output, in, among other media, the Bulletin. This in-house endeavour has been taken forward by three editors to date, now to be followed by a fourth when Lars Cornelissen takes over the role from Rachael Kiddey in November. We thank Rachael for her four years’ contribution to the life of the Bulletin, and for the particular interest that her own field of archaeology has brought to it. We equally look forward to the new ideas that Lars will bring from his own interests, in critical theory and continental philosophy among others, and including the work of Foucault. Foucault writes that it is his hope for a ‘criticism that would try not to judge but to bring an oeuvre, a book, a sentence, an idea to life…It

7


FROM THE DIRECTOR OF RESEARCH

would multiply, not judgements but signs of life’.1 This is a good way to characterise what we hope for at the ISRF, and for which the Bulletin might be seen as a vehicle. The articles in this issue are exemplary in this respect and also, in their combination of expert knowledge and reflective criticality they recall another, distinctive medium of critique, the essay. In 2014 we instituted an annual series of ISRF Essay Prizes and were able to award year-on-year: a senior academic, a graduate student, an independent scholar, and a feminist economist who remarked to me that as she was habitually attacked for her views, it was nice to receive a prize for them instead. On two occasions however (one being this year) we have been unable to award a prize. One reason may be that the work of writing something ‘extra’ in today’s climate is a distraction for scholars obliged, or disciplined, to focus on career advancement. Another reason may be that the essay is not a favoured or indeed any longer a recognised medium for reflective thought alongside the opportunities offered by the blog, the ‘long read’, or the imperative to engage a wider public more disposed to tweeting than reading. Going resolutely against these trends we are currently, as advertised at the end of this Bulletin, running a competition for the ISRF’s own essay prize; not in conjunction with a journal as in previous competitions but as a standalone award which provides the winner (if a submission of sufficient merit is received) with a tidy sum for research-funding activities (that can be used in part to fund childcare). When compared to writing a small research grant proposal the work of writing the essay is not likely to prove more time consuming, is probably more likely to yield a pecuniary result, and is very likely more intellectually rewarding. It is also open to any scholar at any stage of a career, or none.

1. Quoted by Judith Butler in Butler, J.(2005). Giving an Account of Oneself. New York: Fordham University Press, p 44. 8


EPIDEMICS AND GLOBAL HISTORY The Power of Medicine in the Middle East Dr. Edna Bonhomme Postdoctoral Fellow, Max Planck Institute for History of Science

I

n August 2018, the World Health Organization reported that cholera has infected 120,000 people in Yemen. Survivors and victims, alike, have had to endure varying degrees of the symptoms: diarrhoea, dry mouth, low blood pressure, muscle cramps, and death. Situated in the southern tip of the Arabian Peninsula, Yemen heralds as a country bound with aromatic Mokha coffee, centuries of coastal trade, and most recently the target of a Saudi-led war. The political tensions of the Middle East have generated a particular kind of crisis whereby hundreds of thousands of people are subjected to a double bind - the tragedy of war and the upsurge of an epidemic. Vibrio cholerae, the bacteria that causes cholera, has been isolated since the 1800s and its cure has been known since the following century. The disease is one of many that have been the harbinger for epidemics in the Middle East. Yet, that history is connected to a global transformation in medicine, sanitation, and capitalism since the early nineteenth century. The current cholera epidemic in Yemen can be understood in a vacuum with a myopic account focused on disease incidence and prevalence, or it can be interpreted through a broader lens, one that considers the various historical, political, and commercial actors that shape medicine and health. The history of modern medicine in the Middle East is inseparable from the “global” insofar that these practices have a wide geographical and conceptual reach. Recent discussions about the term “global” have been problematized and disrupted in history of science along the lines of the local vs. the global, centre vs. periphery, and “Western” 9


EPIDEMICS AND GLOBAL HISTORY

vs. “non-Western.” Medicine and health also feature in this discourse insofar as the priorities that have driven the provision of healing and therapeutics are dynamic, dialectic, and material. Medical practices and their epistemology expand across space and time and feature within the global and globalizing processes, especially as epidemics cross borders and enter into realms of war, yet there is a prehistory that shows a dynamic and complex environment for health and medicine. Arab Christian, Jewish, and Islamic disease cosmology was traditionally linked to overlapping discourses about divine will, infection and death. ‘Adwā was synonymous with infection meaning that the disease could be transmitted through a vector or directly from the source. In the premodern period, contagion was a contested concept whereby disease transmission was attributed to anthropogenic forces, corrupt air, divine intervention, the environment, and evil spirits. Within the matter of practice, Jalāl al-Dīn Abū al-Faḍl A ͑ bd al-Raḥmān ibn Abī Bakr al-Suyūṭī, Ṭibb al-Nabawī (Medicine of the Prophet), remarked that “Every plague (ṭā͑ ūn) is an epidemic (wabā’), but not every epidemic (wabā’) is a plague (ṭā͑ ūn).” In Tadhkirat, Da’ūd Ibn ‘Umar al-Anṭākī (d. 1599) a sixteenth-century Syrian Christian physician, argued that ṭā͑ ūn (bubonic plague) was, a speedy moldy carnage that appears in such flails and armpits. It is called pestilence because of how inseparable these features are. Otherwise, they are two common special effects. In fact, they are a vesicle like broad beans and the rotten blood increases its component.1 His use of the image of broad beans visually dovetails with the structure of buboes. Additionally, the pustules that emerge from pierced buboes can generate a liquid containing blood. The intellectual dynamism between the tenth and fourteenth centuries set the stage for broader and more popular notions about medical, religious, legal, and moral milieu in the Middle East and North Africa. What was more distinct was considering the ways that therapeutics played a role in healing people. Overall, domesticated plants played an important role in medieval Arab pharmacology and the remedies directly applied to boils. Herbs, 1. Da’ud al-Antaki, Tadhkirat uli’l-albab wa’l-jami’ li’l-’ajab al-’ujab 10


DR. EDNA BONHOMME

ointments and spices were seen as both preventive and curative agents for the plague and were part of the political economy of therapeutics. Al-Suyūṭī (d. 911 AH/ 1505 CE) noted that people used violet infused ointments to prevent the buboes from spreading to other portions of the body. Violet was considered a curative agent well into the eighteenth century according to the Egyptian ͑ ulamā͗ ‘Abd al-Mu’ṭā al-Sahalāwī in his plague treatise. Doctors, pharmacists, and bloodletters, who were connected to local merchants, would serve as mediators who advocated for a herb and provided it to their willing and sometimes dying patient. Material culture also featured in therapeutics during epidemics. Amulets, incantations, and inscriptions were curative agents that were used by non-elites to deter the plague and other epidemics. These methods were part of a broader corpus that often got labeled as magic or popular medicine, and they can be understood within the context of material culture and warding off the evil eye or djinn. Magical practices could be seen as distinct from prayers. That is, prayers and inscriptions were primarily based on the many names of God, al-asmā al-ḥusnā’, which have mystic properties. Ibn Ḥajar advised reciting al-Kursi (the seat) and the subḥānallāh (glory to God) in treating the plague victim - one could recite them over the course of three consecutive nights as a preventive measure. The main Qur͗ānic verses that were believed to cure the plague included sūrah al-Yūnus, the sūrah an’ām, and fātiḥah of the Qur’ān. Similarly, Ibn Haydūr also recommended writing Qur’ānic prayers on paper and attaching those texts to the wall to prevent the bubonic plague from entering a household. In Shams al-ma’ārif al-kubrā, ͑Alī al-Būnī (d. 622 AH/1126 CE) recommended magic squares, cabalistic letters, and talismanic signs to prevent the plague from entering a household. To what extent does this pre-modern history inform us about the cholera epidemic in Yemen today, or any epidemic? How do broader regimes disrupt the possibility for using traditional medicine and/ or providing curative agents? Reflecting on the historical congruent traditional and religious medical practices can sharpen our analysis by providing a “critical and people-centred approach both to and within global health.” 2 At the same time, it also allows us to problematize 2. Biehl, J. (2016). Theorizing global health. Medicine Anthropology Theory, 3(2), 127-142. 11


EPIDEMICS AND GLOBAL HISTORY

the term “global” in history. Yemen is a place, yet its history is connected to a set of global occurrences, including colonialism, structural adjustment programmes, and most recently, war. Disease proliferation and treatment access are not solely reliant on Yemen but are central to broader discourses concerning the continuous wars in the Middle East, mobility of travel, and sanitation regimes. At the core of the cholera crisis are social and political events that can easily be resolved, as the World Health Organization has recommended, by terminating the war. Yet, that has not been the case even though the current “outbreak is the most serious on record.”3 The war has exacerbated the epidemic insofar that hospitals have been targeted by airstrikes4, medical supplies have not been allowed to safely enter the country, and health practitioners are working under harrowing conditions. Outbreaks kill - but what is needed is the development of global health policies that provide care and humanity to those suffering from an epidemic.

3. Associated Press (2018, August 30). ‘UN says 120,000 suspected cases of cholera in Yemen’. Associated Press. Retrieved from https://abcnews.go.com/Health/ wireStory/120000-suspected-cases-cholera-yemen-57503453 4. Fahim, K. (2016, January 10). ‘Hospital Aided by Doctors Without Borders Is Bombed in Yemen’. The New York Times. Retrieved from https://www.nytimes.com/2016/01/11/ world/middleeast/hospital-aided-by-doctors-without-borders-is-bombed-in-yemen.html 12


SOCIAL SCIENCE AND IMPERIAL PROJECTS Professor Martin Thomas Professor of History, University of Exeter; ISRF Mid-Career Fellow 2015-16

I

t might seem unsurprising that Western imperialists, their governments and supporters harnessed science and technology to advance the cause of empire. The means by which they did so are increasingly analyzed by historians through the prism of globalization, which frames the mechanics of scientific advancement in the context of transnational networks, the migration – voluntary or forced – of people, and the diffusion of knowledge as new technologies proliferated worldwide. At a more practical, but no less significant level, certain scientific achievements, from tin canning to viral prophylaxis, from steam ships to machine guns, have been singled out as particularly crucial to empire-builders, especially in the long nineteenth century of so-called ‘high imperialism’, which ended in 1914. It is perhaps more unsettling to remind ourselves that social science, too, became integral to Western imperial projects. More than that, certain branches of the social sciences were, from their inception, deeply implicated in colonialism. Sometimes they offered academic validation for it. At other times leading social scientists worked directly with state authorities to contain or even repress anti-colonial opposition within particular territories. In this context, the social scientific villains of the piece have typically been identified as firstgeneration social anthropologists, ethnographers, and, latterly, select groups of social psychologists. They are variously accused of first developing, then applying, ideas of scientific racism to colonial subject peoples and of pathologizing manifestations of anti-colonial protest as evidence of mental disorder or collective psychosis. This article offers a few snapshots of these processes at work. 13


SOCIAL SCIENCE AND IMPERIAL PROJECTS

In the case of the French Empire, after the First World War ideas of how best to administer dependent territories emerged from the confluence of three factors. First was the professionalization of the colonial service. Second was the surging popularity of the social sciences within French academia. And third was the belief shared by bureaucrats and social scientists that ethnography was a uniquely colonial discipline with scientific precepts that would enable officials not just to administer dependent peoples but to understand them as well.1 Those individuals who personified all three elements were best placed to put the new thinking into practice. Leading ethnographers boasted extensive colonial experience. Perhaps the most influential, Maurice Delafosse, was a former director of political affairs in the federal government of French West Africa. Another West Africa veteran, Henri Labouret, made ethnography integral to the curriculum of the École Coloniale, the college for trainee empire administrators on the avenue de l’Observatoire in Paris. Delafosse and Labouret persuaded other long-serving officials in French Africa that ethnology and its close cousin social anthropology were bedrocks of successful colonial government. 2 Their chief disciple was Georges Hardy, appointed to head the École Coloniale in 1926. 3 Hardy’s innovation was to marry these ‘colonial sciences’ with practical courses of instruction – a programme of social scientific ideas translatable into administrative practice. Officials trained in Hardy’s methods venerated ethnographic ‘fieldwork’ as a prerequisite for sound policy choices. It was not that simple. Ethnography came loaded with presumptions and prejudices in regard to colonized societies and their limited ability to cope with economic modernization. Industrial diversification, urbanization and the spread of waged labour were thus interpreted as socially destabilizing, even morally wrong. Puritanical, ascetic Islam was dangerous and 1. Alice L. Conklin, ‘The new “ethnology” and “la situation coloniale” in interwar France,’ French Politics, Culture and Society, 20:2 (2002), 29-48. 2. Emmanuelle Sibeud, Une science impériale pour l’Afrique? La construction des savoirs africanistes en France 1878-1930 (Paris: EHESS, 2002), 257-72; Wilder, The French Imperial Nation-State. Negritude and Colonial Humanism between the Two World Wars (Chicago: University of Chicago Press, 2005, 58-61. 3. Hardy’s earlier work in Morocco helped shape his ideas: Spencer D. Segalla, ‘Georges Hardy and educational ethnology in French Morocco, 1920-26,’ French Colonial History, 4 (2003), 171-190. 14


PROFESSOR MARTIN THOMAS

‘un-African’, heterodox Sufism supposedly more malleable and tolerant.4 Party politics and European-style jury trial, both predicated on adversarial argument, were, the ethnographers insisted, too much for African minds to handle. 5 Needs and wants were better articulated through traditional means – customary law (although officials remained hazy about what this was), chiefly courts, and village elders.6 Scientific colonialism, in other words, revealed as much about its practitioners beliefs as about those of their colonial subjects. The numbers of anthropologists roaming colonial Africa were much smaller than the ranks of other scientifically trained personnel - agronomists, medical specialists and engineers - who filled the ranks of colonial administrations after the Second World War. But the anthropologists were perhaps more influential in determining the actions of governments.7 Eager officials pointed to anthropological studies of ‘tribal custom’, local ‘folklore’ and ‘authentic tradition’ to justify colonial tutelage as a work of social conservation.8 No matter that the sheen of academic objectivity legitimized policies that typecast Africans in particular ways, consigning them to a pre-modern status in which industrialization, advanced education and gender equality became foreign-borne ills to be avoided.9 Others see baser motives in this ‘politics of retraditionalization’.10 Stripped of cultural baggage about remaking colonial societies in the French image, this ‘scientific colonialism’ signified a turn towards low-cost, high-extraction administration.11 At its heart was the ‘bargain of collaboration’ with 4. Ministère des Affairs étrangères, Paris, série K Afrique 1918-1940, sous-série Affaires musulmanes, vol. 9, K101-2, ‘Les populations musulmanes de l’Afrique Occidentale et Equatoriale Française et la politique islamique de la France.’ 5. Ruth Ginio, ‘Colonial minds and African witchcraft: interpretations of murder as seen in cases from French West Africa in the interwar era,’ in Martin Thomas (ed.), The French Colonial Mind: Mental Maps of Empire and Colonial Encounters (Lincoln, NE.: University of Nebraska Press, 2012), 58-61. 6. Benoît de l’Estoile, ‘Rationalizing colonial domination: anthropology and native policy in French-ruled Africa,’ in Benoît de l’Estoile, Federico Neiburg and Lygia Sigaud (eds), Empires, Nations, and Natives. Anthropology and State-Making (Durham, N.C: Duke University Press, 2005), 44-7. 7. Conklin, ‘The new “ethnology”,’ 29-46. 8. Benoît de l’Estoile, ‘Rationalizing colonial domination?’, 49-54. 9. Helen Tilley and Robert J. Gordon (eds), Ordering Africa. Anthropology, Europea Imperialism, and the Politcs of Knowledge (Manchester University Press, 2007), 6-9. 10. Frederick Cooper, Colonialism in Question: Theory, Knowledge, History (Berkeley: University of California Press, 2005), 144. 11. Alice L. Conklin, ‘‘Democracy’ rediscovered‘‘. Civilization through association in French West Africa (1914-1930),’ Cahiers d’Etudes Africaines, 145:37 (1997), 59-60. 15


SOCIAL SCIENCE AND IMPERIAL PROJECTS

local elites – the chiefs, mandarins, and village elders who made the system work. The bargain preserved their titles and limited legal and tax-raising powers. They upheld rural order and furnished the authorities with revenue, labour and military recruits in return. The ‘science’, in other words, was more rhetorical than real. With French democracy restored in 1944-5 after four years of Nazi occupation, the new political leadership in Paris made unlikely imperialists. Most were ideologically left-of-centre. Several leading government figures had been imprisoned for resistance activities. Some, like Marseille Mayor Gaston Defferre, built powerful regional political networks that exploited their proud records of resistance. Other discrete groups – Jean Monnet’s dirigiste planners and Pierre Mendès France’s political economists - were technocratic modernisers. Their reformist sympathies translated into commitment to mobilize state resources to develop colonial economies, improve welfare provision, and raise living standards. Most significant to us here, even the community of French colonial anthropologists, although bitterly divided in their wartime responses to the racial discrimination of the wartime Vichy state, shifted after the war in tune with the newly-founded United Nations Educational, Scientific and Cultural Organization (UNESCO), which, in 1950, formally condemned scientific racism, dismissing received wisdom about hierarchies of civilization as a pernicious myth. 12 But any notion that France had repudiated ‘scientific colonialism’ would soon be belied by its repression of anticolonial groups from Vietnam to Algeria. To illustrate the point, let’s look briefly at a somewhat forgotten 1947 rebellion against French colonial rule on the island of Madagascar. There, the French ethnographer and social psychologist Octave Mannoni, a long-serving official in Madagascar’s colonial administration, depicted the notoriously brutal repression of the Malagasy revolt as a form of ‘theatrical violence’. Mass killings of villagers and novel forms of murder such as the dropping of victims from aircraft were demonstrative acts intended to restore order to the minds of an indigenous population whom Mannoni considered psychologically dependent on the unflinching discipline of external 12. Alice L. Conklin, In the Museum of Man: Race, Anthropology, and Empire in France, 1850-1950 (Ithaca, N.Y.: Cornell University Press, 2013), chapter 7, 327-31. 16


PROFESSOR MARTIN THOMAS

authority.13 Not surprisingly, Frantz Fanon, the Martiniquan psychiatrist who famously championed the emancipatory potential, as much mentally as culturally, of revolutionary anti-colonialism, found Mannoni’s views repugnant. In his 1952 work Black Skin, White Masks, Fanon excoriated Mannoni, who by this point had written a best-selling book purportedly explaining the Malagasy rebels’ mental processes. In Prospero and Caliban: the psychology of colonization Mannoni reduced their actions to a caricature of psychological dependency. According to Mannoni, the leaders of Madagascar’s anti-colonial opposition, the Mouvement Démocratique de la Rénovation Malgache (MDRM), were sufficiently educated to recognize their reliance on French tutelage but were insufficiently mature to achieve genuine autonomy psychologically or politically. The great majority of their followers, in this reading, cleaved to the MDRM because they felt betrayed by a weak wartime administration, whose firm guiding hand they craved. Fanon rightly exposed the racial stereotyping at the core of Mannoni’s interpretation. He took issue with the psychiatrist’s unwillingness to concede that Malagasy were reacting rationally against decades of economic exploitation and cultural denigration.14 The point, though, is that French government officials would continue to cite Mannoni’s work over Fanon’s for years to come. France, though, was by no means alone among the ranks of imperial powers in using scientific language and, to a degree, social scientific method and findings to justify continuing colonial rule. As the pace of decolonization quickened after the Second World War, British colonial Africa offered striking, and disturbing, instances of this mobilization of social science to serve imperial interest. Take Kenya, for instance. There, Jomo Kenyatta’s fledgling Kenya African Union (KAU) gained a stronger foothold in the late 1940s. Political space also opened up for the KAU’s radical offshoot, the Mau Mau (literally translatable as the ‘greedy eaters’ of chiefly elders’ authority). KAU leaders struggled to bridge the cultural and generational divides between their rural supporters in Kikuyuland and the more confrontational trade union activism and youth politics of post-war Nairobi.15 Even so, what startled 13. Jock McCulloch, Colonial Psychiatry and ‘the African Mind’ (Cambridge, 1995), 99104; Jacques Tronchon, L’insurrection malgache de 1947 (Paris: Karthala, 1986), 74-9. 14. Alice Bullard, ‘Sympathy and Denial: A Postcolonial Re-reading of Emotions, Race, and Hierarchy,’ Historical Reflections, 34:1 (2008), 124-8. 15. John Lonsdale, ‘KAU’s cultures: imaginations of community and constructions of 17


SOCIAL SCIENCE AND IMPERIAL PROJECTS

the colonial administration most about the emergence of the nonviolent KAU was its apparently ‘pan-tribal’ – for which we might read ‘national’ - basis of support.16 What alarmed them about Mau Mau was precisely the reverse: its sectarian violence and it secretiveness.17 These two features coalesced in British official minds thanks to highly sensationalist reports from field ethnographers working under the aegis of the colonial authorities. During the early 1950s, accounts flooded into district and central administrative offices of Mau Mau oathing ceremonies in which tens of thousands in Central Kenya pledged support, often in small groups and frequently under duress.18 Equally effective as instruments of political mobilization and social discipline, oathing ceremonies drew on Kikuyu religious practice. Earlier dramatic increases in the numbers making oaths of allegiance to the KAU were instrumental to the efforts of younger, Nairobibased militants to usurp the party’s established leadership of rural elders typified by Kenyatta and another senior Kikuyu chief, Koinange wa Mbiyu. By 1951 the party had acquired a younger, more militant aspect.19 Where declarations of support for the KAU were conventionally political and limited in number, Mau Mau oathing was ritualized and conducted on an enormous scale. Exaggerated, vulgarized accounts of these newer oathing ceremonies became staples of settler conversation and British press accounts. Their garishness sought to demonize Mau Mau by proving the movement’s backwardness, its deviancy and its cruelty. Numerous reports from district administrators, officially sponsored ethnographers and government-appointed psychologists interpreted oathing through the cosmologies of early modern witchcraft. Ceremonies, which often involved animal sacrifice and the eating of raw goat meat, were interpreted in highly sexualized terms as frenzied leadership in Kenya after the Second World War,’ Journal of African Cultural Studies, 13:1 (2000), 109-22. 16. Bruce Berman, Control and Crisis in Colonial Kenya: The Dialectic of Domination (Oxford: James Currey, 1990), 322-5. 17. John Lonsdale, ‘Mau Maus of the mind: making Mau Mau and remaking Kenya,’ Journal of African History, 31:3 (1990), 393-421. 18. Daniel Branch, Defeating Mau Mau, Creating Kenya. Counterinsurgency, Civil War, and Decolonization (Cambridge University Press, 2009), 36-9. 19. David M. Anderson, Histories of the Hanged: The Dirty War in Kenya and the End of Empire (New York: Norton, 2005) 11-12, 28-30, 39-43. 18


PROFESSOR MARTIN THOMAS

acts of satanic depravity.20 New initiates, estimated to number around ninety per cent of the population in parts of Kenya’s Central Highlands, were thereby represented as having been duped. Either they were coerced into compliance or they fell into a trance-like state in which all reason and inhibition was lost. Not surprisingly, a propaganda war soon developed over the meaning and validity of Mau Mau oaths, and of the movement they endorsed. If, as British official statements insisted, followers of Mau Mau had succumbed to a form of collective psychosis, corrective treatment rather than colonial reform was what was required. So-called ‘counteroathing’ ceremonies became a central plank of counter-insurgency strategy. Theatrical public recantations were organized in which Mau Mau detainees ceremonially repudiated their earlier vows – sometimes kissing a male goat’s foot, spitting and then spurning Mau Mau allegiance. Such performances, instrumental to British ‘rehabilitation’ of their Kenyan captives, perpetuated the idea that Mau Mau was closer to a cult than a political quest for ‘land and freedom’, the movement’s core slogan.21 The inclination among British colonial administrations to employ cultural anthropologists, ethnographers and, above all, social psychologists in order to compile ‘scientific’ evidence of subject peoples being misled or otherwise radicalized by violent extremists echoed the preoccupation among British, French and other European police and security agencies at the turn of the twentieth century with crowd psychology. These earlier presumptions that ‘the mob’ could be pathologized as an organic mass prone to manipulation and, by extension, to counter-manipulation drew inspiration from French social psychologist Gustave Le Bon’s influential work, the Psychology of Crowds [La psychologie des foules (1895)]. While it is tempting to consign Le Bon’s thinking to the anxieties of fin de siècle societies confronting industrialization and poorly-regulated urbanization, it’s worth recalling that his basic contention – namely, that crowds of demonstrators behaved as a collective in scientifically predictable 20. Ronald Hyam, Britain’s Declining Empire : The Road to Decolonisation, 1918-1968 (Cambridge : Cambridge University Press, 2006), 189-90. 21. Branch, Defeating Mau Mau, 2-3, 24, 40-52; Caroline Elkins, ‘Detention, rehabilitation and the destruction of Kikuyu society,’ in Atieno Odhambo and John Lonsdale (eds.), Mau Mau and Nationhood (Oxford: James Currey, 2003), 191-226. 19


SOCIAL SCIENCE AND IMPERIAL PROJECTS

ways – continued to inform strategies of riot control in the European empires to the last days of decolonization. When combined with abiding racist stereotypes about the emotional, unintellectual, and consequently apolitical behaviour of some colonial subject communities, the results were poisonous. During a wave of late 1950s demonstrations against the twin authority of the British Crown and the white Rhodesians who directed the Central African Federation of South Rhodesia (Zimbabwe), Northern Rhodesia (Zambia) and Nyasaland (Malawi), ‘on the spot’ colonial officials watched with a mixture of shock and surprise as the people of Nyasaland mobilized against British and white Rhodesian domination. Why? Historian Megan Vaughan highlights the persistence among colonial health professionals, psychologists, magistrates and police of crudely racist stereotypes about Africans, their mental acuity, and supposed lack of initiative. Commenting on the incidence of suicide, or rather, its presumed absence, in Nyasaland, Vaughan notes that ‘“Africans” were generally held to be a happy-go-lucky “race” of people with few cares in the world.’ They were alleged to attribute any worries they did have to the malign influence of others, ‘via the medium of witchcraft or the intervention of spirits. African people’, so the argument went, ‘did not suffer from introspection and guilt, and so one rarely encountered depressive illness among them.’22 These layers of prejudice and lazy thinking about Africans lacking political conviction or much capacity for self-reflection were gradually stripped away by well-organized public protests coordinated by Hastings Banda’s Nyasaland African Congress. Once again, social psychology was used – or, rather, misused – to infantilise colonial subjects and to delegitimize political opposition as a manifestation of mental disorder or the malign influence of ‘outsiders’. In this brief survey of colonialism’s manipulation of social science, let’s turn finally to the Portuguese Empire, from the late 1920s until the Carnation Revolution of 1974 under the yoke of António de Oliveira Salazar’s dictatorship. Salazar’s regime always depicted its colonial administration as a ‘scientific occupation’ supposedly informed by 22. Megan Vaughan, ‘Suicide in late colonial Africa: the evidence of inquests from Nyasaland,’ American Historical Review, 115:2 (2010), 387. 20


PROFESSOR MARTIN THOMAS

the rational study of dependent peoples, the maximization of their economic potential, and benevolent, albeit authoritarian, government.23 Where the Portuguese empire differed was in its stubborn adherence to this ideology of domination after 1945, at a time when other imperial states were adopting strategies of modernization and greater political inclusion in an effort to assuage international criticism and breathe new life into their empires. Undeterred by decolonization’s momentum elsewhere, the Salazarist regime developed an entire political vocabulary to justify continued Portuguese control of African territory. The categorizations employed to describe the empire’s component territories and peoples were refined by social scientists, most notably the Brazilian sociologist Gilberto Freyre. As in the French case, black African colonies were re-designated as ‘overseas possessions’. In Freyre’s conception, the colonial power, Portuguese colonial settlers, and the colonized peoples formed a single ‘pan-Lusitanian’ community linked by shared language, acquired European customs, and the imposition of Portuguese civil and criminal law. The very designations ‘Angolan’, ‘Mozambican’ or ‘African’ were declared outmoded. Adopting Freyre’s idea of ‘Lusotropicalism’, Salazar’s dictatorship insisted that the entire empire was composed of Portuguese, albeit of various colours and aptitudes and with markedly different political rights and economic opportunities. The lived experience for black Africans was much different. The Salazarist colonial state was, if anything, more stringent in its application of rigid racial categories than the other European empires whose practices it derided. Designation of the terms ‘civilisado’ and ‘indigena’ was formally codified in two decisive legislative instruments: the Colonial Act of 1930 and the Organic Charter of the Portuguese Empire, promulgated in 1933. Taken together, these laws set out the juridical framework for differential rights and legal punishments in Portugal’s African territories, which remained in place for decades. In Portuguese Africa, as in other regions of the global South living under European imperial authority, social science was exploited, whether to delegitimize popular protest, to dismiss political grievances 23. Omar Ribeiro Thomaz, ‘“The good-hearted Portuguese people”. anthropology of nation, anthropology of empire,’ in Benoît de l’Estoile et al, Empires, Nations, and Natives, 69. 21


SOCIAL SCIENCE AND IMPERIAL PROJECTS

as evidence of mental disorder, or to pathologize entire communities as prone to collective psychosis. Little wonder that fieldworkers in disciplines as diverse as cultural anthropology and social psychology have been keen to put distance between their methods and objectives and those of some of their disciplinary forebears in the formerly colonized regions of the global South.

22


HOW CAN WE TALK POLITICS ACROSS THE POST-2016 DIVIDE? Dr. Sherrill Stroschein Reader in Politics, University College London; ISRF Mid-Career Fellow 2017-18

D

ue to recent events, I refurbished my MSc Democracy courses this year. In fact, I demolished them, tore out the walls and the floor, and had to rebuild from scratch. The year sequence now starts with a session on how we talk about politics in the current age. This is the session that worries me most.

Hard conversations about politics are important to keep a divided democracy together. Moving outside of my theoretical understanding of such conversations, I spent some time this summer engaging in conversations of disagreement, to get the hang of it. Given my research focus on societies divided along ethnic or religious lines, I thought it would be easier than it was. Something has increased the distance between those I used to know and myself – a process of polarisation. This piece considers polarisation in some theoretical and practical terms. Can we talk across our divided identity camps after the divisive votes of 2016 in the US and the UK? We have to try. The attacks on the status quo brought by the 2016 elections tend to have two lines of explanation: economics versus culture or ideology. In my field of Political Science, several analysts have chosen to focus on one or the other, engaging in heated debates over which matters more: a) economic disadvantage, or b) culture/ideology, including potential racism. In the midst of this fray, there is some research indicating how economics and culture might interrelate. The sociologist Susan Olzak outlined how conflict can emerge between racial and ethnic groups 23


HOW CAN WE TALK POLITICS ACROSS THE POST-2016 DIVIDE?

when they perceive that they are competing for resources – in The Dynamics of Ethnic Competition and Conflict (1992). More recently, Catherine Cramer’s book The Politics of Resentment (2016) describes how people in rural Wisconsin resentfully perceive that they have less access to resources than those in cities – in spite of their hard work. She finds that the idea of being deserving, yet not receiving, is a crucial driver of their support for elites they think will shake up the system. The notion of what one deserves is about more than just resources – it is a claim that society is unjust and must be uprooted. It is an ideology of change. We could find ways to solve the economic issues, with the right combination of policies and the welfare state. But in the US, such solutions are frequently rejected by those whom they would benefit, because of the ideological and identity overlay. One’s identity as NOT a “liberal,” NOT a “leftist,” is far more important. Similarly, for those opposed to the Trump regime, their identity as NOT a “Trumper”, NOT a “redneck” or “hick”, or NOT from a “red state” is also very important. These are the camps of the new polarised America. In Britain, there are different labels with similar polarisation dynamics: “Remoaner” versus “Brexiter,” never the twain shall meet. These identity camps are not something that can be solved or resolved with policies. And the more we try to discuss across the divide, the more entrenched it seems to become. The dynamics of polarisation between identity camps are wellknown in my field of ethnic politics. Left unattended, they can spiral into Bosnias, or the type of protracted violence of “The Troubles” in Northern Ireland. Such violence can start with attacks by the unhinged on MPs and journalists, encouraged by the hyped-up discourse that is part of polarisation processes. The only way out is to start to talk down the discourse. This is not easy, but we have to try given where the alternatives lead. Where and how would we start to do this? Given my discussions this summer, I think there are some ways forward, none of them easy. Imagine you are trying to re-engage with an old friend in the camp less frequented by academics, the Trumper or Brexiter camp. The person you used to know is still there, and still loves to talk about their kids or 24


DR. SHERRILL STROSCHEIN

sport or another space that you have in common. Start there, in that common space. Re-establish that you have a shared space. You might need to retreat there when the going gets tough. Then: listen. This is what Cramer did for her insightful study. What motivates people to take the position they care about? Are there any lines at all you could yourself imagine? In the case of Brexit, I realised that I agreed with a friend that the workload at our (quite different) jobs had become worse over the years, and that indeed, there were some in the office that made us look bad by working weekends or uncomplainingly accepting the worsening of work terms. In both of our offices, those people were other Europeans from countries with a reputation of hard work. It becomes possible to see how someone might vote for Brexit in hopes of improving their terms of work. But it could also lead to a discussion that tries to uncover the real reason for a higher workload. Could there be a managerial source? Would the managerial stance actually change with Brexit? These are conversations we have not really had, and we could be better off if we did. The active listening should also diagnose where the red lines are, and one should know where one’s own red lines are. I learned mine this summer during the migrant family separations in the US. I have been in conversations where I have had to apologise and say that I have to stop there because I cannot discuss that particular topic. It is too painful. Stopping is a better alternative to flying into a rage (which can happen later, alone). Similarly, friends have their own red lines. One cannot accept criticism of Trump. Invoking Trump in a negative way will end any hope of a productive conversation. However, it turns out that there is still plenty to talk about while leaving Trump out of the discussion. In fact, for a “lefty” American like me, it produces better thinking and diagnosis of the actual details to be more precise and discerning about what is happening and how or why. Focusing only on Trump is, well, lazy – there are plenty of individuals to blame for those policies aside from just him. Navigating red lines can force better thinking, which can only be a good thing. There will be disagreements. These can be handled with respect, the way they used to be in the days before polarisation. But practicing respect is far more difficult online than offline, which is where 25


HOW CAN WE TALK POLITICS ACROSS THE POST-2016 DIVIDE?

technology comes in. I have had these discussions both online and offline, and they are vastly different. Sometimes we forget precisely why, because online engagement is something to which we have become accustomed. It is worth remembering why and how it differs. Discussing politics via Facebook or Twitter is not at all the same as discussing it with a friend face-to-face. An online email discussion with a friend, one-on-one, will not look vastly different from any other form of one-on-one conversation, just with some potential long-winded rants so at home in email. A social media discussion is more like having the discussion on a crowded train or in a classroom, in which the presence of others means there will be some element of performance. That is where trouble lies. Under conditions of severe polarisation, exerting one’s identity perpetually with strangers is a real thing, one that causes potential static in attempts to have a one-onone debate with a friend. It is as if a stranger on that crowded train takes a side in your debate, uninvited, or a gang joins in to rally to your friend’s defence. It took me a bit of time to realise that this is a hobby for some, a kind of liberal-bashing or Trumper-bashing that is a sport in online media. Those situations have no hope, and are best avoided. All they do is feed polarisation and blood pressure. We can do better, and we must. How will the class go? In writing this I realise that any attempt to record it and put it on the web would make it fail, and fail utterly. There can be no Lecturecast, no podcast, no Tweeting, no crowds watching us from the outside. We will have to talk to each other in the old-fashioned way, with just us in the room, nobody else for whom to perform. It will be strange for all of us, as we have become accustomed to discussions that cross walls via the internet. We are used to informing, promoting, the world outside coming in. But somehow that is feeding the polarisation. And stopping it has to start somewhere.

26


PROVOCATIONS FOR A MATERIAL INTERNATIONAL LAW1 Dr. Jessie Hohmann Senior Lecturer in Law, Queen Mary University of London; ISRF Early Career Fellow 2017-18

I

nternational law is generally defined as the set of rules which are agreed upon by sovereign states, and which bind them in their relations with each other. It binds ‘the state’, which often appears as a closed entity, with a unitary intention, will and authority that can be clearly located and known. Making this international law is often presented as the preserve of statesmen (rarely women) and a supporting cast of technocrats who work busily behind the scenes, dotting the i’s and crossing the t’s on overly complex and wordy treaties, while, front of house, world leaders shake hands under elaborate diplomatic protocols, smiling - or stern and strong as occasion dictates - for the cameras. It appears to ‘happen’ far away and high above the lives of ordinary individuals, to be remote, abstract, and untouchable. Yet it has long been clear that international law is deeply implicated in, and has striking effects on, the lives of ordinary individuals, as this famous cartoon of European leaders ‘carving up Africa’ at the 1885 Berlin Conference suggests.

European powers carve up Africa at the Berlin Conference of 1885. Source zz1y, Journal L’Illustration, (cc licence)

1. This piece comes out of my ISRF research on the Objects of International Law, and will form part of a future monograph on A Material International Law. 27


PROVOCATIONS FOR A MATERIAL INTERNATIONAL LAW

Many international lawyers (including those very technocrats mentioned above, as well as those working in international legal practice and in academic institutions as scholars) are deeply committed to international law’s relevance and success. However, international law has traditionally been considered in terms of its normative and regulatory frameworks; its topic areas, subjects and politics. For international lawyers, the interpretation of rules, the intention of states, and issues of enforcement and effectiveness have been dominant. These have often been approached in highly abstract ways, in a technical language, and through practice and scholarship that revolves around the production and interpretation of text. In recent years, however, international law scholars, myself among them, have begun to turn their attention to international law’s materiality, and to question how international law is inserted into and experienced in everyday life: how do international legal regimes and rules structure subjectivity, sexuality, the home, food production, or movement within and across borders?2 This work can be seen as a part of a broader ‘material turn’ in the humanities and social sciences. But what provokes this turn for international lawyers, and why at this moment? In positing a social history for the ‘science’ of international law,3 I will suggest here that we can identify a number of factors – phenomena playing out across the globe today – that can be argued to underlie the turn to materiality, objects and the everyday life of international law, and which point to the pressing need to consider international law as a material practice and project. Here, I will point to three factors: A Response to Abstraction: First, the turn to materiality and the everyday is a response to the 2. See for eg the contributions in Hohmann & Joyce (eds) International Law’s Objects (OUP, 2018) (forthcoming); Kapur, Gender, Alterity and Human Rights: Freedom in a Fishbowl (Elgar, 2018); Tzouvala ‘Food for the Global Market: The Neoliberal Reconstruction of Agriculture in Occupied Iraq (2003-2004) and the Role of International Law’ (2016) 17(1) Global Jurist; and Eslava, Local Space, Global Life: The Everyday Operation of International Law and Development (CUP, 2015). 3. On the presentation of international law as a science, see Lassa Oppenheim ‘The Science of International Law: Its Task and Method’ (1908) 2(2) American Journal of International Law 313 and also Anne Orford ‘Scientific Reason and the Discipline of International Law’ (2014) 25(2) European Journal of International Law 369. 28


DR. JESSIE HOHMANN

abstraction of international law, and to the experience of it as remote and unaccountable to everyday individuals, even while it impacts on them heavily, affecting the availability of jobs, pay and working conditions, mobility, access to food and medicine. Recent world events – notably the election of Donald Trump to the US presidency, and the UK’s vote to leave the European Union (‘Brexit’) - testify to the experience of international law as both invasive and unwanted. Trump has portrayed international law as detrimental to the US national interest and pledged to withdraw from a number of international agreements and organisations, including on climate change, trade, and human rights, in order to ‘put America first.’ The unusual prominence of international law in the 2016 presidential race prompted the highprofile American Society of International Law to dedicate a series of ten one-hour video discussions by former senior government officials, on the implications of US policy decisions for ‘a range of vital international legal issues.’4 This initiative signals first, the threat that Trump is seen to represent to international law and its established order (and establishment). It is, also, second, a form of presentation that at least gestures towards a need for popular engagement, if remaining ultimately oriented to elites. The ‘Brexit’ campaign, too, played on powerful themes of ‘taking back control’ from an unaccountable and technocratic European Union, and of hardening borders to resist the insertion of international law into ordinary people’s lives. Brexit and the election of Donald Trump have forced international lawyers to consider the deep animosity to international law and its perceived unaccountability, and to bring ourselves back down to the material level where it is experienced. Deflating the ‘Distended’ Human: We need, wrote experimental Marxist novelist Sergei Tret’iakov, a ‘cold shower’ for our narrative pretentions, which place a ‘distended’ human hero at the centre of all action, blotting out the roles of objects, things and other processes. 5 Many proponents of the material turn in the humanities and social sciences have stood under this frosty deluge. 4. Tellingly, perhaps, these discussions can be found at https://www.asil.org/trump (accessed 16 Aug 2018). 5. Sergei Tret’iakov ‘The Biography of the Object’ (2006) October 118 57, 61. 29


PROVOCATIONS FOR A MATERIAL INTERNATIONAL LAW

The material turn is motivated at least in part by a growing recognition that a view of the world that focuses too heavily on the human as the cardinal subject and agent is first, incomplete and partial, and second, has sanctioned human dominance and destruction of other entities, including the planet. The idea of the human as an active subject, exercising agency over a passive realm of objects, is increasingly recognised as a product of Western, post-enlightenment thinking: it is a worldview, not a truth. And, under careful analysis, the binary categories of object/subject, nature/culture, mind/matter on which it rests cannot be sustained.6 This should not come as a surprise to lawyers, including international lawyers, for we are in the business of granting and denying subjectivity. The classical ‘object’ theory of international law, which casts a long shadow over the discipline, holds that the state is the only subject, while human beings are mere objects, occupying similar positions to beasts, ships, or territory.7 Might law recognise natural objects as subjects? Recent (national) initiatives to grant legal personhood to rivers, mountains and other natural objects might be initial steps in this regard. On the other hand, human beings have only recently gained international subjectivity and direct protection of their rights, partly through international human rights law, and this protection remains tenuous and fragile. For this reason many international lawyers might resist efforts to break down categories between non-human object and human subject.8 At the very least, however, the material turn, with its recognition of the entanglements and intra-actions between all entities, should prompt us to consider the relationship between legal and ontological categories of object and subject, and ask how law participates in their construction and maintenance, but can also be used to challenge and resist them. 6. See for Davies, ‘Material Subjects and Vital Objects – Prefiguring Property and Rights for an Entangled World’ (2016) 22(2) Australian Journal of Human Rights 37; Latour and Woolgar, Laboratory Life: The Construction of Schientific Facts (1979 Princeton UP), Callon, ‘Some Elements of a Sociology of Translation: the Domestication of the Scallops and the Fishermen of St Brieuc Bay’ (1984) 32 The Sociological Review 196. 7. See George Manner, ‘The Object Theory of the Individual in International Law’ (1952) 43 American Journal of International Law 428 and further discussion in Jessie Hohmann ‘The Lives of Objects’ in Hohmann and Joyce (eds) International Law’s Objects (2018 OUP). 8. See, for eg the discussion in Therese Murphy ‘AIDS’ in Hohmann and Joyce (eds) International Law’s Objects (2018 OUP). 30


DR. JESSIE HOHMANN

The Yamuna River at Agra, India was given legal rights akin to a person by a lower court, before the Indian Supreme Court reversed the ruling. Harming the river would have been akin to harming a person, and the legal initiative was aimed explicitly at protecting the highly polluted river from further contamination. Photo by Author, 2007. These steps are also significant because international law has served as a handmaiden to human desires to exploit and dominate the objects of the world, which have been posited as simply there for the taking. This is evidenced in international law’s imperial history, which relied on doctrines that erased whole peoples so that dominant powers could exploit them and their lands, in its foundations in the protection of private investors’ property rights, and in its efforts – notably in trade regimes such as the WTO - to purify itself of social, environmental and political concerns.9 The global challenges of climate change, and our reliance on toxic products and processes of production, prompt a rethinking of these suppositions and modus operandi for international law.

9. See, respectively, Antony Anghie, Imperialism, Sovereignty and the Making of International Law (CUP 2004); Martti Koskenniemi, ‘Empire and International Law: The Real Spanish Contribution’ (2011) 61 University of Toronto Law Journal 1; Andrew Lang ‘Purse Seine Net’ in Hohmann and Joyce (eds) International Law’s Objects (OUP 2018). 31


PROVOCATIONS FOR A MATERIAL INTERNATIONAL LAW

The Digital Turn: Another provocation for a material international law lies in the increasing digitalisation of the world. Vast amounts of data and information exist, but they appear as intangible, disembodied even ghostly. On the one hand, digitisation offers access to new materials. Vast archives and repositories of scholarly and historic materials, government documents (whether leaked or authorised) and images, are available at the tap of a few keys, or the swipe of a finger. These materials are open in ways that dusty archives and Old World libraries can never be, but something is lost, too. Many materials are no longer available to be touched, held or viewed ‘in the flesh’. We lose the opportunity to smell, feel or hear these materials, and our experience of them is poorer and thinner, as a result.10 The first page of Western Treaty No 8 (1899) is available freely as a digital document from the Library and Archives Canada website, making it widely accessible. But it is available only as a digital document: we cannot experience it as a material document ‘in the flesh’. Source: Library and Archives Canada See further ‘The Treaty 8 Typewriter: Tracing the Roles of Material Things in Imagining, Realising and Resisting Colonial Worlds’ (2017) 5(3) London Review of International Law 371.

10. See the discussion in Jessie Hohmann, ‘The Treaty 8 Typewriter: Tracing the Roles of Material Things in Imagining, Realising and Resisting Colonial Worlds’ (2017) 5(3) London Review of International Law 371. 32


DR. JESSIE HOHMANN

New information materials and technologies offer new opportunities for international legal regulation and governance. For instance, biometric data promises states and international organisations control over migration and refugee flows. 11 But such technologies and dematerialisation are also threatening. How will the old laws of war regulate cyber-attack? International law is profoundly territorial, but ‘where’ is data such that it can be regulated by this territorially-bound law? These are pressing questions, but they also point to a deeper and more profound issue, and one that must be addressed. The de-materialising of the world around us also prompts us to consider what ‘the real’ is, what objects, things and materiality are, how the real is constituted or denied in or by law, and how the answers will impact on how we experience and know the world around us.

11. See Fleur Johns, ‘Data, Detection, and the Redistribution of the Sensible in International Law’ (2017) 111(1) AJIL 57. 33


CRISIS: WHERE HUMAN NATURE MEETS CULTURAL CRITIQUE Dr. Susanne Schmidt Postdoctoral Scholar, Freie Unviersität Berlin

O

ne might not have expected the history of the midlife crisis to begin with a feminist bestseller. A favorite gendered cliché, it evokes the image of an affluent, middle-aged man speeding off in a red sports car with a woman half his age at his side. He leaves behind his wife and children; yet he - not they - are in “crisis.” Because most tales and treatises about the midlife crisis centre on men, you might be misled into thinking that they have nothing to do with women’s lives. For example, in his recently published book Midlife (2017), the MIT philosopher Kieran Setiya looks at the topic from a philosophical perspective. He declares gender differences irrelevant and even draws on the work of Simone de Beauvoir. Yet for all intents and purposes, Setiya presents the quest for self-knowledge as an endeavor that concerns primarily men: the author himself (who experienced a crisis at the age of 35) and the great men of philosophy, from John Stuart Mill to Arthur Schopenhauer. When he reads Leo Tolstoy, the moral philosopher is interested in Count Vronsky, not the title heroine Anna Karenina. Others declare the male midlife crisis to be a “myth” and lame excuse for selfish behavior. They ridicule the insignia of reinvention - the expensive car, fancy dress, and adventurous holiday trips. Sometimes, shoring up stereotypes is a means of self-defense, where the symptoms are exaggerated to hold a diagnosis at bay. Yet many a satire testifies to the bitter experience of those left behind, or what Susan Sontag called the “double standard of aging,” according to which the social pathology of midlife afflicts women much more than men. Still, 34


DR. SUSANNE SCHMIDT

neither experts nor affected parties, advocates or critics, ask where the concept of midlife crisis comes from. What has almost entirely dropped out of sight is that the midlife crisis was initially a feminist idea, which became popular at the height of the women’s movement in the 1970s. Back then, “midlife crisis” described how men and women in their thirties and forties abandoned traditional gender roles: women re-entered the world of work, while their husbands stepped in to help at home. This was how the New York journalist Gail Sheehy defined the midlife crisis in Passages: Predictable Crises of Adult Life (1976), the bestseller with which the midlife crisis entered popular culture and social science in the United States and abroad. Passages was based on interviews with 115 women and men, most of them white and educated, many dissatisfied with their lives. Sheehy introduced the term “midlife crisis” - coined by the Canadian psychoanalyst and management consultant Elliott Jaques in the 1950s, but not well-known in psychology or among a broader public - to describe how her contemporaries reappraised their lives. Around the age of thirty-five, when, in a typical middle-class setting, the last child was sent off to school, women asked: “What am I giving up for this marriage?” “Why did I have all these children?” “Why didn’t I finish my education?” “What good will my degree do me now after years out of circulation?” “Shall I take a job?” or “Why didn’t anyone tell me that I would have to go back to work?” Sheehy’s men went through a midlife crisis too, but in a different way. While women negotiated trading the roles of at-home wife and mother for a career, men were disillusioned with the world of work. Turning forty, they experienced a period of dissatisfaction. Sometimes their careers stagnated or they even lost their jobs - this was the period right after the oil crisis and the stock market crash of 1973. But success was no safeguard. Sheehy spoke to an internationally acclaimed New York architect who, at the height of his career, felt depressed and inane. Another interviewee quit a prestigious position in Washington, DC for 35


CRISIS: WHERE HUMAN NATURE MEETS CULTURAL CRITIQUE

The New York journalist Gail Sheehy’s Passages: Predictable Crises of Adult Life (1976) made the midlife crisis popular in the 1970s. Sheehy’s best-seller is remembered for its Milton Glaser cover: a rainbow-colored flight of stairs. Bold colors and jumbo letters marked its publication as an event while also signifying seriousness.

The “Sexual Diamond.” The graphic designer Barbara Nessim supplied Sheehy’s New York magazine article on the male and female life courses with illustrations of rhombi organized around pools and open space. She compared this midlife crisis diagram to a baseball field. © Barbara Nessim, 1976/2016.

36


DR. SUSANNE SCHMIDT

Daniel Levinson’s The Seasons of a Man’s Life (1978) was marketed as “the basis for Passages” although, published two years after Sheehy’s best-seller, it challenged her concept of the midlife transition. © Ballantine, Penguin Random House.

An economic image of the midlife transition: the rise and fall of professional men’s wages and home time across the life cycle. Image from Gary S. Becker and G ilber t G hez, The allocation of time and goods over the life cycle, New York: National Bureau of Economic Research, 1975.

37


CRISIS: WHERE HUMAN NATURE MEETS CULTURAL CRITIQUE

Despite its popular resonance, the midlife crisis was controversial in academic research. The sociologist Janet Giele, who pointed to the impact of social and historical factors, was not the only one who challenged the idea of a universal turning point. Figure from Methods of Life Course Research, ed. by Janet Z. Giele and Glen H. Elder, Thousand Oaks, CA: Sage, 1998. a lousy job in real estate which allowed him to live with his family in Maine. He told Sheehy: “I’ll stay home and take care of the kids. I really mean it. I adore children. And to tell you the truth, at this time in my life, I would just love to paint houses and build cabins.” A good dozen years after Betty Friedan’s The Feminine Mystique (1963), Sheehy wed the “the crisis in women’s identity” to David Riesman’s critique of conformity in The Lonely Crowd (1950). “Midlife crisis” was a new name for women’s discontent with the domestic ideal and men’s alienation from the world of work. Critically acclaimed and very widely read, Passages brought the midlife crisis to a wider popular audience. It remained on American best-seller lists for two years, selling millions of copies. The New York Times called it a “revolution in psychological writing”, Ms. magazine praised Sheehy’s analysis of gender and identity, and social scientists spoke of a “damn serious book.” In Library of Congress surveys in the 1980s and ’90s, readers voted Passages among the ten books that influenced their lives most. Sheehy’s feminist concept of midlife crisis demonstrates the use 38


DR. SUSANNE SCHMIDT

of “crisis” as a tool of social criticism. Often brushed aside as an indeterminate or sensational catchword, the ubiquity and resonance of “crisis” suggest that it remains a key term in social science and the public sphere. As an ambiguous, versatile concept, crisis can be a political instrument as much as an analytic category. Designating sites of intervention, it is known to legitimate the rule of experts. Giorgio Agamben is not the only one to note that the contemporary notion of a perpetual, endless crisis - just like a state of emergency - is an instrument of rule, which deflates the right to democratic participation. In contrast, the feminist construct of midlife crisis challenged the status quo, thus illustrating the critical potential of crisis declarations. Crisis is typically understood as a diagnostic category, its etymological origins are attributed to Hippocratic medicine, where the term described the critical days or hours when the fever breaks and a disease changes for the better or worse. Depending on the kind of illness, a crisis was thought to occur at specific hours, on specific days, or during specific weeks, even years. According to the ancient theory of “critical” or “climacteric” years - on which the idea of midlife crisis built - human life proceeded by steps of seven (or, sometimes, nine) years; every seventh (or ninth) year was an “annus climacterius.” These periods of transition brought sudden shifts in constitution. In children and youths, the climacteric years were seen as steps toward maturity: in the seventh year children grew permanent teeth, in the fourteenth they entered puberty. But every change also implied danger, and among the elderly, the profound change which their bodies underwent in climacteric years could pose lethal risks. Numerous works explained that people often died in a climacteric year. The deadliest of all was the sixty-third, the so-called “annus climactericus maximus” or “androklas” (man-killing), with the forty-ninth the slightly less dangerous “small climacteric year. However, as the historian Reinhart Koselleck pointed out, in classical Greek, “krísis” was first and foremost a political term. Used in governmental, military, and juridical contexts, it implied a normative “critique” as much as a descriptive “crisis.” It meant not only “quarrel,” “fight,” and “divorce,” but also “decision.” It was in this sense that Thucydides applied the word to the battles that ended the GrecoPersian Wars. But “crisis” also meant “decision” in the sense of reaching 39


CRISIS: WHERE HUMAN NATURE MEETS CULTURAL CRITIQUE

a verdict or judgment, what today is meant by criticism. Designating a transition, “crisis” described the period in which a decision was due but not yet taken. From this meaning of reason, reflection, and critique, the term acquired political significance. To identify a crisis was tantamount to calling for change, reform, or revolution. The biological and social understandings of crisis were fused. The metaphor of the body organism had been applied to the community since antiquity and became central in the nineteenth and twentieth centuries. As Hannah Arendt noted in On Revolution, modern organic theories of society, which saw the multitude of the nation, people, or society in the image of one supernatural organism, used the notion of biological necessity to call for social change. In the 1970s, when the women’s movement drew attention to the political relevance of seemingly personal issues, often associated with body and mind, “crisis” was a ubiquitous term. Linguists who tracked its proliferation in the media, counting more than 200 “crisis” compounds in international newspapers and magazines in 1979, identified it as an emotional as much as economic and political concept. The human science to define “crisis” in the twentieth century, psychology distinguished between a pathological definition of crisis as maladaptation and a positive understanding of crisis as a developmental step. For the post-war analyst Erik Erikson, famous for describing the “identity crisis” of adolescence, “crisis” connoted a normative turning point, not a catastrophe. Though the psychological, developmental notion of crisis fuelled Sheehy’s observations, the comparison with Erikson also illuminates a key characteristic of crisis as critical concept. The psychologist described a cyclical crisis - an anthropological constant repeated over the course of generations, thus serving to maintain societal institutions, as Erikson expressly emphasized. In contrast, Sheehy’s “midlife crisis” was transformative by design. Passages entailed different messages to readers from different generations. A call for change for readers beyond thirty, the midlife crisis constituted a cautionary tale for younger women and men, whom Sheehy advised to forego established role patterns, thus preventing the “predictable crisis.” The call to end traditional gender roles was only partially successful. 40


DR. SUSANNE SCHMIDT

Today, the working woman seems to have replaced the at-home wife and mother, whose “midlife crisis” Sheehy described. Yet the incompatibility of personal and professional requirements persists, triggering a new kind of midlife crisis. “The typical male midlife crisis tends to hit out of the blue and take men by surprise,” remarks the journalist Hanna Rosin in The End of Men (2012), “but for women it’s been lingering there all along.” In their thirties, many professional women are exhausted by the competing demands of work and children, now in reverse order. Everything looks right on paper. They’re climbing the corporate ladder, may be on the verge of promotion, up against one or another glass-ceiling. Now, the crisis is: how to have a life beyond the job? The female midlife crisis has not disappeared. It has just changed.

41



ISRF Flexible Grants for Small Groups Competition (FG5) Scholars from within Europe are eligible to apply as Principal Investigator (PI) to convene a small research group of 2-8 scholars (which may include graduate students). The PI should hold a PhD and will normally have a permanent appointment at an institution of higher education and research. Applications may be made by those whose sole or principal post is a part-time equivalent. Independent scholars with an academic affiliation may also apply. The award will enable researchers to conduct a research collaboration. The ISRF aims to support independentminded researchers to do interdisciplinary work which is unlikely to be funded by existing funding bodies, and therefore encourages innovative proposals involving novel groupings of researchers, especially in interdisciplinary areas involving overseas collaborators, normally with a maximum of two researchers from any one university. The ISRF wishes to support innovative research which breaks with existing explanatory frameworks so as to address afresh empirical problems with no currently adequate theory or investigative methodology. Innovation may also come from controversial theoretical approaches motivated by critical challenge of incumbent theories. Interdisciplinarity in the generation of new investigative initiatives may be achieved by combining and transforming empirical methods and theoretical insights from the social sciences. Projects ranging across the breadth of the social scientific disciplines and interdisciplinary research fields are welcome, and relevant applications from scholars working within the humanities and the natural sciences are also encouraged. Award Value: Up to ÂŁ5,000 (or â‚Ź5,500) Submission Deadline: 2nd November 2018 More Information: http://www.isrf.org/fg5/


ISRF Essay Prize Winners 2014-2017 2014: THE RESEARCH INVESTIGATOR AS INSTRUMENT ACROSS THE HUMAN SCIENCES The ISRF and the Journal for the Theory of Social Behaviour awarded the 2014 ISRF Essay Prize in Social Theory to Professor Kenneth J Gergen (Swarthmore College) for his essay ‘From Mirroring to World-Making: Research as Future Forming’. http://www.isrf.org/funding-opportunities/essay-competitions/ social-theory-2014/

2015: WHAT IS THE PLACE OF CARE IN THE ECONOMY? The ISRF and the Cambridge Journal of Economics awarded the 2015 ISRF Essay Prize in Economics to Professor Julie A. Nelson (University of Massachusetts, Boston) for her essay ‘Husbandry: A (Feminist) Reclamation of Masculine Responsibility for Care’. http://www.isrf.org/funding-opportunities/essay-competitions/ economics-2015/

2016: AUTONOMY & ORGANISATION The ISRF and Organization Studies awarded the 2016 ISRF Essay Prize in Organisation Studies to Simon Stevens (Loughborough University) for his essay ‘Life and Letting Die: A story of the homeless, autonomy, and anti-social behaviour’. http://www.isrf.org/funding-opportunities/essay-competitions/ organisation-studies-2016/

2017: SOCIAL THEORY The ISRF and the Journal for the Theory of Social Behaviour awarded the 2017 ISRF Essay Prize in Social Theory to Dr Dominic Holland (Nottingham Trent University) for his essay ‘The Nature of the Political Reconsidered’. http://www.isrf.org/funding-opportunities/essay-competitions/ social-theory-2017/


The 2019 ISRF Essay Competition Interdisciplinarity: The New Orthodoxy? The ISRF intends to award research funding of €5,000 for the best essay on the topic ‘Interdisciplinarity: the new orthodoxy?’ This is a topic, not a title. Accordingly, authors are free to choose an essay title within this field. Submissions are invited on the theme ‘Interdisciplinarity: the new orthodoxy?’ Essays may address any topic, problem or critical issue around or on this theme. The successful essay will be intellectually radical and articulate a strong internal critique of existing views. Writers should bear in mind that the ISRF is interested in original research ideas that take new approaches and suggest new solutions to real world social problems. The winning author will be awarded a prize of €5,000 in the form of a grant for research purposes. It is intended that this award would be made to the author’s home institution, although alternative arrangements may be considered for Independent Scholars. The winning essay, and any close runners-up, will be accepted for short format presentation at the 2019 ISRF Annual Workshop (expenses for attendance at which will be covered by the ISRF) and publication in the ISRF Bulletin; authors may be asked to make some corrections before publication. The winner will be able to visit The Conversation UK for a day, see how the news site operates behind the scenes and spend some one-on-one time with Josephine Lethbridge, the ISRF-funded Interdisciplinary Editor, discussing their research, its potential news angles and how best to draft a pitch, with the potential of writing an article should an idea be agreed upon. Submission Deadline: 31st December 2018 More Information: http://www.isrf.org/2019EssayPrize


This issue features: Edna Bonhomme Jessie Hohmann Susanne Schmidt Sherrill Stroschein Martin Thomas


Turn static files into dynamic content formats.

Create a flipbook
Issuu converts static files into: digital portfolios, online yearbooks, online catalogs, digital photo albums and more. Sign up and create your flipbook.