Anatomy of Cultural Recursion

Page 1

INQUIRE MAGAZINE

WHERE ACTIVISM TAKES ROOT

CONFORM


-INQUIREMAGAZINE - Editors In Chief Devin McDonald Craig Draeger - Co-Presidents Lauren Sampson Ralph Yeung - Contributors Lauren Sampson Ralph Yeung Courtney Reeve John Wilson Craig Draeger Devin McDonald -PhotographyDevin McDonald Craig Draeger

inquireblog.blogspot.com inquirepublication.com All citations are available online Images Supplied

Content Notes From the Editors - 4 Grassroots Gourmet - 6 Lauren Sampson Healthcare Through Time - 8 Ralph Yeung Canadian Freedom For Libya - 10 Courtney Reeve Democratizing Mediocrity - 12 Craig Draeger Bespoken For - 14 Devin McDonald Quintessence of Desk - 16 Devin McDonald Blowin In the Wind - 19 John Wilson Flattening of the Artistic Horizons- 21 Craig Draeger


Haggis Fritters


Anatomy Cultural Recursion The of

Notes From the Editors

Craig Draeger

Arts and Science, 2013 On August 13 of this year, Fidel Castro—father of the Cuban revolution—celebrated his 85th birthday. On the same day, crowds tore apart cities in the United Kingdom as riots entered their third week. I was struck by the confluence of these two events, because they poignantly symbolize a shift in the Western countercultural ethos. From the 1950s all the way through the 1980s, it was fashionable among those who consider themselves radical rebels to express “solidarity” with communist states or socialist agitators. At its core, these displays were about showing support for “the enemies” of one’s own mainstream Western culture. And make no mistake, communism was set up—for propaganda purposes—as the “societal menace” of countries like the United States. However, since the collapse of the Soviet Union, there’s been a seismic shift among countercultural advocates in free Western countries to align with a different flavour of rebellion: anarchism. There’s no such thing as an anarchist state, so it’s not a matter of solidarity. And this trend is clearly not for lack of modern-day societal enemies, but nobody is running off to join the Taliban or holding solidarity marches for Iran’s autocrats. There are even still authoritarian states on both sides of the spectrum—although aging figures like Castro, Kim Jong-Il and Belarus’ Alexander Lukashenko offer little to get excited about. It would be foolhardy to think that counterculture is exclusive to the political left, moreover, because a simultaneous shift

4

has occurred among rebels on the right. Just as left-wing fascination with Mao and Guevara has abated lately, the right has abandoned figures like Pinochet and Franco in favour of movements like the TEA (Taxed Enough Already) Party in the United States, who espouse essentially the same anti-authority, minarchist rhetoric and borderline belligerent behaviour as anarchist movements. The common impression is that cultural rebellion is a backlash against whatever is seen as “mainstream,” but I would argue that this is an incomplete analysis. Counterculture is, by its very nature, a backlash against society—the way it’s arranged, and who benefits from it. This means counterculture can take two forms: either a reflection of our own society’s flaws (in backlash), or “support” for our society’s enemies (in solidarity). The influence of anti-authority causes has been ultra-pervasive in the West over the last half decade: from Wikileaks and the net neutrality movement to the Black Bloc at Toronto’s G20 Summit, and the London and Vancouver riots this year. What does it say about our own society that its internal critics are calling now for more negative liberty, rather than less? It seems to indicate that we’ve taken a step too far toward statism (left or right, sometimes both at the same time). It’s hard to say which direction we’re heading next, but it’s safe to assume that our situation—both the mainstream culture of Western societies and the counterculture rebels who stand to oppose it—will not remain static.


“What does it say about our own society that its internal critics are calling now for more negative liberty, rather than less?”

“It is no longer about the content of the rebellion but rather about the race to cultural contrarianism”

Devin McDonald

Arts and Science, 2013 University is a ripe time for rebellion. Away from parents, the abundance of alcohol and drugs, the ability to do what you want. This to many, is the essence of the university experience; the rebellion from the tutelage of your parents and the drafting of your own new order. The baby boomers certainly embraced the concept with their embattlement against the establishment. But where does our rebellion lie today? It seems that every conceivable rebellion has become so cliched, so packaged and branded as to strip of it rebellious status. Is it possible to that we live in a post rebellion world? Perhaps most demonstrative of our disenchantment with rebellion is the hipster phenomena. I find myself living in a some strange cultural schism wherein a large part of the many things I enjoy doing are branded as being hipster yet the irony (I recognize the contextual relevance of the use of this word) is that I am most often declared hipster by people who seem to possess equally “hipsteresk” attributes. Yes I like obscure music, yes I wear skinny jeans, I even listen to music on anachronistic rotating grooved disks. The combination of these things somehow acquires the scorn of my peers. Yet if I cannot self-identify as a hipster, as it seems a core tenet of hipsterdom is selfdenial and lest I fall into a hipster self-identifying paradox. A 2009 New York Magazine article by Mark Grief traces the genealogy of the hipster. He links the derogatory and accusatory nature of the word with the idea of apriorism; a hipster finds value not in the inherent attributes of an item, product or trend but in its exclusivity. Thus cultural value becomes a rat race to find the next big thing, only

to reject it once it reaches critical mass. Hipster culture has an inherent distrust of its own members; it is weary of “hanger-oners” to past fads. The now hallmark phrase of the hipster: “this is so mainstream,” is demonstrative of internal suspicion of the follower rather than genuine contributors to the culture. This cultural apriorism has led to a seeming rebel extremism; it is no longer about the content of the rebellion but rather about the race to cultural contrarianism. This seems due in part to the pace at which our rebellion is sold back to us by the establishment. The corporate capacity for cool hunting allows retailers to bring fads from back alley bands to suburban malls in ever shortening spans of time. This short production cycle lessens the time in which early adopters can garner a feeling of exclusivity from a product. The adoption of the trend de jour is no longer necessarily a mark of your cultural sophistication but rather a mark of your weekly attendance to Urban Outfitters. I use the example of hipster culture because it is representative of the crisis that exists within the cultural conception of rebellion. Rebellion has become less cultural commodity more so just commodity. How do we rebel when the establishment is shoving counter culture down our throats? How do you rebel against your parents when they regale you with stories of their own anti-estblishmentarianism? This issue of Inquire attempts in part to reflect on the changing role of rebellion in our society and in what form it still holds efficacy. Is rebellion still relevant in the postindustrial world in which we live?

5


Grassroots Gourmet Lauren Sampson

Arts and Science, 2012

In the popular imagination, the buy-local movement is the purview of bohemians and foodies, individuals who are earth-conscious or who can afford to be choosy about where their food, furniture and household amenities are produced. The Canadian populace at large accepts and indeed prides itself on its cosmopolitanism – digital cameras from Japan, cell phones from the UK, jeans from the Dominican Republic, silverware from China, oranges from Brazil and petroleum from Nigeria. Owning goods from around the world seems both a mark of globalism, of partnership with a larger world economy, and inevitable. With the near-full scale outsourcing of manufacturing to countries like India and China, is it possible to subsist without products crafted abroad? And yet in the past five to ten years, the local purchasing movement has gone mainstream. For example, in 2007, the Ontario government launched a “Buy Ontario” marketing strategy (contributing approximately $12.5 million), which included a consumer awareness campaign designed to increase interest and demand for local foods across the province, expand the Foodland Ontario program to more fresh food products and boost the Savour Ontario program in fine and vacation dining establishments. Books like The 100 Mile Diet: A Year of Local Eating and Animal, Vegetable, Miracle spend weeks at the top of bestseller lists and dozens of new farmers’ markets have opened across Canada, with annual sales reaching $1.03 billion. Independent business networks, which work to educate customers about the value of local buying and give independent owners a unified voice in government and media, have doubled in number since 2005, with approximately 25,000 small firms in the United States

6

belonging to a business alliance promoting local shopping. In British Columbia, 60% of the province’s local municipalities are banding together to fight a proposed Comprehensive Economic and Trade Agreement (CETA) with the European Union, which officials believe will damage the communties’ efforts to promote economic development through local procurement and local hiring. What has prompted this groundswell of support for local infrastructure? In the first place, it appears the recession has incited business owners and local governments to promote buy-local efforts to help insulate them against the worst of the downturn. Simply put, the economic argument behind buy-local campaigns is that spending at local stores, rather than at national chains or online, promotes local economies, as those firms are more likely to buy from community suppliers and hire local service providers for managerial, accounting and warehousing needs. Additionally, profits stay with local owners who in turn spend in the community, rather than to out of state or out of province shareholders. More jobs also denote an increased commercial tax base, crucial for any cit facing enormous budgetary deficits, like Toronto. But more than anything else, the buy-local movement is evidence of a public shift towards environmental thinking, manifested in green consumerism. Defined as a form of political or ethical consumption that is characterized by a concern for the environmental impact of products and producers, the philosophy first entered the public sphere in the early 1990s, when the term “sustainable development”


started to gain traction. The release of Al Gore’s documentary film “An Inconvenient Truth” in 2006 reignited worldwide interest in the environment by emphasizing the very real evidence that global warming was already occurring and so was threatening the welfare of flora and fauna around the world. Environmentalism quickly became the preeminent political topic of the day. In 2007, the environment overtook healthcare as the issue Canadians considered most important, while that same year, more than 50% of Canadian consumers reported that environmental issues influenced their co sumption habits. In response, companies have attempted to portray themselves as operating with consideration for the environment, often going as far as “greenwashing”, or using green marketing to falsely portray a corporation’s policies or products as environmentally friendly. Support for buy-local movements can be mapped onto environmentalism’s trajectory from grassroots philosophy to political platform. As fears surrounding climate change mounted, it began to seem dangerously extravagant to spend 36 calories of fossil fuel energy transporting one calorie of California lettuce to New York when food travels, on average, 1,500 miles from farmer to consumer. Buying food from community farms keeps these “food miles” (which are on average 27 times higher for grocery store items than local foods) to a minimum. Green consumers argue local and regional food systems avoid the carbon dioxide, sulphur dioxide, particulate matter and other pollutants released when packaging, refrigerating and transporting fresh foods over a considerable distance. In fact, in recognition of the widespread support the movement has garnered, major food corporations are attempting to tap into buy-local enthusiasm. Hellmann’s mayonnaise launched a website that questions whether cheap imports are “worth” their substantial threat to Canadian family farms while Loblaw’s aired commercials featuring farmers taking part in their “grown close to home campaign”. Even WalMart, the largest grocer in the US, has been officially encouraging managers to buy produce grown within 450 miles of its distribution centers and in 2008, committed to spending at least $400 million on purchasing local food. But is the buy-local movement truly evidence of a new politico intellectual zeitgeist, an example of grassroots efforts effecting mainstream political change? Or is it simply a halfhearted attempt to capitalize on citizen fear? Perhaps more importantly, is the buy-local movement even truly effective for either the economy or the environment? Some argue that green consumerism serves only to mask concerns, concealing its market-driven origins with layers of moral righteousness. It does little to attack the root of the

problem: over-consumption. “Green” strategies are then by nature hypocritical and oxymoronic, as the purpose of eco-friendly thinking is ostensibly to establish how consumerism itself is problematic. Green should denote consuming less, not consuming differently. Moreover, the buy-local movement can simplify or ignore the global implications of their policies. The movement is strongest in Europe, where it got its start in the 1990s, and Canadian products are suffering for it, with exports of agricultural and fishing products declining 7.2% in January 2011 and wheat in particular falling 15.4%. Additionally, because “local” doesn’t necessarily equate to “sustainable”, a product’s actual environmental friendliness remains a mystery. Florida or California residents purchasing locally grown rice are ultimately still consuming food grown in a heavily irrigated desert, at astronomical environmental costs. Indeed, in 2007, the American Council for an Energy Efficient Economy calculated that the cost of getting an average car from Tokyo to California was between 1,000-1,800 pounds of carbon dioxide emissions, a surprisingly low number that mirrors the amount that same car will produce every month for the entirety of its driving life (indicating again that consumption itself is the central issue). In a political climate steeped in paranoia and insularity, the danger is that the buy-local movement will become (or already is) jingoism, a means of rejecting foreignness in favour of good old-fashioned, home grown products. Perhaps that’s the real zeitgeist the movement is heralding – an era of backlash against globalization, the complex, amorphous cause of a worldwide financial crisis that seems to indicate that if economies stand together, they must fall together.

7


Healthcare Through Time:

From A Doctor’s World To A Patient’s World

Ralph Yeung Arts and Science, 2011 Medicine has always been an art form designed to be enjoyed by those who seek its rewards; undeniably, the patient is the ultimate benefactor of any practice of medicine. As with all forms of art however, it takes a particular type of artist to perform well and because medicine is certainly not something any person can do, doctors from ancient ages to modern times have been accorded a certain status and prestige within society. As such, medicine has historically been regarded as a prestigious elite club for those of tremendously high social status. It is tempting to lend credence to this argument by examining the image of doctors in history – the stereotypically well-dressed, well-spoken gentlemen depicted in paintings of the ether dome in Massachusetts come to mind. However, status is not derived from how people within an elite club perceive themselves but rather from how those outside the club perceive them. Consider patients in the 1960s and it becomes evident that doctors were of an elevated social rank: male and female patients alike were dressed to impress, right down to the tie and pearl necklaces, as if they were going to church. Visiting a doctor was a religious experience and doctors naturally fit into the role of the all-knowing, all-powerful healer of any ailments. Their opinion was typically considered fact and their word was to be taken without question. For all intents and purposes, doctors were gods and their patients very willingly worshipped them. In fact, to question the advice of a doctor was a sign of such tremendous disrespect, it could be considered grounds for a doctor to refuse a patient or otherwise label them troublesome. In the latter instance,

8

patients who showed significant distrust of the advice of a physician were thought to have overly anxious personalities or even hypochondria. This type of mentality was not exclusive to doctor-patient relationships but also inherent in relationships between physicians and other health care professionals. Within the system, doctors were viewed by other health care professionals as the ultimate decision maker whose word on a dispute or an issue was considered final. The very terminology we use when referring to doctors accentuates their authority over other staff: “The doctor ordered an x-ray of your chest”. Yet, who could argue that they were not highly knowledgeable, that their skills and knowledge did not have an obvious, direct consquence on a patient’s health? As a result of the stigma associated with patients who had an ounce of doubt of their doctors’ opinions and the fact that staff naturally viewed doctors as the ultimate decision makers, medicine not too long ago was a doctor’s world, designed, administered and controlled by doctors and their directives. To be fair, for the patient, obeying the doctor is the easiest mentality to adopt when forming a doctor-patient relationship. It’s the most comfortable for the patient, especially when they are feeling particularly vulnerable, to have someone take charge of their well-being and relinquish any responsibility. Outside of this rationale, where else could a patient gather information pertinent to their ailments outside of a doctor’s office? The physician was the obvious, easi-


est conduit to information and relief from illness and disease. As a result of these factors, the doctor-knows-best mentality still pervades modern medicine. But to many of us, the idea that a doctor’s word is to be trusted without question, that his decisions are final and no other health care professional can offer a differing opinion without jeopardizing their work relationship, seems preposterous. Returning to a patient perspective to examine what healthcare is like in the modern age, we can attribute a shift in how patients view their doctors to the wealth of information newly available to the patient. Whereas in the past the easiest source of information was the doctor, a patient’s easiest source of information now is Dr. Google. The very term “Dr. Google” can make most doctors cringe because of the copious amounts of false information available on the internet, but the silver lining is that it has made information much easier to access. As a result, patients can have a second opinion, or several opinions, before seeing a doctor in the first place. The increased availability of medical information, regardless of whether or not this information was true, was the first test of medicine’s ability to adapt.

bottom of the hierarchy, but as mentioned earlier, a patient must become more responsible for their own bodies in order to achieve better overall health. In this way, modern health care must include the patient in the health care team in order to allow the patient to acquire responsibility. As a result, the patient becomes more involved, becomes more knowledgeable and has the resources of multiple professionals and multiple opinions in order to form a care plan with the their best interests in mind. Whereas in the past medicine was a doctor’s world because of their monopoly, whether real or perceived, on medical knowledge, modern healthcare revolves around the patient because of the realizations that they have the greatest control over their health and that no singular decision maker can formulate as ideal a plan as a team of health care professionals.

Thus came about the current paradigm in medicine and health care in general, which reflects what this writer perceives as healthcare’s essential form: a patient-centered, interprofessional team-based, short and long-term care package that encompasses the patient’s best interests. It is not an easy Pair the patients’ newfound knowledge with research suggest- task and it’s not for everybody. Thus, medicine remains an ing that the ultimate secret to better health lies in long-term art form that requires the skills of many brilliant artists, in changes to patients’ lifestyles and suddenly the efficacy of order to benefit those who seek its benefits, as it has always the “treat what surfaces” model so crucial to the practice of been intended to be. medicine seems mediocre at best. Indeed, the best treatment a physician can give is to suggest and guide a patient through lifestyle changes that will naturally lead them towards better health, either by reducing the chances of certain illnesses and/or by improving the efficacy of the body’s many systems. This paradigm of better lifestyle choices being the best medicine demands that a patient be knowledgeable and responsible for their own health, naturally lending support for patients to research their condition before and after speaking with a physician. There is also a paradigm shift in the hierarchy of health care. In fact, the idea is to not have a hierarchy at all. Although it is understood that a physician has a tremendous wealth of knowledge, it is also evident to both doctors and other health care workers that no one professional can know everything that could help treat a patient. As a result, the expertise of many other professions can be equally required in order to give the best care to a patient. Given that fact, the hierarchal structure previously (and still actively) used in health care can be detrimental to the patient: information is not shared amongst other health care professionals but is rather all channeled towards a single responsible personnel. It is also a natural tendency of this old system to put the patient at the

9


Canadian Freedom For Libya

Courtney Reeve

Arts and Science, 2013 How has Canada responded to Libyan civil war? “One either believes in freedom or one just says they believe in freedom,” announced Prime Minister Stephen Harper when speaking of Canadian involvement in Libya last March. “The Libyan people have shown by their sacrifice that they believe in it. Assisting them is a moral obligation upon those of us who profess to believe in this great ideal." Freedom is food Canadiana serves for breakfast. As the quintessential idea of a personal and national identity, the Canadian bacon sizzles next to the sunny-side-ups.Table-talk of freedom describes liberation and the power to resist. It is an ideal that aligns connoisseurs with the people and their human rights. Freedom is owned by those who speak of it; who can fight for it. Especially for those who don’t have it. But when it has become the ideal we all believe in, what really are we talking about? Could Harper being prescribing a Canadian freedom for Libya? It is clear what kind of freedom Libyan rebels keep fighting for; sovereignty from

10

Gadaffi. Following the footsteps of the January Tunisian and Egyptian revolutions that overthrew Ben Ali and Mubarak, Libyan rebels have demanded Muammar Gadaffi to step down and end his 41 year long regime. On February 17-18, approximately 600 human rights activists organized a protest against poor housing conditions. The Libyan army opened fire, killing dozens of people and sparking much larger protests against Gadaffi’s government. Since March of this year, the Libyan Rebel Group has grieved regional disparities in the distribution of oil revenue benefits, gross human rights violations including the 1966 Abu Salim prison massacre in Benghazi, Gaddafi’s concentration of power and government corruption. After rebel’s cries for international support, the UN Security Council approved a “no-fly zone” over rebel-held regions, explicitly excluding on-ground occupation of Libyan territory. Execution of this strategy has been passed to the US and NATO in which 350 international military aircrafts have been employed to protect civilians from Gaddafi’s air


freedom is a breakfastfood.” The meaning of sovereignty from Gaddafi is lost when Canada prescribes morality to a struggle of people’s choice, agency and power. “Long enough and just as long” as freedom is attached to morality,Harper’s freedom will continue to be with us at breakfast.

A Brief Libyan History: 1969 - King Irdis is removed in a coup by Colonel Gadafi. forces. On-ground, rebels have formed the National Transitional Council (NTC) to represent and unite anti-Gadaffi forces across Libya as they continue to advance to pro-Gaddafi capital city, Tripoli. More than 30 nations worldwide have recognized thiscouncil as the legitimate governing authority of Libya fighting for justice and the freedom to choose their own leader. This recognition has resulted in the release of frozen Libyan government assets, amounting to millions of dollars for the NTC. To the rebels, their freedom is defined by their power to resist corrupt rule and to elect a leader who acquires power rightfully and exercises power rightfully. This freedom distinctly contrasts the moral ideal of freedom the head of Canada’s elected government uses to win over Canadian support. According to Stephen Harper, intervening in Libya is a moral obligation footed in the ideal of freedom. Foreign Minister John Baird noted Canadian involvement in Libya as a central influence in its commitment and continued action in securing “a brighter future for a free Libya.” In August, Canada donated another $2 million dollars in humanitarian aid totaling to $10.6 in assistance for victims of rape. Also, Canada has a substantialmilitary presence in Libya including 7 fighter jets, refueling aircrafts, transports and 440 military personnel all in its effort to assist rebels to freedom. Yet Harper’s endorsement for Canadian involvement in Libya is embedded with religious moralism that prescribes a particular freedom. This “freedom” is weighed by Canadian ideals and moral high ground and aligns itself with the multi-cultural, peace-keeping Canadian identity that is already sustaining 10 years of military intervention in Afghanistan. From this standpoint, Canada’s championed freedom is inspired by notions of right and wrong, rather than a return of agency to Libyan people.

1981 - Over the Gulfe de Sirte, US shoots down two Libyan aircraft in what Libya claime to be territorial waters. 1986 - Libyan alleged to be involved in bombing of Berlin disco. 2003 - Libya abandones plans to develop nuclear weapons. 2006 - US restores ful diplomatic ties with Libya. 2008 - The auctioning of oil licenses sees the return of US energy companies for the first time in 20 years. February 2011 - The arrest of human rights activist ingites protest which would spread across the country. Gadafi government use air attack against protestors. March 2011 - UN secuirty council approves a no-fly zone to prevent the use of military aircraft against civilians. August 2011 - Rebel forces take Libyan capital Tripoli and Gadafi is forced to flee.

It is under the auspices of this Canadian freedom Harper assumes for Libya akin to E.E. Cumming’s description “as

11


Democratizing

Mediocrity Craig Draeger Arts and Science, 2013 The emergence of mass education has been one of the most revolutionary developments in civic life over the past hundred years. Publicly-provided education, in particular, has evolved from being a radical concept (famously espoused by Marx as the tenth “plank” in his Communist Manifesto), to a pillar of the modern welfare state. This “democratization of education”—as it’s been called—has had many effects, but one that’s seldom examined is the impact it’s made on the concentration of power in society.

ganically by the need for new types of leaders in society—in business, politics, the sciences, etc.—it is now being contrived by central planning for reasons of political expediency. The American “G.I. Bill” was the first major instance of an artificially rapid expansion in access to post-secondary education in North America. By sending millions of young Americans to universities, it helped to midwife everything from the civil rights movement to “free love,” driving social change in the United States (for better or worse).

Thinking of practical power often calls to mind images of political influence or financial opulence. But as the French sociologist Michel Foucault rightly observed, power is not a tool wielded through financial or political avenues alone; relations of power also exist between doctor and patient, between military and citizenry, and between teacher and student. I would further charge that they exist between the professional and the general labourer, and between the educated and uneducated citizen.

Universities in Canada were brought under the public umbrella throughout the nineteenth and twentieth centuries, to the extent that nary a handful of small private institutions are still operating. The province of Ontario, in particular, has in place an official policy goal of 70 percent postsecondary certification among its citizens. This trend toward greater admittance has created an “educational inflation” which serves to diminish the economic currency of an individual degree, but its effects are yet to become fully apparent.

This power dynamic has always existed, but it does not necessarily create oppressive hierarchies; in fact, the specialization of labour has numerous benefits. However, the distinction of holding a university degree (or even a college diploma) has been significantly degraded by education policies that aggressively promote expansion at the expense of quality and value.

Political operatives like to pontificate about “accessible education” and its supposed economic dividends. While these claims are dubious in and of themselves, they also disregard the psychological and sociological impact of lower academic standards and procedures necessitated by inflationary policy. The flattening of education by state action has been, inadvertently, a method of exerting a power structure that caters to the economic middle class at the expense of others, thereby degrading meritocratic stratification. It is emblematic of a society and a political structure that increasingly steals opportunity from the poor and wealth from the rich to cater to the whims of this aforementioned class.

This can be traced back to its origin: primary and secondary education has been state-provided in North America since at least the end of the nineteenth century, but its post-secondary equivalent has existed in an entirely different state. While the growth in post-secondary capacity was once driven or-


“State control of education has been a tool for the diffusion of individual power and the assertion of centralized power structures” It would be unduly conspiratorial to speculate that there has been a concerted effort to diffuse individual ambition through state education. But as history has shown us, malfeasance is more often the result of incompetence than malevolent will. It is undeniable that the expansion of educational capacity at the post-secondary level has resulted in a significant shift in societal stratification which has bloated the middle class. Whereas the growth of educational capacity has been sold as a natural expansion of the welfare state, it actually represents a perversion of our social structure. While university used to be a place designed for tomorrow’s leaders to interact and widen their horizons, it is now little more than a sandbox for political projects and social engineering.

The theme of this edition of Inquire is “cultural recursion,” so let’s tie it back: Any society that so ardently suppresses the self-confidence of its future leaders has dismantled its own future. Our public education policies are unsustainable for more than their mere economic ramifications—they also serve to weaken the threads of our social entente. By sublimating individual greatness to a collective average, we are breeding conformity of aspiration and mediocrity of achievement. If we continue on this present course it is not clear what our end state will be. But as Frederick the Great once observed, “an educated people can be easily governed.” Or controlled.

If we are to avoid the consequences of the unsustainable social structure foisted upon us, we must start by recognizing that state control of education has been a tool for the diffusion of individual power and the assertion of politically advantageous power structures.

Can’t get your highschool boyfriend out of your jewelry box?

We buy gold and silver jewelry so you can find someone with better taste in jewelry.

KingstonGold.ca 613.893.2857 32 Montreal Street, Kingston


BeSpoken For

Devin McDonald Arts and Science, 2013

Why are you wearing that suit? Do you like it? Do you feel trapped? Did you sling it on last minute before your interview for that post-grad job? Atop your shoulders, what feels like excessive padding. Your neck on the verge of strangle from a tie that seems all too apt a metaphor for the corporate shlep into which you’re about to dive. The suit was attained two weeks before your high school prom at the local Moores. The salesman ushered you into an ill-fitting black three button ensemble with assurances that black goes with everything. Your mother, overjoyed at seeing you all dressed up, is quick to give approval. You remain apathetic; this garment is an obtrusive formality, an anachronistic holdover soon to go the way of fax machines and pagers. The modern work world is shaped by the hoodied Zuckerbergs, not the Brooks Brothered Kennedys of days old.

“[The suit] is an obtrusive formality, an anachronistic holdover, soon to go the way of fax machines and pagers” work. Clothing has had enormous repercussions on gender equality; with the now proverbial burning of bras but more subtly the advent of women wearing pant suits, an implicit aspiration to roles traditionally held by suit wearing males.

Our school teachers and mothers attempted to instill into us the notion what we wear and the way we look isn’t what matThe suit has become much more than a garment. Few pieces ters, it is what is inside that truly counts. Not to suggest they of cultural iconography are more representative of the status were lying, but that platitude seems to bare little resemblance quo; the suit embodies hierarchy and authority. A man in a to the reality of adulthood. Clothes, whether consciously or suit can be many things or perhaps less of a person– transsubconsciously, are indications of where their owner wants to forming the individual to simply “the man.” Popular culture be. Many see Mark Zuckerberg, the world’s youngest bilholds at its core the rejection of the suit; few sitcom scenes lionaire, strut onto a stage full of suited men wearing nothseem more typical than a husband coming home to loosen ing more than jeans, t-shirt and a hoodie. Zuckerberg is the his tie and discard his jacket, and with it a day of answering embodiment of the “what on the inside matters” culture. to monolithic authority. Ironically, people fail to recognize that even Zuckerberg wore a suit everyday for a year just to prove that he could. ZuckerClothing holds an important place in society and has long berg recognizes the importance of the suit as a symbol wherebeen indicative of the place its wearer holds in society or per- as many hold the naive belief that their abandonment of care haps more aptly, the place the wearer aspires to be in society. for their appearance is akin to inciting a rebellion against the Eighteenth century Frenchmen abandoned knee breaches establishment; yet it transforms them into another indistinas representative of their revolutionary alignment. Wearing guishable grunt in the casual revolution. a white shirt until relatively recently was demonstrative of your aristocratic status; you could wear white without fear The suit, not for the first time in history, is now the subject of dirtying the cloth because you did not perform physical of counter culture. Though it is not counter culture in the

14


“[Once] the epitome of conformity it is now a rebellion against an attitude of apathy” contrarian sense of punks or the sexual revolution, it seems more of a reclamation of fashion for men. Long assumed to be relegated girlfriends, fashion is being reclaimed by men and being placed more centrally in our lives. Blogs like Put This On, A Continuous Lean and The Art of Manliness detail fashion basics with a focus on fit and quality. The Art of Manliness’s bold moniker states “reclaiming the lost art of manliness.” The site goes further than just diatribes about the how many breaks you should have in your chinos, recent articles include “How to make your own canoe paddle” and “The importance of trusting men in your circle.” The site is plastered with black and white photos of grandfathers and early twentieth century ads. The focus of the site is reclaiming the gender that is otherwise plagued by apathy towards its role in the world. The rebellion of the suit lies in the idea that it represents; for the same reason that the suit was the epitome of conformity it is now a rebellion against an attitude of apathy. To put on a suit in the morning, to take care that it fits

and that its numerous articles match is to draw attention to one’s role in the world. It draws attention to the idea that one cares about the way others perceive them. Fashion without doubt is in many way self centred, but equally it evades the solipsism of the casual revolution. To care about your appearance is to suggest that you care what the people around you think; their opinion of you is central to your concern. Rebellion is inherently reactionary; you cannot rebel in a vacuum. The typical rebel is antithetical to the mean of society and thus earns the scorn of the establishment. Yet what happens when the rebellion manifests itself within the establishment? Wearing a suit is a statement about one’s willingness to engage the establishment rather than abandon it.

15


The Quintessence of Desk

Devin McDonald Arts and Science, 2013

When I wake up in the morning I use a Dieter Rams designed coffee maker—probably dated from the eighties—and despite the fact that its filter tray has to be caressed into place, it lacks any additional functionality like an timer and there is a perfectly functional coffee maker in the cupboard that is not yet twenty years old and includes the aforementioned functionality. I plan on giving away the coffee maker nouveau as soon as possible to avert the possibility that my housemates will return the Dieter Rams back to the cupboard in exchange for its more juvenile cousin. At this point, I’m sure many of you might find yourselves perplexed as to why I would choose to use something that is on the verge of collapse when I have a perfectly good alternative on hand, even to go so far as to scuttle my housemates’ plans to modernize the kitchen wares. The reason I am obsessed with this caffeine spurting anachronism is because I consider it to be a classic; everything about it is simple and unimposing whereas its counterpart is crudely formed and somewhat jarring in its shape.

are we able to create designs capable of evading the fate of anachronism? This seems wholly implausible as a conclusion. Nobody looks at 18th century thrones and declares them to still be relevant today. Was there something mysterious that happened in the last century that allowed us to end the Sisyphean cycle of design? As I think it is silly to declare the last century mysteriously exceptional, the purpose of this article will be to explore the phenomenon of quintessentialism. More specifically, the dialogue between technology and design, how the two come to inform each other, and how they influence the consumer experience of the product.

I would like to take two iconic pieces of furniture as the basis for my examination of the progress of design; the Eames chair and the Barcelona chair. Both of these pieces of furniture made use of the most advanced construction techniques of their times. These pieces of furniture are still perceived to be incredibly modern despite being 45 and 83 years old respectively (the Eames chair was first released in 1956 and Imagine Neanderthals had not become extinct but had devel- the Barcelona chair in 1928). When I reflect on the 1950s oped alongside us, as if we had an evolutionary cousin with I think of drive-in movies and housewives, quintessential all the same functionality but none of the sentience. Then, anachronisms rather than designs that transcend the mark of out of nowhere, they turned into drip coffee makers. Who time. would you rather spend time with: the sentient coffee maker or the Neanderthal coffee maker? I would like to make the assertion that technology is inextricably connected to what we perceive as modern. This may Why am I so desperate to idolize a piece of so–called cultural at first seem odd considering the aforementioned dates of quintessentialism? I am not alone; the blogosphere is a flurry release. My assertion is that technology in regards to design of romanticized photos of mid-century furniture and their has not moved forward the finished product, but has focused accompanying buzzwords–classic, timeless, essential. Blogon the manufacturing and distribution of the consumer good gers talk about design and fashion as if they are the pinnacle itself. The design in the products that the vast majority of us of human development; as if we have transcended the stain use has stagnated because we no longer of time. But can this really be possible? Why all of a sudden

16


focus on the development of the item itself but rather on its ubiquity. Consider first the example of the simple men’s dress shirt. Though there are a plethora of colours, cuts, and customizations, its essence has seen very little evolution in a very long time. The basic design elements of the shirt have not improved nor has the overall durability or quality of the item. In fact, I would venture to say that the dress shirt worn by the average man today is of lesser quality than the same shirt he would have worn fifty years ago. In the first half of the 20th century, the average worker had very few sets of clothes; little more than a set of work ware and a set for church. Due to the narrow scope of his wardrobe, the clothes had to be of high quality. The blue contrast colour shirts adorned by fictional power broker Gordon Gekko in the 80s classic Wall Street are, in fact, a holdover from the replaceable collar. As the collar is one of the parts of the shirt that takes heavy wear, collars were made to be replaceable, and given that the shirt’s fabric wasn’t available many years after its initial pro-

duction, the replacements were just white. The style is now, rather ironically, referred to as the “banker collar” rather than the “factory workers’ collar,” as history might suggest. The point being that shirts were made to last, and designed with the intent to be functional and repairable. Today, one is more likely to have ten shirts from H&M that last one season rather than two shirts that last long enough to host numerous collars. Mass production has necessitated that we have the quantity associated with wealth, yet we lost the quality somewhere along the way. The second example previously mentioned was the Eames lounger and the Barcelona chair. Though their names might not ring any bells, it is likely that most people would recognize the iconic forms of these two pieces of furniture. The Barcelona chair was first conceived by Von Mises in 1908 but not brought to production until 1928 when Knoll produced the chair for Germany’s pavilion in the International Exposition of 1929. The simplistic chrome x-frame design with the tufted leather cushions has inspired innumerable copies and omages. In 1928, the x-frame design made use of new welding techniques. Additionally, the chrome frame was seen as futuristic. Equally groundbreaking was Eames’

“Was there something mysterious that happened in the last century that allowed us to end the Sisyphean cycle of design?”

17


“Our western society is heralded for its material riches, but how wealthy is a society that trades form and function for quantity?” use of bent plywood as the frame for their lounge chair. The modernism of a curved shape with medium brown veneers was indicative of mid-century design. Though there are many notable exceptions, my point is that the vast majority of the things we use from day-to-day are ill-designed and sought for their low prices rather than their longevity or thoughtful construction. Our western society is heralded for its material riches, but how wealthy is a society that trades form and function for quantity? Have we grown any wealthier when that wealth is built on the principle of disposable goods? The material we so covet seems more like the cheap Chinese rip-offs of the designs on which they are based.

This upswing in the popularity of terms like “quintessential” or “classical”—and the seeming trend to a modernist neoclassicism—is perhaps a symptom of the predominance of the quantity over quality mentality that has come to dominate our commercial society. We have come to swim in disposable materialism so much so that the allure of something which lays claim to being timeless is enchanting. For the record, I think there will come a time when even the classic lines of said furniture will come to look dated, but for the mean time, I don’t mind attaching myself to the momentary romanticism that things can transcend time.

Dieter Rams: “As little design as Possible” 18


Blowin Wind IN THE

John Wilson Arts and Science, 2013

There’s nothing new or contentious in suggesting that by studying a culture’s entertainment we can learn much about the culture itself. After all – as Plato probably did not say – you can learn more from a man in an hour of play than a year of conversation. Despite this possibly obvious observation, in this piece I’d like to offer an examination of today’s popular music as a means for drawing some worrisome, if not scary, conclusions about the state of contemporary culture. Specifically, that today’s music reflects a kind of dissatisfaction and apathy about living in today’s world. Before I begin in earnest, let me try to mollify those of you who may be shaking your heads and offering early objections. Yes, I recognize that popular music is a form of entertainment, and should generally be treated with less gravity than other subjects. And yes, it is mass produced for profit and as such might not be perfectly amenable to serious normative considerations. However it is important to remember that people are willing to part with both their free time and significant funds to enjoy them. It follows then that a large cohort of society finds in the consumption of today’s pop music some value, thus it is worth some sober thoughts. What kind of widespread value can be found in popular music? Perhaps the most obvious of reasons is that it accurately expresses shared feelings or desires – the lyrics authentically speak to the collective listener’s experience, offering empathy in melody. History seems to lend its weight to this possibility: the earliest appearance of the blues, the precursor to rock and roll, was born out of slave gatherings in the Mississippi Delta

prior to abolition. Later, widespread dissatisfaction with the moral and political landscape of the sixties was put to verse by Dylan and Bayez. More recently, Pop of the early nineties vocalized female empowerment through the lyrics of bands like TLC and the Dixie Chicks. In all of these instances, the music which attained popularity was that which spoke to the hearts of the people or sang out their collective feelings. If this kinship in feeling – what I’ve termed empathy in melody – is historically what lends music to popularity, then an examination of today’s hits should provide some insight into the collective emotional state of contemporary culture. It is here that worries begin to appear. For the sake of argument, and to best serve our earlier premises, our attention will be directed towards the most mainstream and widely consumed music, as reported by recent Billboard charts. Let’s start at the top – the number one song for several weeks running is Katy Perry’s party anthem “Last Friday Night,” a four chorded, guitar driven romp through the artist’s semirecollected memories of the previous evening. As she wakes up the morning after, Perry surveys the damage to her home “Chandelier is on the floor,” notes the locations of incapacitated out revellers “DJ’s passed out in the yard”; and suggests what might have transpired “Yeah I think we broke the law;” all before insisting that this “blacked out blur” is worth repeating next week. Perhaps more troubling than the sorry, sordid subject matter is the tone in which Perry reports it. The narrative voice is almost indifferent, apathetic even, towards the consequences: “Pictures of last night ended up online / I’m screwed / Oh well.”

19


This apathetic tone is reminiscent of another hit from several years ago, John Mayer’s “Waiting On the World to Change.” Though it covers more serious subject matter, such as political commentary and distaste towards the power of the media empire which “owns the information,” Mayer’s single is not a call to action. As the title suggests, the song advocates inaction for his peers until the world is theirs: “One day my generation / Is gonna rule the population / But we keep waiting, waiting on the world to change.” Unfortunately for Mr. Mayer, political terms are short, and the lifespan of idealism in contemporary politics is even shorter. Any meaningful change will take more than a brief rule by a group who lay in wait for their time. It will come from a group of active, determined people who continually press on despite the potentially Sisyphean nature of the task that Mayer enunciates. Perhaps there’s the rub: our generation is educated enough to observe and eloquently report the problems of today’s society, but we collectively do little to change them. And when things don’t change, as is often the case with inaction, we retreat to escapism, to partying, to the blacked out blur of Perry’s hit. Oh well. If our first hypothesis about empathy in melody is correct, then we can draw some unnerving conclusions here. If Perry’s hit is popular because it expresses a shared feeling – namely alcoholic abandon and apathy towards the consequences – I would argue that our culture needs some sort of value reinvigoration or adjustment. Similarly, if Mayer’s single speaks to us because we are dissatisfied with the world we live in and see inaction as a legitimate response, our culture needs an infusion of determination or drive. However, perhaps shared expression of feelings is not the only manner in which society finds value in popular music. Another possibility is the act of consuming the music satiates a collective desire or need. This alternative seems to fare no better in light of the music already discussed. If we apply this idea to Perry’s hit, one could argue that the strong desire to escape everyday responsibilities, to totally let loose, is the driving force behind its popularity. Do today’s listeners need to get blackout drunk and forget the world? Similarly, do the listeners who supported Mayer’s single thirst for inaction and ambivalence? These possible answers look even worse when juxtaposed with earlier popular hits. The United States needed Dylan to speak out against their political system, and to point out that the times were a-changing – this seems far nobler than our current hits, which paint a poor picture of both artist and consumer. Now, it may be that the first two considerations are not the

20

case for today’s music popularity, and that the music is enjoyed purely because of its aesthetic appeal. The catchy melody or contagious backbeat of a tune leads to it being driven up the charts. However, if this is the reason for mass appeal, I would argue that we are still left with a worrying picture of the culture supporting it. It does not dodge the value issue for two reasons. Firstly, we are shifting our focus from the lyrical to the musical, making value determination impossible. Secondly, if that is the case, we are practicing conspicuous consumption. We are worshipping the plumage and not the bird. After the above considerations, it seems to me that today’s popular music tells us something quite worrying about those consuming it. If any of the three possibilities for value determination are correct, or if some combination of the three is responsible, we are still faced with an unnerving picture of our time. Apathy, indifference, inaction, and a desire to escape are rampant. The reasons for this may go deeper than the bounds of this piece allow, but I hope that for our sakes, the answers to our issues are not blowing in the wind.


The Flattening of the Artistic Horizon Craig Draeger Arts and Science, 2013

It seems like we spend a lot of time asking what defines art, or makes one artistic creation superior to another. But the study of aesthetic value alone is insufficient for a full understanding of the impact of creative expression in our world, because it must also address how different forms of art are transmitted and what this entails for our broader culture. Aesthetics are inexorably linked to economics and sociology, and looking for the connections and relationships therein will lend a greater understanding of culture as a whole, especially from a historical perspective. If we look to the origins of art (whether one chooses to start as early as cave drawings or as late as the Renaissance), it’s clear that they were local in nature. Until recently, the evolution of art has followed an uninterrupted path from the narrow to the broad, or the local to the national. After the advent of the printing press, the transmission methods that followed for other forms of media (radio, film, television, and the internet) helped to bring the world a lot closer to home for each of us. Laying aside the written word, there are basically three types of artistic content: audio-only, visual-only, and audio-visual blends. Let’s start with audio content. While internet piracy has altered the profit margins of the professional music industry, it’s not the only way the web has changed the business. Since the proliferation of home computers in the 1990s, it has been getting cheaper and easier to be a home musician. We even celebrate the living room Liberace, holding up artists like Justin Bieber—who was originally discovered through home-recorded Youtube postings—as symbols of the industry’s accessibility and meritocracy. I would suggest that this accessibility actually represents a cultural recursion toward a localized way of presenting one’s own art—everyone is entitled to their own Youtube profile, and thus their own artistic

locality, with or without broad attention. Television represents a very different paradigm. Although the emergence of Netflix and web personalities (who are often hosting what amounts to serialized television programming) have undermined the system, TV still exists squarely within a capitalist structure and is subject to market forces as such. If a network program is not financially successful, it will be summarily cancelled. Cinema occupies an intriguing middle ground between the flattened landscape of music and the market-driven nature of television. With “prosumer” electronics on the market—and the spread of high-quality editing software—it has become remarkably easy to film and produce your own movie. David Lynch shot his acclaimed feature Inland Empire on a standard definition, store bought camera. However, the relationship between film production and distribution remains a major factor in determining the success of any given creation. While low budget and creativedriven projects like The Blair Witch Project have gone on to major financial success on the back of a profit-driven model, the system does not lend itself well to the cause of creative expression before the bottom line. Nonetheless, I don’t seek to discuss the ethics of consumerdriven creativity, but rather expose the flattening of the artistic landscape and its repercussions. We can see that the fundamental difference between film and television is not just the serialized nature of the latter, but the way its method of transmission interacts with financial structures to produce different outcomes for consumers. Advances in technology have allowed home musicians and filmmakers to practice their craft, so serialized programs will not be far behind.

21


Returning to music for a moment, isn’t there an apparent quality control issue with having a flattened landscape for musical artists? What assurances are there that an open transmission system like the internet will be able to maintain a high content quality level? The beauty is that it doesn’t have to bear that burden. The internet has been criticized for drawing out the worst in us by allowing us access to our most basis desires, but it has also allowed promotion for artists who would have not otherwise been discovered. The critical point is that the internet need not force a race to the bottom in music or any other form of media, despite the naysayers. If music, unshackled from a capitalist production and distribution framework, hasn't universally collapsed into abysmal quality, then other industries shouldn’t be afraid to follow the same model.

“People are continuously cool hunting: searching for the latest fad in the arts, and looking to display it proudly.”

It is worth also examining visual art, which is even easier to pirate than audio content through JPEG file sharing and hosting. Why hasn’t the open availability of Salvador Dali paintings on the web hampered the sales of Dali prints? Because people are not paying for the right to view the image on their computer; they’re paying to display the print in their home or workplace. They’re paying to make it theirs. Many artistic industries are finally realizing what should have been apparent to them a long time ago: that the medium is the message and they’ll need to make the delivery method— just as much the content itself—relevant to the consumer in order to survive. People are continuously cool hunting: searching for the latest fad in the arts, and looking to display it proudly. Only when they’re able to tap into the great cultural recursion toward the re-localization of art will artistic industries be able to attain a sustainable business model.

For the full Inquire archive visit: inquireblog.blogspot.com 22


Now Accepting Submissions for the November Issue:

Power in Society and Culture

This issue examines the topic of rebellion in our world; the question of what constitiutes rebellion today. We feel what ought to naturally follow rebellion is the question of what role power plays in our society? Who holds it? Who ought to hold it? Both in its abstract and concrete forms where do we find power in our lives? Now accepting applications for 2011/2012 school year Inquire staff for the below positions: Editors Copy Editor Graphic Design Layout Editor Marketing Initiatives Finance Please send all applications and submissions to: copresident@gmail.com 23


September 2011


Issuu converts static files into: digital portfolios, online yearbooks, online catalogs, digital photo albums and more. Sign up and create your flipbook.