Bizarre Tales of Forgotten Legend

Page 1

STARRING…



1814 ⟵⟶ 2014

Subsumption . . . . Obsession. . . . . . Reconstitution . Power. . . . . . . . . . Arrogance. . . . . Duality. . . . . . . . . Fate. . . . . . . . . . . . . Re-Occurrence .

. . . . . . . .

. 4 . 11 . 16 22 24 28 32 37


T B

4

hank God! this is the country where bribery can do anything, and we are well-supplied with money. – Jonathon Harker

ut you must remember that I am not as you are. There is a poison in my blood, in my soul, which may destroy me. – Mina Harker



No Life Stories Rob Horning July 10, 2014 thenewinquiry.com/essays/no-life-stories

If media scholar Mark Andrejevic’s recent book, Infoglut: How Too Much Information Is Changing the Way We Think and Know, is right, the explicit appearance of Big Brother serves only to conceal the possibly more troubling fact of his ultimate indifference. To the degree that they have access to the devices we use to mediate our relation to everyday life, companies deploy algorithms based on correlations found in large data sets to shape our opportunities — our sense of what feels possible. Undesirable outcomes need not be forbidden and policed if instead they can simply be made improbable. We don’t need to be watched and brain-washed to make them docile; we just need to be situated within social dynamics whose range of outcomes have all been modeled as safe for the status quo. It’s not: “I see what you are doing, Rob Horning, stop that.” It’s: “Rob Horning can be included in these different data sets, which means he should be offered these prices, these jobs, these insurance policies, these friends’ status updates, and he’ll likely be swayed by these facts.” This form of control is indifferent to whether or not I “believe” in it. I can’t choose to opt out or face it down with a vigilant skepticism. I can decide to resist —  I can toss my smartphone in a lead bag and refuse all social media  —but these decisions won’t affect the social

practices ever more efficient and frequent” at the cost of generating a captured record of it. Surveillance thereby is linked not with suspicion but with solicitude. Each new covetable digital service, he points out, is also at once a new species of data collection, creating a new set of background norms against which to assess people.

6

infrastructure I am embedded in. At best, this will register only as a marker of quirky noncomformity. It may only supply consumerism with new symbols of rebellion to market. Since the data of average people establishes the backdrop statistics of normality, being a compliant citizen makes essential to surveillance rather than exempt. This level of surveillance is thus no longer intended to enforce discipline, as in a panoptic model of social control. Instead it aspires to sink into the hum of technologically mediated everyday life. You get watched not because you’re a “person of interest” but because you’re probably already perfectly ordinary. Surveillance has aspired to become as ordinary as the population it seeks to document. Social media, smartphones, wearable trackers like Fitbit, and other interlocking and ubiquitous networks have made surveillance and social participation synonymous. Digital devices, Andrejevic notes, “make our communication and information search and retrieval

Thanks to the services’ ongoing proliferation, it has become increasingly inconvenient to take part in any social activity — from making a purchase to conversing with friends online to simply walking down the street —

that doesn’t leave permanent data traces on a privately owned corporate server. The conveniences and connectivity interactive technologies promise normalizes what Andrejevic calls “digital enclosure” —turning the common space of sociality into an administered space in which we are all enlisted in the “work of being watched,” churning out information for the entities that own the databases. The transformation of social behavior into a valuable metacommodity of marketing data has oriented communication technology toward its perpetual collection. We become unwitting employees for tech companies, producing the data goods for the companies’ true clients: companies who can process the information. With surveillance shaped and excused as interactive participation,

Humans are controlled by lust, or fear that we will be. Lust can make people do terrible things. We aren't afraid of monsters hiding in the dark; we're afraid of ourselves. Vampires don't fear their lust or fight them. They never worry aboutthe consequences because there aren’t any. They feel no guilt. Dracula was everything the Victorians had liberated themselves from. He threatenedtoundermineitalland force them backwards. David Dvorkin


we experience it not as a curtailment on our privacy so much as information overload: Each demand for more information from us comes joined with a generous provision of more information to us. Much of the surveillance apparatus provides feedback, the illusion of fair exchange. For instance, we search Google for something (helping it build its profile of what we are curious about, and when and where) and we are immediately granted a surfeit of information, more or less tailored to instigate further interaction. What is known about us is thereby redirected back at us to inform us into submission. Purveyors of targeted marketing often try to pass off these sorts of intrusion and filtering as a kind of manufactured serendipity. “In the world of databasedriven targeting,” Andrejevic argues, “the goal is, in a sense, to pre-empt consumer desire.” This is a strange goal, given that desire is the means by which we know ourselves. Hoping to anticipate our desires, advertisers and the platforms that serve ads work to dismantle our sense of self as something we must actively construct and make desire something we experience passively, as a fait accompli rather than a potentially unmanageable spur to action. Instead of constructing a self through desire, we experience an overload of information about ourselves and our world, which makes fashioning a coherent self seem impossible without help. If Big Data’s dismantling the intrinsic-self myth helped people conclude that authenticity was always an impossibility, a chimera invented to sustain the fantasy that we could consume our way to an ersatz uniqueness, that

would be one thing. But instead, Big Data and social media foreground the mediated, incomplete self not to destroy the notion of the true self altogether but to open us to more desperate attempts to find our authentic selves. We are enticed into experiencing our “self ” as a product we can consume, one that surveillance can supply us with. The same is true of the self. The vast pool of data about us, combined with the social media to circulate and archive it, makes it hard to escape the sense that any partial representation of ourselves is not the whole truth, is somehow “inauthentic” and easily disprovable with more

~~~~~~~~~~~~~~~~ This is why defying the prevailing currents of ideology at the individual level doesn’t constitute meaningful resistance. We no longer are controlled through learning ideological explanations for why things happen and what our behavior will result in. Instead, we are constructed as a set of probabilities. A margin of noncompliance has already been factored in and may in fact be integral to the containment of the broader social dynamics being modeled at the population level. Expanding the scope of individual agency doesn’t disrupt the control mechanisms enabled by Big Data; it may in fact be

Insatiability is not a state of excessive desire — it’s not wanting it all. It takes no account of substance. Instead it breeds a bottomlessness. A gaping hole. Some call it a wound. Ana Cecilia Alvarez data … Individual doubt about being able to process all the information about ourselves, stoked by media entities capable of overloading us with information that will seem relevant to us, provokes a surrender to the machines that can. Instead of inventing a dubious, distorted, inauthentic life story to make sense of our choices, we can instead defer to something that supposedly can’t be faked: data. Big Data benefits by persuading us that we are the least trustworthy processors of data about ourselves. The degree to which we believe our own life stories are unreliable, to others and to ourselves, is the degree we will volunteer more information about ourselves to data miners for processing.

a by-product of that control’s efficient operation. We are ostensibly free to believe what we want, but then, Andrejevic argues, “Freedom consists of choosing one’s own invented version of history invoked for the purposes of defending the individuating logic of market competition.” Since Big Data lumps people together on the basis of the statistical implications of actions they would never bother to consciously correlate, they are left to essentially do what they please within confines they cannot perceive. So we may feel liberated from indelible typecasting by our consumer choices by liking a certain kind of music or wearing a certain sort of

Subsumption (continues on 8)


Subsumption (continued) clothes. Self-conformity is not necessary to identity in the age of the malleable archive, the generative database. ~~~~~~~~~~~~~~~~ Big Data promises a politics without politics. The trust necessary to ratify explanatory narratives is displaced from the seemingly intractable debate among competing interests and warped into a faith in quasiempirical mechanisms. Yet the idea that a higher level of objectivity exists at the level of data is itself a highly biased conception, a story told to abet capitalist accumulation. If unexplained correlations are politically or commercially actionable — and in capitalist society, profit arbitrates that — they will be deployed. The correlations that pay will be true, the ones that don’t will be discarded. Profit becomes truth. (This is especially true of prediction markets, in which truth is literally incentivized.) Making money becomes the only story it is possible to convincingly tell. ~~~~~~~~~~~~~~~~ Far from neutral or objective, data can be stockpiled as a political weapon that can be selectively deployed to eradicate citizens’ ability to participate in deliberative politics.

Understanding why outcomes occur becomes unnecessary… ~~~~~~~~~~~~~~~~ As sociologists Kate Crawford and Danah Boyd point out, Big Data “is the kind of data that encourages the practice of apophenia: seeing patterns where none actually exist, simply because massive quantities of data can offer connections that radiate in all directions.” The kinds of “truths” Big Data can unveil depends greatly on what those with database access choose to look for.

ourselves by fiat. I need to negotiate who I am with others for the idea to even matter. Alone, I am no one, no matter how much information I may consume.

8

~~~~~~~~~~~~~~~~

This plays out not only with events but also with respect to the self. Just as politics necessarily requires interminable intercourse with other people who don’t automatically see things our way and who least acknowledge alternate points of view only after protracted and often painful efforts to spell them out, so does the social self. It is not something we declare for

~~~~~~~~~~~~~~~~

The work of selfhood is difficult, dialectical, requiring not only continual self-criticism but also an awareness of the degree to which those around us shape us in ways we can’t control. We must engage them, wrestle with one another for our identities, be willing to make the painful surrender of our favorite ideas about ourselves and be vulnerable enough to becoming some of what others see more clearly about us. The danger is that we will settle for the convenience of technological work-arounds and abnegate the duty to debate the nature of the world we want to live in together. Instead of the collective work of building the social, we can settle

… as long as the probabilities of the correlations hold to make accurate predictions.

for an automatically generated timeline and algorithmicallygenerated prompts for what to add to it. … If coherent selfpresentation that considers the need of others takes work and a willingness to face our own shortcomings, collaborating with social surveillance and dumping personal experience into any and all of the available commercial containers is comparatively easy and fun. It returns to us an “objective” self that is empirically defensible, as well as an exciting and novel object for us to consume as entertainment. We are happily the audience and not the author of our life story. ~~~~~~~~~~~~~~~~ By trading narratives for Big Data, emotions are left with a basis in any belief system. You won’t need a reason to feel anything, and feeling can’t serve as a reliable guide to action. Instead we will experience the fluctuation of feeling passively, a spectator to the spectacle of our own emotional life, which is now contained in an elaborate spreadsheet and updated as the data changes. You can’t know yourself through introspection or social engagement, but only by finding technological mirrors, whose reflection is systematically distorted in real time by their administrators. x



I feel, indeed, that I have made the treasure Of human thought and knowledge mine, in vain; And if I now sit down in restful leisure, No fount of newer strength is in my brain: I am no hair’s-breadth more in height, Nor nearer, to the Infinite, Fear not that I this pact shall seek to sever? The promise that I make to thee Is just the sum of my endeavor. I have myself inflated all too high; My proper place is thy estate: The Mighty Spirit deigns me no reply, And Nature shuts on me her gate. The thread of Thought at last is broken, And knowledge brings disgust unspoken. Let us the sensual deeps explore, To quench the fervors of glowing passion! Let every marvel take form and fashion Through the impervious veil it wore! Plunge we in Time’s tumultuous dance, In the rush and roll of Circumstance! Then may delight and distress, And worry and success, Alternately follow, as best they can: Restless activity proves the man! I feel thee draw my heart, absorb, exhaust me: Thou must! thou must! and though my life it cost me!

10



THE DISRUPTION MACHINE

What the gospel of innovation gets wrong Jill Lepore June 23, 2014 newyorker.com/reporting/2014/06/23/140623fa_fact_lepore

Beginning in the 18th century, as the intellectual historian Dorothy Ross once pointed out, theories of history became secular; then they started something new — historicism, the idea “that all events in historical time can be explained by prior events in historical time.” Things began looking up. First, there was that, then there was this, and this is better than that. The 18th century embraced the idea of progress; the 19th century had evolution; the 20th century had growth and then innovation. Our era has disruption, which, despite its futurism, is atavistic. It’s a theory of history founded on a profound anxiety about financial collapse, an apocalyptic fear of global devastation, and shaky evidence.

12

In his 1997 book, The Innovator’s Dilemma, Clayton M. Christensen argued that, very often, it isn’t that executives make bad decisions but the velocity of history, and it wasn’t so much a problem as a missed opportunity, like a plane that takes off without you, except that you didn’t even know there was a plane, and had wandered onto the airfield, which you thought was a meadow, and the plane ran you over during takeoff. Ever since The Innovator’s stage: “big bang disruption.” Dilemma, everyone is either “This isn’t disruptive innovation,” disrupting or being disrupted. they warn. “It’s devastating There are disruption consultants, innovation.” disruption conferences, disruption seminars. This fall, the Things you own or use that are University of Southern California now considered to be the prodis opening a new program: uct of disruptive innovation “The degree is in disruption,” the include your smartphone and university announced. “Disrupt many of its apps, which have or be disrupted,” the venture disrupted businesses from travcapitalist Josh Linkner warns in el agencies and record stores to a new book, The Road to Rein- mapmaking and taxi dispatch. vention, in which he argues that Much more disruption, we are “fickle consumer trends, friction- told, lies ahead. Christensen free markets, and political unrest,” has co-written books urging along with “dizzying speed, ex- disruptive innovation in higher ponential complexity, and mind- education, public schools, and numbing technology advances,” health care. His acolytes and mean that the time has come imitators, including no small to panic as you’ve never number of hucksters, have called panicked before. Larry Downes for the disruption of more or and Paul Nunes, who blog for less everything else. Forbes, insist that we have entered a new and even scarier ~~~~~~~~~~~~~~~~

The rhetoric of disruption  —   a language of panic, fear, asymmetry, and disorder — calls on the rhetoric of another kind of conflict, in which an upstart refuses to play by the established rules of engagement, and blows things up. Don’t think of Toyota taking on Detroit. Startups are ruthless and leaderless and unrestrained, and they seem so tiny and powerless, until you realize, but only after it’s too late, that they’re devastatingly dangerous: Bang! Ka-boom! Think of it this way: the Times is a nation-state; BuzzFeed is stateless. Disruptive innovation is competitive strategy for an age seized by terror. Every age has a theory of rising and falling, of growth and decay, of bloom and wilt: a theory of nature. Every age also has a theory about the past and the present, of what was and what is, a notion of time: a theory of history.

Theories of history used to be supernatural: the divine ruled time; the hand of God, a special providence, lay behind the fall of each sparrow. If the present differed from the past, it was usually worse: supernatural theories of history tend to involve decline, a fall from grace, the loss of God’s favor, corruption.

Most big ideas have loud critics. Not disruption. Disruptive innovation as the explanation for how change happens has been subject to little serious criticism, partly because it’s headlong, while critical inquiry is unhurried; partly because disrupters ridicule doubters by charging them with fogyism, as if to criticize a theory of change were identical to decrying change; and partly because, in its modern usage, innovation is the idea of progress jammed into a criticism-proof jack-in-the-box.

The idea of progress  —  the notion that human history is the history of human betterment — dominated the world view of the West between the Enlightenment and the First World War. It had critics from the start, and, in the last century, even people who cherish the idea of progress, and point to improvements like the eradication of contagious diseases and the education of girls, have been hard-pressed to hold on to it while reckoning with two


World Wars, the Holocaust and Hiroshima, genocide and global warming. Replacing “progress” with “innovation” skirts the question of whether a novelty is an improvement: the world may not be getting better and better but our devices are getting newer and newer. The word “innovate”  — to make new  — used to have chiefly negative connotations: it signified excessive novelty, without purpose or end. Edmund Burke called the French Revolution a “revolt of innovation”; Federalists declared themselves to be “enemies to innovation.” George Washington, on his deathbed, was said to have uttered these words: “Beware of innovation in politics.” Noah Webster warned in his dictionary, in 1828, “It is often dangerous to innovate on the customs of a nation.” The redemption of innovation began in 1939, when the economist Joseph Schumpeter, in his landmark study of business cycles, used the word to mean bringing new products to market, a usage that spread slowly, and only in the specialized literatures of economics and business. (In 1942, Schumpeter theorized about “creative destruction”; Christensen believes that Schumpeter was really describing disruptive innovation.) “Innovation” began to seep beyond specialized literatures in the 1990s, and gained ubiquity only after 9/11. The idea of innovation is … progress stripped of the aspirations of the Enlightenment, scrubbed clean of the horrors of the twentieth century, and relieved of its critics. Disruptive innovation goes further, holding out the hope of salvation against the very damnation it describes: disrupt, and you will be saved.

Disruptive innovation as a theory Bradley Company as triumphing of change is meant to serve over four rivals, is a book called both as a chronicle of the past The Bradley Legacy, published (this has happened) and as a by a foundation established by model for the future (it will the company’s founders. This keep happening). The strength is akin to calling an actor the of a prediction made from a greatest talent in a generation model depends on the qual- after interviewing his publicist. ity of the historical evidence and on the reliability of the ~~~~~~~~~~~~~~~~ methods used to gather and interpret it. Historical anal- In the preface to the 2011 edition ysis proceeds from certain of The Innovator’s Dilemma, conditions regarding proof. Christensen reports that, since None of these conditions have the book’s publication, in 1997, been met. “the theory of disruption continues to yield predictions that are ~~~~~~~~~~~~~~~~ quite accurate.” This is less because people have used his Christensen’s sources are often model to make accurate predicdubious and his logic question- tions than because disruption able. His single citation for his has been sold as advice, and investigation of the “disruptive because much that happened transition from mechanical to between 1997 and 2011 looks, in electronic motor controls,” in retrospect, disruptive. Disrupwhich he identifies the Allen- tive innovation can reliably be

Obsession (continues on 14)

TWo Sentences ABout Getting Older and Working on the Web Frank Chimero May 03, 2014 frankchimero.com/blog/two-sentences-about-getting-older-and-working-on-the-web/

It is time to come clean: pretty much everything in a modern web stack no longer makes sense to me, and no one explains it well, because they assume I know some fundamental piece of information that everyone takes for granted and no one documented, almost as if it were a secret that spread around to most

everyone some time in 2012, yet I some how missed, because — you know — life was happening to me, so I’ve given up on trying to understand, even the parts where I try to comprehend what everyone else is working on that warrants that kind of complexity, and now I fear that this makes me irrelevant, so I nestle close

to my story that my value is my “ideas” and capability to “make sense of things,” even though I can’t make sense of any of the above — but really, maybe I’m doing okay, since it’s all too much to know. Let the kids have it.

x


Obsession (continued) seen only after the fact. History speaks loudly, apparently, only when you can make it say what you want it to say. The popular incarnation of the theory tends to disavow history altogether. “Predicting the future based on the past is like betting on a football team simply because it won the Super Bowl a decade ago,” Josh Linkner writes in The Road to Reinvention. His first principle: “Let go of the past.” It has nothing to tell you. But, unless you already believe in disruption, many of the successes that have been labeled disruptive innovation look like something else, and many of the failures that are often seen to have resulted from failing to embrace disruptive innovation look like bad management.

pens is everywhere. Ideas that come from business schools are exceptionally well marketed. Faith in disruption is the best illustration, and the worst case, of a larger historical transformation having to do with what happens when the invisible hand replaces the hand of God as explanation and justification. Innovation and disruption are

museums, hospitals, schools, and universities have been supported by patronage, donations made by individuals or funding from church or state. The press has generally supported itself by charging subscribers and selling advertising. (Underwriting by corporations and foundations is a funding source of more recent vintage.) Charging for admission,

doned, according to the report, because it has “hidden costs” that thwart innovation. ~~~~~~~~~~~~~~~~ Vox Media, a digital-media disrupter that is mentioned ten times in the Times report and is included, along with BuzzFeed, in a list of the Times’

14

Christensen has compared the theory of disruptive innovation to a theory of nature: the theory of evolution. But among the many differences between disruption and evolution is that the advocates of disruption have an affinity for circular arguments. If an established company doesn’t disrupt, it will fail, and if it fails it must be because it didn’t disrupt. When a startup fails, that’s a success, since epidemic failure is a hallmark of disruptive innovation. (“Stop being afraid of failure and start embracing it,” the organizers of FailCon, an annual conference, implore, suggesting that, in the era of disruption, innovators face unprecedented challenges. For instance: maybe you made the wrong hires?) When an established company succeeds, that’s only because it hasn’t yet failed. And, when any of these things happen, all of them are only further evidence of disruption. Disruptive innovation as an explanation for how change hap-

to one for whom having is the main form of relatedness to the world, ideas that cannot be easily be pinned down are frightening — like everything else that grows and changes, and thus is not controllable. Erich Fromm ideas that originated in the arena of business but which have since been applied to arenas whose values and goals are remote from the values and goals of business. People aren’t (widgets). Public schools, colleges and universities, churches, museums, and many hospitals, all of which have been subjected to disruptive innovation, have revenues and expenses and infrastructures, but they aren’t industries in the same way that manufacturers of harddisk drives or truck engines or drygoods are industries. Journalism isn’t an industry in that sense, either. Doctors have obligations to their patients, teachers their students, pastors their congregations, curators the public, and journalists their readers —obligations that lie outside the realm of earnings, and are fundamentally different from the obligations that a business executive has to employees, partners and investors. Historically, institutions like

membership, subscriptions and, for some, earning profits are similarities these institutions have with businesses. Still, that doesn’t make them industries, which turn things into commodities and sell them for gain. ~~~~~~~~~~~~~~~~

It’s readily apparent that, in a democracy, the important business interests of institutions like the press might at times conflict with what became known as the “public interest.” That’s why, a very long time ago, (the newsmedia) established a wall of separation between the editorial side of affairs and the business side. “The wall dividing the newsroom and business side has served The Times well for decades,” according to the Times’ Innovation Report, “allowing one side to focus on readers and the other to focus on advertisers,” as if this had been, all along, simply a matter of office efficiency. But the notion of a wall should be aban-

strongest competitors (few of which are profitable), called the report “brilliant,” “shockingly good,” and an “insanely clear” explanation of disruption, but expressed the view that there’s no way the Times will implement its recommendations, because “what the report doesn’t mention is the sobering conclusion of Christensen’s research: companies faced with disruptive threats almost never manage to handle them gracefully.”

Disruptive innovation is a theory about why businesses fail. It’s not more than that. It doesn’t explain change. It’s not a law of nature. It’s an artifact of history, an idea, forged in time; it’s the manufacture of a moment of upsetting and edgy uncertainty. Transfixed by change, it’s blind to continuity. It makes a very poor prophet. The upstarts who work at startups don’t often stay at any one place for very long. (Three out of four startups fail. More than

Obsession (continues on 31)



W

16


W

hen I found so astonishing a power placed within my hands, I hesitated a long time concerning the manner in which I should employ it. Although I possessed the capacity of bestowing animation, yet to prepare a frame for the reception of it, with all its intricacies of fibres, muscles, and veins, still remained a work of inconceivable difficulty and labour. I doubted at first whether I should attempt the creation of a being like myself or one of simpler organization; but my imagination was too much exalted by my first success to permit me to doubt of my ability to give life to an animal as complex and wonderful as man. The materials at present within my command hardly appeared adequate to so arduous an undertaking; but I doubted not that I should ultimately succeed. I prepared myself for a multitude of reverses; my operations might be incessantly baffled, and at last my work be imperfect: yet, when I considered the improvement which every day takes place in science and mechanics, I was encouraged to hope my present attempts would at least lay the foundations of future success. Nor could I consider the magnitude and complexity of my plan as any argument of its impracticability. It was with these feelings that I began the creation of a human being. As the minuteness of the parts formed a great hindrance to my speed, I resolved, contrary to my first intention, to make the being of a gigantic stature; that is to say, about eight feet in height, and proportionably large. After having formed this determination, and having spent some months in successfully collecting and arranging my materials, I began. (Ch 3)


Living In The Mess Mandy Khan 2011 Reprinted from Collage Culture

The PERFECT CIRCLES and DORIC LINES of the ancient World are buried at Pompeii, and anyway, flawless things make the people beside them more cracked and ragged. Better to show off the broken thing, the jarring thing, a mess of the purposely mismatched. At the very least it makes us feel like we’re telling the truth.

susceptible —embarrassingly delicate in travel, for example — the first in any boating party to turn green. And maybe that’s why I started to feel in my gut the affects of discrepancy in the new millennium, as much as I loved the new millennium —as much as I wanted everything to be threadbare and disintegrated, as much as I chose to live in intentional discordance: still. I wanted lo choose the mess and become increasingly fearful the mess had chosen me.

child’s mourning, triumph over missing; the new version moved listeners to tears in a way the first never had. That extra dimension is the complex, layered province of collage: it’s the tension, the depth, the feeling and meaning that comes from juxtaposition, and it’s the realm, I later realized, I was pretty much living in. I’d noticed Natalie Cole’s turn in part because it was rare. But ten years later, songs made from the cut-up parts of other songs were so prevalent you’d be hard-pressed to notice. Or rather, you’d be hard-pressed to remember it hadn’t always been that songs were made by sampling other songs. And anyhow, so much had changed since 1991 that the trend towards borrowing was barely a blip on the new screens that were everywhere —in our offices, in our pockets and purses, even embedded in the backs of the plane seats in front of us­ , in the dashboards of our cars.

18

Enough time in the randomness and patterns make themselves apparent  —even there, in the place from which order was banished.

I wasn’t interested in patterns but there they suddenly were, and then they were everywhere, which mode them invisible.

Always with me was a book of Rauschenberg posters, which I loved, though it was hard to decipher why. Here was a woman’s HAIRBRUSH, here was a pack of MARATHON RUNNERS, here was a sitting room DRAWING OF SHEEP, here was A GUSTY WHEAT FIELD.

These cuttings, seemingly unrelated, were glued in a pattern I found inexplicable­—if it were a math problem I’d call it unsolvable —and though no part felt inherently beautiful I found beauty in the resulting combination, which was maybe a visual beauty maybe the dug-up beauty of an idea.

Like everyone else, I spent the first (years) of the new millennium

making nests of cast-off things, of pieces culled from eras past I wove together without pattern — my own hairbrush and marathon runners and line-drawn sheep and windy clumps of wheat.

And the one day I looked around and said, I’m living in a Rauschenberg poster, and the song that’s playing is Rauschenberg too, and the book of poems on my beside table, found poems made from scraps of extant material, is one also —and that’s when I started to feel unsettled. Sometime during elementary school I saw a television program about motion sickness. This was of special interest to me as I’ve always been queasy on long drives, on whale-watching trips, in films with hand-held camerawork. We feel nauseous, the program explained, when our brain receives clashing messages — when the body says I’m bouncing furiously while the eyes say I’m on steady waters, for example. Some people are more sensitive to this discrepancy than others. I’ve always been unusually

When Natalie Cole’s iteration of the pop song "Unforgettable" become a considerable Billboard hit in 1991, I thought it was a cheat. She hadn’t even written the song, I argued —she’d done nothing more than add a vocal track to a song of her father’s — she’d simply squeezed a hit out of a hit. Value, I argued, lay in original creation. The collage she’d made was a lower form of art  —a trick —a deeply American show of, well, laziness and greed.

Years later I come to understand the genius in Natalie Cole’s stroke. Her vocals added an entire dimension to a listener’s experience, a dimension created by the relationship between the song’s components. After all, her father, Nat "King" Cole, had passed away in ’65. That added dimension was furnished to bursting with filial piety, a

What could possibly be dangerous about borrowing from other times —times we knew inherently, times we collectively missed? What could be less dangerous than a pack of ardent runners, a hairbrush and a somehowfamiliar sheep? By then I’d been a teacher and knew how hard it was to get students to understand tone. And that extra dimension a collage creates, the invisible but palpable and complex dimension, is a realm of tone entirely. To make sense of a collage the viewer must guess with what tone its elements have been joined. But tone is subtle and an ear for it is hard for some to cultivate. Without such an ear, works of


collage become noise or unfixed puzzles. The early 2000s saw a toneimportant trend in the arrival of the ironic moustache, a favorite at the shows I was frequenting. The first group to wear it—a tiny number, a handful at best­joined it with a certain heavy smirk. The two —smirk and moustache —were never seen apart, because without the smirk

For those who wore it, the ironic moustache was a stroke of ultimate brashness, a thing they knew was on the surface ugly but intended to make meaningful, even attractive, by marrying it to on idea. It was worn, said the smirk, to make you laugh, or guffaw, or as a conversation piece, or to indicate confidence: I can wear something ugly on my face and still be impossibly attractive. That smirk was a negative sign placed before the

a college sweatshirt to demonstrate an earnest love of a school – let’s say a +6  —then the wearing of that same sweatshirt to mock the collegeloving becomes a -6. Tone will let an onlooker now whether to plot that sweatshirt’s value on the positive or negative side of zero on the number line, whether it’s worn to mean the thing itself or its opposite. Tone is all-important, then  —an absolute imperative for under-

of tone-important revivals  — revival after revival, coming faster and faster, until they were piling up, happening concurrently, creating a kind of retro salad in which tone was drowned, meaning was forgotten, and things become only a parade of undesignated symbols, which is why I loved this decade, and which is why it came to worry me.

The unity of a text is not in its origin, it is in its destination; but this destination can no longer be personal: the reader is a man without history, without biography, without psychology; he is only that someone who holds gathered into a single field all the paths of which the text is constituted. Roland Barthes the moustache became its own grotesque opposite, its own photo negative. The moustache was exceptionally shaggy —an exaggeration, if one was possible, of the version popular in the 1980s among television and porn actors. It was worn with neither reverence nor nostalgia, though some of our fathers and uncles had once displayed it with pride. This was not a way to align ourselves with our fathers and our uncles. It was a way to separate ourselves, because its tone, dictated by the smirk, was wholly inscrutable to many, and so to many the moustache was puzzling or repellent or taken as its opposite, a tribute. That moustache, on some level, become the locked door to a speakeasy many would neither locate nor have the credentials to enter.

moustache itself, transforming the ugly to its opposite, but only for those who understood. As the moustache disseminated, the smirk it carried dissipated. The next wave of wearers, and certainly the next after those, may have only been able to explain the moustache by saying it was retro, it was cool. Once it was popular, it stopped being ugly and therefore stopped being funny. Without the smirk it was a plain, straight thing, worn for normal reasons. Though their absolute value is the same, -6 and +6 are mathematical opposites. The negative sign tells us whether to plot a value on the number line towards the positive side or into the land of negatives. Tone, similarly, tells us where to plot a value. For example: If we ascribe a random number to the wearing of

standing, for the accurate communication of meaning. What happens when tone can’t be determined by a viewer? In that case, the thing might be wrongly ascribed an opposite meaning to what had been intended — mathematically, -6 might be read as +6. But even more often, things are left uncategorized —I don’t know what it means, it’s confusing, I don’t have time to figure it out —and everything, whether meant to be positive or negative, becomes its absolute value —a number of paces to move without the map that says which way to go  — the thing itself, but meaningless —a word, say, written in cuneiform  — a symbol. There Is Danger In A Landscape Of Meaningless Symbols. The first decade of the 21stcentury saw a record number

How Different Is The Ironic Moustache From Much Of Conceptual Art? Each relies on the beauty of on idea to advance its project, and charts its course at times by way of ugliness. Faced with the conundrum of preference posed by poet Wallace Stevens, both conceptual art and moustache choose the latter: I do not know which is to prefer, The beauty of inflections, Or the beauty of innuendos. The blackbird whistling, Or just after. I say “collage culture” because the cutting and posting of extant things has replaced the act of original creation in the new millennium as the favored creative method. Musicians and artists, designers and writers, jewelry makers and interior decorators

Reconstitution

(continues on 20)


Reconstitution

(continued)

employ the method of collage — a method that once felt foreign and daring —with such incredible frequency now that nobody notices. Sometimes the fashions of entire decades were revived, sometimes just pieces —the ’50s prom dress became a rage, and ’60s leather Beatle boots, ’80s shoulder pads and ’40s swim-suits. The past, it seemed, and increasingly, felt, was being ransacked: rummaged so its component parts could be joyfully cut from their contexts, joined with other things and assigned a different tone. In essence, collaged.

When my parents divorced, my dad retained our family house. Far into the nineties, it kept its ’70s form: sunken living room, orange carpeting, drapes in an oversized floral pattern, giant mirrors, wallpaper in brown and tan. I loved that house, even when it fell from fashion; I remember a classmate saying that visiting felt like entering a time warp.

I’m bouncing from the body — we feel physical symptoms. I speak to Johannes Burge, a postdoctoral researcher at the Center for Perceptual Systems in the Department of Psychology at the University of Texas at Austin. I ask whether collage in fashion or interior design or music might affect the way one feels. If an outfit tells the brain, for example, the year is 1980 and the year is 1955 concurrently, or a room does, might that send the body into a sort of does-not-compute mode, resulting in a nausea similar to seasickness? As far as he’s aware, no one has ever tested for this.

being manufactured, were being boxed and shipped, were on the bodies of high school freshmen who didn’t know us. But the question remains: Why did we turn to the past? Those parties were full of art school students, filmmakers, musicians, graphic designers: creative minds of every possible stripe. Why didn’t they invent a style so singular their fingerprint would be indelible? Wasn’t there anything anyone wanted to say?

20

The New Year’s Eve that welcomed the millennium was fraught with heavy fear. Computers weren’t made to handle the date change, went common lore: they might all crash and New Year’s Day devolve to temporary anarchy. People prepared for power outages, shortages of water. They pulled out large amounts of cash, should banking systems go. As the date transitioned there was an audible roar at the party I attended, as a roomful of revelers waited to see whether the power would cut out. It didn’t.

Corporations had scrambled to fix computer glitches ahead of time. And though anarchy didn’t arrive when people feared it would, it did arrive, at least in boisterous spirit. Y2K had started a decade that looted the closets of history, that picked through the sounds and styles of the past like the strictures of time no longer applied, that borrowed and bore with a kind of electric frenzy — till it was harder and harder to tell by sight what year we were living in.

My mom’s house was similarly a product of its era: walls in baby blue and mint, down-stuffed couches, pastel flowers, lace. Those were decades when homedecor was carefully matched — not only in style and color and pattern, but also in era. My mother would never have let a mid-century love seat cross her ’80s threshold. And my father, who knew nothing about design, would still have somehow known it was taboo to switch my mother’s couch with his. These aes­ thetic rules were so entrenched that violating them would have made a person appear unhinged. ~~~~~~~~~~~~~~~

From a 2008 article entitled “Sydney’s (Best) Houses,” the home profiled features “an oriental coffee table and oversized black chinoiseries cupboard, French neoclassic Louis dining chairs, African sculptures, midcentury chairs.” Such a mix, in the perfect ’80s, would have made its owner oddball. Now anything is fair game for plundering and incorporation —any point of origin, any moment in time.

~~~~~~~~~~~~~~~~

Los Angeles harbors the means of creative production. Ideas that are born here are transmitted to the rest of the world with staggering speed  —we have the movie cameras, the music studios, the concrete factories downtown ready to multiply a pair of pants by twenty thousand. This city fuses wings onto our notions, so that before we’ve made up our minds entirely, they’ve flown from us.

I’ve been told no cultural movement starts in a single place, which I don’t disbelieve. The internet serves as a kind of cosmic consciousness, conjoining us all, and it’s silly to lay ownership to something as big as a trend.

What I Can’t Get Out Of My Mind Is The Concept Of Motion Sickness.

That said, it felt like we were at the point of germination, the place where the cell was splitting, and there was giddiness to it, a whiff of agency or power.

When the body gels two dissimilar pieces of information —I’m stationary from the eyes and

Someone decided to dress up for the party in something strange and then clothes were

But borrowing was saying —it was bestowing favor, it was creating an in-crowd of those who could judge tone, it was expressing the abandon and rowdy spirit inherent in plundering. And so the revivals come faster and faster, till everything was being revived, it felt, at once  —all moments in our collective post were alive again, concurrently, and suddenly we could pick and choose between them. The dance floor at the Vine Bar was alive with ’60s fringe, ’80s shoulder pads and ’30s waistcoats. And for the record, this had nothing to do with drugs —who needed them? The walls of time, for us, were already porous. Before it arrived, I assumed the millennium would bring new trends in culture, immediately after burning the old trends down. I was wrong. It didn’t burn —it looted. I’d been trained by the former century, when every decade wanted to be new. Just as a mob at Comiskey Park burned disco records with audible glee, each decade of the twentieth century involved a thorough housecleaning and a methodical reinvention. Fifties strictures and rules of decorum were burned in the love fires of the sixties Haight, for example. The


excesses of the ’20s were denounced through the prudence of the ’30s. Like teenagers, each generation labored to become itself, and defined itself in exact opposition to what had come before, as if the opposite was by definition the most rebellious. But Nothing Feels More Tired Than A Predictable Rebellion.

past, but on fast-forward —here was the song that brought everyone onto the dance floor at our sixth grade prom, here was the song my stepmother played ad nauseam on cassette — but as soon as a melody swam to the surface another one would clobber it. I was chasing every memory, but fruitlessly, exhaustively: sometimes getting grip-

Like a drug trip, however, we can’t choose where we go. And like a drug trip, there’s risk: what’s accessed within us can be wonderful but it can also be sinister. The human brain buries things for reasons. A hypnotist approaches his work carefully; Gillis seems almost brash. “I like these songs to be familiar so you either love them or hate

Johannes, auditory processing is faster … but most of the processing done at this speed is subconscious. Researchers, when testing for visual perception, will flash an image on a screen for a slice of time — let’s say 30 milliseconds — and ask a volunteer whether he saw anything. At that speed, he’ll say he didn’t. Then researchers will

It may be that the incomplete story, the particle, the fragment, is now the preferred unit of information in our culture, and lack of place is more useful for presenting these fragments than to fix them into sentences or grids. Frances Butler After a century of reinvention, perhaps nothing seemed more revolutionary than borrowing, using, valuing —than crediting the past, even worshiping it — and in doing so, throwing off the hampering need to invent. When we’d tired of burning, we looted. And I suspect someday we’ll burn our loot. Better known as Girl Talk, Gregg Gillis is a mash-up artist; his craft lies in the manipulation of chunks of extant songs. His first album, Night Ripper, uses more than 100 samples, and his follow-up Feed the Animals quotes upwards of 300. He’s especially frenetic when playing live; egged on by an audience, he’s been known to quote an album’s worth of samples in a handful of minutes. Notable is the incredible familiarity of the songs Gillis chooses to use: each one such a radio favorite its notes are etched in our collective history. The first time I heard Feed the Animals I felt I was on a children’s ride through the halls of my own

ping chunks but never getting wholes. Apparently a reporter at The Montreal Gazette had a similar experience: “The effect is as dizzying as it is intoxicating. You barely have time to react to one combination of associations before it is replaced by another, and another, and another.” Which, as it turns out, is intentional. Gillis tells London’s The Guardian, “I always wanted to use recognizable elements and play with people’s emotional, nostalgic connections with these songs. … For me the biggest challenge is to make transformative new music out of these huge musical parts that anyone can recognize.”

them, that they have some kind of memory I can manipulate,” Gillis tells The Daily Telegraph.

And so suddenly, not just parts of songs but parts of associated memories are being looped, sped, scratched and stocked together. What could be more hallucinatory? What could be more surreal?

I ask Johannes Burge whether listeners can process the samples in a Girl Talk song at the pace they occur, even if their conscious minds can’t keep up. He sends me an article from Nature Neuroscience that makes me think they can.

Used to be that drugs, or hypnotism, or psychotherapy were required to plumb those depths.

I’ve seen Gills play; no one gets people dancing faster. As he carries the crowd from sample to sample, between places we remember, all of us pressed together end dancing as hard as our bodies will go, there’s a feeling of shared excitement, and it’s intoxicating. There’s a feeling that we’re headed somewhere, and we’re going there together, and good or bad, we’re willing: we’ve signed on for the duration. ~~~~~~~~~~~~~~~~

Our system processes visual data with 75% accuracy in fewer than 30 milliseconds, the article explains. And, says

force him to choose between two possible images as the thing he saw. If 75 of 100 volunteers choose the correct image, it is said we process visual data at 30 milliseconds with 75% accuracy. But still, the volunteer will think he hasn’t seen anything. Johannes … calls the samples Girl Talk uses “jingles.” When a song comes on the radio that stirs a particular memory, that memory has the chance to bloom in the conscious, to be pondered. We might spend the time it plays wistfully revisiting, turning over a moment or a feeling in the mind. The trouble with a Girl Talk song is the speed at which it happens — foster than the conscious mind but slow enough for the subconscious. Gillis layers samples so we get old songs concurrently, and so memories, as quick as they come, are braided into rope — rubbed against each other for ferocity, for friction. We leave feeling elated or dazed or dumbed or shaken or raw, oblivious to the specifics of

Reconstitution

(continues on 38)


Sequenced in the USA A Desperate Town HandS Over Its DNA Amanda Wilson July 21, 2014 www.psmag.com/navigation/business-economics/sequenced-u-s-desperate-town-hands-dna-85232

Eleven years ago, the community of Kannapolis, about 25 miles northwest of Charlotte, was the scene of the largest single layoff in North Carolina history. All in one day, some 4,300 locals — a tenth of the town’s population — lost their jobs when the textile mill at the center of town closed its doors.

in a metabolic chamber while their vital signs are monitored — sometimes for $30 to $ 250 and up. In exchange for their participation, contributors (also) receive a $10 gift card to Walmart.

there, that raw material —that blood and urine —can be distributed to researchers to be analyzed and rendered into the form most useful for molecular science: into ones and zeroes. Into data.

Many Kannapolites will tell you, however, that they’re not participating for the gift card. At a predominantly African American church that recruiters visited in 2012, two women told me they took part in the study for the good of their grandchildren and future generations. At a car dealership near Kannapolis where 30 employees gave samples to the MURDOCK Study, two managers told me they were just doing what they could to boost the town’s new biotech economy. One said he hoped 10,000 people would work at the research complex someday, and that they would all drive cars from his dealership. “We are trying to support it in the only way we can.”

If the promise of personalized medicine bears fruit, this data — gene sequences, protein levels, and the various other codes that can be translated from human tissue  —could lead to major advances in public health. It could also be worth a huge amount of money.

22

At the time of the layoff, as many as half of the mill workers — average age of 46  —hadn’t graduated high school. Many had no computer skills. Some had never worked a mouse or ever driven on an interstate. So why is Kannapolis lucky? Because of what didn’t happen after the shutdown. Because the mill didn’t become a derelict ruin at the heart of town. Because the local economy didn’t entirely collapse from the inside out. Because a new Walmart opened. (“That’s where a lot of people went to work,” says Kevin Eagle, a local artist and third-generation Kannapolite.) And most of all, because an elderly Los Angeles billionaire named David H. Murdock stepped in to transform Kannapolis into a mecca for biotechnology and life sciences research. ~~~~~~~~~~~~~~~~ Today, the workers who pull up each morning at the site of the

former mill are more likely to be genomics scientists from China than blue-collar Kannapolites with a high school education. According to the campus developer, the complex should eventually create 5,000 jobs in the area; so far it has created 600, most of them highly skilled positions that require advanced degrees. In other words, although the research complex occupies the same physical space as the mill once did, it does not constitute the center of daily life in Kannapolis quite the way the mill did. Except, perhaps, in one respect.

Today’s Kannapolis does not offer as many good blue-collar jobs as it used to —unemployment still hovers at 10% —but it does provide plenty of opportunities for locals to serve as human research subjects. At any given time, Kannapolites can take part in studies of nutrition, longevity, or cognitive or physical performance; eat a standardized diet; or spend time

So far, over 10,000 people have joined the study. All their samples reside in a white, 40,000-square-foot warehouse on the edge of town, called a biorepository, or biobank. Once

~~~~~~~~~~~~~~~~

To translate its own discoveries into products for market, Duke has created a spinoff company called  —  cutely, for a venture anchored in a former manufacturing town — the Biomarker Factory. The spinoff is a joint venture with LabCorp, one of the largest clinical lab testing firms in the U.S. In the partnership, Duke brings biological material to the table —20% of the samples from the MURDOCK Study  —and LabCorp brings cash: $24 million. The goal of the new company is to rapidly turn discoveries from the MURDOCK

How much will the average citizen share in the prosperity of the town’s new biotech economy writ large?


Study into intellectual property and lab-test products. If those products go to market and are successful, it will no doubt be good for Kannapolis; one way or another, some portion of the profits will probably filter through the town’s economy. But it’s not exactly clear how. A generation ago, the average Kannapolite derived a sizable share of the prosperity generated by the mill. How much will the average Kannapolite share in the prosperity of ventures like the Biomarker Factory, and of the town’s new biotech economy writ large? The recent transformation of Kannapolis presents a kind of parable for larger transformations in the American economy. Kannapolis, like much of America, now has an economy in which the good jobs go to highly skilled specialists, and prosperity trickles down through the service economy to lessskilled workers. An economy in which the quirky notions of billionaires can mobilize huge sums of talent and capital for speculative ends. And one in which major firms create value by harvesting the personal data of ordinary people  —whether from a sample of their blood or the details of their Facebook page —and then monetize that data in proprietary maneuvers that those people barely comprehend. x


W

24


W

ith such precautions the courtiers might bid defiance to contagion. The external world could take care of itself. In the meantime it was folly to grieve, or to think. The prince had provided all the appliances of pleasure. There were buffoons, there were improvisatori, there were ballet-dancers, there were musicians, there was Beauty, there was wine. All these and security were within. Without was the “Red Death.�


This is Probably a Good Time to Say That I Don’t Believe Robots Will Eat all tHe JoBs

With these three things in place, humans will do what they always do: create things that address and/or create new wants and needs.

The Flip Side Of Robots Eating Jobs. What never gets discussed in all of this robot fear-mongering is that the current technology revolution has put the means of production within everyone’s grasp. It comes in the form of the smartphone (and tablet and PC) with a mobile broadband connection to the Internet. Practically everyone on the planet will be equipped with that minimum spec by 2020.

26 Mark Andressen June 13, 2014

blog.pmarca.com/2014/06/13/this-is-probably-a-good-time-to-say-that-i-dont-believe-robots-will-eat-all-the-jobs

One of the most interesting topics in modern times is the “robots eat all the jobs” thesis. It boils down to this: Computers can increasingly substitute for human labor, thus creating unemployment. Your job, and every job, goes to a machine. This sort of thinking is textbook Luddism, relying on a “lumpof-labor” fallacy —the idea that there is a fixed amount of work to be done. The counterargument to a finite supply of work comes from economist Milton Friedman – Human wants and needs are infinite, which means there is always more to do. I would argue that 200 years of recent history confirms Friedman’s point of view. If the Luddites had it wrong in the early 19th century, the only way their line of reasoning works today is if you believe this time is different from those Industrial Revolution days. That either: (a) There won’t be new wants and needs (which runs counter to human nature); or (b) It won’t matter that there

are new wants and needs, most people won’t be able to adapt to, or contribute/ have jobs in the new fields where those new wants and needs are being created.

While it is certainly true that technological change displaces current work and jobs (and that is a serious issue that must be addressed), it is equally true, and important, that the other result of each such change is a step-function increase in consumer standards of living.

As consumers, we virtually never resist technology change that provides us with better products and services even when it costs jobs. Nor should we. This is how we build a better world, improve our quality of life, better provide for our kids,

and solve fundamental problems. Make no mistake, advocating for slowing technological change to preserve jobs equals advocating for the punishment of consumers, and stalling the march of quality of life improvements. So how then to best help those buffeted by producer-side technology change and lose jobs they want to keep?

First: Focus on increased access to education and skill development, which will increasingly be delivered via technology. Second: Let markets work (this means voluntary contracts and free trade) so that capital and labor can rapidly reallocate to create new fields and jobs.

Third: Create and sustain a vigorous social safety net so that people are not stranded and unable to provide for their families. The loop closes as rapid technological productivity improvement and resulting economic growth make it easy to pay for the safety net.

What that means is that everyone gets access to unlimited information, communication and education. At the same time, everyone has access to markets, and everyone has the tools to participate in the global market economy. This is not a world we have ever lived in. Historically, most people  — in most places  —have been cut off from all these things, and usually to a high degree. But with that access, with those tools in the hands of billions, it is hard to believe that the result will not be a widespread global unleashing of creativity, productivity and human potential. It is hard to believe that people will get these capabilities and then come up with … absolutely nothing useful to do with them? And yet that is the subtext to the “this time is different” argument that there won’t be new ideas, fields, industries, businesses, and jobs. In arguing this with an economist friend, his response was, “But most people are like horses; they have only their manual labor to offer…” I don’t believe that,


and I don’t want to live in a world in which that’s the case. I think people everywhere have far more potential. There is a consequence to a growing robot workforce. Everything gets really cheap. The main reason to use robots instead of people to make something is when the robot can make it less expensively. The converse is also true. When people can make something that costs less than what robots can make, then it makes economic sense to use people instead of robots. This is basic economic arbitrage at work. It sounds like it must be a controversial claim, but it’s simply following the economic logic. Suppose humans make Widget X profitably at a $10 price to the consumer. (But) robots can make X at a $5 price to consumer. Economics drive X to be made entirely by robots, and consumers win. But then imagine the owner of the robots cranks X price to the consumer to $20. Suddenly it’s profitable for humans to make X again; entrepreneurs immediately start companies to make X with humans for price $10 again. Robots eat jobs in field X. What follows is that products get cheaper in field X, and the consumer standard of living increases necessarily. Based on that logic, arguing against robots eating jobs is equivalent to arguing that we punish consumers with unnecessarily higher prices. Indeed, had robots/machines not eaten many jobs in agriculture and industry already, we would have a far lower standard of living today.

Just as increases in consumer goods prices disproportionately hurt the poor, holding back on robots eating jobs would also disproportionately hurt the poor. The same logic applies to trade barriers (import tariffs): These disproportionately hurt poor consumers by inflicting higher consumer goods prices. Therefore, with rare exceptions, there won’t be states where robots eat jobs and products get more expensive. They almost always get cheaper.

A Recessionary Interlude. Progressive and smart economist Jared Bernstein has explored the productivity puzzle of robots eating all the jobs (or not). He points out that productivity growth was up 1% last year, and has averaged 0.8% since 2011. But what he really focuses on is the smooth trend that tracks through the numbers. The trend suggests that the pace of productivity growth has decelerated since the first half of the 2000s. That begs an important question that the robots-are-coming advocates need to answer: Why a phenomenon that should be associated with accelerating productivity is allegedly occurring over a fairly protracted period where the [productivity] trend in output per hour is going the other way? My own take. We’re still coming out of a severe macroeconomic down cycle, the credit crisis, deleveraging, and the liquidity trap. The prevailing pessimistic economic theories – the death of innovation, the crisis of inequality and yes, robots eating all the jobs – will fade with recovery.

(For bonus points, identify the other tech-driven economic force that could explain low productivity at a time of great tech advancement. My nomination — tech-driven price deflation lowers prices, reduces measured GDP and productivity, while boosting consumer welfare.) Thought experiment: Imagine a world in which all material needs are provided for free by robots and material synthesizers Housing, energy, health care, food and transportation —all delivered to everyone for free by machines. Zero jobs in those fields remain.

A planet of slackers you say? Not at all. Rather than nothing to do, we would have everything to do. Curiosity, artistic and scientific creativity have full rein resulting in new forms of status-seeking (!). Imagine 6 or 10 billion people doing nothing but arts and sciences, culture and exploring and learning. What a world that would be. The problem seems unlikely to be that we’ll get there too fast. The problem seems likely to be that we’ll get there too slow.

Andressen completely ignores that capital investment in automation has a profit motive, not a humanitarian one. Stick with me here. What would be the key characteristics of that world, and what would it be like to live in it? For starters, it’s a consumer utopia. Everyone enjoys a standard of living that kings and popes could have only dreamed of. Since our basic needs are taken care of, all human time, labor, energy, ambition, and goals reorient to the intangibles: the big questions, the deep needs. Human nature expresses itself fully, for the first time in history. Without physical need constraints, we will be whoever we want to be. The main fields of human endeavor will be culture, arts, sciences, creativity, philosophy, experimentation, exploration and adventure.

Utopian fantasy you say? OK, so then what’s your preferred long-term state? What else should we be shooting for, if not this? Finally, note the thought experiment nature of this. Let’s be clear, this is an extrapolation of ideas, not a prediction for the next 50 years! And I am not talking about Marxism or communism, I’m talking about democratic capitalism to the Nth degree. Nor am I postulating the end of money or competition or status seeking or will to power, rather the full extrapolation of each of those.

Why Do I Believe That? First, robots and AI are not nearly as powerful and sophisticated as I think people are starting to fear. With my venture capital and technologist hat

Arrogance (continues on 31)


28

F


F

rom the nature of my life, advanced infallibly in one direction and in one direction only. It was on the moral side, and in my own person, that I learned to recognise the thorough and primitive duality of man; I saw that, of the two natures that contended in the field of my consciousness, even if I could rightly be said to be either, it was only because I was radically both‌


The “Digital Native” a Profitable Myth Jathan Sadowski July 9, 2014 thebaffler.com/blog/2014/07/the_digital_native_a_profitable_myth

Technology buzzwords, although annoying, often seem innocuous enough. They’re just catchy and trite enough to bleed into common usage, but just misleading and obfuscatory enough as to discourage deep analysis. Two of the most widespread buzzwords and phrases to escape the tech world and infiltrate our general lexicon are the couplet “digital native” and “digital immigrant.”

economic differences, which way universal, and in some exist within the younger gener- cases apply only to a usually afations, and do so in a way that fluent minority of the so-called creates lucrative business op- “digital generation.” portunities for education gurus. The problem here is not just that Sweeping statements about tech- we turn people into caricatures nological aptitude overshadow when we paint them as a the actual differences in how — monolithic group of “digital and how well —people use dig- natives” who are more comital technologies. Proponents fortable floating in cyberspace assume a natural, generational than anywhere else. The larger baseline where one doesn’t issue is that, when we insist on necessarily exist. generalizing people into a wide category based on birth year When we take a look at the alone, we effectively erase the data and research, however, it stark discrepancies between acbecomes clear that the great cess and privilege, and between divide between “digital natives” experience and preference. By and “digital immigrants” is a glancing over these social difpuff of smoke —one that ob- ferences, and just boosting scures the actual differences new technologies instead, it that other factors (like socio- becomes easy to prioritize gadeconomic status, gender, education, gets over what actually benefits and technological immersion) a diverse contingent of people. play in digital proficiency. And those skewed priorities will be to the detriment of, say, less An academic article from The well off groups who still lack British Journal of Educational the educational resources necTechnology last year, which crit- essary to learn basic reading and ically examined the discourse writing literacy skills. around “digital natives,” found that “rather than being empiri- Even though claims about cally and theoretically informed, “digital natives” and “immigrants” the debate can be likened to don’t rely on much, if any, an academic form of a ‘moral empirical data or robust theory — panic.’” The authors found that and only gain legitimacy by the commonly made claims are stoking a sense of “moral largely under-researched or just panic” — there’s plenty at plain wrong when compared to stake for the cottage industry the data. For instance, computer of education technology (aka, and web access, as well as Ed-Tech) consultants. After all, activities like creating content if those dumb “digital immion the Internet (keeping a blog grant” teachers don’t do someor making videos), are in no thing now, then how can they

30

Unlike many buzzwords that and the Internet.” By contrast, buttress the latest, supposedly “Those of us who were not born world saving, innovations, their into the digital world but have, origins  —and the definitions at some later point in our lives, that have stuck with them  — become fascinated by and ad can be clearly traced to one opted many or most aspects point: A 2001 article written of the new technology are, by the education consultant and always will be compared Marc Prensky. The article exists to them, Digital Immigrants.” only to coin these dichotomous (The typical birthdate cited for labels. And based on the arti- “digital native” status is 1980 cle’s astronomically high num- and after.) Of course, Prensky ber of citations  —  8,748 and admitted, some immigrants are growing, as indexed by Google, going to adapt to their digital upon my latest check — it’s safe environment better than others; to say Prensky succeeded. but compared to the natives they will always, to some deAccording to Prensky, the arrival gree, retain an “accent.” Prenof digital technologies marked sky wrote as if “digital” is the a “singularity” that changes name of a country you’re either everything so “fundamentally” born into with citizenship, or that there’s “absolutely” no turn- try desperately to enter with ing back. This singularity has your virtual visa. caused a schism in the population, especially in Prensky’s For any buzzword, we should realm: education reform. ask what assumptions and generalizations using it obscure and “Students today,” Prensky wrote who benefits from its propagain 2001, “are all ‘native speak- tion. These two particular labels ers’ of the digital language are prime subjects for inquiry. of computers, video games In brief, they overlook socio-


Arrogance (continued from 27)

possibly educate the savvy “digital natives”? Ed-tech proprietors are ready to jump at the chance to consult and sell schools new learning techniques and technologies — such as video games, which is supposedly the medium most familiar and engaging to “digital natives.” That’s all fine, except for the fact that video games are not so universally played in the

One reason that so many people are so ready to fall for the rhetorical devices that give “digital native” and “digital immigrant” sticking power is that we’re all already primed to grant sanctified status to “digital” — to separate this current phase of development — out from the broad history of technology. I wonder, did people talk about “industrial natives” and “industrial immigrants”? Were kids who

did people talk about “industrial natives” and “industrial immigrants”? first place  —there’s a large gender gap, for instance  — and there’s inadequate understanding among educators about how to actually use video games to foster learning rather than just entertainment. But of course the originators and biggest perpetuators of the “digital native” discourse — the ed-tech gurus and hucksters — aren’t trying to make well-researched, academic claims. What matters most is that educators, school administrators, and parents believe there’s a drastic divide in need of a bridge. And that bridge is usually built with expensive seminars, consulting fees, and technologies.

grew up working in factories just “industrial natives” who were at home amongst the machines?

on I wish they were, but they’re not. There are enormous gaps between what we want them to do, and what they can do.

goods markets – handmade highend clothes. This will extend out to far more consumers in the future.

What that means is there is still an enormous gap between what many people do in jobs today, and what robots and AI can replace. And there will be for decades.

Fourth, just as most of us today have jobs that weren’t even invented 100 years ago, the same will be true 100 years from now. People 50, 100, 150, 200 years ago would marvel at the jobs that exist today; the same will be true 50, 100, 150, 200 years from now.

Second, even when robots and AI are far more powerful, there will still be many things that people can do that robots and AI can’t. For example: creativity, innovation, exploration, art, science, entertainment, and caring for others. We have no idea how to make machines do these. Third, when automation is abundant and cheap, human experiences become rare and valuable. It flows from our nature as human beings. We see it all around us. The price of recorded music goes to zero, and the live music touring business explodes. The price of run-of-the-mill drip coffee drops, and the market for handmade gourmet coffee grows. You see this effect throughout luxury

We have no idea what the fields, industries, businesses, and jobs of the future will be. We just know we will create an enormous number of them. Because if robots and AI replace people for many of the things we do today, the new fields we create will be built on the huge number of people those robots and AI systems made available. To argue that huge numbers of people will be available but we will find nothing for them (us) to do is to dramatically short human creativity. And I am way long human creativity. x

Obsession (continued from 14) The “digital native” and “digital immigrant” buzzwords can describe the world only insofar as they describe a world that would benefit the Ed-Tech profiteers. As with any other fetishized innovations, it’s worth keeping in mind that our initial introduction to (and understanding of) new technologies tend to come directly from the very people who stand to reap the profits from them. That alone is reason enough to be skeptical. x

nine out of ten never earn a return.) They work a year here, a few months there — zany hours everywhere. They wear jeans and sneakers and ride scooters and share offices and sprawl on couches like Great Danes. Their coffee machines look like dollhouse-sized factories. They are told that they should be reckless and ruthless. Their investors, if they’re like Josh Linkner, tell them that the world is a terrifying place, moving at a devastating pace. “Today I run

a venture capital firm and back the next generation of innovators who are, as I was throughout my earlier career, dead-focused on eating your lunch,” Linkner writes. His job appears to be to convince a generation of people to learn remorselessness. Forget rules, obligations, your conscience, loyalty, a sense of the commonweal. If you start a business and it succeeds, Linkner advises, sell it and take the cash. Don’t look back. Never pause. Disrupt or be disrupted.

x


32 W

hat is it, what nameless, inscrutable, unearthly thing is it; what cozening, hidden lord and master, and cruel, remorseless emperor commands me; that against all natural lovings and longings, I so keep pushing, and crowding, and jamming myself on all the time; recklessly making me ready to do what in my own proper, natural heart, I durst not so much as dare? Is Ahab, Ahab? Is it I, God, or who, that lifts this arm? But if the great sun move not of himself; but is as an errand-boy in heaven; nor one single star can revolve, but by some invisible power; how then can this one small heart beat; this one small brain think thoughts; unless God does that beating, does that thinking, does that living, and not I. By heaven, man, we are turned round and round in this world, like yonder windlass, and Fate is the handspike.



Why futurologists are always wrong And why we should be sKeptical of techno-utopians Bryan Appleyard April 10, 2014

and social paralysis, not an obvious feature of the Facebook generation. Most inaccurate of all was Paul R. Ehrlich who, in The Population Bomb, predicted that hundreds of millions would die of starvation in the 1970s. Hunger, in fact, has since declined quite rapidly. Perhaps the most significant inaccuracy concerned artificial intelligence (AI). In 1956 the polymath Herbert Simon predicted that “machines will be capable, within 20 years, of doing any work a man can do” and in 1967 the cognitive scientist Marvin Minsky announced that “within a generation … the problem of creating ‘artificial intelligence’ will substantially be solved.” Yet, in spite of all the hype and the dizzying increases in the power and speed of computers, we are nowhere near creating a thinking machine.

34

newstatesman.com/culture/2014/04/ why-futurologists-are-always-wrong-and-why-we-should-be-sceptical-techno-utopians

In his book The Future of the Mind, the excitable physicist and futurologist Michio Kaku mentions darpa, the American Defense Advanced Research Projects Agency, the body normally credited with creating, among other things, the internet. It gets Kaku in a foam of futurological excitement.“The only justification for its existence is … ” he says, quoting darpa’s strategic plan, “… to ‘accelerate the future into being’.” This isn’t quite right (and it certainly isn’t literate). What darpa actually says it is doing is accelerating “that future into being,” the future in question being the specific requirements of military commanders. This makes more sense but is no more literate than Kaku’s version. Never mind; Kaku’s is a catchy phrase. It is not strictly meaningful —  the future will arrive at its own pace, no matter how hard we press the accelerator —but we know what he is trying to mean. Technological projects from smartphones to missiles can, unlike the future, be accelerated and, in Kaku’s imagination, such projects are the future. Meanwhile, over at the Googleplex, the search engine’s head-

quarters in Silicon Valley, Ray Kurzweil has a new job. He has been hired to “work on new projects involving machine learning and language processing.” ~~~~~~~~~~~~~~~~

OK, Kurzweil doesn’t say he is religious but, in reality, his belief system is structurally identical to that of the Southern hot gospellers who warn of the impending “Rapture,” the moment when the blessed will be taken up into paradise and the rest of us will be left to seek salvation in the turmoil of the Tribulation before Christ returns to announce the end of the world. Kurzweil’s idea of “the singularity” is the Rapture for geeks —in this case the blessed

will create an intelligent computer that will give them eternal life either in their present bodies or by uploading them into itself. Like the Rapture, it is thought to be imminent. Kurzweil forecasts its arrival in 2045. Kaku and Kurzweil are probably the most prominent futurologists in the world today. They are heirs to a distinct tradition which, in the postwar world, has largely focused on space travel, computers, biology and, latterly, neuroscience.

Futurologists are almost always wrong. Indeed, Clive James invented a word — “Hermie” —  to denote an inaccurate prediction by a futurologist. This was an ironic tribute to the cold war strategist and, in later life, pop futurologist Herman Kahn. It was slightly unfair, because Kahn made so many fairly obvious predictions —mobile phones and the like — that it was inevitable quite a few would be right.

Such a machine is the basis of Kurzweil’s singularity, but futurologists seldom let the facts get in the way of a good prophecy. Or, if they must, they simply move on. The nightmarishly intractable problem of space travel has more or less killed that futurological category and the unexpected complexities of genetics have put that on the back burner for the moment, leaving neuroscientists to take on the prediction game. But futurology as a whole is in rude health despite all the setbacks.

Why? Because there’s money in it; money and faith. I don’t just mean the few millions to be made from book sales; nor do I mean the simple geek belief in gadgetry. And I certainly Even poppier was Alvin Toffler, don’t mean the pallid, undefined, with 1970’s Future Shock, which pop-song promises of politisuggested that the pace of cians trying to turn our eyes technological change would from the present —Bill Clinton’s cause psychological breakdown “Don’t Stop Thinking About


Astrophysics run on the model of American Idol is a recipe for civilisational disaster. Tomorrow” and Tony Blair’s “Things Can Only Get Better.” No, I mean the billions involved in corporate destinies and the yearning for salvation from our human condition. The future has become a land-grab for Wall Street and for the more dubious hot gospellers who have plagued America since its inception and who are now preaching to the world. Take the curious phenomenon of the Ted talk. Ted  (Technology, Entertainment, Design)  is a global lecture circuit propagating “ideas worth spreading”. It is huge. Half a billion people have watched the 1,600 Ted talks now online. Yet the talks are almost parochially American. Some are good but too many are blatant hard sells and quite a few are just daft. All of them lay claim to the future; this is another futurology land-grab, this time globalised and internetenabled. ~~~~~~~~~~~~~~~~ Benjamin Bratton, a professor of visual arts at the University of California, San Diego, thinks the Ted style evades awkward complexities and evokes a future in which, somehow, everything will be changed by technology and yet the same. The geeks will still be living their laid-back California lifestyle because that will not be affected by the radical social and political implications of the very technology they plan to impose on societies and states. This is a naive, very local vision of heaven in which everybody drinks beer and plays baseball and the sun always shines.

The reality, as the revelations of the nsa’s near-universal surveillance show, is that technology is just as likely to unleash hell as any other human enterprise. But the primary Ted faith is that the future is good simply because it is the future; not being the present or the past is seen as an intrinsic virtue.

it [online] in the first place.” So Google elects itself supreme moral arbiter.

Tribalism in the new digital age will increase and “disliked communities” will find themselves marginalized. Nobody seems to have any oversight over anything. It is a hellish vision but the point is that Bratton, when I spoke to him, it is all based on the assumption described some of the futures that companies such as Gooon offer as “anthrocidal”  —  gle will get what they want — indeed, Kurzweil’s singularity is   absolute and unchallengeable often celebrated as the start of access to information. a “post-human” future. We are the only species that actively ~~~~~~~~~~~~~~~~ pursues and celebrates the possibility of its own extinction. One last futurological, landgrabbing fad of the moment ~~~~~~~~~~~~~~~~ remains to be dealt with: neuroscience. It is certainly true For it is clear from The New that scanners, nanoprobes Digital Age (written by Google’s and supercomputers seem to executive chairman, Eric Schmidt) be offering us a way to invade that politicians and even sup- human consciousness, the final posedly hip leaders in business frontier of the scientific enterwill have very little say in what prise. Unfortunately, those leading happens next. The people, of us across this frontier are course, will have none. Basically, dangerously unclear about the most of us will be locked in meaning of “scientific.” to necessary digital identities and, if we are not, we will be ~~~~~~~~~~~~~~~~ terrorist suspects. Privacy will become a quaint memory. “If Yet neuroscience —as in Michio you have something that you Kaku’s manic book of predon’t want anyone to know,” dictions —is now one of the Schmidt famously said in 2009, dominant forms of futurolog“maybe you shouldn’t be doing ical chatter. We are, it is said,

on the verge of mapping, modeling and even replicating the human brain and, once we have done that, the mechanistic foundations of the mind will be exposed. Then we will be able to enhance, empower or (more likely) control the human world in its entirety. This way, I need hardly point out, madness lies. The radicalism implied, though not yet imposed, by our current technologies is indeed as alarming to the sensitive as it is exciting to the geeks. Benjamin Bratton is right to describe some of it as anthrocidal; both in the form of “the singularity” and in some of the ideas coming from neuroscience, the death of the idea of the human being is involved. If so, it is hard to see why we should care: the welfare of a computer or the fate of a neuroscientifically-specified zombie would not seem to be pressing matters. What matters is what our current futurologies say about the present. At one level, they say we are seriously deluded. As Bratton observes, the presentational style of Ted and of Gladwell involves embracing radical technologies while secretly believing that nothing about our own cherished ways of life will change; the geeks will still hang out, crying “Woohoo!” and chugging beer when the gadgets are unveiled. x

It seems to be the fate of man to seek all his consolations in futurity. The time present is seldom able to fill desire or imagination with immediate enjoyment, and we are forced to supply its deficiencies by recollection or anticipation. Dr Johnson, 1752


36


dialectic re-occurence


Reconstitution

(continued from 21)

what we’ve undergone. Things happen to us but without us, because of their speed — they slip right past the part of us we think of as ourselves. ~~~~~~~~~~~~~~~~

we’re together, you ignore the calls of others and give me your complete attention.” This approach has been hard to explain to my parents. My mother answers the telephone no matter when it rings. She is trying to keep up, bottle after bottle: if she misses one, she knows the conveyor belt won’t pause.

Noticing takes time, plus a desire, plus an inclination, especially in a world where there is too much to see: To consider something deeply a dozen other things must be concertedly ignored.

38

Noticing takes time, plus a desire, plus an inclination, especially in a world where there is too much to see: To consider something deeply a dozen other things must be concertedly ignored.

Imagine a pair of girls working on assembly line at a bottling plant. When they interact, they miss a bottle, and once they’ve missed the first, all goes to bedlam; bottles pile up like something out of vaudeville. The machine is bigger­ — it chooses the pace. Trying to keep up turns into slapstick.

The 21st century is bigger and faster than we are. To consider a single bottle, we must be content to miss many others, which feels irresponsible: we’ve been trained to keep up. Concentration makes us different to the point of eccentricity. And So I Started To Feel Eccentric, And To Appear Eccentric.

A friend said, “I don’t mind that you fail to answer my calls; when

I’ve taken to reading fiction from the American 1950s, that foreign place where living rooms match and poem-lines are written from scratch, not borrowed; where small things happen with heavy resonance because a person, if he is so inclined, has the time to interact with them, to determine their weight or decide to lend them such. How strange collage art must have seemed back then, how exciting, how beautiful, which is how the 1950s feel to me. The refrain of the contemporary world seems to be: I didn’t realize he meant it as a joke. He was being serious. She was actually mad. She meant it ironically —  I can’t derive tone from an email.

Emails are written quickly, so that parts of grammar are sacrificed, the parts of grammar that help us determine how a message should be interpreted; and

they’re consumed quickly, so the time required to gel their meaning is truncated.

We send and receive an incredible volume of correspondence, but what of it? Wouldn’t a fraction of the number of messages, prepared and consumed with care, ultimately succeed at communicating more? Aren’t We Just Decorating Our Screens With Meaningless Symbols, Filling Our Time With The Scanning Of Them, And Loading Our Conveyor Belt With Useless Bottles?

Said Rauschenberg, “The artist’s job is to be a witness to his times.” But when artistic techniques escape the museum, artist and times became jumbled; movements and counter-movements, past and present weave themselves to a thing so big it becomes invisible, so intricately sewn its stitches disappear, and which, a huge collage itself, is a different thing at different distances, until the museum is everywhere, until there aren’t museums. Without our best attention, the mess can own us. We own the mess when we consider it, like collages, from various distances, with our most thoughtful scrutiny, the kind we save for art. x

The 21st century is bigger and faster than we are. we must be content to miss (things), which feels irresponsible. Concentration makes us different to the point of eccentricity.



40


Issuu converts static files into: digital portfolios, online yearbooks, online catalogs, digital photo albums and more. Sign up and create your flipbook.