April 8, 2015

Page 1

April 8, 2015

Est. 1948

Volume 67 | Issue 9

the pace press thepacepress.org

Jon Huntsman would’ve been President if our democracy weren’t broken KOBE Y. JACOBS Staff Writer As the din surrounding the 2016 primaries pipes up it gives us a chance to reflect on the structural deficiencies on our politics. By this I mean that when you hear complaints of gridlock in Washington and partisanship grinding the wheels of our 200+ year old democracy to a halt, please remember that our politicians did not suddenly decide to wreck havoc on a legislative process that had worked smoothly to this point. This is a structural deficiency in the system that can be altered. The explanation you often hear is that the voters are becoming more polarized as media outlets become more numerous and, thus, specialized (and by specialized I mean partisan). You will hear people lament the days where there were three broadcast channels and Walter Cronkite would report the straight facts because he had no choice as the central source of news for most Americans. Granted, this is a perfectly legitimate argument––I have even made it myself. However, what you come to realize is that there is no going back to that––Fox News and MSNBC, or their likenesses, are here to stay as polarizing forces. But what is important about that is not just that they are polarizing, but who it is they are polarizing. The central argument of the book “Culture war?: The Myth of a Polarized America” is that in spite of the data indicating increasing polarization, it is what they refer to as the “political class” that is polarized. This political class refers to officeholders, political journalists, and anyone those who “constitutes the public face of politics in contemporary America.” Thus, the political class and not the

continued on PAGE 3



before the word existed

Monica Lewinsky owns it. continued on PAGE 8


INSIDE @thepacepress




Yes, again: White officer shoots and kills unarmed black man Page 5

What Kendrick Lamar’s album does different

What “The Dress” really showed us about pop culture. Page 8


Page 6





April 8, 2015


THE PACE PRESS EDITORIAL BOARD Erick Mancebo Editor-In-Chief Christian Gomez Executive Editor Marc Saggese Advertising Manager Maximilliano Onofre News Editor Melissa Vargas Arts Editor Courtney Michelle Johnson Features Editor Brianda Agramonte Distribution Manager Dr. Stephanie Hsu Faculty Advisor

The Pace Press is the student newspaper of Pace University’s New York City Campus. It is managed and operated entirely by members of the student body as it appears above. The Pace Press welcomes guest editorials and letters from students, faculty, administration and staff. The Pace Press reserves the right to not publish any submitted material, both solicited and unsolicited. All submissions must include the author’s full name and contact information. The Pace Press 41 Park Row, Rm. 902 New York, NY 10038 thepacepress.org editor@thepacepress.org Copyright 2015

DR. LEE EVANS Faculty Contributor I recently showed a one-hour Young People’s Concert film to my Fundamentals of Music class at Pace University on the subject What Is An Interval?, narrated by, and featuring the New York Philharmonic Symphony Orchestra conducted by, Leonard Bernstein. By way of introduction to the film I asked my class of twelve students if they knew who Bernstein was. Not one student recognized his name! But when I told the class that among his many accomplishments he had composed the music for the Broadway show West Side Story, I finally detected a slight stirring of recognition. This revelation got me to thinking that in their previous schooling these young college-age students had been ill-served when it came to the subject of music education, or perhaps I should say its glaring absence from many high school curricula. It seems that somewhere along the line certain educational planners came to the unfortunate conclusion that music education was not at all crucial to one’s intellectual and emotional development. Did it all boil down to money, I wondered? Arts, you see, have always been among the first subjects to be shortchanged in a financial crunch. AN AESTHETICS COMPONENT TO THE CORE CURRICULUM It has been the wisdom of enlightened educational experts over many decades that there should be an aesthetics component to the core curriculum – music, art, or some other cultural area – because it was deemed vital to intellectual growth and maturity for students to be exposed to and learn about at least one area of the incomparable cultural heritage of our world. These experts also believed that a college education should not merely be a training ground for future jobs, but an essential opportunity for students to develop their intellectual and aesthetic capabilities in order to bring greater depth and meaning to their life. I feel that students have a responsibility to themselves to enhance their life experience by learning about, by developing insights into, and by acquiring an appreciation of history’s great musical art, in order to further themselves intellectually and aesthetically, and then pass on the benefits of that growth to their children so that their world will be enriched as well. POP CULTURE Many of today’s young men and women are steeped in pop culture and pop music, often to the almost total exclusion of any other musical genre, and moreover were deprived of meaningful aesthetic education in the lower grades due to inadequate school systems and budget cuts that resulted in the downplaying, or even virtual elimination, of arts programs in pre-college educational institutions throughout the country. Important composers and musicians have enriched the human condition with their magnificent creations and innovative musical ideas. Among them have been such highly significant musical figures as American-born Leonard Bernstein, known to many as “Lenny.” WHO WAS LEONARD BERNSTEIN? Leonard Bernstein (1918-1990) was the leading musical icon of the latter half of the 20th century, a protean musician and the single most consummate teacher of music worldwide. He chose to attend Harvard as an undergraduate, rather than study at a music conservatory, in order to receive a broader education that would include philosophy, linguistics, etc. It wasn’t until he became a graduate student that he entered the Curtis Institute in Philadelphia,

one of the world’s leading music conservatories, and it was there that he studied conducting with Fritz Reiner, one of history’s famous and most respected symphony orchestra conductors. David Hurwitz, author of the book Bernstein’s Orchestral Music (Amadeus Press, 2011), says of Bernstein: “His personality was so huge, so voracious and so very public, that it’s quite easy to lose the music in considering the man.” Anyone watching his many Young People’s Concerts television broadcasts, in which he taught various aspects of music from classical to jazz, will also surely come away sharing that assessment. His six classic Norton lectures at Harvard in 1973, still available for purchase on DVDs under the title The Unanswered Question, also strongly bear out Hurwitz’s statement. NEW YORK PHILHARMONIC SYMPHONY ORCHESTRA Bernstein was appointed Assistant Conductor of the New York Philharmonic Symphony Orchestra in 1943. Assistant conductors must learn the music for each concert as thoroughly as a conductor knows it, in order to be ready to step in at the last minute in the event that the regular conductor should become indisposed, which hardly ever happens. However, one day that same year, conductor Bruno Walter became ill at the eleventh hour, and on a nationally broadcast concert from NYC’s Carnegie Hall Bernstein was asked to fill in – without rehearsal! Bernstein made a sensational debut that brought the house down, and this event became the featured story on the front page of the NY Times the next day. His subsequent brilliant career in music was now assured and in short order Bernstein became America’s leading musical figure. He eventually was appointed the official conductor of that august orchestra, serving in that capacity during the period 1958 -1969, and as Laureate Conductor during the period 1969-1990. Overall, he conducted 1,244 concerts and made more than 200 recordings with the orchestra. BERNSTEIN AS COMPOSER By general consensus, Bernstein’s most successful original music was written for the theater, rather than his pieces in larger abstract forms such as his three symphonies, although they too are theatrical in quality. Bernstein himself says, in his Preface to the score of his Symphony No. 2: “If the charge of ‘theatricality’ in a symphonic work is a valid one, I am willing to plead guilty. I have a deep suspicion that every work I write, for whatever medium, is really theater music in some way…” One of the best film scores ever written was Bernstein’s score for the 1955 film On The Waterfront, starring Marlon Brando, Rod Steiger and Lee J. Cobb. His 1956 operetta/Broadway musical Candide and his 1957 West Side Story were Broadway musical milestones. His current Broadway musical revival, On The Town is a huge hit. Other noteworthy works of his also recommended for listening, in order for one to develop a better familiarity with his oeuvre, include his 1971 Mass: a Theater Piece for Singers, Players and Dancers, his 1949 Prelude, Fugue and Riffs for clarinet and jazz ensemble, and his 1965 Chichester Psalms for choir, male alto and orchestra. END NOTE Bernstein’s father was very much against his son becoming a professional musician. Lenny nevertheless persisted and soon became the world’s most famous and respected musician. In later years, his father was cheekily asked, “Well, what do you think now of your son and his great success?” He replied: “How did I know he would become LEONARD BERNSTEIN?”

Lee Evans, Ed.D., is a professor of music at NYC’s Pace University. He is the author of over 100 music books in the U.S., 38 in Japan, and 2 in the former Soviet Union. His most recent book is the acclaimed Crash Course In Chords (Hal Leonard Publishing), a foundation theory and performance workbook for pianists and non-pianists alike. For additional information, please visit www.leeevansjazz.com.


April 8, 2015



How campaign finance and the primary system drive our elections to partisan extremes KOBE Y. JACOBS Staff Writer

continued from FRONT PAGE “mass public” are the ones who are polarized; the people they polarize are the extreme voters, not the “mass public.” (If you are interested, the book supports this with a great deal of empirical data.) How then does the mass public, which is more moderate than we imagine, elect an extreme political class? That is where the structural deficiency comes in. The answer is Jon Huntsman. I am aware the majority–– if not the totality––of my readership has no clue who this is, but if you bear with me you will come to learn of his importance. Huntsman is the former Governor of Utah, former U.S. ambassador to China, and 2012 Republican primary candidate. You have never heard of him because he got very small proportions of the vote and didn’t have enough funding to continue his race––a point we shall return to. What is his importance? In the most objective sense possible, I will argue he is the ideal presidential candidate for America. (Before you accuse me of advertising my personal preferences, I will have you know I say this despite the fact that he and I do not share the same politics.) His policy positions––though certainly right of center–– are fairly moderate. Following his 2012 run, he became a national leader for the group No Labels, a group founded for the purpose of commonsense, nonpartisan solutions to pubic policy problems. Furthermore, he faithfully served the ambassadorship even when the president––President Obama––was of the opposing party. He, thus, has a proven track record of commonsense, nonpartisan thinking, which is exactly what the country is in need of amidst our partisan crisis. (Additionally, he is fluent in Mandarin and was the former U.S. ambassador to China, making him well positioned to improve diplomatic relations with the world’s other economic superpower.) So how does he answer the question posed at the end of the last paragraph?

He represents a well-qualified, moderate candidate, who, under a well functioning democracy, should have done well enough in the last election for you to know who he is. Because this is what he represents, I can use him to show you that our political system is structurally skewed towards creating a partisan political class. Though there are more elements of our system than I have space to discuss, I shall pick two––campaign finance and the primary system. I mention campaign finance first because it comes first chronologically. It Is what is referred to as the “invisible primary,” or the money primary. The authors of The Party Decides define it as, “a long-running national conversation among members of each party coalition about who can best unite the party and win the next presidential election.” This process revolves around the elite donors, because candidates who don’t have the support of the establishment (or “party coalition”) donors won’t make it very far. (This has become far more prevalent since the Citizens United decision essentially eliminated legal political donation limits. In 2012, .000042%, or 132 Americans, donated sixty percent of the SuperPac money. These are the party elites we refer to in the invisible primary.) In this system, well-qualified moderates, like our Jon Huntsman, are disqualified before the race even begins. Extreme donors––and you have to be pretty extreme to be willing to donate such large sums to a presidential candidate––won’t push for a candidate who they see as willing to compromise with the opposition. They also are looking for someone who will appeal to extreme voters because that is who the primaries attract. Which brings us to our next point: the primary (or caucus) system, as is, attracts only the most fervent supporters of a political party. This is who candidates need to appeal to in order to even have a chance at getting to the general election. Because the first states are the most important (if you do not succeed in the first few states you are doomed because donors will pull support for a

Pace Press pitches for writers now available online! Faster, more efficient. visit: thepacepress.org/online-pitches/

candidate who does not show early promise), candidates who are extreme will fare better early on. This maintains the vicious cycle of donors skewing candidates to the extremes. (Not to mention the fact that this gives undue influence in voters whose states vote early. This results in a process called front loading, or states competing to move their primaries to the earliest date.) Therefore, if the hypothesis of the book mentioned above, Culture War, is correct, and the “mass public” is more moderate than we believe, the reason we end up with a polarized political class is because the early stages of our nomination process––the invisible primary and the actual primaries (or caucuses)––are structurally designed to present the mass public with extremist choices rather than moderates like Jon Huntsman. Jon Huntsman, who is representative of the moderate that we need to unite our country through common sense policy and compromise, is widely unknown because the system is built to brush aside people who do not satisfy the extreme donors and voters. So how might we remedy this? If the problem is that the extreme political class presents the moderate mass public with extremist options, then we need to create a system that engages the mass public early on––that is, before they have no choice. Of the variety of suitable proposals to amend campaign finance, Grant and Franklin Project promoted by Harvard Law professor Lawrence Lessig (which you can find more information about by Googling and clicking the top result) offers the greatest chance of engaging the mass public. Basically, since each citizen is allotted the same amount of money in the form of a voucher, the plan incentivizes politicians to engage as many ordinary citizens (i.e. the “mass public”) as possible in order to raise enough money. If they do not engage both the polarized voters and mass public they risk leaving large sums of money untapped––something no politician would do. With respect to the primary system, might I suggest a single, nationwide primary (in order to remove the undue influence of the first states) that pools all candidates into an alternative vote (i.e. automatic-runoff) system. This would incentivize voters to take risks on candidates that do not seem to be widely supported by the establishments because their votes would not go to waste if their candidate of choice did not win. It would be reallocated to their second favorite choice. The great irony is that polarization is likely to block any proposal that would restructure our system in order to reduce polarization and gridlock. There is a high standard of consensus for constitutional amendments within either of the two avenues available. If Congress is currently having trouble building consensus on formerly routine procedures, it is certainly not in a position to build consensus on major democratic reforms. The other option, a constitutional convention, or Article V convention, is an untested, and loosely defined method that has equally, or more stringent thresholds. As it turns out, in order to fix the system we must work through the broken system. However, in the words of political journalist Ezra Klein, “Pessimism shouldn’t be considered fatalism. And impossible fights have been won before.”

DISCLAIMER: These opinions are expressed by contributors (students, faculty, administration and staff) to The Pace Press. These opinions are solely those of the individual writers and do not reflect the opinions of The Pace Press, the members of The Pace Press staff or Pace University. The Pace Press is not responsible and expressly disclaims all liability for damages of any kind of arising out of use or relevance to any information contained in this section.




April 8, 2015

Jon Stewart replacement faces immediate backlash SARAH HARTZELL Editorial Intern


When Jon Stewart announced in February he’d be leaving “The Daily Show” after sixteen years at the helm, viewers immediately started speculating as to who would fill the shoes of the revered late-night host. Of all the names thrown around, the one chosen was certainly not who most people expected: Trevor Noah, the South African comedian who joined the show in December. Noah has only appeared on the show three times, significantly fewer than other Daily Show correspondents like Samantha Bee, Jason Jones and Jessica Williams, all of whom were rumored to be Stewart’s successor. Though relatively unknown in the United States, Noah is a well-known comedian in his home country of South Africa and abroad. He has hosted a variety of radio and television programs, including his own late-night show, “Tonight with Trevor Noah.” Since coming to America, he has appeared on “Late Show with David Letterman” and “The Tonight Show with Jay Leno”— the first South African to do so— before becoming a contributor for “The Daily Show.” He has also had a stand-up special on Showtime, “Trevor Noah: African American.” Though only appearing on “The Daily Show” a handful of times, his commentary has made headlines every time. He criticized America’s reaction to the Ebola crisis, Boko Haram and competitive chess players, from the perspective of a South African national. Since the announcement was made on March 30, Noah has made headlines for less positive reasons. Several of the comedian’s tweets, some dating as far back as 2009, have been making the rounds online for their anti-Semitic and misogynistic content. Noah responded to the controversy by tweeting, “To reduce my views to a handful of jokes that didn’t land is not a true reflection of my character, nor my evolution as a comedian.” Comedy Central also released a statement standing by its choice: “Like many comedians, Trevor Noah pushes boundaries; he is provocative and spares no one, himself included. To judge him or his comedy based on a handful of jokes is unfair. Trevor is a talented comedian with a bright future at Comedy Central.”

Noah’s appointment is the latest in a major shakeup to the late-night lineup, making him the sixth new host in just over a year. Stephen Colbert left his Comedy Central show “The Colbert Report” in December to take over for Letterman on “Late Show.” Colbert’s time slot was filled by “The Daily Show” alum Larry Wilmore under “The Nightly Show.” Wilmore’s appointment was significant as it made him the first person of color to host a major late-night program since Arsenio Hall. Noah’s continuation of this progress is significant in its own right. “He is a great comedian and it’s welldeserved,” says Steve Hines Williams, “But it’s also a note of the change in our culture. We have two African-Americans doing nightly shows back-to-back.” The rising diversity on television is undeniable and the trend has certainly become apparent in Comedy Central’s lineup. “The Nightly Show” regularly discusses race and other controversial issues and has a diverse panel of guests in each show. While “The Daily Show” appears to be bringing a change in perspective as it enters its landmark 20th season, the same level of diversity does not seem to be extending to network talk shows. Despite the number of hosting changes, all five late-night shows on ABC, CBS, and NBC are still hosted by white men. Though many viewers clamored for a female replacement for Leno, Letterman, Colbert or Stewart, none of the openings were filled by women. Nevertheless, Noah’s appointment as the third host of “The Daily Show” has drawn mixed reactions from fans. “I’d never heard of him before it was announced that he was a replacement,” says freshman Nina Tandilashvili. “I was sort of confused and unsure what made him qualified, especially as someone rather unknown. I read more about him and found him to be a really interesting guy who can probably bring a more worldly perspective on things... I’m looking forward to him taking over. Not all fans are so optimistic, though. “No one can replace Jon Stewart and do what he did,” Koula Von Hoppe said. A date has not yet been set for Noah’s debut, but it is likely he will take over in late 2015 or early 2016.

Pace Law School launches tuition-matching program NATALIE CONDRILLO Editorial Intern Earlier this month, Pace Law School launched a tuition-matching program, the first in the nation to do this. An education in law has definitely become a big investment, as law degrees come with very heavy price tags. Pace Law, the University’s own law school in White Plains, is one of the most affordable, yet still comes at a very steep cost. The program will allow law students access to a more affordable law education, saving them tens of thousands of dollars. This program will become effective for the 2015 – 2016 academic year. The school has also placed a tuition freeze, meaning costs will not rise at all next year. Currently, the tuition for Pace Law is around $45,000. Higher education, in general, is an expensive route to pursue. Law schools in the New York area can reach over sixty

thousand dollars in tuition alone, even more if you are pursuing a private education. This tactic to lower rates is also due to the fact that not many students are eager to enter the legal field. The attraction of law school in general is currently very low, tarnished by the saddled debt and scarce, competitive jobs. Many schools have already lowered LSAT score requirements, offered online courses and implemented other incentives in order to get more applicants. Pace Law School has also increased financial aid scholarships and grants, making a continuous effort to make a onein-a-lifetime legal opportunity into an affordable one. The school also has loan forgiveness for graduates who work in public service. “The affordability of education has become a critical issue and this program is a unique approach to making a first-rate legal education more accessible to students across the country,” declared Dean David Yassky.



April 8, 2015



Officer charged with murder in shooting death of man

The New York Times ERICK MANCEBO Editor-In-Chief

The New York Times

A video that surfaced Tuesday night showed a white North Charleston, S.C. police officer shooting and killing an unarmed black man as he ran from the officer. The micro-blogging service Twitter erupted with links to the video Tuesday night, as the officer—Michael T. Slager, 33—was charged with murder, his earlier version of the events having been deemed false. Slager previously said he “feared for his life because the man had taken his stun gun in a scuffle after a traffic stop on Saturday,” according to the New York Times. The video instead shows a brief physical interaction, the suspect—Walter L. Scott, 50—running and the officer pointing his gun and firing off eight shots. As Scott falls to the ground, Slager picks up something from the ground near the initial point of interaction, and drops it near Scott’s body. Police reports also indicate that officers performed CPR and first aid on Scott, another point disputed by the video evidence. The shooting, of course, follows a year of extremely high profile

and volatile cases of white officers killing black men, with varying responses; from the militarized and violent clashes in Ferguson, Mo. following Mike Brown’s death, to the relatively peaceful protests against the no-indictment in the case of the NYPD officer who used an illegal chokehold, resulting in the death of Eric Garner, a New Yorker. North Charleston Mayor Keith Summey announced the murder charges at a Tuesday night press conference, saying, “...If you make a bad decision, don’t care if you’re behind a shield or just a citizen on the street, you have to live by that decision.”

Selfie-obsessed generation can’t tell whether selfie in front of tragic explosion site is inappropriate EMMA TAUBENFELD Contributor The “Village Idiots” headline filled the front page of the Sunday edition of the New York Post last Sunday, publicly and unapologetically shaming the tourists who decided that taking a selfie at the East Village explosion site where two died was appropriate. The scathing reports and attacks on those who seemingly lost all sense of respectability in times of tragedy continued throughout the week on social media, as those who continued posting selfies from the explosion site incurred the wrath of New Yorkers. From the rise of oversharing social media sites such as Snapchat and Instagram, the selfie has become an abundant part of our culture. The word “selfie” has even recently become a permanent addition to the English dictionary. It has become socially acceptable to take selfies anywhere from school, to your favorite coffee shop or to your living room couch. But these selfies are more than just a spark for outrage, they are the intersection of our sharing-driven social environment with our fascination with—or desensitization from—gripping and tragic news. To the generation that watched the Twin Towers burn and crumble from their elementary school classrooms, it seems that anywhere is suitable for a selfie: even tragic events such as funerals or memorials. That same recent, tragic event of the gas explosion that occured in the East Villa made an appearance on Snapchat’s New York City “story”—the app’s assemblage of user-generated content for a specific geographic marker. Photos and videos of the fire, police and fire vehicles and large crowds of people all gawking at the sight with camera phones out filled the stream of photos, providing arguably an important informational purpose. But there is a fine line between documenting a tragic event and snapping a picture for one’s own shallow benefit, such as having a unique photo to upload to Instagram or having an original shot to attach with a heartwarming post about this disastrous event on Facebook. The popularity of funeral selfies has been rapidly growing as people are snapping a quick picture of themselves featuring the coffin of a loved one in the background or crying relatives.


There is even a blog on the popular blogging website, Tumblr, entitled Selfies at Serious Places featuring photos of people at the 9/11 memorial, Holocaust memorials including the infamous Auschwitz, and even at Pearl Harbor. Some individuals seem to be a bit confused as to what circumstances are appropriate to take pictures of themselves. President Obama was even caught in the act of taking a selfie at the Nelson Mandela memorial with Danish Prime Minister Helle Thorning-Schmidt and British Prime Minister David Cameron. University student Cheyenne Larose says, “I don’t think people should be taking pictures at these places because these are places meant for respect and not for tourism. I see it all the time at the 9/11 memorial.” Taking selfies has been associated with narcissism, according to researchers. That doesn’t mean that anyone who takes selfies has a narcissistic personality disorder, but frequent selfie takers have scored higher when measured for narcissism. In a way, we are self-objectifying ourselves. People who post many pictures of themselves also scored higher with anti-social skills. These social media sites that are believed to connect people are actually causing society to lack social skills. It comes down to just one person and a screen. Taking photos at tragic events takes the focus away from the person that is supposed to be respected and places it on yourself.




April 8, 2015

Review: Kendrick Lamar changes hip-hop. Again. ALVARO GAMBOA Contributor With his sophomore release “To Pimp A Butterfly,” Kendrick Lamar paints a picture of the troubles of today’s society on the canvas he knows best. This 16-track body of work tackles hot button issues such as racial discrimination, the institutionalization of young black males and police brutality. The album’s second single, “The Blacker The Berry,” was directly inspired by the killing of Trayvon Martin. “To Pimp A Butterfly” also takes a look at the inner thoughts of Kendrick Lamar that lead through the dark thoughts that many have but are afraid to ever make public. While many artists have sophomore album slumps, Kendrick Lamar might just have made another classic, with “To Pimp A Butterfly” being heralded and under universal acclaim, surpassing even Kanye West’s “My Beautiful Dark Twisted Fantasy” as the most critically acclaimed album of the decade. Race, the institutionalization of young black males, police brutality and racial discrimination are all discussed, along with homelessness, Kendrick’s

meteoric rise to fame and riches and the taboo subject of self-loathing and suicide. Spoken word, Funk, live Jazz and elements of Reggae are sprinkled brilliantly throughout the album. The sounds showcased in the album are refreshing, as Kendrick is introducing a new generation to genres they may be unfamiliar with. “It’s like OutKast meets NWA,” said sophomore Khabari Phillips. The album’s lead single “I” serves as an anthem of self-love and black power that is supposed to uplift and motivate the masses. “U,” posed here as the opposite of “I,” lets us into Kendrick’s negative conscience as he battles with how his life has changed for the worse. Even in today’s society, these ideas of low self-esteem are very taboo and it is believed that rappers have to be full of bravado. Kendrick challenges that by letting his sentiments be known in a drunken rage as he sips from his bottle and berate himself and his actions. The spoken word that Kendrick implements all throughout the album deals with the troubles he has faced up to this point and how he battled with them. The battle is shown in spurts

as the outro of select songs and is the main purpose of “U.” He finally expands upon his poem and finishes it in the last track, “Mortal Man.” It is in “Mortal Man” that Kendrick has an impromptu interview with his idol and West Coast Legend, Tupac Shakur. Kendrick has always cited Tupac as an inspiration and many have made comparisons of the two. At the conclusion of the album, we have the revelation of the album’s title through spoken word delivered by Kendrick. The metaphor of the caterpillar (here Lamar) morphing into the butterfly after wanting more for himself and his city and the butterfly being exploited by others for its beauty is a wonderful allusion to the Harper Lee classic, “To Kill A Mockingbird.” Upon release, the album has received universal acclaim and heralded an instant classic. Whether or not To Pimp A Butterfly can withstand the test of time is an argument future generations can have for themselves. The album’s sole purpose is to be a mirror of society.


“Experimentation” debuts ASIA LETLOW Editorial Intern Nicholas Mayfield is a University freshman who plans to pursue acting and directing. He wrote the show “Experimentation,” produced by StandUp Productions, which premiered this semester at the Pace Performing Arts building. He spoke to The Pace Press about the experience of putting on the show. The Pace Press: What was “Experimentation?” Was it a show or monologue? Nicholas Mayfield: Experimentation is a full-length new play that I began working on about two years ago. TPP: What was the show about? NM: Thomas, 17, has two crushes: one on the art of Slam Poetry and one on a boy named Matthew who (unfortunately) happens to be heterosexual. This is the story of how these two crushes intersect, and what stems from his little experiment. TPP: What message were you trying to convey? NM: One intent was to tell an LGBTQA story other than those that traditionally tell of discrimination or “coming out stories.” TPP: What inspired you for this show? NM: I wrote a monologue at a summer intensive and it just snowballed and snowballed into a play. TPP: What inspired you to go into acting? NM: I was in a Drama summer camp as a kid, and from then I auditioned for Booker T. Washington High School for the Performing and Visual Arts, a performing arts high school in Dallas. TPP: What is the significance of self-made, and/or “one-man-band” shows? NM: There is a dire need for new work that tells stories of today. And self-made shows provide a good format for that. TPP: How did you perform it? NM: We performed it as a fully blocked staged reading using iDevices to keep constantly updated scripts. TPP: What elements did you use in presentation? Was there an elaborate backdrop, or was the stage kept simple? NM: We used four chairs and all black coats as costumes (the space was cold).

April 8, 2015




With “Glee,” a particular era of television says goodbye

watchingthewasteland.com SARAH HARTZELL Editorial Intern

All good things must come to an end, as a number of popular television shows have in the last few months. While the brunt of media attention has been focused on new shows like “Empire” and “How to Get Away with Murder,” fan-favorites have been bowing out, some gracefully and some not so gracefully. After twelve seasons, CBS’ “Two and Half Men” aired its series finale on Feb. 19, to more than 13 million viewers. The show is no stranger to making headlines with its former star Charlie Sheen attracting negative press after his weird behavior, but the final episode drew heavily mixed reactions. In a self-aware twist, the episode poked fun at Sheen’s eccentricity and the wild success that the show has had over the last twelve years. It attempted to explain Sheen’s disappearance through a convoluted plot that resulted in his character being presumed dead. He was, in fact, not dead and returned at the end of the episode only to be crushed by a falling piano. Sheen himself did not return for the episode, as only his character’s back was shown. The episode ended with Chuck Lorre, the series’ creator, breaking the fourth wall and saying Sheen’s catchphrase, “Winning.” After the episode aired, critics slammed it for being in poor taste and not doing justice to the existing cast and the twelve years of the show’s history. Lines like, “It’s amazing you’ve made so much money with such stupid jokes,” seemed disrespectful to loyal fans in favor of dismissing criticism and bringing off-screen controversy on-screen. Credit where credit is due, though, for “Two and Half Men” never strayed from its identity and was unapologetic in its truly ridiculous exit. While “Two and Half Men” looked to its past for its finale, NBC fan favorite “Parks and Recreation” took to the future for its final season. The seventh season made a time jump to 2017 and showed where the characters had progressed over the three years. The whole season was widely regarded as the best yet,

and the finale was no exception. Leslie Knope and company reunited to fix a swing and in the process flashed forward to as far ahead as 2048 to find where life has taken the beloved characters. It provided a healthy closure for each member of the ensemble cast while being open-ended enough to let the story live on (Did Leslie become president? Or was it Ben?). It maintained its signature sense of humor and still did justice to the effect that the show and its characters have had on audiences. A formerly groundbreaking show, FOX’s “Glee” had been slipping in ratings and acclaim for years when it was decided that season six would be its last. The shortened season reunited the glee club members at their alma mater to revive the club after their Broadway dreams are dashed. The two-part finale went both backward and forward in time to complete the stories of the main cast. The first part recreated the show’s pilot and showed how the singers got to auditioning for the club. All of the original cast returned for the episode, except for Cory Monteith—who passed away in 2013—though his character was still given a storyline through editing and mentions by other characters. The second part, titled “Dreams Come True,” showed exactly that. Everyone lives happily ever after on “Glee,” with families, Tony awards, and record contracts. For a show that had lost its way and its heart, the finale brought it back to its roots and gave fans a reason to remember why they liked it in the first place. If somewhat cliché, it kept its quirky and sentimental spirit alive. The remaining weeks of the spring season will bring several other shows to a close, and it should be interesting to see how they deal with their departures. HBO’s series “Looking” was recently cancelled after its second season finale, but the network has opted to air a farewell special to give closure to the series. “Mad Men’s” final season premiered recently to huge hype and expectations are high for when it comes to an end. Critics’ favorites “Nurse Jackie” and “Justified” will bow out in the coming months after declines in ratings.





April 8, 2015

In TED talk, Monica Lewinsky speaks openly of past

theguardian.com NATALIE CONDRILLO Editorial Intern

If we were to rewind to 1998—many of us indeed were children, even some of us babies—but all can recall what happened between Monica Lewinsky and then President Bill Clinton. Whether it be the ensuing sarcastic jokes, harsh judgment or a compassionate perspective, the affair between them is still discussed today on a global scale. Monica Lewinsky spoke out after decades of silence at a TED talk conference earlier this month. She opened up about her deep regret, shame, and the embarrassment that has plagued her life after a horrible mistake she made. At just 22 old, Lewinsky interned at the White House and says she fell in love with her boss—President Clinton. In the late 1990’s the Internet was a new rapidly growing technology, mainly used for distributing information at an international level. Because of this, the indiscretions that took place were seen and commented by virtually everyone, perhaps creating one

of the first documented cases of so-called cyber bullying. “Overnight, I went from a completely private figure, to a publically humiliated one worldwide,” Lewinsky said in her moving speech. Her reputation was instantaneously demolished by the time she was 24. She was branded as a tramp, a whore, ‘that woman’ and a slut. She was seen by many, but actually known by very few—and this sexist branding followed her throughout her life. (It’s important to note the Clinton family went on from the scandal virtually unscathed.) Lewinsky then shifted the conversation and spoke about Tyler Clemente, an 18-year-old student who committed suicide at Rutgers University in New Jersey. His roommate secretly recorded Tyler being intimate with another man and posted it online for the world to see. He jumped to his own death days later after being harassed endlessly, being completely and utterly mortified. She mentioned this because in 1998, she was nearly humiliated to death as well. Her mother made her shower with the

door open, and sleep beside her at night, terrified the incessant cruelty would take her motivation to live away. Cyber-bullying has become a permanent way of continuing the abuse. Everything you put on the Internet will stay there forever. Reading about what complete strangers have to say about you is a stabbing pain— not because we care about who said it, but because it’s anonymous, we hear these words in our own voice. What people say about us in mass quantities at some point begins to become internalized. This makes the torment and judgment from public shaming even more difficult to cope with, hence why cyber-bullying often leads to suicide. Monica Lewinsky’s TED talk was not only inspiring, but also eye-opening. She challenged the public with her speech on the dangers of online harassment and slut-shaming. Shame is an industry in this country, and our culture feeds off gossip and humiliation. A concept we’ve all learned as actual children—do not do something to someone that you wouldn’t want done to you—lays forgotten to most adults.

Why “The Dress” isn’t about the dress at all MARC SAGGESE Advertising Manager At one point or another, everyone had a phone screen shoved in their face while being asked the question, “What color is this dress?” In what is certainly the Internet’s most divisive photo of 2015, people were legitimately confused as to what the piece of clothing’s colors truly were. Articles were written, celebrities tweeted their opinions, conversations were overheard, even television news networks reported on it; The Dress was truly inescapable for about 72 hours at the tail end of February. It was an Internet phenomenon in a scope and scale truly unmatched by anything that came before it, and a month later nothing has yet to eclipse it. Looking back, however, the most interesting part of The Dress was not the photo itself, nor the discussion around it. The Dress was interesting because of how short it’s reign over pop culture lasted. The Dress is the most extreme

example of pop culture deciding to make something “a thing” and then immediately changing its mind. When The Dress became a phenomenon, everyone in the world was talking about it. Part of what kept that discussion going was the inherent conflict in that it was an optical illusion which caused people to argue over what color they saw. The other part of the discussion was about why it mattered. After a full week of fighting, the general consensus was that it really didn’t matter. After some explanations had been posted about why people saw it two different colors, it lost its magic. Sure, some out-of-touch corporate social media accounts were lagging behind, trying to capture the “teen” demographic by demonstrating an understanding of Internet culture. They posted the picture, but the response was lukewarm at best, and they eventually stopped. If anything, the case of The Dress demonstrates the absolute power of pop culture and the Internet: how it can raise up something so simple and uninteresting to some god-like level of cultural worthiness, and then immediately silence it so that it can never be popular again. Things like this have happened in the past, but never this quickly and forcefully. That picture really isn’t about what color are seen, but about the amount of time they’re seen for.


Millions discover their favorite reads on issuu every month.

Give your content the digital home it deserves. Get it to any device in seconds.