Page 1







The Princeton


FALL 2013 Volume 3 | Issue 1

Since the devastating Great Recession of 2008, we have seen slow but marked recovery. The economy has grown a steady 2% annually and employment has risen from a 129.3M low in December 2009 to nearly 136.3M as of September 2013. Though the recovery has been gradual, its effects are undeniable. Perhaps the most exciting development has been the stock market. Stocks have not only recovered from their recession losses, but have soared to all-time highs in the past few weeks.

Financier EDITOR-IN-CHIEF Darwin Li ‘16

MANAGING EDITORS Seth Perlman ‘14 Jeffrey Yan ‘16

DESIGN & LAYOUT Michelle Molner ‘16 You-You Ma ‘16

CONTRIBUTORS Ryan Azarrafiy ‘16 Dhruv Bansal ‘17 Kevin Chen ‘15 Hadley Chu ‘15 Julian He ‘14 Eric Huang ‘16 Christopher Huie ‘16 Sukrit Puri ‘16 Diana Turbayne ‘16 Chris Wu ‘14 Sherry Zhang ‘15

PCFC Board FALL 2013

Recent talk has revolved around high-growth technology firms. The NASDAQ biotech index and Cloud Computing funds, such as SKYY have returned 50% and 39% respectively so far this year, beating the S&P 500 by a wide margin. From social media to smartphones to 3-D printing to alternative energy, technology has exploded over the past year and its future growth prospects have been at the forefront of investors’ minds. This issue of The Princeton Financier focuses on this theme of high-growth technology. This issue features writing from the Princeton Corporate Finance Club’s (PCFC) Industry Insight Teams, dedicated staff writers, and prominent Princeton professors. Furthermore, the successful publication of the 5th issue of The Princeton Financier was made possible only through the tenacity of the PCFC officers. From diligent design and layout planning to vigorous fundraising and marketing campaigns, it took many divisions working together to generate this final product. We are extremely grateful for the continued support from our undergraduate contributors, leading academics, and industry professionals. If The Princeton Financier piques your interest, I hope you choose to explore some of the many other exciting opportunities PCFC has to offer as well. On a final note, consistent with the theme of high growth potential expressed in this issue, the young yet burgeoning PCFC sees a bright future ahead. With a growing student officer team and increased reach on campus, the PCFC hopes to continue improving upon its current operations for future semesters. With your support, the backing of committed corporate sponsors, and a creative team with big plans for the future, the continued growth and success of the PCFC is assured. Thank you. -Darwin Li


PRESIDENT Hannah Rajeshwar ‘14

MENTORSHIP Dalia Katan ‘15

MARKETING Michelle Molner ‘16


INDUSTRY INSIGHT Sunny Jeon ‘14 Julian He ‘14 Alex Seyferth ‘14

FINANCE Jason Nong ‘15 ALL CORRESPONDENCE MAY BE DIRECTED TO: The Princeton Financier 0666 Frist Center Princeton, NJ 08544

2 4 6 8 10 12 14 17

The Lay of the Cloud by Eric Huang Big Pharma’s Gamble: Explaining the Addiction to Biotech by Dhruv Bansal Abenomics: Death by Taxes by Sherry Zhang, Christopher Huie, and Sukrit Puri All Hype or Something to Tweet About? by Diana Turbayne and Kevin Chen Engaging with Ecommerce: An Interview with Alain Kornhauser Interview by Hadley Chu Liberty Global’s Venture into Virgin Territory by Julian He, Chris Wu, and Ryan Azarrafiy Economics of the Internet by Professor Swati Bhatt Perspective from Inside the Fed: An Interview with Asani Sarkar Interview by Darwin Li and Hadley Chu

The Lay of the Cloud By Eric Huang

What is Cloud Computing? Traditionally, enterprise IT has been burdensome and expensive. Business software applications (e.g. Oracle), development platforms (e.g. C/C++), and hardware (storage, servers, and networks) all require teams of IT specialists to manage. Upgrades and repairs are major undertakings in terms of both human resources and capital. The Cloud is poised to revolutionize business IT by increasing efficiency and flexibility, and, in the process, it may well revolutionize the way we think about the Internet and personal computing as well. Cloud Computing is still, fittingly, a nebulous term. Despite all the hype and attention that it has been drawing recently, “confusion abounds when Cloud Computing is discussed, and the situation is getting worse, not better,” says Daryl Plummer, an analyst at Gartner. Instead of trying to seek a single definition for the term, Cloud Computing can be better understood by examining its effects on IT, industry, and consumers. Cloud Computing provides the ability for files and applications to be accessed and used over the Internet rather than being hosted, stored, and processed on locally managed hardware. Companies involved with Cloud Computing provide Internetbased services that give users scalable, abstracted IT capabilities, including software, development platforms, and hardware. Reduced costs, speed, flexibility, and innovation are primary advantages of Cloud Computing solutions over traditional IT.

The Cloud can be understood through the metaphor of an office building that is shared by multiple companies. Although each company may be involved in different industries, they all benefit by sharing the same basic utilities and common areas that the building provides. It is much more convenient and efficient for the companies to each rent a floor in the building than to each purchase their own separate office complex. Cloud Computing gathers substantial hardware and software resources and then allows businesses to access these resources remotely through the Internet, paying only for the amount they consume. In this way, Cloud Computing increases the flexibility and cost efficiency for businesses, as companies no longer need to make large capital investments in technology. Instead, they can “rent” services per their needs.

their upfront costs, essentially spreading out what otherwise would be capital expenditures into smaller, ongoing operation costs. SaaS gives companies the flexibility to rapidly scale, upgrade, and deploy their software.

Lay of the Cloud

Newer subsectors such as media and big data are also emerging as major portions of the industry. Media distribution through the Cloud is changing how users consume music and other streamed content. Internet media companies such as Hulu and Netflix seek to take advantage of rapid growth in connected device use and faster broadband network speeds, while large players such as Google and Apple seek to lock in users by building complete media ecosystems combining both online media and physical hardware. As the amount of digital information being generated and stored increases, the need for big data management grows with it. Database-as-a-Service is gathering momentum due to its inherent advantages in accessibility and scalability over

Taking a broad view of the Cloud Computing industry as it stands today, we find that the current services offered can be categorized into applications, platform and infrastructure, and other growing subsectors such as media and big data. The first segment, applications, is also known as Software-as-a-Service (SaaS). Software code and data are stored in the Cloud and then accessed over the Internet only when needed. Following the Cloud model, SaaS are paid for on a per-use basis. SaaS has become common in enterprise applications for customer relationship management and collaboration tools. SaaS allows companies to cut down on

Infrastructure-as-a-Service (IaaS) provides hosting, development environments, and storage. Customers of IaaS do not need to be concerned with any hardware and are responsible only for the application being hosted. Some companies have also begun building platforms for third parties to develop their own applications. Platforms and IaaS are often offered together to developers who use a company’s development platform and then have their app hosted on the same company’s Cloud. This segment of the Cloud is still relatively nascent.


traditional databases. Cloud Computing can and will be implemented in almost every aspect of information technology. Key Players The Cloud Computing market boasts a variety of players ranging from IT behemoths who have extended their offerings to include Cloud services (e.g. Amazon and Google) to relative newcomers who have set their focus on the Cloud (e.g. Salesforce and VMware). While some big names with even bigger bankrolls may seem to currently dominate

about $3bn, almost 5% of Amazon’s total revenues. Joyent Joyent is a high-performance Cloud infrastructure company that offers the only solution specifically designed to power real-time web and mobile applications. The company’s hosting unit, JoyenCloud, is built to compete with Amazon’s EC2 Cloud and offers high-performing platforms for application development. The company hosted Twitter in its early days, and features LinkedIn and Gilt Groupe as customers.

While some big names with even bigger bankrolls may seem to currently dominate the market, innovative startups have begun to create some real competition in these early stages of the Cloud Computing game. the market, innovative startups have begun to create some real competition in these early stages of the Cloud Computing game. Some firms specialize in SaaS, platform, or IaaS offerings, while others have their fingers in multiple segments of the Cloud. Here we compare Amazon, the leader in the infrastructure segment of the Cloud, with Joyent, a leading contender trying to take away market share from its much larger competitor. Amazon Widely known for being a dominant online retailer, Amazon has also become a leader in the IaaS segment. Through its Amazon Web Services (AWS) business, Amazon has brought Cloud Computing on a global scale to emerging start-ups, corporate enterprises, and the public sector. AWS is a comprehensive Cloud platform that offers computer processing capacity, storage, content delivery, database, application, networking, payment, and deployment services. Notable customers include Netflix, Shazam, and Thomson Reuters. According to Gartner’s 2013 report on IaaS providers, Amazon has reached a staggering level of dominance in enterprise-level computing services for big companies with annual revenue of THE PRINCETON FINANCIER | 3

LinkedIn, and led the pack with revenue growth of 103.6%, 86.2%, and 34.6% respectively. The same group of Cloud companies created about 67,200 jobs in 2012, a 48% expansion of their overall workforce. With so much growth and potential for value, the Cloud has definitely started to attract the attention of investors. In the past 18 months alone, 13 Cloud Computing companies have gone public, and investors have enthusiastically bought in. Workday Inc., which debuted one year ago with a valuation of $9.5bn, has seen avid investors increase its market capitalization to over $14bn. One year ago, ServiceNow debuted with a market value of $2.17bn; the stock is now trading with a cap of $7.06bn.

With estimates of the total global software market approaching $1.1 tn in 2016, Bank of America Merrill Lynch’s (BofAML) report on Cloud Computing predicts that 20% of this spending will go to Cloud Computing services. The applications segment accounts for $112bn of this 20%. The most lucrative sub-categories within applications are likely to be collaboration tools ($30.7bn), email and office productivity applications (20.7bn), and customer relationship management ($16.4bn). The IaaS market is estimated to be worth $60 to $70bn within the next 4 to 5 years. This prediction uses a $354bn overall infrastructure market estimate and assumes 20% penetration by Cloud Computing. The platform segment is estimated to be worth about $33bn.

Jumping on the Cloud Computing bandwagon, investment firms have created indices and funds to track the Cloud Computing industry. As the top public Cloud Computing companies hit the $100bn market cap milestone in July of this year, Bessemer Venture Partners created an index to track 30 large Cloud companies. Similarly, First Trust Portfolios L.P created the First Trust ISE Cloud Computing Index Fund (SKYY) in 2011 as a tool for those seeking to invest in companies which are actively involved in the Cloud Computing industry. Although volatile, the fund has attracted over $100mn in the past two years and has rewarded investors with around 39% returns in the same time period. On the private side, Cloud Computing has also sparked immense interest within the venture capital community, as demonstrated by the results of a study conducted by Deloitte and the National Venture Capital Association in August 2013. Their survey of 403 venture capital, private equity, and growth equity investors ranked Cloud Computing as the most confident sector for U.S investors.

While there is undoubtedly potential for the Cloud Computing market to become a major target for IT spending in the next few years, the industry has already seen strong growth in recent years. In 2012, revenue growth among 10 leading Cloud companies chosen by BofAML averaged 35.6%, while IT spending only grew by 2.1% over the same period. Workday,

In an otherwise uncertain market, Cloud Computing is a sector that is currently brimming with opportunities. From payas-you-go enterprise applications and database services to offering media and data storage through the internet, Cloud Computing continues to transform every interaction between businesses, consumers and technology.

Industry Financial Landscape


By Dhruv Bansal

Four years ago, Aragon Pharmaceuticals started testing an anti-hormonal prostate cancer drug on mice. Four months ago, Johnson & Johnson bought the company for nearly one billion dollars. This incredible showing is no fluke: biotechnology companies are on fire. The average acquisition price for a biotech firm over the past seven years was $479mn. In the second quarter of 2013, biotech companies accounted for one billion dollars in IPOs and an additional billion dollars in mergers and acquisitions. But biotech companies rarely succeed: they often hinge on one product, don’t offer returns for decades, and are subject to the whims of FDA regulations. Despite these negatives, investors have good reason to continue pouring money into the biotech industry. Background To understand why this is the case, we need to take a step back and look at broader changes in pharmaceuticals. Until recently, the pharmaceutical industry was a relatively quiet area. Big pharma: the name conjured an image of a dry, static field dominated by huge, glacial companies. Recently, however, a variety of new forces caused the ice to crack beneath the traditional market leaders. One of these forces was the population stagnation of the developed world, pharmaceuticals’ principal market. American and European populations had started to plateau, and big pharma had already penetrated nearly every corner of the market. As a result, the sales of pharmaceutical companies leveled off. Adding to their woes, revenues from contemporary products declined as key drug patents expired and generic manufacturers jumped into the fray. The exclusive drugs that made big pharma rich, such as Pfizer’s Lipitor, were open for

any company to make. To compensate, big pharma increased investment in research and development (R&D), driving up costs dramatically. The so-called “golden era” of big pharma was over. The Market Roars to Life Consequently big pharma switched tactics. Instead of wasting money and time on R&D, firms began to buy biotech companies with great drugs of their own. In essence, they outsourced their R&D. Biotech companies develop promising drugs to advanced stages. Then, big pharma buys the companies or partners with them, gaining rights to the drugs. Big pharma spends less money with less risk and biotech companies gain funds to complete research, gain access to the market, and potentially earn huge profits.

from 1995 to 2014 will have been due to such partnerships. Investors are pushing money into biotech companies, hoping to land the next big deal. According to the New York Times, the share price of the 16 biotech companies that went public so far this year is up 48% on average. Four of the ten top performing stocks on the market are biotech firms. Six more biotech firms are considering going public this year. That said, the biotech industry is not without risk, as the story of Aveo Pharmaceuticals demonstrates. Aveo’s Illustrative Woes Aveo Pharmaceuticals started off the way many other biotech companies did: as a research lab. Initially part of the DanaFarber Cancer Institute, Aveo formally

Big pharma spends less money with less risk and biotech companies gain funds to complete research, gain access to the market, and potentially earn huge profits. This win-win strategy immediately took off. One by one, large companies from Pfizer to Merck started partnering with or outright buying dozens of biotechnology companies. Preliminary data suggests its working: a recent joint review by Deloitte and Thomson Reuters found that of the top twelve life science companies, ten demonstrated net increases in pipeline momentum from 2010 to 2012: that is, pushing more drugs through late stage development, which would be nearly impossible through internal R&D.

separated from the institute in 2002 to pursue its own goals. Aveo first researched mice models’ effectiveness in simulating human responses to cancer therapies, but quickly switched gears in 2006 once the firm, in partnership with Japanese research group Kyowa Hakko Kirin Co, discovered a compound called tivozanib that demonstrated strong potential in suppressing kidney tumors. Aveo fast tracked trials of tivozanib and began to aggressively pursue commercialization and investment.

Thus, the current biotech boom began. According to Nature Biotechnology, two-thirds of all pharma sales growth

Four years later, biotech firm Aveo Pharmaceuticals roared onto the market with an $81mn IPO. A year later, Aveo


raised another $100mn as it entered a partnership with Japanese firm Astellas to market the drug in Europe. Tivozanib promised to be the most effective kidney drug on the market, a promise confirmed by its clinical trials. Investors remained confident as Aveo sailed into the FDA approval process that the drug would be on the market within a few years. Almost immediately, however, Aveo landed in hot water. In August 2012, the company revealed that the FDA had expressed concern over tivozanib’s clinical trial data, news that shocked investors. Nine months later, the FDA advisory panel recommended against approval of tivozanib, a decision that consigned the company to certain death. Shares immediately plummeted and had already hit rock bottom by the time the FDA formally rejected tivozanib.

population. Taken together, the FDA found the data charting tivozanib’s success unreliable, and rejected the drug. What is crucial in all of this is that Aveo failed due entirely to factors opaque to investors. Factors in Success The dramatically different outcomes of Aragon and Aveo illustrate the extraordinary risk and volatility of the biotech market. Aragon, which hadn’t even completed clinical trials, yielded huge profits for investors. Aveo, however, ended up failing spectacularly after it had completed all clinical trials. Indeed, the fortune of biotech companies depends heavily on two factors: big pharma’s interest and the FDA approval process. FDA approval can be hard to predict, as the Aveo case demonstrates.

Biotech is by no means a safe investment, but it can be an incredibly rewarding gamble. Unbeknownst to investors, Aveo had ignored a secondary goal laid out by the FDA a year prior. Concerned about the overall reliability of Aveo’s tests, FDA regulators privately pushed Aveo to demonstrate that tivozanib increased the lifespan of patients. Instead, Aveo structured its trials in an unusual manner: if patients felt that the control drug, Nexavarto, was not effective enough, they could switch to tivozanib. Aveo argued that, because of this structure, too many patients switched over to the trial drug, leading to an artificially deflated number for the life span for tivozanib patients. As a result, Aveo failed to meet the secondary standard. The FDA rejected this explanation, stating that because of the trial structure, they could not conclusively determine whether the benefits of the drugs outweighed its costs. Furthermore, the FDA decided Aveo trial data relied too heavily on patients from Eastern Europe, and concluded that the sample was not representative of the American THE PRINCETON FINANCIER | 5

The process looks deeply into the trial data of the biotech companies. Companies are required to conclusively demonstrate that the drug both treats the problem it is designed to tackle and that it does so with minimal side effects. The FDA often strikes down trial data, dooming whole companies, on the basis of assumptions unknown to those outside of the company. Aveo provides one example: there was no way for investors to know the composition of patient samples and the trial structure or that these details would pose an issue for the FDA. As a result, investors use interest by big pharma as a tool to measure potential success. However, big pharma’s interest can be hard to assess before it is too late to invest. Epizyme, one of the most successful biotech firms on the market, has partnerships with GlaxoSmithKline, Celgene, and Eisai amounting to $125mn. Bluebird Bio, another sought-after firm, has a $75mn partnership with Celgene.

These partnerships often take the form of structured deals: an upfront payment with more compensation once the firm reaches certain research milestones. Aragon followed this path: Johnson & Johnson paid $650mn up front and allotted a further $350mn to be distributed as research progresses. This reduces cost for the big pharma companies in case something goes wrong in late clinical trials. At the very end, most research partnerships lead to acquisition: again, this is the route Aragon took. In most cases, these research partnerships come after investors have already identified potential success stories. By the time big pharma has partnered with a biotech firm, it is often too late for new investors to substantially gain from the company. Risks and Looking Forward Tisk epitomizes the biotech market today. Eager for rapid growth and huge returns, investors continue to turn their attention to the lucrative partnerships forming between biotech and big pharma. Yet, a tiny fraction of biotech companies succeed. More often than not, firms fall to disappointing trials, fatal side effects, and red tape. Darlings of today become trash of tomorrow. Hundreds of millions of dollars of investment vanishes in an instant. Biotech is by no means a safe investment, but it can be an incredibly rewarding gamble. How long the biotech industry will grow explosively remains in question, but there is no doubt that it is reshaping the pharmaceutical market permanently. The forces necessitating a shift to the model of R&D acquisition will persist. Big pharma needs the drugs biotech companies develop and supply in order to maintain a profit and a competitive edge. Biotech companies need big pharma to fund research and to generate profit through acquisitions or partnerships. This emerging codependency will only grow in importance as big pharma asserts its presence in developing countries and, in the end, will redefine the marketplace throughout the globe for years to come.

ABENOMICS: DEATH BY TAXES By Sherry Zhang, Christopher Huie, and Sukrit Puri

Beginning in the late 1990s, the Japanese economy experienced a negative reversal from its past success. The Japanese yen depreciated to a dangerous degree. When the yen strengthens against foreign currencies, it becomes less attractive for other nations to import goods from Japan. The Japanese economy relies heavily on exports, so it is unsurprising that the deflation of the yen caused a GDP downturn. At the same time, Japanese debt continued to rise steadily. Previously, the Japanese economy was the second largest in the world, behind the United States. In the 1960s, Japanese industry achieved 10% growth rates, rebounding from the economic calamities of WWII. In the early 1990s, income per capita in Japan equaled or surpassed that of most Western countries. Japan’s economic growth was a remarkable spectacle. By the late 1990s, however, Japan suffered some economic setbacks that they still struggle to solve. From 1986 to 1991, the fast escalation of prices and overheated economic activity, coupled with a loose money supply and credit expansion resulted in a bubble. In August 1990, the bubble popped. By 1992, asset prices had plummeted to their lowest point. The government intervened to keep the economy afloat, and, as a result, Japan ran massive budget deficits to finance public works programs. These programs were not enough to stimulate the economy. The Japanese government decided to revoke its expansionary monetary

policy due to its limited success. The new, stricter monetary policy pushed Japan into chronic deflation – a problem that President Shinzo Abe now intends to solve. Abe’s economic regime, which has come to be known as “Abenomics,” consists of a variety of aggressive monetary, fiscal, and structural policies aimed at inflating the yen. His most controversial policy concerns the Bank of Japan,

central bank, Haruhiko Kuroda, plans to double the country’s monetary base and holdings of Japanese government bonds. He also aims to merge the bank’s quantitative-easing program into regular operations, increasing the monetary base at an annual pace of 60-70 trillion yen. Lastly, he plans to purchase Japanese government bonds (JGBs), exchangetraded funds (ETFs), Japan real estate investment trusts (J-REITs), and CP and corporate bonds.

The Japanese government decided to revoke its expansionary monetary policy due to its limited success. The new, stricter monetary policy pushed Japan into chronic deflation – a problem that President Shinzo Abe now intends to solve. which Abe wants to transform from its independent past into a more direct tool of government policy. Unsurprisingly, this strategy quickly put Abe at odds with the central bank. Although at first the Bank of Japan rebelled against Abe’s control, it did eventually align with and implement his plans to beat deflation. The Bank of Japan began by setting a new inflation target of 2%. The new governor of the

Recent statistics have shown moderate improvements in the economy. Public consumption has remained stable. Public investment and inflation expectation continue to increase. The Yen has lost about 20% of its value against the dollar since late 2012. For example, the consumer price index rose 0.4 percent in June, and the economy grew 4.1 percent in the first fiscal quarter. That said, there are risks that come with quantitative easing. Wages may not rise as fast as THE PRINCETON FINANCIER | 6

inflation. Many small to medium-sized firms, which make up the majority of employment in Japan, do not have the capability to raise wages in a slow economy. Moreover, quantitative easing only increases Japan’s rising debt. Up until the middle of the 90s, Japan’s debt was at the high end of the spectrum, but still very comparable to other industrialized nations. This is no longer the case. Japan’s public debt has ballooned to over twice the size of its GDP, making it the most in-debt country in the world by that metric.

potential ill effect on consumer demand and increased tax revenue would help to rein in the rising debt. Under Abe’s suggested plan, the sales tax would increase from 5 to 8 percent in April 2014 and from 8 to 10 percent in April 2015. Despite Kuroda’s claim, however, government forecasts show that this would cause overall growth to slow to 1 percent during the next fiscal year. Moreover, some private sector economists predict that the planned tax increase would cause Japan to fall back

If the Bank insists on maintaining its easing measures, however, it needs a complementary plan to get the debt under control. Abe believes he has that plan. Despite this, however, the Bank of Japan decided to leave its monetary policy unchanged at the end of August 2013, which suggests that the Bank has found the aggressive easing measures to be successful enough to warrant its side effects. If the Bank insists on maintaining its easing measures, however, it needs a complementary plan to get the debt under control. Abe believes he has that plan. Kuroda has voiced his support for an increase in the national sales tax. He argues that pro-growth policies implemented by the Bank of Japan and Shinzo Abe’s administration would offset the tax’s


into a recession. At the same time, Abe needs to consider Japan’s public debt, which has grown tremendously under these two and a half years. A failure here could risk not only not decreasing the debt but also undoing the recent economic progress Abe’s policies have generated. He needs to be careful that investors do not interpret his actions as uncertain or uncoordinated, which could be the case if one year’s solution invalidates the previous year’s efforts. From a historical standpoint, a tax increase does not seem to be the most fruitful option. The last Japanese tax increase in 1997 did not produce its

desired effects and actually led to a recession and ultimately the resignation of Prime Minister Ryutaro Hashimoto who approved the increase. However, economists today believe that the economic conditions back in 1997 were more turbulent, as there were less accommodative monetary policies and the tax increase occurred during the onset of the Asian financial crisis. Currently, the Bank of Japan and Abe’s administration have a great deal of confidence that Abenomics will be successful in countering the deflation of the sales tax. Recent data show that Japan’s economy has grown for three consecutive quarters, as Abe’s reflationary policies bolstered household spending and drove down the yen. This ultimately increased the annual growth of exports by 3.8 percent from April to June. Moreover, the Bank’s “tankan” survey for September showed that large manufacturers’ sentiments have risen to their highest levels in six years. With all of these positive signs, the Bank estimates that even with the sales tax increase, the economy will expand around 1.3 percent in the fiscal year beginning in April 2014, which would outpaces the 0.7 percent growth projected in the recent Reuter’s poll. Hence, despite the risk of falling back to a recession, the Bank of Japan and Abe’s administration think that increasing the sales tax is worth the gamble.

All Hype or Something to By Diana Turbayne and Kevin Chen Piece written on November 4, 2012

With its upcoming IPO on November 6th, Twitter hopes to avoid repeating Facebook’s mistakes. Since Twitter’s September 12th IPO announcement, the financial world has been abuzz with anticipation concerning the social media heavyweight’s decision to go public. That said, a certain amount of skepticism has also accompanied the big news. As the most popular micro-blogging platform of all times, there is no question that Twitter has seen incredible growth since its founding seven years ago. At the same time, however, it is hard to ignore the recent big tech IPOs that left many investors disappointed and relatively empty-handed. Zynga, Groupon, and most notably Facebook all proved to be huge letdowns to their initial investors. Although there have also been many recent successful IPOs in the social media industry, such as LinkedIn, Pandora, and Yelp, it is in the best interest of investors to proceed with caution when Twitter does go public. As more details of Twitter’s initial public offering have become available, the main concern has been its lack of profitability. Although profitability neither guarantees nor excludes a successful IPO, it does mean that other factors like potential for growth will matter much more. In Twitter’s case, investors should be somewhat reassured. Like other non-profitable

companies who have had good results in the stock market, Twitter has put a major focus on growth in recent months. Despite Twitter’s widened losses in the first three quarters of this year from $70.7mn to $133.8mn, revenues for the same period rose 106% to $422.2mn. According to Mike

Part of Twitter’s effort to appeal to investors can be seen in its recent push to monetize its platform. Its recent acquisition of MoPub, a mobile advertising exchange, will enable Twitter to sell valuable data to advertisers for use in retargeting. From its ability to track what is trending via hashtag use, what users tweet, and who they follow, Twitter has unique insights

Part of Twitter’s effort to appeal to investors can be seen in its recent push to monetize its platform. Its recent acquisition of MoPub, a mobile advertising exchange, will enable Twitter to sell valuable data to advertisers for use in retargeting. Gupta, Twitter’s CFO, these increases in revenue are just the beginning. While Twitter previously prioritized capital investment, the company is now shifting its focus to growing profits. With a bold profit margin target of 40%, Twitter’s co-founders are laying out plans to increase its user base drastically and change its approach to advertising. While the expected timeframe for hitting this target has not yet been announced, the ambitious goal itself may reassure investors worried about the company’s current lack of profitability.

into consumer preferences. Twitter has just begun to monetize messages by selling advertisements in the form of promoted tweets and trends that appear on users’ timelines. In addition, through its self-serve advertising platform, Twitter will draw in large brands as well as small businesses. The self-serve advertising option of scheduled tweets is sure to be a major pull for marketers and, in the long run, could bring in the revenue Twitter needs to reach its profitability goals. Along with its potential for growth in advertising, Twitter’s plans to broaden its international presence and user THE PRINCETON FINANCIER | 8

base will also serve as a new source of revenue. Dick Costolo, Twitter’s CEO, believes there is an enormous opportunity to expand Twitter’s current user base of 230mn. From his estimations, only 10% of what he called the worldwide “connected community” is presently Twitter active. Although Twitter’s goals for growth are ambitious, its price speculations have been rather conservative. Unlike Facebook, whose aggressive initial share prices had nowhere to go but down in the public market, Twitter is taking a more cautious stance. It plans to set its share price between $17 and $20. Anxious to avoid the

Facebook in yet another way. While its decision may have had nothing to do with Facebook’s launch on the NASDAQ (given that over the past couple years there have been a fairly equal number of new listings on each exchange), it would not be surprising if Twitter took Facebook’s experience as an example here as well. To avoid the technical glitches that characterized Facebook’s IPO, the NYSE ran a full IPO simulation on October 26th in preparation for the Twitter debut. The simulation was used primarily to check two things: to see if systems could handle the volume of message traffic that might be generated by the IPO and to make sure that order confirmations would

If Twitter can prevent the underwriting and last minute price increases that plagued Facebook’s IPO, it should be well positioned for a successful debut. overvaluation which hurt Facebook’s offering, Twitter announced a $13.9bn top-valuation, a good deal less than the $15bn which was widely anticipated. If Twitter can prevent the underwriting and last minute price increases that plagued Facebook’s IPO, it should be well positioned for a successful debut. Apart from concerns related to the recent government shutdown, timing also seems to be favorable for the IPO. While Twitter is clearly distinguishing itself from Facebook as it nears the offering date, it can only help that Facebook shares are finally recovering. Since falling over 50% in the months following its IPO, Facebook has rebounded. Hopefully for Twitter, this will repair investor sentiment concerning tech IPOs just in time for Twitter’s debut. In selecting the New York Stock Exchange for its primary listing, Twitter has differentiated itself from THE PRINCETON FINANCIER | 9

be promptly issued once the IPO took place. Twitter is aiming to eliminate as much investor anxiety as possible. Nevertheless, some hesitation among investors can be expected. With Twitter’s full financials not available until months after the offering, its lack of profitability, and the volatility guaranteed with any overhyped IPO, investors will probably test the waters first before purchasing shares in more significant size. Accounting for some hesitation on the part of investors, Twitter is optimistic for a possible take of $1.6bn dollars in its sale of 70mn shares. While the pursuit of this initial $1.6bn stock offering is relatively small in comparison to past IPOs within the social media industry, Twitter is making a smart judgment call by not flooding the market with shares. Once Twitter actually collects the proceeds from its IPO, it will use the money to further its key goal mentioned above: generate greater

revenue through increased advertising to a wider user base in order to achieve positive net profit. Although it has started to work toward this goal already, gains from the IPO will give Twitter the capital it needs to actually make its goal achievable. Currently, Twitter obtains 87% of its revenue from advertising. The balance is derived from licensing agreements that provide other companies better access to the flow of activity on Twitter’s service. Once Twitter goes public, it will most likely be looking to become more aggressive about showing advertisements, thereby increasing its average revenue per user. In comparison to other social media industry companies such as Facebook and LinkedIn, whose revenue from advertisements are $1.58 and $1.53 per user respectively, the Twitter only averages $0.64 in advertisement revenue per user. Unlike those other companies, Twitter is much more focused on the mobile sector, selling 65% of its advertisements on smartphones and tablets (as opposed to Facebook’s 41%). Due to the increasing relevance of the mobile sector, Twitter is well established to take advantage of the sector and continue its rapid rate of growth. As the Twitter IPO draws nearer, market hype makes it clear that investors are regaining confidence in social media companies once more. Twitter’s efforts to ensure that it avoids the mistakes of its predecessors have not gone unnoticed. It acknowledges its current lack of profitability, it has made sure that technical failsafes are in place, and it has set its price at a reasonable level. The way that the company chooses to utilize its proceeds from the IPO will have a large effect on its future. If Twitter is able to successfully follow the path to profitability that it has laid out for itself, its stock price should steadily increase along with its bottom line. All investors should certainly keep their eyes on the #TwitterIPO as the day approaches.


ngaging commerce with

Interview with Alain Kornhauser Interview by Hadley Chu


Alain Kornhauser is a professor of operations research and financial engineering at Princeton. He received his Ph.D. from Princeton in 1971 and has since been teaching. He is a member of many professional societies, has been published extensively in scholarly journals, and has made numerous presentations at professional conferences around the world. He teaches a course being offered this spring on electronic commerce that explores both its technological evolution and its growing market reach.

What does the umbrella term e-commerce include? I’ve evolved the course over the years, and I now say that it’s everything having to do with a machine. We carry around some computing and communications devices, and we have them readily available to us. It’s basically the interactions with those devices that end up being electronic commerce. Commerce is really the delivery of a value proposition to individuals or entities. What this device is doing is delivering to you a value proposition. That value proposition is something that takes on many different forms. In some sense, it’s like walking into a store: a store has many different things available for you to consume. That store could be a movie theater; it does not necessarily have to be gizmos that you buy. You go into that movie theater or

that store to be entertained, and you pay for that personal utility in one way or another. If you define that as commerce, then that broad definition, when it is brought to the electronic world by the devices you carry around--it’s everything!

the use of technology, really. But I would argue that the real beginning, in the conventional sense of e-commerce, is the advent and popularization of eBay and Amazon. That’s when people got really interested.

While we’re probably more familiar with the advent of e-commerce megastores in the mid-90s and early 2000s, when would you say e-commerce started?

On the other hand, there are all the reasons that compel us to carry these devices around. For example, thinking about music or social media, it’s about allowing that gizmo to provide that connectivity and allowing it to do more things for you. That can also be considered the popularization and advent of e-commerce.

The mid 1990s and early 2000s is when e-commerce got popularized. When you think of the first communication that was done electronically, that process could also be thought of as the beginning. In some sense, the beginning of technology is the beginning of e-commerce. In another sense, you could call the beginning of e-commerce when people started using the phone to order take-out, anything that has to do with communication and

How did the 2000 crash affect e-commerce, and what changed after 2001? The knee-jerk reaction is to say ‘of course things changed.’ The stuff that should have gone belly-up went belly-up, but if



you look at the pillars of what existed pre-crash, those guys are still around and doing great. Priceline, whose stock price skyrocketed to over $1100 and what has evolved into Netflix are doing fantastic. EBay and Amazon are very strong with $60B and $170B market caps respectively. It is everything people were saying on the big rise up. We have achieved a lot today of what we were hoping for ten years ago. Does it have further to go? Of course, it does. Is it leveling off ? Of course, it is, but there are other great big things soon to come. What changes and developments have enabled these new trends? Certainly, technology has been the enabler. We all knew we wanted a device on which we could carry everything. An all-purpose communications piece was important. Why carry a wallet anymore? There was a fundamental desire to have all our information in a convenient form. Being able to put that information somewhere and later retrieve it is the key, and that’s why it has been successful. It has not been jammed down our throats or come via government regulation. Who are the most influential players in e-commerce today? Of course, there’s Google on the information side. Nobody has been really able to compete with them in that process of going out and retrieving information. There are some competitors out there; but they are smaller in size, and it is hard for them to catch up due to less funding and less human capital compared to Google. Furthermore, Google has such an embedded head start on everyone else that it would have to take something brand new to take its place. It is not going to be a marginal change. Google was not a marginal change off of what was before. The infrastructure, the wires, and the Wi-Fi already existed and were provided so cheaply that Google’s take off was inevitable. It is the same thing that happened to Kodak: they had built an economy around pictures that didn’t exist before. All influential players have created new markets that completely


reshaped pre-existing markets. And we can expect that someone is going to create something else to do the same. How has and should e-commerce be regulated? One of the big forces behind the success of e-commerce was Congress’ decision not to put sales tax on transactions. If Congress had demanded sales tax be collected on every eBay transaction and insisted that everyone using eBay fill out government forms, the website eBay’s creators put together would look nothing like it does today. There would have been such a significant hurdle placed on the market that people would not have jumped in. Because Congress placed no hurdle, they allowed for the development of a competitive market. This was a government incentive for development. Sales tax was probably one of the biggest movers of e-commerce, but then of course so was the investment by the government in organizations like the National Science Foundation which built the infrastructure and the fibers that allowed all these things to be created. Finally, deregulation of the telephone industry was enormously important. Without deregulation, we might have had to go to Europe or China to get anything even close to smart phones. Has the rise of social media changed the way businesses and retailers do e-commerce? We are all copycats. We are all voyeurs. We are all into ourselves and would like others to be into us as well. Social media feeds on these intrinsic desires. Have we as a society really changed because of all this? When you look at the fundamentals, it is hard to say. How has e-commerce developing countries?


With regards to developing countries, the most important thing has been the introduction of wireless communications. Given the price-performance on these

devices, you can now offer accessibility to almost everybody. In terms of the social structure associated with this, is it easier to have an ‘Arab Spring’ because everyone is now organized? Or does it make it harder because people have so much access to diverse information there is no way to herd them? Though both lines of thinking hold true, I think it as a whole makes it harder. To me, wireless communication gives every individual a great deal of independence. How do you see e-commerce changing in the next 10 or 15 years? On the technological side, text-to-speech technology is still lacking. We as human beings are able to have a conversation between one another in which there is continuous feedback which technology is still unable to replicate. Technology is still not able to mimic that kind of interaction. When you use Google, you put one thing in and get another thing out, but if you think about the delay that is going on there, it is nothing like you have in a normal individual over the course of a day. We are multiple orders of magnitude away from being able to replicate that. In addition, we will certainly get more gigabytes, higher resolution cameras, and who knows what else! The fundamental notion of e-commerce is that is allows you to find the needle in the haystack. That is how Google search became so successful. That is how Amazon works, by bringing together rare events of buyers and sellers who otherwise would not find each other. In any transaction, what you are trying to do is extract the ‘sketchiness.’ If eBay did not have PayPal, no one would use it. PayPal is a sketchiness remover. If you want to think about a few things that make e-commerce work, sketchiness removers are essential. The other key component I see is the good housekeeping seal of approval. In the past, advertisers would put a seal of approval on products. The better you are able to do that, the more successful you will be. It is not new but it is absolutely necessary.

Last June, the landmark acquisition of Virgin Media by fellow telecom and TV giant Liberty Global created one of the largest cable and broadband suppliers in the world. According to Reuters, the Liberty Global-Virgin Media deal is one of the ten largest cable deals in history and has created a dominant new player in the European market. Given the importance of the players and the significance of the rapidly growing and increasingly competitive European telecom industry, the deal is sure to have major repercussions on EMEA finance in the coming months. This article breaks down the details of this $16bn USD deal and the implications it will have on the media sector going forward. Background Before the acquisition, Liberty Global and Virgin Media were similar and complementary firms who primarily dealt in the provision of Internet, TV, and phone services to retail consumers in the UK and mainland Europe. Today, they are collectively the largest European provider of multi-service packages, such as the “triple play,” an industry term referring to a single bundle that contains

wireless broadband Internet, cable TV, and home phone packages within it.

sales, and revenue generation is expected to help transform the new company.

Liberty Global was originally formed in the US in 2005 through the merger of US firms Liberty Media and UnitedGlobalCom. The resulting conglomerate is run by the former head of Liberty Media, American entertainment mogul John Malone. Virgin Media was founded one year later through the merger of two British companies (NTL and TeleWest). The resulting firm was then rebranded when it entered the fold of Richard Branson’s “Virgin” corporate umbrella. Virgin Media based its initial success on getting in as an MVNO (Mobile Virtual Network Operator) early to strike a good deal with its network operator partner. Early subscriber acquisition (based on innovative pricing strategies) was also instrumental to its early growth. In contrast, Liberty Global’s MVNO business in Europe has been sluggish and underdeveloped. In addition, Virgin Media has been better at selling to businesses than Liberty Global, generating 16% of revenues from business customers, significantly higher than Liberty Global’s 6%. Virgin Media’s expertise in MVNO, bundle and mobile

The Deal Liberty Global’s motivations in entering last summer’s deal were twofold. First, the acquisition of Virgin increased Liberty Global’s market share and improved its business efficiency in mainland Europe by integrating Virgin Media’s existing customer base and competitive advantages. Liberty will also benefit from increasing economies of scale, which will reduce cost per customer and help Liberty to negotiate on content. According to analysis by NJIT, it is estimated that the deal will add approximately 5mn customers and a $180mn in combined synergies per year to the parent company. Second, this deal marks Liberty Global’s entrance into the tough but lucrative business of British cable, the largest market (by value) in Europe. With its established presence in the UK market, where content, rather than pricing, is king, Virgin Media boasts a 60% gross profit margin--significantly higher than the industry average of 40%. It is worth


The Liberty Global-Virgin Media deal is one of the ten largest cable deals in history and has created a dominant new player in the European market. noting that although Virgin Media enjoyed success over the past year, with shares rising 90% in FY 2012, it was forced to cut jobs last summer (premerger) due to the highly competitive nature of the cable industry. The deal closed June 7th, after approval from Virgin shareholders and clearance from European and American regulatory agencies. The total value of the acquisition is around $15.8bn USD and up to $23.3bn USD when viewed in terms of enterprise value. The deal netted Virgin shareholders a 24% premium on the stock price, with Liberty Media ultimately paying around $47.87 USD per share. This has made Liberty Global the world’s largest “multi service operator,” or MSO, surpassing Comcast in terms of total users. Liberty Global now has 25mn customers and services 47mn households (with duplicates for multiple services) in 14 different countries. The deal gives Liberty Global nearly 5mn Virgin Media customers in the UK, where the Sky Network still maintains a 2:1 lead by market share. Together, the two firms generated around $17bn in revenue in 2012 and $7.5bn in operating cash flows. The deal occurred amidst a recent spate of M&A activity in the European cable industry, spurred by American interest in entering the sector. This is the case despite European broadcasters recently suffering from stagnation and falling margins.

combined customer base. However, with higher expected rewards comes higher risk. After the deal, Liberty Global will venture into Europe’s biggest and most competitive telecom market. The UK has long been a stronghold for Murdoch’s media empire. It remains to be seen whether Liberty Global, a newcomer to the show, can compete with existing service providers. Impact on the Industries The Liberty Global – Virgin Media merger will impact the cable and TV, broadband Internet, and telecommunication industries, as well as the market overall. First, the merger will tremendously affect the broadcasting and cable television sector. It’s expected that this sector will see a 19.1% increase in Europe and a 13.7% increase in the United Kingdom. The merged entity will dominate the broadcasting industry and will become a strong competitor to the British Sky Broadcasting Group, which is currently the leading cable-TV company in the United Kingdom. The deal will also greatly affect the Internet sector, both narrowband and broadband connections. Similar to the cable and TV sector, the Internet sector is expected to grow in both Europe and the UK. The projected level of growth from 2011 to

The future for Liberty Global is indeed bright and intriguing. A strong demand for its digital cable-TV services, faster broadband, and triple-play bundled offerings are all catalysts for future growth. The triple-play customer base grew 13.8% in the first quarter of 2013 over the same quarter in 2012. It is expected to grow even faster with the


2016 is 41.2% in Europe and 25.9% in the UK. These numbers demonstrate that the merger will likely have a larger impact on the Internet industry than the cable and TV sectors, consistent with the Internet sector’s less-established nature. In contrast, the telecom industry sees a potential decrease in size between 2011 to 2016. Europe is projected to see a 6.5% decrease and the UK is projected to see a 12.7% decrease. Despite the merger, the telecommunications market has seemed to reach a plateau in terms of growth and usage. It is, in many ways, a retired market, whereas the Internet industry is a relatively new sector with striking capability for growth. This phenomenon essentially demonstrates that in any market, a new technology has the potential to disrupt the rate of growth or even eliminate the market altogether. The decrease in the telecom market is likely a result of the further development of wireless technologies. Conclusion Behind this solid business deal between two media giants there lies a more personal story. By moving into the UK, John Malone hopes to challenge Rupert Murdoch for UK broadcasting control. Rupert Murdoch’s News Corp owns part of the Sky Broadcasting network in Britain. About ten years ago, John Malone acquired about 17 percent of Murdoch’s News Corporation, and now, ten years later, Malone continues to haunt Murdoch with his milestone acquisition of Virgin Media. It is likely that Malone has a personal as well as business motivation behind the buyout. We will just have to wait and see who will rise as the king of telecommunications.

The deal netted Virgin shareholders a 24% premium on the stock price, with Liberty Media ultimately paying around $47.87 USD per share. This has made Liberty Global the world’s largest “multi service operator,” or MSO, surpassing Comcast in terms of total users.

ECONOMICS OF THE INTERNET By Swati Bhatt Lecturer of Public and International Affairs


Swati Bhatt has been teaching at Princeton since 1992. Her research interests are industrial organization, applied microeconomics and finance. She was Director of Student Programs, covering both the Undergraduate Certificate in Finance and the Master in Finance, at the Bendheim Center for Finance from 2000-2007. She greatly enjoys interacting with students and has taught courses in Finance, at the graduate and undergraduate level, undergraduate courses in Industrial Organization and Intermediate Microeconomics and supervised over 120 senior theses. She received her Ph.D. from Princeton in 1986, worked for the Federal Reserve Bank of New York from 1985 to 1990 and taught at the Stern School of Business at New York University from 1990 to 1992, before returning to Princeton’s Economics Department.

After a morning of hiking the Colorado Rockies near the Maroon Bells in Colorado, we decided to treat ourselves to lunch. Pulling out my iPad with 3G capabilities, I looked for restaurants in the area and found a nice place 20 miles down the road in Basalt, Colorado. Ultimately, the restaurant and I executed a trade due to the connectivity provided by this mobile technology. Significantly, this technology has connected buyers and sellers, so that information is instantly, continuously, freely, and ubiquitously available to all participants. I will use the acronym “Internet� to refer to the entire system of mobile, digital, two-way, communication technology. The smartphone is no longer merely a communication device, but a computer, with all of the accompanying technological capabilities. Two major trends have been unleashed: mobile connectivity and big data. This Internet technology is driving the fourth industrial revolution (following

the steam engine, the assembly line or mass production, and the early stage digital revolution). In such a world, we can ask four questions: (i) How has the

implications for future technological innovation and intellectual property, Internet security, and privacy? I address each of these questions, in turn, below.


This technology has connected buyers and sellers, so that information is instantly, continuously, freely, and ubiquitously available to all participants. organization and structure of markets changed? Is perfect competition, with zero profits, inevitable? Is there no longer a role for intermediaries? (ii) What has changed for highly connected industries such as the media, communication, information, entertainment, education, and health-care? (iii) What is the role of new industries, such as digital social networks? (iv) Finally, what are the

(i) Think of the economy as a vast network of interlinked highways, with firms and consumers as cities or nodes and the transactions between them as roads or links joining nodes. This networking framework allows for the possibility that interactions between participants need not be of equal measure. It also allows


for the formation of new links, as a result of two developments. First, product and price information is provided through social networking, mobile advertising, and search engines; and second, costs are falling, due to faster processing, easily available broadband networks, and Cloud Computing. As a result, richly connected networks lead to more trading opportunities and sharing of information, resulting in low profit margins. An economic system that permits the free formation of links reorganizes the power structure of the trading network,

Francisco and Tokyo give powerful gatekeeper privileges to these key cities connecting geographically separate markets. San Francisco, connected to a whole other network in Tokyo, has access to an entirely new information set, which enhances its power over the Bay Area. Information gathered by this gatekeeper node, in the form of “big data,” can then be leveraged to provide customized products for specific customer groups at higher profit margins. As Hal Varian, chief economist at Google, explains, “Information technology allows for fine-grained observation and analysis of

However, for a growing market, entrants might be able to crowd out incumbent giants, building upon existing technology to make transitions smoother. away from the firm and intermediaries, empowering the individual consumer. Increasing digital literacy amongst consumers will enable them to enlarge the economic network by creating more links. These links create a power shift as consumers share product reviews and pricing details. Social networking sites reinforce this power shift by harnessing the web of links originating from each consumer’s page or node. A “like” on Facebook or a Retweet can be noted by all the followers of a given individual, and then followers of the followers, etc.

consumer behavior. This permits various kinds of marketing strategies that were previously extremely difficult to carry out.”

Market impediments, such as trade embargoes, entry or participation restrictions, ownership of scarce resources and geography can lead to imperfect connectivity or broken links. Key nodes can have considerable pricing power if they become the single link from a vast hinterland of nodes to the outer world, effectively acting as a gatekeeper. Importantly, gatekeeper nodes are also likely to have an informational advantage since they are bridging two hitherto disconnected worlds. For example, San

As firms and consumers become more interconnected, reputation effects dominate, particularly since the Internet allows for the “viralization” or rapid dissemination of negative information. Easley and Kleinberg, authors of Networks, Crowds, and Markets: Reasoning About a Highly Connected World, explain, “[C]onvince a small number of key people in the part of the network using [product] B to switch to [product] A, choosing these people carefully so as to get the cascade going.”

The Internet allows for commoditization of back-office processes, such as human resource management, accounting and finance, supply chain management, etc. New firms, startups, will be created to pick up this segment of economic transactions. The ensuing competition among these startups will augment profits for the buyer of these services or front-office firms, like Apple.


(ii) How are the dominant firms in the media, information, communication and entertainment industries impacting market structure and organization of their respective industries - Amazon, Google, Facebook and Apple? To the extent that some firms have products that require set-up costs or learning costs, it might be costly for consumers to switch between firms. If you are familiar with Google search, you might be unwilling to expend the learning cost associated with a new search engine or new search format. However, for a growing market, entrants might be able to crowd out incumbent giants, building upon existing technology to make transitions smoother. More likely, mobile commerce will have a dramatic impact on both incumbents and entrants. More important to us at Princeton is the question of online learning: specifically, massive, open, online courses (MOOCs). Does knowledge creation require physical interaction in real time? Can we integrate MOOCs with brick-andmortar education? What is the role of a “Princeton University”? These questions have no easy answers, but we have to start a dialogue because the technology is with us and could acquire an agenda of its own. (iii) With connectedness via social networks we get global phenomena, so we need to have models that focus on collective outcomes. There are network effects, where the larger the network, the more valuable it becomes for individuals to join the network, further enlarging the network. This could impact market tipping points, where a certain product or system becomes the dominant one. With network effects and tipping points, static models don’t work, so we need dynamic feedback loops, where the economy is modeled as an evolving, complex system. This has important implications for financial contagion effects and systemic risks, such as the financial crises in 2008.

In the world of international diplomacy, revolution and war, we might have a collective action problem, where a critical mass of people needs to undertake an action to generate benefits – as we saw in the “Arab Spring.” An increase in communication and connectedness could build understanding across cultures, so it is possible that a firm such as Google could help mitigate, and perhaps eliminate, the problem of international terrorism. (iv) Finally, we need to grapple with several policy issues. First, let’s start with the notion of network neutrality: the ideology that the Internet must intrinsically be open and transparent, in equal measure, to all global citizens. The idea that some knowledge should be restricted, even for national security purposes, is abhorrent

to believers in this ideology. Should all information be freely available? Second, we need to have a technology policy. We need to encourage true innovation, which is about adding value, or solving old problems in new ways (as in Apple’s new operating system, iOS 7), and in a less fundamental way, improving existing solutions (iPad 5s). We also need to encourage invention or creating something new, such as the first iPad. Does the current intellectual property law (governing copyright, patents and trade secrets) advance or retard innovation? Third, the big data industry is predicated on gathering consumer information on search engines such as Google, Facebook, and sites like Amazon. There are likely to be privacy issues at risk, since this data can be shared across third

parties and could be subject to misuse. Consider the recent case of PRISM – The National Security Agency’s collection of metadata in the United States, allowing the government to follow your travels, trace which individuals, families or groups were communicating with one another, identify any social group, and determine its major actors. Is this a violation of individual rights to privacy? Is this justified in the interest of national security? These policy issues will be followed by many more, as the digital revolution unleashes its possibilities. As in any revolution, there will be winners and losers.



Asani Sarkar is an Assistant Vice President at the Federal Reserve Bank of New York. He received the Western Finance Association (WFA) Pearson Award for 2011 for the best paper on Financial Institutions and Markets for his paper “Stigma in Financial Markets: Evidence from Liquidity Auctions and Discount Window Borrowing During the Crisis”. In the past, Dr. Sarkar has also held positions at Princeton University, Columbia University and the University of Illinois at Urbana-Champaign. He is currently working on empirical evaluations of the Federal Reserve’s liquidity provision programs, funding liquidity and market liquidity, and asset pricing. Dr. Sarkar has published numerous articles on the microstructure of equity, fixed income, and futures markets. Dr. Sarkar received his Ph.D. from the University of Pennsylvania. * The following interview represents the opinion of Asani Sarkar and do not represent the opinions of the Federal Reserve Bank of New York or the Federal Reserve System.

Can you tell me a little about yourself and your work in the Federal Reserve? I received a Ph.D in economics from UPenn and started out as a professor at UIUC (University of Illinois at Urbana-Champaign). I was there for a few years and then came to the Fed about 20 years ago. There are two strands to my work; the first is looking at how effective the Fed was when it provided liquidity to the banks during the recent crisis in terms of their effects on the performance of the banks and the economy. The second is related to market structure, especially liquidity. You recently won the Western Finance Association Pearson Award in 2011 for the best paper on Financial Institutions and Markets. Tell me a little about that. Sure. I was looking at the liquidity facilities that the Fed created before and during the crisis. A little bit of background: traditionally the Fed provides liquidity to banks via a facility called the discount window. As the name suggests, it’s a window, where the banks can come to at any time and borrow short term, usually over-night funding. The problem is that there’s a stigma with going to the facility. The identity of the bank going to the window is supposed to remain hidden, but there appears to be ways in which

identities can sometimes be revealed indirectly—and if people know that the banks have gone to the discount window, then the counterparties or customers may think the bank has problems. To eliminate this stigma, the Fed created another facility called the Term Auction Facility or TAF which tried to get rid of the stigma by auctioning the funds. Because all banks were involved in acquiring funds together, individual banks would not be stigmatized. My research looked for empirical evidence that the stigma existed in the discount window. We had a methodology that allowed us to show that stigma existed and to estimate its magnitude. We found that the banks were actually willing to pay a premium to borrow the funds from somewhere else, the market or TAF, rather than go to the discount window which was a manifestation of stigma. I know you’ve done a lot of work with the Lehman Brothers bankruptcy. What is the importance in looking into this? As you know, in the crisis, we had failures of big institutions, in particular Lehman Brothers. Many people think its failure really pushed the crisis into a much more severe level. Markets crashed globally. So why did the Lehman failure have such a far-



reaching effect? Lots of companies fail, even big companies, and there’s a resolution in bankruptcy court so that the amount that’s left over is divvied out among the various creditors. The failure of financial institutions is intrinsically different from that of nonfinancial institutions for two reasons. First, when banks fail, they fail very quickly. If there is a rumor a bank is going to fail, people will quickly go to the bank and withdraw their deposits. Second, global financial institutions are highly connected to other parts of the economy. For example, dealers in CDS (Credit Default Swaps) have many counterparties. If anything happens to the dealers, it may affect other groups such as insurance companies. The failure of Lehman created a domino effect of other failures in the economy. Because of this, policy makers have been trying to implement a good system of failure resolution for large, complex, global financial institution. So that next time, if a Lehman-like institution goes bankrupt, we can have an orderly resolution without detrimental effects. I wanted to examine the resolution of the Lehman bankruptcy in detail, as a case study. What were the problems in resolving Lehman? Was the problem that it was global? That it was subject to different bankruptcy laws for different subsidiaries? Or was it because of the

types of assets it had? That’s the idea behind the project. What conclusions and solutions did you reach and find? Yes, so we are actually publishing a report early next year for the public. We had some interesting findings. For example, when a firm is going through bankruptcy proceedings, it still needs to be funded as it continues operating its business. Let’s use Lehman as an example. Lehman went into bankruptcy, and a subsidiary of Lehman, the investment bank, was sold to Barclays. That was a very good outcome because all the accounts of customers of Lehman’s broker dealer were transferred over to Barclays. But that sale took about a week, and in this week, parties had to ensure that the Lehman broker-dealer continued to operate its business. That was difficult because the nature of the broker-dealer business is that lots of funding is needed on a daily basis to finance the inventory. If you’re in distress, no one is going to give you credit. So, there was a danger that the Lehman broker dealer would actually have ceased to function before it could have been sold to Barclays. So it would have had a huge negative consequence, but what ended up happening was during this one week, the Fed provided funding (against collateral) to the Lehman broker-dealer which kept it going for this extra week so that it could ultimately be sold to Barclays. This example clearly demonstrated to us that if you are trying to have a good way to resolve financial institutions, then one of the things you have to worry about is where this funding is going to come from. Can you describe the repo 105 transactions with Lehman Brothers? When Lehman was in distress and on the verge of bankruptcy, there was a lot of attention on the amount of debt it had on its balance sheet. If they reported a very high leverage ratio, that would scare off investors. Thus, they instituted a program they called Repo 105 internally. Repo is a way of borrowing based on collateral; you sell an asset with the intention of buying it back, typically the next day. From an accounting point of view, you typically record the transaction as a secured loan, which means it will appear on your balance sheet as a liability. What

Lehman did was that before it reported its earnings at the end of a quarter it treated the repo as a sale for accounting purposes (that was reversed shortly after the earnings report). As a result, at the end of the quarter when it reported its numbers, it was able to temporarily show a lower debt on its balance sheet. The accounting method is legal, but you are supposed to disclose that you are using it. But because Lehman did not disclose this accounting practice, it gave the market the impression that its debt was lower than it actually was. Are these tactics common among failing banks? This in general is called window dressing, which companies use, not just in cases of bankruptcy. For example, mutual funds at the end of the year have to disclose their holdings. Suppose they have some small and risky stocks that they do not want people to know they actually hold. Often what they do is sell the stocks in December and buy them back in January. This might lead to the so-called small stock effect where small stock returns are lower in December and higher in January. This is a form of window dressing. In terms of bankruptcy, what firms like to do is look for ways in which their performance looks better than it actually is. Lehman for example had a lot of bad assets on its balance sheet, so they came up with a proposal to put all the good assets into one company and leave the bad assets in another company which they then tried to sell off. This is something they called Spinco. They tried it but it did not actually succeed. It was not illegal per se but it was a way to manage the balance sheet. Another tactic that companies use in general, not just financial companies, is stretching out payments so that they can retain cash. Say you have to pay your supplier on a certain day but you choose to delay payment another thirty or sixty days. You will be able to retain that cash on your balance sheet. How has the effect of Lehman Brothers affected public policy? One of the important policy lessons from the crisis and specifically from the failure of Lehman Brothers was that we have to have a robust system of resolving

complex global financial institutions. There are many policy initiatives to do that—either already implemented or well on the way to being implemented. There are two types of policies: those that concern making the bank safer and reduce its chances of going into distress and those that address what should be done afterwards. In terms of predistress policies, one provision is called enhanced oversight and supervision where the Fed and other supervisory agencies are in the banks looking at their books very carefully and thoroughly. This way, any lurking dangers will be discovered earlier rather than later. Another thing that has been done is the administration of stress tests. In these tests, the Fed gives the banks “stress” scenarios in which the economy and financial markets are doing poorly, and has the bank estimate the impact on its “stressed” balance sheet. Essentially, they are trying to estimate, what will be the capital loss to these banks in such a distressed scenario and ensure that even under these circumstances, the bank will have sufficient capital to withstand financial distress. This first stress test was done during the crisis, but now there is a new policy that conducts these tests on a more regular basis. In addition, large banks are now required to have a living will, or a resolution plan which they can implement quickly. Let’s suppose on a Friday the bank gets bad news and is forced to go into bankruptcy over the weekend. They are required to have a plan available whereby they know exactly how they are going to be resolved by Monday. This is an ongoing process and banks are coming up with plans and submitting them to the Fed and the FDIC for review. The final part of this is how bankruptcy of large, complex financial institutions will be resolved. Lehman’s resolution was done under so-called Chapter Eleven bankruptcy provisions, something most companies use. Alternatives to Chapter Eleven have been proposed specifically tailored to large, complex financial companies like Lehman which have holding companies with many subsidiaries spread out in different countries. THE PRINCETON FINANCIER | 18

Deutsche Bank

Where many different minds meet. At Deutsche Bank, a diverse culture is not just desirable: it’s essential. Collectively, our breadth of ideas, skills, and perspectives helps us to deliver better solutions for our clients every day.

Deutsche Bank Securities Inc., a subsidiary of Deutsche Bank AG, conducts investment banking and securities activities in the United States. Deutsche Bank Securities Inc., is a member of FINRA, NYSE and SIPC. Copyright Š 2013 Deutsche Bank AG.

2013 Fall Princeton Financier  
2013 Fall Princeton Financier  

The fifth issue of the Princeton Financier features student contributed pieces, interviews of Princeton Professor Alain Kornhauser and Feder...