Technophilic Magazine -- Fall 2012

Page 1


[ McGill Chapter ]

Azfan Jaffeer / Committee Chair Rachael Kim / Chapter Editor-in-Chief David Bailey / Publications Director

[ Contributors ] Surabhi Joshi Georgi Kostadinov Manosij Majumdar Christopher Mitchell Elisabeth Perron

[ Interviewees ] Dr. Ronald Chwang Dr. Eric Schadt

[ Advisory Board ]

Contents

[ Image Credits ]

Editor’s Desk: The Pursuit of Significance

3

Behind the scenes: Startups & VCs

4

Is the Internet Made of Blue Cheese?

7

Dr. David Lowther

• flickr.com/halderman/4545253985/ • flickr.com/opensourceway/4371001458/ • flickr.com/tueksta/839920747/ • gizmodo.com/19437193 • flickr.com/emsl/6297163307 • geeky-gadgets.com/wp-content/uploads/2012/07/ImWatch.jpg • flickr.com/kryptyk/125244997

[ Sponsors ]

Copyright 8 Mathematics and its Infinite Wonders

11

Center Spread: The Internet Map

12

Pulling the Plug

14

Self-Teaching a Love of STEM: A Personal Tale

16

Q&A with Dr. Eric Schadt 18 Where ‘Them Girls At?

21

The Future of Wearable Gadgets 22 [ Technophilic Magazine Inc. ]

Technophilic is published by Technophilic Magazine Inc. every semester for McGill’s Engineering Undergraduate Society.

Robert Aboukhalil / Editor-in-Chief Daisy Daivasagaya / Executive Editor Jimmy E. Chan / Business Development The opinions expressed herein reflect the opinions of their respective authors and may not reflect those of Technophilic Magazine Inc., McGill University or our advertisers. All articles here are licensed under the Creative Commons License. ISSN 1925-816X

Jokes 23


/3

EDITOR’S DESK

The Pursuit of Significance by Rachael Kim

Like a roller coaster ride, research work in a lab has its ups and downs but is worth every second. Doing research means working on problems never investigated previously, and making discoveries that ultimately benefit society. Have you ever wondered why we relatively stay the same size and do not grow indefinitely? Our cells undergo programmed death—yes, planned suicide—called apoptosis. If you have a chance to sneak into a 200-level biology course, I can attest that the gruesome details of this suicide cascade can be covered in one lecture. An hour to

«

T-cells. The professor had a few candidate genes in mind—mostly growth factors. Our working hypothesis was that the ASM cells produced secretory factors. These factors would then bind to their receptors on the T-cells stimulating the tube growth. Using the qPCR machine, I measured the expression levels of the candidate genes in the ASM cells before and after being co-cultured with the T-cells. A good chunk of my first two months was spent culturing the cells, learning the techniques and avoiding the minefield of blunders.

I would recommend, to any student who is remotely interested in research, to give it a try. »

describe how our cells die—easy! Little did I know that I had failed to appreciate the time and effort that go behind every discovery. Only this summer did I start to realize how meticulous a scientist ought to be in order to produce appreciable results. I will be honest: I got sucked into the research frenzy whirlpool among numerous pre-med students. In early May, I walked into the respiratory physiology lab filled with anxiety and excitement not knowing what to expect. Luckily, I was given a small project of my own concerning nanotubes. These are thin tubes between cells connecting their cytoplasm. The tubes are microscopic, averaging about 30 μm in length (Fig. 1). The cytoplasmic connection via tubes allows the exchange of intracellular content. In fact, a recently suggested method for the rapid spread of HIV could involve nanotubes. It has been proven that the nanotubes help minimize the exposure of the virus to the extracellular antibodies. The study of nanotubes is still at its infancy.

Figure 1. Nanotubes connecting cells could be a route for HIV transmission (Sowinski et al., 2008).

I examined the tubes between the airway smooth muscle (ASM) cells and the immune T-cells. My goal was to investigate some genes that are potentially involved in the tube formation projecting from the

Figure 2. Staining with a green

fluorescent dye spread to all cells.

There was a great deal of repetition as well. Slowly but surely, I was able to gather my sample size of five and singled out one gene, Fibroblast Growth Factor 2 (FGF2), for further investigation. After consulting a paper, I purchased a drug to block the receptor for FGF2. I was hoping to see a decrease in the tube formation after drug administration. To quantify the tubes, only the T-cells were stained with a non-specific green fluorescent dye; any green ASM cells would imply tubular connection. After a week and a half of completing culturing and staining procedures, I looked at my slides with high hopes. To my great dismay, all the ASM cells were stained green and it was impossible to quantify their number (Fig. 2). At first, I thought it must have been due to human errors. Despite my effort in tweaking the protocol for a month, I could not preclude the entire ASM population from being stained green. At this point, I became quite frustrated and exhausted.

After several repeated failures, the grad student and I brainstormed possible explanations regarding the unforeseen spread of the dye-tubes among the ASM cells or the T-cells underwent necrosis, releasing the dye to the cytoplasm. We decided to take another path in tube quantification and began a stain for apoptosis (Fig. 3). This new method offered an easier system of counting the tubes and assessing the viability of the T-cells. Even though my project stands inconclusive, the summer internship was rewarding despite its challenges. I would recommend, to any student who is remotely interested in research, to give it a try. This summer provided me with the most interaction with a professor I have ever had since I entered university. Although the professor-student barrier still remains, it was a great opportunity to become acquainted with a professor. I dreaded the weekly lab presentations, in which I was required to communicate my findings. My results were regularly dissected under the critical eyes of the graduate students and the professor himself. However, in retrospect, the presentations provided me with a good experience in speaking to the scientific community.

Figure 3. Apoptosis stain.

The experiments rarely proceeded ideally, and I came to accept unexpected results as an opportunity to think of the alternatives. Most importantly, even if the project does not yield any useful results, it is imperative to not take the outcomes personally. I was hard on myself for the insignificant results and became discouraged with the lack of productivity that impeded the progression of the project. With the implementation of the new staining method and the final stages of the project in sight, only time will tell whether or not the FGF2 gene is involved in nanotube formation ■


RONALD CHWANG PHOTOGRAPH BY JIMMY E. CHAN, TECHNOPHILIC MAGAZINE


Q&A / 5

STARTUPS & VCs: BEHIND THE SCENES Q&A with Ronald Chwang Ronald Chwang is a seasoned entrepreneur and venture capitalist. After obtaining a B.Eng. from McGill in 1972 and a Ph.D. from USC in 1977, he worked several years at Acer before moving to the venture capitalist industry in 1988. He is now the Chairman and President of iD Ventures America, a VC firm based in Sillicon Valley. Last May, Dr. Chwang received an honorary degree from McGill’s Faculty of Engineering and we got a chance to interview him before his commencement speech.

CAREER PATH When you started your Ph.D., did you know you would go to industry upon graduating? No. In fact, I did a Ph.D. because I wanted to figure out whether I wanted to pursue a career in academia or industry.

«

tel doing advanced product design, or do I take a big risk and join an early-stage startup and be unsure of what the future has in store for me a year down the road?

early stage startups. That’s how I evolved from a corporate executive to an investor. And I’ve been an investor ever since. What do you think made you successful?

I went for the startup, which was called Quasel, as Chief Engineer. We wanted to build dynamic memory (DRAM), but I was very naïve at the time and didn’t realize that there are cycles to any industry: By the time we could produce our components, DRAM chips were selling cheaper than we could

I was willing to take risks because I wasn’t afraid of failure. I told myself “If I fail, I will have learned something”.

By the time I started my research work in STARTUPS & VCs the mid 1970’s, silicon integrated circuits (ICs) really took How does venture capital work off and I was from an investor’s point of view? immediately atSo I had a decision to make: do I stay at Intel If I raise $100M from other investors, tracted by that direction. By the doing advanced product design, or do I take a usually that money has to be investtime I was writin 5 to 7 years; you can’t sit on big risk and join an early-stage startup and be ed ing my Ph.D. thethe money. So you have to use up the sis, I knew that I unsure of what the future has in store for me money during that period. would probably a year down the road? I went for the startup » How successful you are depends on pursue a career in industry. how good is the return you get. And the successful VCs start to get more My thesis was a very theoretical analysis of produce ours, so the company didn’t suc- money so their size gets bigger and bigger. semiconductor device behavior and I want- ceed. This was in 1986. Some large Silicon Valley funds are now ed to know how to do IC design. half a billion dollars; that’s a huge amount After that, I gathered a small team and start- of money. One of the job offers I received was to do IC ed a small company that moved away from design at Bell Northern Research, so I went commodity components such as DRAM That’s a good thing, right? back from California to Ottawa, where I de- chips (because it’s a difficult market) and signed one of the components of the first instead, focused on programmable chips. It depends. Each VC has so-called venture digital switch machine. partners and a large VC may have 10 or 20 How did you go from that to becoming partners that look for companies to invest. an investor? What did you do after Bell-Northern? But if you have $500M overall to invest, each person needs to invest $25M in a short Intel started a research center in Portland, Almost a year into that startup, one of the period. So what happens is that you tend to Oregon and they recruited a lot of the re- investors of Quasel, Acer CEO Stan Shih, not invest in companies that require small search scientists from Bell-Northern Re- offered to merge my company with Acer, amounts of money, which is bad news for search, so I decided to join Intel where I got where I ran the Acer R&D Labs. That’s when early-stage companies. to work on their commercial products. I was I started my corporate executive career. there for about 6 years. Since the VCs now generally look for very About six years later, I came back to Silicon large deals, that has changed the original But in 1983 - 1984, Taiwan started to be- Valley as the President and CEO of Acer nature of the VC industry. Today, those come a hub for the semiconductor indus- America, where I also got exposure to the early-stage startups are now picked up by try and they encouraged people with that field of investment because I felt that it was angel investors or seed stage/early-stage background to go back to Taiwan and start a rising trend. funds. Those early-stage companies will semiconductor companies. also typically get more one-on-one menIn 1997, we setup Acer Technology Ventures torship. So I had a decision to make: do I stay at In- (ATV), a venture fund that invested in very


6 / Q&A But the way people get funding is constantly evolving. Outside of VCs, there’s also crowdfunding websites such as Kickstarter that are good places to start. What makes you invest in a startup? Three things: idea, market and team. Beyond the idea, I consider whether it covers a reasonably large market. The other aspect I consider is the team. Are they passionate about what they do? Experience is not as critical for young entrepreneurs but most importantly, do they have the ability to learn quickly and build a solid team around them? Do VCs fund new ideas or better approaches to the same idea? Some investors prefer investing in companies that do things better, faster or cheaper, while others prefer companies who do things that have never been done before. I prefer the latter. Do you prefer when the founder stays in the company once investment is made? It really depends. There are two kinds of founders: those who are very technical and only want to see their idea come to fruition, after which they want to return to do more fundamental research. In that case, we can find someone who’s capable of managing the company. Then there are founders who want to ex-

«

The opportunities of today are much broader than 30-40 years ago and I don’t think there is one definite path. I think what students need to do is to find out where their passions lie. »

pand their knowledge on how to build up their business. When you start, ideally you’ll have a founder who has a technical background and who will eventually become a capable CEO. That’s the ideal case. But often, a company will be started by 2-3 individuals and among them, maybe one will be more technical and the other will be more business oriented. That person usually ends up running the company.

LOOKING BACK... Have you ever invested in a company and later found out you made a big mistake? Oh yes! Typically, VCs invest in a number of portfolio companies. Our business model is that, out of 10 companies, there will be a spectrum of success: a couple will be very successful, 2 or 3 will be good and the rest will fail. And even the very successful ones are very rarely overnight successes. Companies often find themselves redefining their business model along the way, and a good VC always tries to help companies figure out a

way to go around those challenges. Some undergraduate students are nearing graduation and aren’t sure whether they should go on to grad school or industry. Do you have any advice for them? The opportunities of today are much broader than 30-40 years ago and I don’t think there is one definite path. I think what students need to do is to find out where their passions lie. For example, if you’re interested in doing research and experimentation, there are two paths. One is to continue on the academic path by going to graduate school, while the other is to conduct research in industrial labs. Today, the latter opportunity is less of an option; back in my day, there were many research institutions like Bell Labs, RCA Labs and Xerox PARC who were dedicated to tackling research questions. But today, this research is mostly attached to large corporations and has become more targeted to commercial applications, although a few of them still do advanced research like Intel and Google. But once you get into that industry environment, I think it’s difficult to go back to academia. On the other hand, you can do a postdoctoral fellowship and maybe even become an assistant professor, and you probably still have the chance to go back to industry. How do you remember your time at McGill? I always felt like the time I spent at McGill was one of the best moments of my life. I really enjoyed the city, the school, the people I met and the professors I worked with. Later on in your life, you will find that what matters is not only what you learn in class and in exams, but also the people you meet and the culture you’re exposed to, because each of you come from diverse background and have different ways of thinking ■


/7

Is the Internet Made of Blue Cheese? by Robert Aboukhalil It’s often the simple questions that baffle us the most. For a society entrenched in technology, it seems silly to even entertain the question of what the Internet is made of. Why, of course, it’s… it’s made of my favorite websites? Although we are painfully aware of the times when we cannot connect to “the Internet”, what are we connecting to, exactly? Where is this Internet located, and what is it made of? The short answer is that the Internet is really a network of billions of computers around the planet that are connected together. When you connect to the Internet, you’re simply joining this network of computers. Every computer in this network is assigned a unique address so they can talk to each other. This address is called an IP address (IP stands for internet protocol). For example, Google.com’s IP address is “173.194.35.18”. You can type those numbers and dots in your browser and you will land on Google’s website (try it!). From URLs to IP addresses Memorizing telephone numbers is difficult enough that we shouldn’t have to memorize the IP addresses of websites. To solve this problem, there are computers on the Internet called DNS Servers (DNS stands for Domain Name System) whose job it is to translate website URLs to IP addresses: When you type ‘google.com’ in your browser, your computer connects to DNS servers, which then redirect you to the computer identified by the IP address ‘173.194.35.18’.

«

not have the listing for that website and will contact other DNS servers to see if they can map the domain name you provided. After a while (this would only take milliseconds), the DNS server gives up and returns an error message. What about my IP address? I mentioned earlier that all computers on the web are assigned an IP address. By that definition, it means that your computer is also assigned such an address when you connect to the Internet. To find out yours,

But what’s the formula for translating ‘google.com’ to ‘173.194.35.18’? In reality, there are none, it’s much simpler than that. The numbers are completely unrelated to the website itself; in fact, the DNS servers simply maintain tables that map domain names to IP addresses. If you enter an invalid website domain, the DNS server will

You may have noticed that I’ve used the terms ‘internet’ and ‘web’ interchangeably. The sad truth is that those two terms mean very different things. The term ‘Internet’ refers to the network itself, whereas the ‘web’ is an application that runs over the Internet. The languages and conventions used to communicate over the Internet are called protocols. For example, the web uses the HTTP protocol while you need the SMTP protocol to send e-mails. Some history never hurts

Your IP address is public information, much like the house number on your mailbox. Displaying your house address does not make the lock on your front door any less secure »

Think of it as getting in a taxi and requesting to go to the train station when you don’t know the exact address; before driving you there, the taxi driver’s must translate your vague request into an address.

Is it ‘Internet’ or ‘World Wide Web’?

go to Google.com and search for “my IP address”. A common misunderstanding is that your IP address should not be shared with others; otherwise, the hackers will get you. The truth is that your IP address is public information, much like the house number on your mailbox. Displaying your house address does not make the lock on your front door any less secure. Similarly, your computer is set by default to not accept such connections from the outside—your doors are locked.

The first use of the word ‘Internet’ was in 1974. In the early days, when the creators of the Internet thought about how to best implement this network, they opted for a design that favored flexibility. In his monumental 1988 paper “The Design Philosophy of the DARPA Internet Protocols“, David D. Clark explains the decision to opt for the ‘end-to-end principle’ , which argues that the network be kept as simple as possible by implementing functionality at the ends of the networks rather than making such functionality a defining part of the network, thereby imposing restrictions on the future growth of the Internet. It was only in 1990 that Tim Berners-Lee and Robert Cailliau invented the World Wide Web (yes, it’s only been that long). Initially, it was meant as a medium for physicists to share papers and data, and allowed browsing the information using ‘hypertext links’ ■


8/

FEATURE ARTICLE Copyright. by Manosij Majumdar The copyright debate has become one of the foremost policy tangles of our time. What used to be an arcane legal matter has grown into an issue that affects wide swathes of the population. Copying used to be the province of the well-equipped few but owing to the digitization of media and rapid advances in information technology, it is now a diffuse, democratic enterprise. There is no other legal infraction that can be committed with as little expense. Ctrl-C, Ctrl-V. Mischief managed. A challenge of this magnitude obviously inspires an overwhelming response on the part of those who stand to lose from it. Multiple pieces of legislation seeking to defend and advance the interests of copyright-holders end up violating the civil rights of citizens, creating further resistance and animosity. One particularly delusional example is the Belgian music royalty collective, which tried to ban the reading of books to children in a library. While there is a case to be made for the rights of copyright-holders, it does not justify the heavy-handed approach taken so far, and while there is a monetary loss associated with piracy, it cannot be worth spying on citizens or limiting their access to information. Laws that permit Internet disconnection without trial or recourse essentially revoke a citizen’s right to a fair trial and are especially repugnant. Murderers and rapists and child molesters get a better deal than that. Since when is stealing a ninety-nine cent song a more serious crime than taking or destroying a life? Given the destructive stalemate that we find ourselves at, let’s look at the merits of both sides of the debate: Piracy does hurt artists and other rightsholders, but not to the extent advertised. Many people who pirate would not have paid for the work in question in any case: some because they lack the means and others because they don’t deem that particular work worth the purchase. They do not represent ‘lost sales’, because the sale would not have taken place anyway. Suing teenagers and grandmas creates an atmosphere of intimidation and may deter some novice distributors, but it only causes the more able ones to dig their heels in and seek better tools. The deterrence is tempo-

rary but the escalation is permanent and more damaging in the long run. The focus should instead be on people willing and able to pay who pirate instead. What factors are keeping them from paying money? One factor is the expectation of ownership. Consumers buy media in new formats with the same set of expectations that they had with the old formats. A book can be lent, annotated, carried and read aloud to a group. So why should one not be able to lend a music file or play a movie in public? Computer files, of course, can also be copied, meaning you can enjoy a song at home and in your car after giving a copy to your friend. There is no single, exclusive instance to lug around. In economic terms, not only are the goods not rivalrous, the sharing mechanism is anti-rivalrous. Media companies’ expectations that consumers interact with these files in a way artificially delimited by the law is optimistic at best and naïve at worst. Attempts to stop the copying of files or restrict their mobility have resulted in annoyance for customers but no profits for the companies. No customer wants to be forced to buy a song for every device that might play it (her computer, her car, her wife’s computer ....) nor do they want to be frustrated at every turn

when going about their life. Is putting up barriers really going to convince consumers of the rightness of the companies’ position and make them switch to legal sources and toe every onerous line? Experience would suggest not. Convenience is another factor. The gallon of gall rightsholders have been pouring on their users often backfires, and badly. There are people willing to pay proxy services to scramble their IP so they can share and distribute files with impunity. They are willing to pay someone else so they don’t have to pay the studios. This may look like sheer bloody-mindedness, but it isn’t. It’s rational. After the limitations on sharing, copying, distributing, recording and playing placed on legally-acquired files, is it any surprise that consumers would rather acquire them illegally, where they come with none of these obligations and truly feel owned rather than borrowed with the grudging assent of a feudal lord? The content industry has actually managed to make its products less attractive or convenient to buy and own and use than to steal and deal. This is poor business strategy and a failure of epic proportions. If the content industry, including music and movies (and ebooks) wants to


/9 compete against illegal copies of their works, they’ll need subtler methods than the sledgehammer of regulation and takedown notices. Their profits are important to them and that’s understandable, but if those profits can only be preserved by spying on citizens and limiting civil rights, few people would choose the profits of one industry over the rights of all citizens. No, the industry needs to reform its business model and make its content easier to access and own legally. Consider movies. A person wanting to see a new movie legally has to go to a theatre at a specified time, pay for a ticket, buy extortionately marked-up snacks and drinks, sit continuously for the duration of the show, then drive back home after it. If they have children, they have to find and pay a babysitter. If they have work or school the next day, they can’t make time today. The real price of a movie experience is much higher than the number on the ticket. It includes time, energy and stress. The illegal route is to download the movie and may be hook up the computer to the TV. Then they can watch it on their own time, invite friends, pause, rewind, skip, discuss, be loud, go to the washroom. What about discs, one might ask. Well, what about them? A movie comes out on DVD or BluRay months or years after the theatres, long after anybody is interested. A bootleg can be available on the Net the day after a movie comes out. A disc needs to be bought from a physical store, or ordered online, with the associated delay and bother. Every movie online can be found readily using one search field. A DVD only works in its own region. A pirated copy is a joy forever. The gap in convenience is huge. The same is true for TV. Why would anyone want to shell out for a channel package for one channel to watch the one show they want? Same goes for buying music. Buying music at a store is bothersome. Amazon and iTunes make it somewhat easier, which is why consumers have willingly spent millions on acquiring songs through them. In 2008, Apple’s iTunes was the number one music retailer in the US (ahead of brick-and-mortar WalMart and Best Buy) and 15 billion songs had been sold on it by 2011. About half of US teenagers did not buy a single CD in 2007, and digital downloads accounted for 30% of all sales in Jan 2008. What would have happened to these sales without legal online downloads? Would the people who bought these have schlepped themselves to a store for discs, or downloaded them illegally? Convenience matters. Availability matters. Ownership matters.

Louis C. K. made a comedy special and put it online for $5, marketing it directly to his fans. He recouped his costs in twelve hours and made a handsome profit. Observe the beauty of the situation. His fans paid less than what a studio would usually charge them for such a video. The artist made more than what a studio would have paid him for such a video. His own comments on the situation are worth reading.

«

France has led to a drop in piracy, but not eliminated it completely. I imagine it’s the less savvy pirates who have been scared away, and some of that drop in traffic is simply due to pirates going further underground. The more interesting fact is on the flip-side: digital downloads in France are up, but despite the halving in piracy, conventional music sales are down. I know Hollywood doesn’t always prefer subtlety, but can they not see the writing on the wall?

The content industry has actually managed to make its products less attractive or convenient to buy and own and use than to steal and deal. This is poor business strategy and a failure of epic proportions »

Studios like to dangle starving artists as a moral sop to guilt people into paying and yes, not everyone has Louis C. K.’s fame or fan base, but what this shows is that the studios are becoming unnecessary–redundant would be another good word–to both sides of the equation. What studios need to realise is that neither consumers nor artists are beholden to them or their business model or distribution networks anymore. They are free to communicate with each other directly, so the studios must adapt and innovate to stay relevant at all. Why not release movies online at the same time as theatres, with access codes sold for a few dollars each? It’s not like the studios get to keep the entire price of the ticket anyway. By cutting out middlemen like the theatre and distributors, they’ll be bringing home the same profits as before, may be more. There will still be a theatre-going audience paying for the larger screen, while many casual viewers who would not have bothered to watch it except on a laptop on their couch will be captured. Once a person buys a movie, it should be theirs to keep. For life. There may be strong copy-protection, but no one who manages to circumvent it should be prosecuted. Studios should remove legal intimidation from their arsenal and limit themselves to a technological arms race, unassuming and impersonal. No system will ever be completely uncrackable, but it might be just enough to make the effort not worth it. At no point should a business sue its customers or support legislation that limits rights. Questions of morality aside, doing so is always counterproductive. Nobody roots for a bully. Strong

anti-piracy

legislation

in

If you continue to deny access to your products for months, then limit that access with geographical restrictions, then sic your lawyers on pre-teens who crack it, you will gain no sympathy and earn only distrust and distaste and disaffection, not profits. So much for movies. What about software? Microsoft turned a blind eye to pirated copies of Windows early in its history because it knew doing so would make Windows ubiquitous and the network effects would assure its dominance. Clever, but not everyone can afford it. Smaller game and software houses are especially hurt by pirating, and they don’t always have the firepower to fight back. Lawsuits do happen, and the companies even win them, but not before spending valuable resources and losing goodwill. Fortunately, there are better ways of giving an edge to paying users. Software can include support and value-added services, and games can include in-game purchases. Rovio, the creator of Angry Birds, has decided to forego the anti-piracy litigation route entirely, recognising such efforts as “futile” and seeing piracy as a marketing opportunity. The point is that there are methods, both technology and business model-based, that achieve desirable results for all parties without causing ill-feelings or requiring litigation, or worse, legislation. A dollar spent on game development has better returns than one spent in hunting down the pirates of the last game. In the long run they achieve more with less effort. They are efficient. Ebooks are a more complicated matter. Books as a saleable commodity have a longer history than musical recordings,


10 / And which book was this? Nineteen Eighty-Four. Oh, beautiful irony. Imagine the power to make every copy of a book disappear from everywhere. Now imagine that power in the hands of a censor, or an extremist, or simply a malfunctioning piece of code or an inept employee. In a world where increasingly more books are electronic, the consequences would be devastating. Human civilization is built on the written word. The word ‘Bible’ literally means ‘book’ in Greek, sharing a root with ‘bibliography’ in English and ‘bibliotheque’ in French. The first word of the first verse of the Qur’an revealed to Mohammad was ‘iqra’ – “read!” The most influential ideas in human history were put down in books – Principia Mathematica, The Origin of Species, The Wealth of Nations, The Communist Manifesto. People from my generation, the ‘digital natives’, still identify ourselves with books. We are the Potter generation (and we look down on the Twilight generation). Any technology that gives a few people the ability to pull the carpet from under human civilization deserves strict scrutiny and strong controls. An ebook should belong to the person it is sold to, with the seller having no ability to influence it after the sale. Ideally, it should be sold without the buyer’s information being linked to it at all. Back-ups could be stored in an encrypted form with a user-generated key. Why would anyone buy legal copies? For one, they might come with discount codes for hard copies, or sequels, or related movies. Again, the providers have to realise that it’s their duty to entice and convince the customer, and yes it does sound like work, but that’s what business is. The customer is free to download an illegal copy. It’s even easier than for movies or music. The files are smaller. The entire Harry Potter series is a 7 Mb file. It doesn’t download – it practically apparates.

certainly moving pictures. As Cory Doctorow puts it in the preface to his “Makers”, books have been owned for far longer than they have been published. Yet ebooks are not owned but merely, effectively, leased. Amazon, for example, was able to pull an ebook from its collection. Not only did this title disappear from Amazon’s online store, which is the equivalent of a brick-and-mortar store discontinuing a product, and perfectly cogent, but it disappeared from the Kindles of Amazon customers as well. Simply vanished. Here today, gone tomorrow, with all their careful annotations.

«

All this downloading and distributing may not be legal or ethical, but businesses hung up on that point are missing the point by a parsec. Copying and sharing is so straightforward and effortless that complaining about the law or morality is immature. One may as well complain about gravity. Admiral Yamamoto warned against Pearl Harbor by saying that fighting the United States would be like fighting the entire world. Here’s a thought for the content industry, their lawyers and their pet legislators: fighting the Internet is exactly like fighting the entire world. And every day the Internet is growing, in places where your precious little intellectual property laws are mocked and ignored. It is becoming less manageable, more widespread, more multilingual. Adapt, or die, because consumers are not obliged to ensure your profits if you fail to construct a functioning business plan. It is your business, not theirs. It is no longer about who is right, but who is left. Arguing from a moral soapbox will leave you broke. This is a practical matter demanding practical solutions. Art has existed for millennia without corporate houses exerting control over it, and it will outlast them long after they are dust because of their own myopic, mulish denial of reality ■

All this downloading and distributing may not be legal or ethical, but businesses hung up on that point are missing the point by a parsec. Copying and sharing is so straightforward and effortless that complaining about the law or morality is immature. One may as well complain about gravity.


/ 11

Mathematics and its Infinite Wonders by Surabhi Joshi

There are many of us who haven’t had the most pleasant of experiences with mathematics. It is a subject dreaded by many. Many avoid numbers like the plague. I can recall myself as a kid when I was first exposed to the idea of ‘x’ and firmly believed that this letter had absolutely no place among numbers. Most of us continue to fear calculus which is often synonymous with boredom and worthlessness. The following comic, sent to me recently by a fellow friend from calculus classes, sums up what most of us feel about integration and differentiation:

is sadly starting to lose its popularity. Alex Bellos is one such example. He is an author who published his first book on football and eventually realized that mathematics is his true calling (yes, his football fans were very disappointed). The next book he released has a lovely title and an endearing image that’ll hopefully bring a smile to all the Casablanca fans. This book lives up to its name and navigates you through the exciting world of math.

There is nothing wrong in feeling this way. It is probably a little reassuring to note that there are several prominent personalities who share the same sentiment. Mark Twain wasn’t too fond of statistical concepts. Lies, damned lies are what he believed them to be (Frederick Mosteller, however will argue and state that ‘it is easy to lie with statistics, but it is easier to lie without them’). From the point of view of schools and universities, it is indeed very easy to play the blame game and throw phrases like ‘it has to be taught better’, ‘the curriculum needs to be reformed’ etc. I agree the system and the way the syllabus is assigned might have its flaws but the approach we take on an individual level has to be adjusted too. And fortunately there are quite a few individuals and groups that are working towards this goal of trying to instill a sense of wonder and praise for this subject that

Math: much more than numbers Chemists will agree that their subject is more than just bonds, atoms and molecules. Chemistry is involved in the study of hormones, and manufacture of explosives. It is used to create dyes, cook delicious food and better our knowledge of what the matter in this universe is composed of. Love (or its illusion), wars, textiles and an understanding of our origins as well as the whole cosmos owe a lot to chemistry. Similarly, math is more than numbers and equations. Probability, geometry, symmetry, statistics are related to patterns, music, poker, and the ups and downs of stock market (or their illusion). Math and Mona Lisa are more connected than you think. Psychology and mathematical modeling are well acquainted too. A cardiod is not just a shape whose x and y coordinates are given by a dreaded combination of sines and cosines. It is a rare image seen in a conical container when the conditions are just right and when the angle at which the light shines happens to be equal to the given angle of the cone.

The book “Here’s Looking at Euclid”

Matt Parker is another brilliant entertainer who combines his knowledge and skills as a mathematician with his talent at comedy. He uses standup comedy to change the way we look at math. He has won several awards as a comedian (or a number ninja which is what he often calls himself). Steven Strogatz, a professor at Cornell University, blogs about mathematics for the New York Times. David Lynch has embraced mathematics and has been involved in an exhibition that just finished this year in March at the Fondation Cartier in Paris.

A cardiod in a coffee cup

Many might still remain unconvinced and firmly believe that math is worthless to their career, jobs and life, and maybe it is. ‘Pointless’ is supposedly one of the most tweeted words at the moment, 90% of which tend to include students saying “this class is pointless”. These claims might be true, but it might also be a silly approach to learning for the sake for learning (and it holds true for any class, not just for math). CONTINUED ON PAGE 14



Map of Submarine Cables This map shows the underwater cables that are used to transport telephone communications and Internet data between all continents (except Antarctica).

Explore the interactive map at www.cablemap.info.


14 / So the next time you browse through Alice in Wonderland, make sure you remember to derive pleasures from the nonsense literature* used in this masterpiece, many of which have been hugely inspired from some of the conflicting mathematical concepts during that era.

Pulling the Plug by Georgi Kostadinov

What do you think when someone mentions the word “brain”? The word “internet”? The word “job hunting”? Connections.

And when you have to undertake a Fourier expansion for your next assignment, spend a few seconds to marvel at the omnipresence of those sines and cosines. If you get exposed to the famous Navier Stokes equation, remember that many economists spend their career on this relation which at first seems to be restricted to those studying fluid mechanics and dynamics. This essentially means that if say an engineer with an expertise in weather modeling meets an economist whose research deals with analysis of macroeconomic growth and modeling of economic relations, the two of them might find it very hard to talk qualitatively about their professions but will find common ground in the form of the similar technical math that both of them are exposed to in their respective fields! Hugo Rossi said it best. Mathematics is indeed an edifice, not a toolbox ■ * Of course, on the diametrically opposite side there

are truly nonsensical and immensely unfortunate ways in which some use numbers. Numerology is one example. Giving six hundred and sixty six a bad reputation is another. Using numbers to quantify selected events, names and situations is nothing more than locating coincidences. The highly selective choices via which they try to find meaning or significance in actions, incidents and scenarios makes their sample size highly skewed and their methodology unscientific.

References • “Mathematics Is an Edifice, Not a Toolbox,” Notices of the AMS 43, no. 10, October 1996. • http://www.thegreatcourses.com/tgc/courses/course_ detail.aspx?cid=1411 • “Alice’s adventures in algebra: Wonderland solved”. http://www.newscientist.com/article/ mg20427391.600-alices-adventures-in-algebra-wonderland-solved.html • “Creativity Meets Math at Fondation Cartier in Paris”. http://intransit.blogs.nytimes.com/2012/01/02/creativity-meets-math-at-fondation-cartier-in-paris/ • “Voyages to uncharted territories”. http://www.lablit. com/article/700 • “The Hidden Math Behind Alice in Wonderland”. http://www.maa.org/devlin/devlin_03_10.html • “Mathematical psychology”. http://en.wikipedia.org/ wiki/Mathematical_psychology • “Here’s Looking at Euclid”. http://www.amazon.ca/ Heres-Looking-Euclid-Surprising-Astonishing/ dp/1416588256 • “Cardiod”. http://en.wikipedia.org/wiki/Cardioid

Throughout human progress, connections have been some of the most pronounced indicators of civilization. In fact, one could not speak of a society if there were no connections to define it - you can’t have a society with only disconnected members. Society evolved when humans decided to group together in cities, and it again evolved when they decided to make connections between these cities. Greek and Roman societies are perhaps better characterized by their unprecedented ability to connect cities at unimaginable distances apart, both by land and by sea. Some of the roads and bridges built by the Romans to hold their empire together still stand today. But I won’t be talking about the Romans, as interesting as they might be. There is much to discuss about connections. There’s a whole industry devoted to telecommunications, and millions are spent each year studying ways to improve how we communicate. So let’s just focus on one small aspect of connections, that of communications, and let’s concentrate on the evolution of cell phone technology. How did cell phones come about? How do they work? What’s in store for them?

The idea of a mobile phone certainly wasn’t new. If you’re a fan of war movies, you must have seen the use of radiophones to communicate the movements of the enemy. Walkie-talkies were already widespread at the time when the first mobile phone was developed. There had also been mobile phones for use in cars. You know, those fancy ones you see in limousines. As is often the case with more modern inventions, the mobile phone was not developed in isolation. The one to patent the portable mobile phone, however, was Dr. Martin Cooper for Motorola in 1975. Kind of sad for Bell Labs, since most of the concepts were initially developed there. The first commercial cell phone network put in operation was in Tokyo, Japan, in 1979. The first in the US took about 4 more years. Brief intermission. We all certainly have this unquenchable thirst to communicate, so why did it take so long? What was the difficulty? Many people fail to realize how hard it is to get spectrum. In actual fact, you can’t just go around broadcasting to whatever frequencies you want. The use of frequencies is govern-


TRIUMF is looking for people like

Look For Opportunities at Canada’s national laboratory

YOU

ment regulated, and believe it or not, it is a very scarce resource that costs quite a bit of money. Even in recent years, you can hear about spectrum allocation - in 2008, there was a lot of noise about it in India, and just recently, AT&T bought some for 1.9 Billion Dollars. End of intermission. The communication system used for cell phone communication in the early eighties was AMPS (Advanced Mobile Phone System). It was analog. The problem with analog is that whenever you make a call, it would take up a band of frequency for that communication channel. So, if you have limited frequencies, there is a limited number of channels you can use. Frequencies are expensive, so if you run out of channels, you’re stuck. You can’t get more subscribers, so you can’t make more money.

www.triumf.ca

In the nineties, several new, digital, standards were established. As with most standards, the world couldn’t make up its mind on which standard to use. The three major ones are TDMA (time division multiple access), CDMA (code division multiple access) and GSM (Global System for Mobile Communications). CDMA uses some convoluted encoding scheme that was originally developed during World War II by the military to prevent the enemy from intercepting transmissions. It splits up your data to a bunch of random spectrum bands, and it does the same for everybody else’s data, and since it’s random, they don’t overlap. These digital technologies are commonly referred to as 2G. Their use of spectrum is much more efficient than analog, and they allow for everyone’s favourite, SMS, or text messaging. The next generation, 3G, uses a number of standards, but the most common one is W-CDMA, or wideband CDMA. Nothing much to say here - just another standard, with a few variations. The difference between 2G and 3G is mostly in the transmission speeds, and some other characteristics that must be satisfied. Chances are your phone right now uses 3G. Speaking of phones, there has been one major shift in recent years - the advent of the smart phone. The standards mentioned so far were mostly developed to accommodate for voice traffic. This has changed, as people now use their phones to look at videos of cats. The transmission speed has become a bigger concern, but there is also another aspect. Voice calls usually use something called circuit

The Motorola DynaTAC 8000X from 1983, one of the first “mobile” phones

www.aapsinc.com

switching. They would require a certain type of hardware. Data transfers usually use something called packet switching. They require a different type of hardware. Can you spot the problem? Since the majority of traffic is now of the packet switched type, there is strong incentive to migrate everything over to the same hardware. You’ve probably heard about voice over IP (VoIP). That’s the general idea. The newest arrival to the scene of cell phone standards is LTE. It’s the 4G everybody’s been talking about. The good thing about it is that a lot of people are saying they’re going to use it. Who knows? The world might make up its mind this time. Another good thing about it is that it’s very fast. We’re talking about more than 100 Mbit/s here. And that’s where we stand right about now. Today, the number of our connections is growing at an exponential rate, and it’s never been easier to talk to someone on the other end of the globe. We grew up in this new era - the era of cell phones and the internet. And maybe we’ve become spoiled. I was traveling to Boston a few weeks ago, and my cell phone had no coverage. Scary, isn’t it? It’s not something that would kill me, but the thought is nevertheless a bit troubling. What would happen if someone pulled the plug? ■ References • “History of Cellular Phones”. http://inventors.about. com/library/weekly/aa070899.htm • “FCC approves AT&T’s $1.9b purchase of 700MHz spectrum from Qualcomm”. http://www.theverge. com/2011/12/22/2656283/att-qualcomm-700mhzlte-fcc • “Spectrum controversy in all its shades”. http://www. indianexpress.com/news/spectrum-controversy-inall-its-shades/386043/ • “United States Frequency Allocations: The Radio Spectrum”. http://www.ntia.doc.gov/files/ntia/ publications/2003-allochrt.pdf • “Federal Spectrum Use Summary”. http://www.ntia. doc.gov/legacy/osmhome/spectrumreform/Spectrum%20Use%20Summary%20Master-06%2021%20 10.pdf


16 /

GUEST ARTICLE

Self-Teaching a Love of STEM: A Personal Tale by Christopher Mitchell

Christopher Mitchell is currently pursuing a Ph.D. in Computer Science at NYU. Within the community of calculator programmers, he is a renowned afficionado. He recently published a book for beginners about programming the TI-83/84 graphing calculators, which you can order at manning.com/mitchell. At the age of five, I was obsessed with trains. One day, my mother took me to the New York Transit Museum, which was holding a workshop about electricity, complete with batteries, light bulbs, magnet wire, and compasses. From that day, I knew I wanted to study electrical engineering, and began teaching myself about circuits, gadgets, and later, programming. I learned several programming languages, designed and built gadgets and hardware modifications, and eventually earned three degrees in electrical engineering and computer science. Relatively early in my programming and engineering career, I started to do teaching of my own, first online, and later for continuing education and undergraduate classes. I developed a conviction that all students and fledgling coders and engineers deserve the same opportunities to explore and teach on their own that I was given. After my introduction to electronics at the Transit Museum, I sought out electronic design books, asked for Radio Shack’s then-admirable Forrest Mims 130-in-One and 300-in-One kits, and taught myself about components, circuits, and even the rudiments of the underlying math. In early elementary school, I learned LOGO and toyed with QBASIC, but I continued to mostly focus on designing and building circuits. Once I received my first graphing calculator, however, my focus began to shift. When I was in seventh grade, I got a trusty TI-83 for Christmas. At first I thought it was little more than a fancy math tool, but as I began to use it more, I discovered it was something else entirely. I found that it had a program editor, and that by putting together a few commands, I could make the calculator do my bidding. The concepts of programming were not entirely foreign to me, thanks to my earlier exploration of LOGO and QBASIC, but I began to build a much greater breadth and depth of knowledge as I worked with my calculator. I started with simple animations, creating programs that drew and erased characters to create primitive ASCII art. After seeing

a few games on friends’ calculators, I took faltering steps into what I later learned was reverse-engineering. I examined other programs, figured out how they worked, and used my new knowledge to improve and expand my own projects. I got involved in the international TI calculator programming and hobbyist community and published some of my projects. I fielded feedback, compliments, and criticism, and learned to grow as a person, a programmer, and even a marketer of my own work. The TI-BASIC language that I had learned was easy but powerful, and I learned to do a great deal with it. However, I was frustrated to notice that some of the programs I encountered seemed far more advanced and powerful than anything I could make. When I tried to view their source code with the calculator’s built-in editor, I was confronted by a sea of random symbols. I eventually learned that these programs were written in z80 assembly language, created on a computer and assembled into a form the calculator could understand. Over a summer, I began to work with the language, at first painstakingly typing out the hexadecimal for each opcode on my calculator, and later gaining access to a computer to use an assembler. By gradually honing my skills, I learned about the internals of processors, memory, and I/O, skills that matured into a love of low-level programming and hardware design as an undergraduate and graduate electrical engineer. I wrote a graphical shell for the TI-83+/84+ calculator, a mouse-based GUI library, games, a music and video player, and even a decentralized networking protocol. Looking back at all of my experiences with graphing calculator programming, I realize that it reinforced my enjoyment of working with circuits and taught me to enjoy hacking in the positive sense. I enjoy the challenge of making an extremely low-resource device do as much as possible, and pushing myself to complete projects that others might dismiss as impossible. Having taken myself from the simplest commands in BASIC to complex hand-coded z80 and x86 assem-

bly, I decided I wanted to share my love of coding (and particularly calculator coding) with the masses. When I was still taking my early steps with TI-BASIC, I founded an online forum and community website called Cemetech (“KE-me-tek”). I used it to publish my own programs and projects, but also began to use it to amass skilled and beginner programmers alike, who learned from each other and began to post their own projects. To date, Cemetech has amassed about three thousand users, and incubated software and hardware projects for calculators, computers, embedded systems, and the web. I was invited to teach beginner and advanced Java programming courses for my alma mater’s continuing education program. As a graduate student, I have twice taught my advisor’s undergraduate students about operating systems, C and x86 assembly programming, and reverse engineering. Their Computer Systems Organization class challenges them to launch buffer-overflow exploits, implement and optimize their own malloc design, and reverse-engineer raw x86, among other labs. I enjoy the challenge of teaching them these low-level concepts, and feel that the sense of accomplishment I feel when a student finally has a moment of understanding makes it worthwhile. I was therefore thrilled to be asked by Manning Publications to write a book about programming graphing calculators. “Programming the TI-83+/84+” is due in print this September, and is written to instill in readers young and old the same love of programming that I developed. I believe that self-education and an early exposure to engineering and programming is vital to prompting a life-long love for these fields. Although I believe I had a predisposition towards technical fields and hobbies, the opportunities I was given fueled early interest that matured as I aged. In particular, without the ability to program my calculator constantly, whether at lunch, at home, or (perhaps unfortunately) during class and while walking to school, I doubt I’d have the love for and


intuition into programming that I now have. In chatting with many current and ex-calculator programmers, I have heard countless versions of my own story: the self-driven exploration of the calculator’s features, the thirst to learn what made programs and games tick, the love of surmounting a good challenge. I think the burden lies on museums, libraries, and even technology companies to be good citizens and make such opportunities available to children and teenagers. A year and a half ago, I wrote an editorial criticizing Texas Instruments, who had taken a nearly Apple-esque position in locking down their new TI-Nspire graphing calculator. Native programming in assembly, C, or even TI-BASIC was impossible with their new calculator, a restriction aimed at placating teachers upset about students playing games in class. Only via third-party hacks could the device be unlocked, and with each new operating system (OS) release, TI squashed the existing unlock exploits. In my piece, I decried TI’s attitude with the Nspire as astonishingly short-sighted; it yielded a calculator that would not allow students to explore programming, a stark contrast to the TI-83+/84+ series. Texas Instruments vociferously promotes STEM (Science, Technology, Engineering, and Mathematics) education, but requiring “jail-breaking” to even write usable programs on the calculators showed exactly the opposite attitude. Thankfully, my editorial and other negative press forced them to partially reverse their decision, and they have now made the Lua language writeable on the calculator. Nevertheless, the Nspire continues to largely cater to a narrow view of the needs of teachers, rather than encompassing the equally-important needs of the students who buy and use Texas Instruments’ calculators. Perhaps you have a similar story of how you got into STEM fields, where you pushed yourself to learn from existing programs, from taking things apart, and from books. Even if your knowledge of technology and engineering comes entirely from formal instruction in classes, I believe you can appreciate the value of earlier exposure to the fields in every form, from hackable, programmable gadgets to easy-to-access fora with free expert programming help. I encourage educational and commercial institutions large and small to press forward in educating younger generations and to give them the opportunity to make the same self-driven discoveries that many of us once made ■


ERIC SCHADT


Q&A / 19

Q&A Eric Schadt Eric Schadt is the Director of the Institute for Genomics and Multiscale Biology at Mount Sinai Hospital in New York City. He is also the Chief Scientific Officer at Pacific Biosciences. We sat down with him to discuss his impressive career path. Dr. Schadt went from pure mathematics and computer science to bio-mathematics. His research focuses on generating and analyzing big biological datasets to further our understanding of human disorders. Tell us about your background before college. I have a very odd background. I grew up in a very poor rural area, where education wasn’t really something that was promoted and then went into the military. Through that, I got into college and when I started, it was a very intellectual sort of exercise, trying to discover how smart I was and how far I could push myself. So the combination of CS and pure math was a very natural place to go to push myself.

math that we do (e.g. Bayesian network reconstruction) is difficult but it’s not what a mathematician would view as the hardest thing!

«

my time in meetings and fighting for why doing this is important instead of actually doing the science.

I have a very odd background. I grew up in a very poor rural area, where education wasn’t really something that was promoted. » What did you do next?

You started out by studying Math and CS. What got you into biology?

Did you go to industry immediately after your Ph.D.?

My undergraduate degree wasn’t so challenging so I decided to do my graduate studies in pure math, which I view as one of the most conceptually difficult areas of study. I was going through that but I always had an applied bent, and in pure math, it’s doing math for math’s sake: it’s not encouraged to figure out whether what you work on would satisfy another area of study. So out of curiosity, I wanted to figure out how everything we’re doing fits together, why we’re here and so on.

Yes. What I saw while finishing up my Ph.D. was this revolution around technologies like the gene chips (used to recognize DNA from samples being tested) and microarrays (used to measure gene expression levels). Many companies said they’d start generating massive scales of data, house them in big databases and mine them, and that was unheard of in biology.

At UCLA, once I got the Ph.D. candidacy in pure math, I made the jump to a bio/ math dual Ph.D. program that had the right level of rigor. I didn’t want to be a mathematical biologist of the type that were very good mathematicians but didn’t have a very deep understanding of the problems in biology and how to design your own experiments. I wanted to grasp that intuition; I wanted to think like a biologist. Are you a biologist who does computation or a mathematician doing biology? I view myself as a biologist who is heavily computational. Mathematicians that I’ve worked with in the past would not respect at all what I’m doing now as being real math. That always stings me a little bit, because biology has classically not been a very quantitative science so the kind of

So I was very interested in those technologies and looked around for how I could get access to them. Roche Biosciences was among the first to sign all these big deals with the companies who were making the technology, and they appreciated the fact that you needed someone much more mathematical to look at the data. So I joined Roche. It was perfect timing. What attracted you to Roche? They had access to technologies that none of the universities had because of the outrageous costs at the time. What also drove me to Roche were the big resources and the excitement of having to carry out the right experiments to show proof-of-concept. Because I was one of the first to apply statistical analysis for gene chips, I gained a certain degree of fame doing that, which caught the attention of the heads of Roche. But all of a sudden I was spending 50% of

I started talking to Rosetta, a startup that focused on building the technologies behind gene chips. After a year and a half at Roche, I went to Rosetta because they were more focused on the science. Also, since it was a startup, there was no bureaucracy or politics. About a year and half later, Rosetta was bought by Merck. They loved what I was doing and they invested very heavily in that arm for 5-6 years. We did lots of good science and published a lot of papers. By the time I left Merck, we were responsible for about half of all the new drug discovery programs, so we were also delivering on the business side. Why did you leave Merck? Merck also got limiting because, as we were learning more about building these Bayesian networks, we wanted to go to the next level. We told them what we thought the next step should be but the price tag was about a billion dollars. That was too expensive for any one company to fund, so they were thinking more along the lines of turning this area into a pre-competitive space, where companies would be able to share all the data between each other. After a lot of discussion, the coFounder of Rosetta and I left Merck to found SAGE Bionetworks, a non-for-profit research center in Seattle. We focused on


20 / Q&A open-access biology: how to facilitate sharing of big data, how to build models and validate them, and enabling others to interact with those models. What did you do once at SAGE? Now that SAGE was set in motion, I joined Pacific Biosciences (PacBio). The idea was that I would setup a new institute in the Bay Area that would focus more on data generation and model building. PacBio knew that going in, and they liked the idea of me spending 75% of my time doing research outside the company because it would cost too much to have that big of a research effort going on internally. And for 25% of the time, I would be the Chief Scientific Officer at PacBio.

«

This is the first time my heavy foot is in academia. I’ve seen both worlds for a long time and I think what academia offers is the ability to be your own CEO, grow out your own program and even though there are funding issues, you have much greater flexibility than in a company. You can make the kind of partnerships you need to leverage what is happening in industry and I view that as a more favorite path. But I will say that the industry path offers, especially to young investigators, clarity of purpose and focus. Going to a startup is an experience like no other. Unlike academia, where you’re able to float and your timelines aren’t so critical, in a biotech startup, you’re living six months to six months. You have money and you see the cliff of when the money will end and if

Biology has to become a more physics-like discipline. If biologists don’t do that, they’ll become irrelevant when Google, Amazon and other computer science powerhouses come in and do it before them.

Although we got offers from UCSF and Stanford to set up the institute there, we needed ~$100M to really make a go at that project and we were having trouble finding enough money to make that project more than just my lab and myself. As I expanded the search for money, I locked onto Mount Sinai because we found donors inclined to give the $100M to do this effort. What do you like best about Mount Sinai? Compared to Stanford and UCSF, Mount Sinai had a reduced bureaucracy. Here, there’s a CEO who runs the hospital and the medical school. It’s a command-and-control architecture that I’m used to from the business side, where it’s easier to see things get done than one where every decision needs a committee. And it is smack down in the middle of a medical center, which will allow us to impact decision making directly in the clinic. That was very attractive. Moving to the East Coast is not something I thought I would ever do but all the pieces fell together!

you should have an expectation of privacy around that. On the other hand, there are things like your face, which can be used to identify you but you have no reasonable expectation of privacy around your face. DNA used to be in the camp of the social security number: of course you want to keep your DNA protected because it defines who you are. What’s changing now, however, is that the technology is becoming so amazing that, in 10 years, sequencing your genome will become as easy as taking a photograph? When that happens, there will still be the personal identifiable issue but the expectation of privacy will go away, because how can you have expectation of privacy around something that is as easy as taking a photograph. That is the transition we’re in. Of course you can take steps to protect the information but there are limits to what kind of privacy you can expect. Educating the population and legislators about that is critically important. The next step is to make laws that prevent discrimination based on that data. What’s the biggest change you’d like to see in biology?

you don’t meet milestones, you’re going off that cliff and you’ll have to fire half the people in the company. That drives you to form bonds and work as a team to accomplish things far bigger than you could ever do. What you learn from that is invaluable. The other advantage that I learned at Merck is: If what you’re working on is in the critical path of a company, the scale of resources you can get to carry out your vision is an order or two of magnitude greater than what you can ever get funded to do in academia, especially if it’s something new and risky. What are the privacy issues with DNA sequencing becoming more popular? We always want to protect data that can personally identify us but there’s another component of that which is the expectation of privacy. For example, your social security number can identify who you are and

If biology wants to go to the next level in achieving an understanding of all the complex things we see, it needs to become much more quantitative and informationdriven. Biology has to become a more physics-like discipline. If biologists don’t do that, they’ll become irrelevant when Google, Amazon and other computer science powerhouses come in and do it before them. They won’t wait for the biologists to give them permission to analyze that data so if biologists aren’t there to work with them, they’ll be supplanted. I don’t think that’s extreme when you consider competitions where solving a biological problem gives you a $20,000 or $50,000 prize. If you look at who’s on the top of the leaderboard, none of them are biologists. Last time I checked, the top of the leaderboard was an accountant from Australia who knows nothing about biology ■

Did you know?

Do you have advice about choosing between academia and industry?

We accept articles year long.

I’ve always had a foot in academia and another in industry. Before joining Mount Sinai, however, the heavier foot was always in industry.

Send your articles to

articles@technophilicmag.com


/ 21

Where Them Girls At? by Elisabeth Perron I won’t lie, I chose engineering as my field of study half out of interest and half because of my desire to break the mold. As the years have gone by, I’ve been collecting both knowledge and experience, hoping to one day write some revealing exposé about women in engineering and cause the whole education system to reform. Well, I’m not there yet, but here’s what I’ve gathered so far... First, some numbers: Did you know more women than men attend university? In fact, women make up approximately 55% of the undergraduate students in Canada, yet only about 20% of these women will go into engineering or applied sciences (see http:// ewh.ieee.org/soc/es/Nov1999/10/BEGIN.HTM). That means 11% of the student body is female scientists and engineers. Now remembering that this 11% has to get split up between the various departments and fields… well, that means very few girls per class.

«

After hearing the facts, I asked the simple question, “But why?” There is a general consensus (but no proven research) that girls aren’t good at math. That science is for boys. That we should just stick to the

scientists all over Europe would be interested to have her work with them. Nope. Her only job offer was at a gas lamp factory. With a Ph.D.! Equipped with ridiculous amounts of perseverance and intelligence, she accomplished all her goals, from becoming a full-fledged professor of physics at the University of Berlin (the first woman to do so) and a famous researcher in Europe. But her journey was not an easy one. She was not allowed to attend classes at certain universities, it took years before her admittance to a prestigious research institute despite stellar work, and she was even basically robbed of a Nobel Prize when she discovered nuclear fission with Otto Hahn but only he received the coveted award. It’s a difficult story to hear, but quite inspiring,

began working at Xerox as an intern, and slowly rose to the top. Her biggest accomplishment is not a complete revolutionizing of engineering or an invention or a Nobel prize. Instead, she is the first African-American woman to head a Fortune 500 company. And also the first woman to replace a woman as CEO of a Fortune 500 company. So for that, we definitely give her a good clap. Emily Warren Roebling Ever heard of a little thing called the Brooklyn Bridge? The civil engineer who designed it, John A. Roebling, was injured shortly after beginning the construction (and eventually passed away because of complications). Before dying, he handed the project over

Women make up approximately 55% of the undergraduate students in Canada, yet only about 20% of these women will go into engineering or applied sciences. That means 11% of the student body is female scientists and engineers. »

other stuff. Not science! Not engineering! The problem actually begins much before high school students fill in their university applications. It’s in our homes, in our elementary/ junior high/high schools, and in our society. It’s a bit discouraging, knowing that the whole system can’t be improved just like that and that girls are going to believe these things, even if they’re not true. Obviously, I’m not famous enough yet to tweet about it and have the whole world react. But I prepared a little something for the girls: So ladies. To help motivate you, to help you tap into your inner feminist (and scientist!), to show you how you could change the world, I found a few inspiring women who fearlessly tackled applied sciences. Lise Meitner To kickstart the list, let me introduce Lise Meitner, who obtained a doctoral degree in physics in 1905, the second to do so at the University of Vienna. After completing such a prestigious program, you would assume

in the way she forced her way into the field of physics and never gave up. Also her cool factor goes up quite a bit when you find out element 109, meitnerium, is named after her. Stephanie Kwolek Stephanie Kwolek, a Polish-American chemist, invented poly-paraphenylene terephtalamide. Or Kevlar, if you don’t recognize the scientific name. She worked at DuPont, a famous chemical company, to raise money to go to medical school, but ended up liking it enough to drop her idea of being a doctor in favor of a career as a chemist. In the experiments her team conducted, waste was generated and discarded. But our girl here used her passion for science in convincing someone to test the waste. And boom, Kevlar was born. Ursula M. Burns I feel like by now, you’re all hoping for a more recent tale of girl power. Well meet Ursula M. Burns, Madam Chairman of Xerox Corporation. Our girl Ursula rocks a B. Eng. in Mechanical Engineering, as well as a Master of Science in the same field. She

to his son, Washington, who coincidentally enough became ill and developed a debilitating condition. In order to save the project, his wife Emily Warren Roebling stepped in to oversee the work. During her time as the technical leader, Emily studied everything from higher mathematics to cable construction to bridge calculations. Not only did she bring the project to completion and essentially become a civil engineer in the process, she fought to have her husband remain the official chief engineer on paper (thus giving him most of the credit). Next time you happen to be in New York, now you’ll be able to appreciate this bridge so much more knowing that a woman with no background in science tackled and completed successfully what many consider the most difficult civil engineering project ever. So next time you walk into your classroom, I recommend sitting beside a girl. She might just be the next great scientist/ engineer, and trust me, you’re gonna want in on that ■


22 /

FEATURE ARTICLE

The Future of Wearable Gadgets by Surabhi Joshi

Tablets and smartphones are everywhere. It is hard not to be linked to your email 24/7 if you own an iPhone, a BlackBerry, or an Android phone. However, things might get more intimate quite literally with the advent of ‘wearable devices’ that are perhaps the next big thing, and will replace the excitement that currently exists over smartphones and tablets. A few of these gadgets, some of which might already be in use, and others that might become more common in the near future, are listed below. Wristbands clad with sensors. Nike has released its ‘Fuelband’ that can track the calories you burn and the steps you take (via accelerometers). You can even use it as a watch. However, not all movements can be quantified by this wristband (yoga and weightlifting for example). Also, it is not waterproof (only water-resistant), and there is no GPS. If all you desire is a simple form of motivation (once you start wearing it, you might be more inclined to climb the stairs than take the elevator to enjoy that inspiring increase in the number of steps) or a way to quantify your calorie budget, this gadget is a useful purchase. However, for fitness freaks (i.e. those who love to analyze nutrition facts on the bottle of the 1% skim milk they try not to drink every morning), the Fuelband might not provide an accurate measure of all the calories burned during your workouts, in which case it might be worth exploring other products or waiting for a new version (and may be as a bonus, you’ll get one with a better catchphrase: Life is a sport. Make it count.) Smart sneakers. Adidas adizero F50 shoes feature an intelligent sneaker design that can track one’s workout statistics and speed. The outsole unit features a cavity that stores the miCoach SPEED_CELL™ and is capable of computing important yardsticks such as speed, average and maximum speed, distance and number of sprints, all of which are then stored on the on-board memory, which can be viewed later on your laptops and tablets. Android-based watches display a vast potential in navigation, health, social networking, and media related fields. Sony has built one such watch, which at the moment is strictly an accessory and bought by those who hate missing phone calls and/or de-

sire a reasonably priced gizmo. Still, with improvements and added features, these are bound to become more common. Coffee card. Starbucks has embraced the ‘cool factor’ by participating in trials for WIMM’s smartwatch via which customers can make their purchases using the bar code which is connected to their coffee account, a welcome micro-app that is also sure to intensify ‘the starbucks effect’ among smartwatch-using coffee drinkers (The New Yorker’s definition of the starbucks effect: A long line for a product you’re not sure you want). Project Google Glass looks very promising and the video that was released earlier this year has created quite a sensation and allowed us to sample the future of communication, navigation and information. Other examples of wearable devices include dual-focus contact lenses with data displays, LARK sleep sensors, phone call

alert via a vibrating tattoo, and fabric that reacts to emotions. It has been forecasted that these type of wearable devices will become commonplace in the near future, especially once the major platforms (Google, Microsoft, Amazon, Apple, and Facebook) develop further interest. One can see the huge potential these wearable gadgets and embedded devices (such as flexible displays and surfaces that can be transformed into screens) will have in store for the future. Some might just offer the convenience of having all the desired features in one portable device, several might take entertainment to a whole new level (as proven by the announcement of SmartGlass) while others could prove extremely valuable in avoiding grave circumstances (for example, wristbands that can detect and some day perhaps even predict seizures) ■


JOKES / 23

XKCD Comics

by Randall Munroe, XKCD.com, CC License


SHOW THIS MAGAZINE AND GET A

2 2


Issuu converts static files into: digital portfolios, online yearbooks, online catalogs, digital photo albums and more. Sign up and create your flipbook.