Cio june 2014 wired world

Page 1










Hands off the Steering Volvo has begun testing cars that—with the push of a button on the steering wheel—take over a car’s acceleration, braking and steering.

AU TO

Brainy Machines D E V I C E S Future computer systems could better tackle big data and autonomous systems. Researchers at Sandia National Laboratories are working on a computer that can tackle real-world situations in realtime and can run on the same power as a 20-watt light bulb. The only “machine” that can handle those functions is the human brain. Thats why scientists are trying to build a computer system that works more like a brain than a conventional computer. “Today’s computers are wonderful at bookkeeping and solving scientific problems often described by partial differential equations, but they’re horrible at just using common sense, seeing new patterns, dealing with ambiguity and making smart decisions,” said John Wagner, cognitive sciences manager at Sandia National Laboratories, in a statement. Scientists at Sandia, which are two major US Department of Energy research and development operations, are working on neuro-inspired computing as part of a long-term research project on future computing systems. Neuro-computing systems are expected to be much better-suited to taking on big data problems, which the US government, along with major enterprises, are working on. The systems also should be better at handling remote autonomous and semiautonomous systems that need greater, and different, computational power. Today’s machines take a command from the program and data from the memory to execute the command, one step at a time. However, the architecture of neuro-inspired computers is expected to be fundamentally different. These future machines would be designed to unite processing and storage in a single network architecture. Scientists have focused on mimicking the brain’s neural connections before but there’s much more excitement about this project because of the advances being made in the field.

—By Sharon Gaudin — By Lucas Mearian 8

J U N E 1 5 , 2 0 1 4 | REAL CIO WORLD

VOL/9 | ISSUE/08

IMAGES BY T HIN KSTOCK

TRENDLINES

Volvo’s self-driving cars use radar, camera to monitor traffic and infrastructure around the vehicle and laser technology to monitor the environment around it. Each car also uses a private cloud map service of the roads it’s traveling on in order to have latest data for the vehicle’s computer. Volvo Car Group’s “Drive Me” project expects to have 100 self-driving cars on the road, with the first prototypes already driving around the streets of Gothenburg, Sweden. The roads being used by the test vehicles are typical commuter arteries, including motorway conditions and frequent queues. The prototype vehicles now being tested do require the driver to continue supervising the car’s performance, since the vehicles are still in test phase. “Our intention is that in the final product a driver can actually release the steering wheel without having to supervise so that he or she can do something else with their time,” said Erik Coelingh, Volvo’s technical expert for activity safety braking, accelerating and steering.” Volvo said that what makes its autonomous vehicle project unique from others—such as Google’s selfdriving cars—is that it involves all the key players: legislators, transportation authorities, a major city, a vehicle manufacturer and real customers. Gothenburg has about 500,000 people. Volvo’s customers will drive the 100 cars in everyday driving conditions on approximately 30 miles of selected roads in and around Gothenburg. “That Volvo Cars’ hometown, Gothenburg, becomes the world’s first arena for self-driving cars in everyday driving conditions demonstrates both our technological leadership and Sweden’s dedication to pioneering the integration of self-driving vehicles,” Coelingh said. “We do analysis on traffic safety. We know from this analysis that almost all collisions that occur are caused by human error,” Coelingh said. “If you automate driving, you take away the causes of many accidents and you can make traffic safer.” The public pilot, Coelingh said, will provide Volvo with valuable insight into the societal benefits of making autonomous vehicles a natural part of the traffic.






Archive Tweets for Safety capable of capturing records in their original published context, before making it permanently accessible through the UK Government Web Archive. Clem Brohier, interim chief executive and keeper at The National Archives, said it was “imperative” for the agency to develop systems to support social media archives because social media now plays an important part in government communications, with Twitter being used to clarify policy and YouTube used to promote initiatives. Since 2003, over three billion items that were published online by the UK Government, including web pages, documents, interactive games, have been archived by The National Archives. The National Archives said it will continue to capture and archive tweets and video feeds from UK central government departments on a regular basis and develop the social media archive so that it is fully integrated into the rest of the UK Government Web Archive.

TRENDLINES

S O C I A L M E D I A The National Archives revealed that it has started archiving Tweets and YouTube videos published by UK central government departments from their official accounts. The new online social media archive contains over 7,000 YouTube videos and over 65,000 individual tweets from UK central government departments currently on Twitter and YouTube. The archived social media content covers major events in our recent history, including: the birth of the royal baby, Prince George of Cambridge; London 2012 Olympic Games; the Queen’s Diamond Jubilee and Andy Murray winning Wimbledon in 2013. The executive agency of the Ministry of Justice said the announcement marks the culmination of a complex project to capture social media as part of the UK Government Web Archive and permanently preserve it as the official public record. The National Archives worked with the Internet Memory Foundation to develop tools

— By Sam Shead

IMAGE: T HINKSTO CKP HOTOS

When Fake News Makes News I N T E R N E T A suspected Iranian hacker group seeded Facebook and LinkedIn with bogus profiles of attractive women and even created a fake online news organization to get digitally closer to more than 2,000 people whom it wanted to spy on. Once they had befriended their targets through fake profiles, the people were emailed malicious links, according to a report titled “The Newscaster Threat,” released by iSight Partners, a security consultancy. Those targeted were more than 2,000 U.S. military members, U.S. lawmakers, journalists based in Washington, D.C. U.S. and Israeli defense contractors and lobbyists for Israel, iSight said in its report, which it did not publicly release. The

VOL/9 | ISSUE/08

group is suspected to be in Iran, based on their working patterns and the location of their command-and-control infrastructure, said McBride. The hackers slowly bolstered fake but credible-looking online personas on social networks, said Steve Ward, senior director for marketing. Profile photos, often of attractive women, were copied from random photos. The credentials of some fake personas were also embellished on a fake online news website call “NewsOnAir.org,” which was still online as of Wednesday night. The site copied news stories from legitimate publishers such as Reuters, the BBC and the AP. The victims were usually receptive to the social media invitations after

seeing that the fake persona was already connected with existing friends. Although the group used some malware, its primary method for compromising victims was merely tricking them into divulging login credentials for Web-based services. The attackers would eventually approach their target, such as with a message with a link to YouTube. A victim would first be directed to a fake Google Gmail login page in an attempt to gather the person’s credentials before being redirected to the video. In other instances, the attackers spoofed the Web-based login page for corporate e-mail systems.

— By Jeremy Kirk REAL CIO WORLD | J U N E 1 5 , 2 0 1 4

13













Mike Elgan

CULTURE

announced about a month ago. Swarm creates a feed of your family and friends based on groups. For example, it can tell you that Steve and Janet are about 500 feet away, and that John, Mary and Jerome are about a mile away. Swarm is similar to Facebook's Nearby Friends, which also launched recently.

Fudging the Numbers Twitter recently rolled out a series of design changes for the Twitter.com website. Two of these changes involve the benefit of approximate numbers in place of specific ones. Since the beginning, Twitter displayed the number of Tweets you've sent, the number of people you're following, the number of people who are following you and other information. Recently, however, it has started rounding these numbers down. For example, Twitter shows me that I've sent 23.8K tweets and have a very similar number—23.4K—followers. Even more vague: Twitter now shows tweets in larger type if they've gotten more engagement. It grades on a curve, so "more engagement" means more than your other tweets, not more engagement than other users' tweets. How much more engagement? And how is engagement defined? Don't worry your pretty little head about it. It's deliberately vague information.

Hiding Details

Why is vagueness a user benefit? Simple: Vagueness is humanizing. People round numbers, guestimate how long things will take, and speak in generalities. Vague data is easier to receive and comprehend.

Another trend is hiding the exact URL of the page you're looking at in your browser. I noticed some time ago that Apple's Safari for iOS browser replaces the actual URL in the address box with just the name of the website. When I'm looking at one of my columns on the Computerworld website (Computerworld is a sister publication of CIO) using any desktop browser, for example, the URLs I see displayed by the browser are pretty long and tend to contain information like the headline and the page number in the URL itself. Pretty standard. But using Safari for iOS on my iPad, the address box shows me not the URL, but simply (and vaguely) computerworld.com. The beta version of the Google Chrome (code-named "Canary") browser is reportedly working on a similar feature for the desktop version of Chrome.

Why Vagueness Benefits Users In every case—Foursquare, Facebook, Twitter, Safari for iOS and Google Chrome "Canary"—the companies have access to perfectly specific data and could easily show it to you. But as a service to you, as a user benefit, they're presenting you with vague information in place of specific information. Why is vagueness a user benefit? Simple: Vagueness is humanizing.

VOL/9 | ISSUE/08

I'll give you an example. People in real life don't say: "Wow! I just spent one-hundred and ninety-seven dollars and forty-two cents at Costco." They say: "Wow! I just spent a couple hundred bucks at Costco." People round numbers, guestimate how long things will take, and speak in generalities. And they do it on purpose. Vague information is easier to receive and comprehend. As technology grows more central to our lives, the specificity of information provided by the machines we use becomes a source of nagging stress. Companies are finding small and subtle ways to humanize technology by making the information presented to us vague, rather than specific. In the cases of Foursquare and Facebook, the idea of broadcasting your exact location feels de-humanizing. But revealing your approximate location feels nice. I'm giving information in a human way: Couched in generality, as well as personal relevance. In the case of Twitter, rounding numbers and separating high-engagement tweets from lower-engagement tweets with size subtly reduces the information overload of the wall of information on a Twitter profile. The trend toward showing only the basic name of a website, rather than the complete URL, as in Safari for iOS or Google

Chrome "Canary," has a security dimension to it. If the address box shows you the whole URL, your eyes are more likely to glaze over and your attention won't focus on it. A URL is a de-humanizing package of information. However, a simple domain can show you at a glance whether you're at a real site or a spoofed or fake one. Humanizing your Web location through vagueness allows your mind to engage with, and thereby benefit from, the location information presented. I'm convinced that these examples are only the beginning. I think we'll see a growing and broad effort by technology companies of all kinds to introduce vagueness everywhere they can. When will vagueness become mainstream? And how far will companies take the trend, exactly? The answers are: Pretty soon and pretty far. If that's too vague for you, you're welcome. CIO

Mike Elgan writes about technology and tech culture. Send feedback on this column to editor@cio.in

REAL CIO WORLD | J U N E 1 5 , 2 0 1 4

25




Rob Enderle

INNOVATION

paid to get. Every year I watch massive amounts of money basically being spent to create innovation using a practice that virtually always kills it. The proper way to handle this process was created at IBM not Dell, but Michael Dell, to his credit, saw the value and brought it to Dell. Dell is currently, in my opinion, doing the best job of getting value out of its acquisitions. And it is a relatively simple process: Identify what is of value inside the firm (what you paid to get) and protect it. Layer on resources that enhance the firm and don't ram the successful small firm into the bigger entity. This practice has resulted in a significant return on most every (I'm hedging because I don't know of one that failed) acquisition Dell has made since implementing the practice.

Skunk Works You don't see this much anymore, but this is basically where a firm creates a separate entity and then fills it with out-ofthe-box thinkers and removes much of the compliance structure. The closest thing to this in the market I know of (and these efforts tend to be very secret) is EMC's Pivotal software effort. Rather than reinventing EMC, it spun out much of software into a separate company, which could operate more like a startup. Dell took the whole company private, but we'll get to that in a bit. The idea is to remove the things that prevent innovation and staffed correctly skunk works projects tend to result in some amazing things. Mostly this process is used to create unique weapons, some of the most insanely wonderful fighter aircraft came out of programs like this.

I wasn't at the earlier competition, but I was at the final this week and was there when the winner was announced. The firm AnesthesiaOS (this was a solution for anesthesiologists that I think should be required by law) won the event over firms providing solutions networking pharmacists, connecting home care givers to critical information, a mini IBM Watson (not from IBM), and a solution that better connected patents to their own information. My personal favorite was Mana Health, which seemed to connect all of the information you'd need (including wearable devices) into a service that could improve your life. Oh and perhaps the most innovative was the Blue Marble Game Company, which used games to improve both your health and the effectiveness of your care-givers. For a relatively small amount of money (the cost of a trial that would become a showcase for Dell and Intel technology) they got access to a massive amount of innovation that the firms will later benefit from.

Finding a way to continue to innovate like startups is critical to the long-term survival of most companies. Innovate or die should be a pretty powerful battle cry.

Innovation Day This program, which Dell showcased this week targeted at healthcare, pits a bunch of small companies against each other all chasing the funding for a large-scale trial. In this case, the competition (which seems modeled after a reality show) starts in three cities and then two finalists are selected from each by segment experts including representatives from both Dell and Intel (Intel co-funded). A seventh company is selected from the pool as a wild card. The presentations are analyzed based on capability and on the impact a large trial might have on their success. The winner gets a funded large trial and, if that is successful, a springboard into being a full-sized and fully capable company in the segment (and the possibility of being further funded and acquired). I should point out there really aren't any losers in the last set because the judges tend to husband them personally into success just not as quickly. 28

J U N E 1 5 , 2 0 1 4 | REAL CIO WORLD

Fixing the Problem At the core of this issue, however, is the fact that innovation is hard to accomplish in big public companies. Dell is addressing part of this by going private because the excess focus on quarterly results is certainly a big part of the problem. Finding a way for companies to continue to innovate like startups is critical to the long-term survival of most and it would be well worth everyone's time to figure out fixes and perhaps use some of the methods I've identified to offset the problem. Innovate or die should be a pretty powerful battle cry. CIO

Rob Enderle is president and principal analyst of the Enderle Group. Send feedback on this feature to editor@cio.in

VOL/9 | ISSUE/08















Case File | CEAT

business growth to handle the additional 60 percent data volume and 20 percent ERP users. At the same time, the business required a high availability and 100 percent uptime with faster processing power. And that solely depended on CEAT’s ERP system. The tales of the company’s sluggish

ERP didn’t really come as a rude shock to Bhalivade. He knew that the system was dragging the company’s business down. Which is why he had already planned and budgeted for a system upgrade project. But Saha’s mail set the alarm bell ringing. “We thought if it has to be done

“Earlier, it took us 72 hours to execute the discount process. With in-memory, we are able to complete it within 10 minutes.” Niranjan Bhalivade, CIO, CEAT

then let’s do it now, the sooner the better,” says Bhalivade. But was upgrading server memory really the answer? For one, it wouldn’t be a self-sustaining move and it wouldn’t shrink the response time. “If it took seven minutes for the invoice to be generated now, then it would take five minutes after the server upgrade. But the time taken for generating an invoice would not reduce drastically,” says Bhalivade. Worse, after six months, it would require another upgrade and that means the company would have to dole out more cash. That wasn’t Bhalivade’s biggest problem. What he wanted to do was ambitious and bold: Cut down the time to execute reports from seven minutes to 10 seconds. It was clear that a mere server memory upgrade wasn’t enough to do the deed. This provoked Bhalivade to look beyond the tried-and-tested. And that brought him to a fairly new but powerful technology: In-memory ERP. “We decided to take a big leap and run our ERP from the memory itself. This meant that it doesn’t have to go back to the hard drive, search the data, process it, and then give it back to the user. And this also ensures that all calculations happen on the fly,” he says. That’s exactly what CEAT wanted. An excited Bhalivade pulled up his sleeves. It was time to get down to business.

Picking Up Pace Bhalivade wasn’t dispirited by the enormity of the project. In fact, the level of difficulty egged him on to prove his team’s mettle. That’s why he was undaunted by the fact that he had to migrate 1,000 users to in-memory ERP in just 100 days. Or that the project had to go beyond borders to include users in Sri Lanka and Bangladesh. Bhalivade constituted a 33-strong project team and organized a three-day internal training program. His objective was to shrink the time taken in report execution from seven minutes to 10 seconds. He gave clear instruction to his team not to transport any program that took even 11 seconds in the pre-production system without his approval. It had to be

VOL/9 | ISSUE/08



















CIO Career

conditions are ripe for conflict. But many IT leaders ignore the danger. CIOs who would never put a networking novice onto an important infrastructure project assign people with limited human dynamics know-how to projects where culture clash is likely. “Then we’re surprised there are so many problems,” McGee says. Dave Kelble, director of IT for the Abramson Center for Jewish Life, which provides residential and non-residential services to seniors at its 72-acre campus, considers organizational culture to be such an important topic that after getting an MBA in information systems, he went back for a master’s in organizational dynamics.

“It gives you a perspective you don’t get in business school or technical school,” he says. “I’ve found in the past the ROI calculations don’t necessarily get a project accepted. You have to work with people to implement new technology.”

‘That’s How We’ve Always Done It’ Whenever you hear phrases like “That’s the way it is around here” or “That’s how

we’ve always done it,” you’re dealing with corporate culture. Tread carefully: Cultural impulses aren’t always logical, and there’s always more to them than meets the eye. “I’ve been burned by culture occasionally,” says Stephen Balzac, president of consulting

firm 7 Steps Ahead and an adjunct professor of industrial organizational psychology at Wentworth Institute of Technology in Boston. Before his current roles, he spent 20 years as a software engineer, and that’s when he learned the hard way about corporate culture. In one memorable case, he was brought in to help a bioengineering firm revamp its operations. Begun in a garage, the company had grown quickly and now had large corporate clients. Its habit of releasing software rapidly and then fixing bugs as they cropped up had become a liability. “We had to turn into a professional software company,” Balzac recalls. The company’s leaders told him they felt their all-day meetings were a time suck. So Balzac set about replacing the meetings with other forms of communication. He was then asked, “Why are you getting rid of the meetings?” “Because you hate them,” Balzac replied. “But they work!” came the response. The company’s culture was so entrenched, Balzac realized, that even traditions that were unnecessary and unpopular couldn’t be removed without trauma. “I learned to back off a little,” he says. He instituted changes more gradually. And he gave management ample opportunity to try doing things the old way and confirm that it wasn’t working before introducing a change. Balzac sees culture as something akin to the body’s immune system: It accepts what it recognizes and rejects the unfamiliar, useful or not. “Think of Apple with John Sculley,” he says. “The whole company acted like it had a bad case of the flu.”

What Are Your Values? In many cases, examining the culture will reveal the true values of the organization. At kCura, for example, the culture is “teamoriented and personal, and we don’t have a lot of politics,” says CIO Doug Caddell. A provider of e-discovery software, Chicagobased kCura has about 360 employees. It’s been growing rapidly, and Caddell says the company’s culture helps foster growth. “It’s a competitive advantage,” he says, “and we see that when we’re recruiting: kCura 60

J U N E 1 5 , 2 0 1 4 | REAL CIO WORLD

VOL/9 | ISSUE/08



CIO Career Kelble was hired with a mandate to upgrade the Abramson Center’s IT architecture, something the center’s leadership knew was needed. “So far, even though it’s a non-profit and budgets are tight, they’ve listened to what I have to say,” Kelble says. In fact, he notes, “I’ve been here just over two years, and I’ve made more infrastructure changes that will be capitalized over three to four years than I did in the five years I was at the other company.” That has changed Kelble’s approach to the healthcare industry as a whole. “I take a longer view,” he says. “Although the other company was also in the healthcare field, I look at what’s happening in healthcare much more than I did before,

as well as what’s going to happen five and 10 years from now. I ask how I can build for the future.”

IT Faces a Cultural Challenge IT employees haven’t always been skilled at integrating with the culture of their organizations, experts agree. For one thing, at many companies, there’s a different culture in each business unit, location or functional department, and IT may well have a culture of its own. “IT professionals and business professionals look at things differently, which from time to time will result in a clash,” McGee says. “Good salespeople can be amazing at how they handle people and get stuff done,” says Joe McLaughlin, who worked in sales before

becoming vice president of IT at AAA Western and Central New York. “IT people are not that way.” In part, that’s because of the skills that brought them to technology in the first place. “IT deals with things that have no feelings,” Balzac says. “Because of that, it sometimes pulls people who are more comfortable with things than they are with people.” Working in IT can magnify this effect. “You’re spending all your time with electrons and not emotions,” Balzac says. “Switching to dealing with people can require effort.” Another problem is that learning about a company’s culture takes time. Many IT people, already overloaded, may feel they have few spare hours for the “soft” activity of exploring a company’s personality. But that’s a mistake, experts say. “Invest that time, certainly in the beginning, to get immersed in how the organization works,” Kelble advises. “Find the people who get things done, and find out how they do it. If the company has a picnic, don’t show up, grab your burger, and head back to your desk. Become part of it, and learn everything you can about how everyone else does their job.” McLaughlin says, for both yourself and your staff, one great way to absorb the company’s culture is to observe others doing their jobs. At AAA, he and the other top executives make a point of spending time in the call center and with the fleet. “You have to become a colleague with your peers. Go hang out in the retail store if yours is a retail operation,” he says. “As an IT person, you always can make the excuse that, ‘I’m here to see how the technology is working for you.’ All of a sudden, you learn things you never would have otherwise -- just because you’re there.” Those things are worth learning. “Culture is a difficult thing to grasp,” McLaughlin says. “There are cultures, and cultures within cultures. Call it whatever you want, but there’s a personality in an organization. If you try to go against it, you do so at your peril.” CIO

Minda Zetlin is a technology writer and co-author of The Geek Gap: Why Business And Technology Professionals Don't Understand Each Other And Why They Need Each

62

J U N E 1 5 , 2 0 1 4 | REAL CIO WORLD

VOL/9 | ISSUE/08




















ESSENTIAL technology

Seasons Old and New "AI is becoming real," says Jackie Fenn, a Gartner analyst. "AI has been in winter for a decade or more but there have been many breakthroughs [during] the last several years," she adds, pointing to face recognition algorithms and self-driving cars. "There was a burst of enthusiasm in the late 1950s and early 1960s that fizzled due to a lack of computing power," recalls Covington. "Then there was a great burst around 1985 and 1986 because computing power had gotten cheaper and people were able to do things they had been thinking about for a long time. The winter came in the late 1980s when the enthusiasm was followed by disappointment," and small successes did not turn into big successes. "And since then, as soon as we get anything to work reliably, the industry stops calling it AI." In the "early days"—the 1980s—"we built systems that were well-constrained and confined, and you could type in all the information that the system would make use of," recalls Kris Hammond, co-founder

inflection point. We now see it emerging from a substrate of research, data analytics and machine learning, all enabled by our ability to deal with large masses of data." Going forward, "The idea that AI is going to stall again is probably dead," says Luke Muehlhauser, executive director of the Machine Intelligence Research Institute in Berkeley, California. "AI is now ubiquitous, a tool we use every time we ask Siri a question or use a GPS device for driving directions."

Deep Learning Beyond today's big data and massive computational resources, a third factor is pushing AI past an inflection point: Improved algorithms, especially the widespread adoption of a decade-old algorithm called "deep learning." Yann LeCun, director of Facebook's AI Group, describes it as a way to more fully automate machine learning by using multiple layers of analysis that compares results with other layers. He explains that previously, anyone designing a machine-learning system had to

AI has reached an inflection point. We now see it emerging from a substrate of research, data analytics and machine learning, all enabled by our ability to deal with large masses of data. of Narrative Science, which sells naturallanguage AI systems. "The notion was to build on a substrate of well-formed rules, and chain through the rules and come up with an answer. That was the version of AI that I cut my teeth on. There are some nice success stories but they did not scale, and they did not map nicely onto what human beings do. There was a very strong dead end." Today, thanks to the availability of vast amounts of online data and inexpensive computational power, especially in the cloud, "we are not hitting the wall anymore," Hammond says. "AI has reached an

VOL/9 | ISSUE/08

submit data to it, but not before they handcrafted software to identify sought-after features in the data and also hand-crafted software to classify the identified features. With deep learning, both of these manual processes are replaced with trainable machine-learning systems. "The entire system from end to end is now multiple layers that are all trainable," LeCun says. (LeCun attributes the development of deep learning to a team led by Geoff Hinton, a professor at the University of Toronto who now works part-time for Google; LeCun was, in fact, part of Hinton's deep learning

$1Bn

The amount IBM plans to invest in artificial intelligence over the next few years. development team. Hinton did not respond to interview requests.) Even so, "deep learning can only take us so far," counters Gary Marcus, a professor at New York University. "Despite its name it's rather superficial—it can pick up statistical tendencies and is good for categorization problems, but it's not good at natural language understanding. There needs to be other advances so that machines can really understand what we are talking about." He hopes the field will revisit ideas that were abandoned in the 1960s since, with modern computer power, they now might produce results, such as a machine that would be as good as a four-year-old child at learning language. In the final analysis, "About half of the progress in the performance of AI has been from improved computing power, and half has been from improvements by programmers. Sometimes, progress is from brute force applied to get a one percent improvement. But the ingenuity of people like Hinton should not be downplayed," says MIRI's Muehlhauser.

The AI Rush If the spectacle of large corporations investing major sums in a technology is evidence that the technology has gone mainstream, future historians may say that AI reached that point in the winter of 2013-2014. In January, Rob High, vice president and chief technology officer of the Watson Group, announced IBM's plans to invest $1 billion (about Rs 5,800 crore) in AI over the next few REAL CIO WORLD | J U N E 1 5 , 2 0 1 4

81




endlines INNOVATION

* BY SHARON GAUDIN

Scientists at Johns Hopkins University are using nanoparticles as Trojan horses that deliver "death genes" to kill brain cancer cells that surgeons can't get to. The nanoparticles, are biodegradable and deliver the genes, which induce death in cancer cells but don't affect healthy cells. A brain tumor is killed by the so-called "death genes" without damaging healthy brain tissue, which is a normal side effect of other cancer treatments, such as chemotherapy and radiation. "In our experiments, our nanoparticles successfully delivered a test gene to brain cancer cells in mice, where it was then turned on," said Jordan Green, assistant professor of biomedical engineering and neurosurgery at the university's School of Medicine. "We now have evidence that these tiny Trojan horses will also be able to carry genes that selectively induce death in cancer cells, while leaving healthy cells healthy." The treatment is focused on glioblastomas, the most lethal and aggressive form of brain cancer, and has been tested on mice but not on humans. Alfredo Quiones-Hinojosa, professor of neurosurgery and a member of the research team, said this method could be used to battle different forms of cancer.

84

J U N E 1 5 , 2 0 1 4 | REAL CIO WORLD

VOL/9 | ISSUE/08

IMAGE BY T HIN KSTO CKP HOTOS.IN

Nanotech Kills Brain Cancer



Issuu converts static files into: digital portfolios, online yearbooks, online catalogs, digital photo albums and more. Sign up and create your flipbook.