

MAPPING COMPLEXITY
Editor: Jennifer Donovan
Design & Illustration: Michael Lewy
Writers: Grace Chua
Katie DePasquale
Greta Friar
Photography credits:
Photos of Jennifer Tang, Marija Ilic, LIDS Student Conference, LIDS Holiday Brunch, and LIDS
Thanksgiving Social by Francisco Jaimes. Photos of Omer Tanovic and the LIDS Commencement Reception by Jennifer Donovan. Photo of John Tsitsiklis by Donna Coveney.
Massachusetts Institute of Technology Laboratory for Information and Decision Systems
77 Massachusetts Avenue, Room 32-D608
Cambridge, Massachusetts 02139
http://lids.mit.edu/

A Message from the Director
It is my pleasure to introduce the latest issue of LIDS|All.
As I look back at the past year, I am excited to see the lab in the midst of milestones: this issue of LIDS|All marks its 15th year of publication; the 2020 LIDS Student Conference will be our 25th; and now, fall 2019, we are celebrating the lab’s 80th anniversary with a major event that explores the past, present, and future of Information and Decision Sciences (you can learn more at: lids80.lids.mit.edu). Finally, together with IDSS, LIDS is joining the newly-founded MIT Stephen A. Schwarzman College of Computing — a major initiative at the Institute that aims to accelerate pioneering research and innovation in computing — a development that keeps LIDS at the center of MIT’s world-leading investment in the future of computing and AI.
I am particularly proud of these milestones because they exemplify much of what makes LIDS special: our deep intellectual roots, the breadth of our reach, and the many ways LIDS research is part of burgeoning new fields, playing a role in everything from autonomous vehicles to future power grids, and more.
Most important, however, these events are a testament to the rich and supportive community that is a hallmark of life at LIDS. It is this community that continues to attract top students and researchers from around the world, and a community that it has been our privilege to feature in these 15 years of LIDS|All. In the very first issue of the magazine, then-student Erik Sudderth
ABOUT LIDS
The Laboratory for Information and Decision Systems (LIDS) at MIT, established in 1940 as the Servomechanisms Laboratory, currently focuses on four main research areas: communication and networks, control and system theory, optimization, and statistical signal processing. These areas range from basic theoretical studies to a wide array of applications in the communication, computer, control, electronics, and aerospace industries.
(PhD ‘06) said of the lab: “There’s such an interesting group of people who are really excited about what they’re doing. If you ask them about some problem you’re having, they’ll probably have a bunch of clever ideas for you. And that’s really great.” I am happy to say the same holds true today.
You will see evidence of this cohesiveness and enthusiasm throughout the magazine, which this year features: graduate student Omer Tanovic, whose award-winning teaching is matched by his impressive work on power efficiency in wireless operations, as well as graduate student Jennifer Tang’s work on online machine learning. You will also read about the lab’s Senior Financial Assistant Carissa Prue; alum Jay Kuo’s Media Communications Lab at the University of Southern California (and his mentoring success); postdoc Vasileios Tzoumas’s vision of resilient autonomous systems intelligence; and Senior Research Scientist Marija Ilic’s forward-looking research on complex electric energy systems.
I hope you enjoy reading these highlights of the year, and learning more about the people and research that make LIDS the exceptional place it is today.
Sincerely,

John N. Tsitsiklis
LIDS is truly an interdisciplinary lab, home to about 100 graduate students and post-doctoral associates from EECS, Aero-Astro, and the School of Management. The intellectual culture at LIDS encourages students, postdocs, and faculty to both develop the conceptual structures of the above system areas and apply these structures to important engineering problems.

ALGORITHMS THAT PROTECT
By Grace Chua
Picture this: It’s a hot, dry summer’s night. A lit cigarette butt; a spark. Suddenly, a rapidlyspreading fire is making its way through the neighborhood. The smoke and heat are too dense for firefighters to navigate, so they deploy a team of drones to map the fire and find any people who may be trapped. The drones calculate an exploration plan, and fly into one of the buildings to begin their task. But as they
“Ultimately, my vision is about resilient machines that can protect themselves and the people around them.”
scurry through the building, unexpected challenges occur: Some of their sensors are obscured by smoke. Others run their batteries down and return to home base. And others are forever lost in the fire, destroyed by falling objects.
An immediate question arises: Despite many of the drones being lost, can those that remain still scan the blaze? As the now-diminished team works through the area, another challenge is also apparent: the fire has created an environment they’ve never encountered before. How do the drones recognize where they are? That they are not visiting the same place again and again? And, importantly, how do they ignore misinformation — one apartment door
looks just like any other; corridors across floors seem the same — as they stitch together their picture of the blaze?
This is where the work of LIDS postdoctoral researcher Vasileios Tzoumas comes in. Together with his colleagues (LIDS professors Luca Carlone and Ali Jadbabaie, and University of Pennsylvania professor George J. Pappas, among others), Vasileios has developed seminal algorithms for resiliency against Denial of Service (DoS) failures (as in drones destroyed by falling objects) and for robustness against outliers (as in misinformation that results in inaccurate mapping).
In a broad sense, Vasileios is working toward a vision of resilient robots and other cyberphysical systems (CPS). CPS are physical systems, such as drones and self-driving cars, that are deeply intertwined with the software that controls them. Communicating between software, sensors, and actuators (components that physically move parts of the machine, e.g., opening a valve), these systems sense, process, and interact with the physical world — and they are doing so in increasingly sophisticated ways.
Vasileios, for his part, focuses on the resourceconstrained tasks of search and rescue, navigation, and surveillance. Though his results are primarily relevant to the field of control and robotics, they have found applications in statistical learning and operations research as
well. In his research, Vasileios builds on fundamental methods in control theory and discrete (combinatorial) optimization. “When we design heterogeneous teams of robots, different combinations of failures have different effects; some failures can be more devastating than others,” he says. “To be resilient, we need to identify the worst combinations among all possible. And doing this in real-time, it’s hard.”
Vasileios is an animated speaker, his passion for his work coming through as he explains the importance of his research: Today, as robots and other CPS are put to work in a variety of failure-prone situations — such as search and rescue in disaster zones or self-driving vehicles in crowded cities — resilient autonomy will be increasingly necessary.
In addition to resiliency against failures and robustness against outliers, Vasileios says that resilience against attacks (coordinated misinformation) is the third domain relevant to his research. All three need to be integrated into a “resilient autonomous systems intelligence,” he says. Though the need for this kind of systems intelligence will certainly increase in the future, he argues that recent cyber-attacks, such as the wi-fi breaking attack against selfdriving cars, or the Suxtnet computer-worm against sensors and actuators in nuclear reactors, have demonstrated the need for resilient autonomy right now. Overall, resilient autonomy has been recognized as an important issue at a national level: the National Institute
of Standards and Technology layout resiliency frameworks as part of public policy, and the National Academy of Engineering named resiliency (as security) against attacks and DoS failures one of the 14 Grand Challenges for Engineering in the 21st Century.
In recent papers, Vasileios and his colleagues addressed several of these resilient autonomy challenges. For example, in a paper on resilient exploration against DoS failures, the algorithm they developed plans trajectories for each drone to maximize the explored area in a way that also withstands multiple drone failures. Their paper shows the algorithm is more efficient than a brute-force method, and is as fast as established methods that ignore the possibility of failures. The algorithm was tested in small-scale drone deployments, and in largerscale computer simulations. In another paper, Vasileios and his colleagues provide a generalpurpose algorithm suitable for outlier-robust mapping, among other applications. The algorithm was tested in several benchmarking datasets to evaluate its use in real-world scenarios. “Ensuring real-time performance is of utmost importance, as one plans to move from theory to practice,” Vasileios says. Looking to the future, Vasileios plans to explore resilient autonomous navigation against cyber-attacks.
“Resiliency comes down to being able to sustain losses in the long run, and yet cross the finish line, making in the middle as many steps of recovery as necessary,” he says. “Similar to
a marathon run.” Fittingly, Vasileios is a marathon runner, hailing from the marathon’s birthplace, Greece, where he began his research career at the National Technical University of Athens, before heading to the University of Pennsylvania for a PhD in electrical and systems engineering. He can be found trotting along the Charles River or pounding the pavements of Cambridge; currently, he is training for an ultra-marathon in Arizona next year.
In his telling, his curiosity about control and resilient autonomy stems from his interest in human behavior. We make decisions every day, based on information we collect and plans we develop, he says. However, we all have limited time and information to develop our plans and make decisions. In other words, there are fundamental limitations to our capacity to solve a problem: time availability, quality of information, and, as a result, quality of plan. All translate to autonomous machines: they need to plan in real-time; they need to reject misinformation; and in the end, they need to ensure they are following an effective plan.
It is this human element that underpins Vasileios’s research: “Ultimately, my vision is about resilient machines that can protect themselves, and the people around them,” he says.


UNDERSTANDING ONLINE MACHINE LEARNING THROUGH ICE HOCKEY
By Katie DePasquale
Jennifer Tang is an avid ice hockey fan who loves both playing and watching the sport. A Ph.D. candidate in MIT’s Electrical Engineering and Computer Science Department and a member of the MIT Women’s Club Ice Hockey team, she was thrilled to have the opportunity to intern with the Boston Bruins in the summer of 2018, a team which made it to the 2018-2019 Stanley Cup Finals. While there, she found that there is significant overlap between learning to play better hockey and her research on information theory and probability theory. This is because the kinds of questions that coaches want to ask about their players’ strengths and weaknesses (both identifying and addressing them) can be effectively examined using the same techniques that Jennifer uses to investigate online machine learning.
After a childhood spent first in Texas and then California, Jennifer attended Princeton for her undergraduate degree before coming to MIT and LIDS. She knew that she wanted to stay in school until she’d completed her doctorate rather than having a job first, so she took a cue from her undergraduate research, which was focused on information theory, and applied to programs with that in mind. When she arrived at LIDS, she started working with Prof. Yury Polyanskiy, saying, “He has very good insights into the research and good tips about what’s important.” Together they’ve been focusing on the theory behind online machine learning, a
method of machine learning that is applied to sequential or chronological data, in which the best predictor of future data is updated as each new data point is available (think: stock price prediction). “Right now learning problems are popular,” Jennifer says. “We’re looking at the question of online learning, which is a study where you, as the statistician or the person trying to make inferences about the data, have observations at every point in time. Given all the observations you see, you are trying to guess what the next observation might be.”
The guesses that Jennifer’s research is looking at are those which give a probability distribution over all the possible outcomes.
Probability looks at how mathematically likely it is that a certain thing will happen in a certain set of circumstances. Why you might want to give a guess in terms of probabilities of all the possible outcomes is simple to explain through Jennifer’s favorite lens of ice hockey. “In hockey,” she says, “there are some benchmarks that you want to be able to estimate pretty well. In particular something you might want to know is, when a certain player takes a shot, what is the probability that the shot will score a goal?”
Most of the research and papers currently available on learning problems focuses on predicting specific outcomes. In the hockey problem, this would mean predicting whether a specific shot will score a goal or not. How-

ever, this prediction is not necessarily the most useful for evaluating players.
“There’s a lot of randomness that happens in hockey — it might be that the player made a shot that was very good, but maybe the goalie got lucky and saved it, or maybe the player made a poor shot but it got through. But from the perspective of someone trying to evaluate the player, a good benchmark is one that summarizes a player’s ability to score without any random factors like luck of the goalie. Using
probability accomplishes this.” The benefits of this sort of analysis are potentially huge for coaches and players, in addition to being of interest to a researcher, for what they reveal about the likelihood of a specific outcome.
Another way of understanding this is the difference between predicting whether a flipped coin will land on tails (a yes or no answer) versus giving a specific percentage of likelihood that it lands on heads or tails. “We’re trying to [recreate] a probability distribution as opposed to making a certain definite decision on something.”
Jennifer’s research focuses on approaching learning problems from an information theory perspective. “There are tools developed for information theory which can be used for prediction. Some of these tools come from data compression: the problem of figuring out how to compress a file so that instead of storing a million bytes on your computer, you can store it in something smaller, like a thousand bytes” she says. “Information theory is not just about compression. It is also about communication. A classic information theory problem may ask the question: When trying to transmit data from a sender to a receiver, what’s the best information rate that you can send at? We can actually use techniques from these kinds of information theory problems to solve machine learning problems.” Machine learning uses algorithms and statistical models to get com-
puter systems to complete tasks for which it’s difficult to give clear, detailed instructions. Instead, the systems have to use patterns to infer what should happen, and then carry out that specific task.
Because Jennifer’s still in the middle of her research, there are still many possibilities for where her work will go in the future. She relies on her LIDS office mates for discussion about various problems as they arise. “[They] can help clarify if there’s a part [of the work] that they know well that I don’t know well,” she says. She hopes that her outcomes will one day prove helpful in understanding logistic regression, which is often used in machine learning because it models how different variables can change the probability of outcomes. To return to the hockey example, she says, “If you observe a lot of data about the characteristics of shots hockey players take, such as where on the ice [the shot] was taken from, the angle from the net, and the distance from the net — you could perhaps answer the question of how much better a closer shot is compared to a farther one, or how much better a shot from the center of the ice is than a shot from a wide angle. There’s some underlying model that governs the probability a shot will go into the net depending on different features of the shot. We don’t know what that model is, but we would like to be able to predict that as well as possible.” The applications for this kind of analysis have significant potential, for hockey and far beyond.

COOL WIRELESS
By Greta Friar
Omer Tanovic, a PhD candidate in the Department of Electrical Engineering and Computer Science, joined LIDS because he loves studying theory and turning research questions into solvable math problems. But Omer says that his engineering background — before coming to MIT he received undergraduate and master’s degrees in electrical engineering and computer science at the University of Sarajevo — has taught him never to lose sight of the intended applications of his work, or the practical parameters for implementation.
“I love thinking about things on the abstract math level, but it’s also important to me that the work we are doing will help to solve real world problems,” Omer says. “Instead of building circuits, I am creating algorithms that will help make better circuits.”
One real world problem that captured Omer’s attention during his PhD is power efficiency in wireless operations. The success of wireless communications has led to massive infrastructure expansion in the U.S. and around the world. As these networks and the volume of information they handle grow, they consume an increasingly hefty amount of power, some of which goes to powering the system as it’s supposed to, but much of which is lost as heat due to energy inefficiency. This is a problem both for companies such as mobile network operators, which have to pay large utility bills to cover their operational costs,
and for society at large, as the sector’s greenhouse gas emissions rise.
These concerns are what motivate Omer in his research. Most of the projects that he has worked on at MIT seek to design signal processing systems, optimized to different measures, that will increase power efficiency while ensuring that the output signal (what you hear when talking to someone on the phone, for instance) is true to the original input (what was said by the person on the other end of the call).
His latest project addresses the power efficiency problem by decreasing the peak to average power ratio (PAPR) of wireless communication signals. In the broadest sense, PAPR is an indirect indicator of how much power is required to send and receive a clear signal across a network. The lower this ratio is, the more energy efficient the transmission. Namely, much of the power consumed in cellular networks is dedicated to power amplifiers, which collect low-power electronic input and convert it to a higher-power output. This ensures that the signal is robust enough to maintain adequate signal-to-noise ratio over the communication link. Power amplifiers are at their most efficient when operating near their saturation level, at maximum output power. However, because cellular network technology has evolved in a way that accommodates a huge volume and variety of information across the network — resulting in far less uniform sig-
nals than in the past — modern communication standards require signals with big peak to average power ratios. This means that a radio frequency transmitter must be designed such that the underlying power amplifier can handle peaks much higher than the average power being transmitted, and therefore, most of the time, the power amplifier is working inefficiently — far from its saturation level.
“I LOVE THINKING ABOUT THINGS ON THE ABSTRACT MATH LEVEL, BUT IT’S ALSO IMPORTANT TO ME THAT THE WORK WE ARE DOING WILL HELP TO SOLVE REAL WORLD PROBLEMS.”
“Every cell tower has to have some kind of PAPR reduction algorithm in place in order to operate. But the algorithms they use are developed with little or no guaranties on improving system performance,” Omer says. “A common conception is that optimal algorithms, which would certainly improve system performance, are either too expensive to implement — in terms of power or computational capacity — or cannot be implemented at all.”
Omer, who is supervised by LIDS professor Alexandre Megretski, designed an algorithm that can decrease the PAPR of a modern communication signal, which would allow the
power amplifier to operate closer to its maximum efficiency, thus reducing the amount of energy lost in the process. To create this system he first considered it as an optimization problem, the conditions of which meant that any solution would not be implementable, as it would require infinite latency. However, Omer showed that the underlying optimal system, even though of infinite latency, has a desirable fading memory property, and so he could create an approximation with finite latency — an acceptable lag time. From this, he developed a way to best approximate the optimal system. The approximation is implementable and allows tradeoffs between precision and latency, so that real-time realizations of the algorithm can improve power efficiency without adding too much transmission delay or too much distortion to the signal.
Omer’s algorithm, along with improving power efficiency, is also computationally efficient. “This is important in order to ensure that the algorithm is not just theoretically implementable but also practically implementable,” Omer says, once again stressing that abstract mathematical solutions are only valuable if they cohere to real world parameters. Microchip real estate in communications is a limited commodity, so the algorithm cannot take up much space, and its mathematical operations have to be executed quickly, as latency is a critical factor in wireless communications. Omer believes that the algorithm could be

adapted to solve other engineering problems with similar frameworks, including envelope tracking and model predictive control.
While he has been working on this project, Omer has made a home for himself at MIT. Two of his three sons were born here — in fact, the youngest was born on campus, in the stairwell of Omer and his wife’s graduate housing building. “The neighbors slept right through it,” Omer says with a laugh.
Omer quickly became an active member of the LIDS community when he arrived at MIT as well, a group for which he is grateful.
“At MIT, and especially at LIDS, you can learn something new from everyone you speak to. I’ve been in many places, and this is the only place where I’ve experienced a community like that,” Omer says.
As Omer’s time at LIDS draws to an end, he is still debating what to do next. On the one hand, his love of solving real world problems is drawing him towards industry.
On the other hand, Omer is not sure he could ever leave academia for long; he loves research and is also truly passionate about teaching. Omer, who grew up in Bosnia-Herzegovina, began teaching his freshman year of high school and has been teaching in one form or another ever since.
At MIT, Omer has taught both undergraduate and graduate level courses, winning prestigious awards for his teaching, including the MIT School of Engineering Graduate Student Extraordinary Teaching and Mentoring Award in 2018.
The magnitude of Omer’s love for teaching is clear when he speaks about working with students: “That moment when you explain something to a student and you see them really understand the concept is priceless. No matter how much energy you have to spend to make that happen, it’s worth it,” Omer says.
In communications, power efficiency is key, but when it comes to research and teaching, there’s no limit to Omer’s energy.

SHEDDING LIGHT ON COMPLEX POWER SYSTEMS
By Grace Chua
Marija Ilic, a Senior Research Scientist at LIDS, affiliate of the MIT Institute for Data, Systems, and Society, senior staff in MIT Lincoln Laboratory’s Energy Systems Group, and Carnegie Mellon University Professor Emerita, is a researcher on a mission: making electric energy systems future-ready.
Since the earliest days of streetcars and public utilities, electric power systems have had a fairly standard structure — for a given area, a few large generation plants produce and distribute electricity to customers. It is a one-directional structure, with the energy plants being the only source of power for many end-users.
Today, however, electricity can be generated from many and varied sources — and move through the system in multiple directions. An electric power system may include stands of huge turbines capturing wild ocean winds, for instance. There might be solar farms of a hundred megawatts or more, or houses with solar panels on their roofs that some days make more electricity than occupants need, some days much less. And there are electric cars, their batteries hoarding stored energy overnight. Users may draw electricity from one source or another, or feed it back into the system, all at the same time. Add to that the trend toward open electricity markets, where end-users like households can pick and choose the electricity services they buy depending on their needs. How should systems operators integrate all these while keeping the grid stable and ensuring power gets to where it is needed?
To explore this question, Marija has developed a new way to model complex power systems.
Electric power systems, even traditional ones, are complex and heterogeneous to begin with. They cover wide geographical areas and have legal and political barriers to contend with, such as state borders and energy policies. In addition, all electric power systems have inherent physical limitations. For instance, power does not flow in a set path in an electric grid, but rather along all possible paths connecting supply to demand. To maintain grid stability and quality of service, then, the system must control for the impact of interconnections: a change in supply and demand at one point in a system changes supply and demand for the other points in the system. This means there is much more complexity to manage as new sources of energy (more interconnections) with sometimes unpredictable supply (such as wind or solar power) come into play. Ultimately, however, to maintain stability and quality of service, and to balance supply and demand within the system, it comes down to a relatively simple concept: the power consumed and the rate at which it is consumed (plus whatever is lost along the way), must always equal the power produced and the rate at which it is produced.
Using this simpler concept to manage the complexities and limitations of electric power systems, Marija is taking a non-traditional approach: She models the systems using information about energy, power, and ramp rate (the rate at which power can increase over time) for
each part of the system — distributing decisionmaking calculations into smaller operational chunks. Doing this streamlines the model but retains information about the system’s physical and temporal structure. “That’s the minimal information you need to exchange. It’s simple and technology-agnostic, but we don’t teach systems that way.”
She believes regulatory organizations such as the Federal Energy Regulatory Commission (FERC) and North American Energy Reliability Corporation (NERC) should have standard protocols for such information exchanges, just as internet protocols govern how data is exchanged on the internet. “If you were to [use a standard set of] specifications like: what is your capacity, how much does it vary over time, how much energy do you need and within what power range — the system operator could integrate different sources in a much simpler way than we are doing now.”
Another important aspect of Marija’s work is that her models lend themselves to controlling the system with a layer of sensor and communications technologies. This uses a framework she developed called Dynamic Monitoring and Decision Systems framework, or DyMonDS. The data-enabled decision-making concept has been tested using real data from Portugal’s Azores Islands, and since applied to real-world challenges. After so many years it appears that her new modeling approach fittingly supports DyMonDS design, including systematic use of many theoretical concepts used by the LIDS
community in their research.
One such challenge included work on Puerto Rico’s power grid. Marija was the technical lead on a Lincoln Laboratory project on designing future architectures and software to make Puerto Rico’s electric power grid more resilient without adding much more production capacity or cost. Typically, a power grid’s generation capacity is scheduled in a simple, brute-force way, based on weather forecasts and the hottest and coldest days of the year, that doesn’t respond sensitively to real-time needs. Making such a system more resilient would mean spending a lot more on generation and transmission and distribution capacity, whereas a more dynamic system that integrates distributed microgrids could tame the cost. “What we are trying to do is to have systematic frameworks for embedding intelligence into small microgrids serving communities, and having them interact with largescale power grids. People are realizing that you can make many small microgrids to serve communities rather than relying only on large scale electrical power generation.”
Although this is one of Marija’s most recent projects, her work on DyMonDS can be traced back four decades, to when she was a student at the University of Belgrade in the former country of Yugoslavia, which sent her to the United States to learn how to use computers to prevent blackouts.
She ended up at Washington University in St. Louis, studying with applied mathematician
John Zaborszky, a legend in the field who was originally chief engineer of Budapest’s municipal power system before moving to the US. (“The legend goes that in the morning he would teach courses, and in the afternoon he would go and operate Hungarian power system protection by hand.”) Under Zaborszky, a systems and control expert, Marija learned to think in abstract terms as well as in terms of physical power systems and technologies. She became fascinated by the question of how to model, simulate, monitor and control power systems – and that’s where she’s been ever since. (Although, she admits as she uncoils to her full height from behind her desk, her first love was actually playing basketball.)
Marija first arrived at MIT in 1987 to work with the late Professor Fred Schweppe on connecting electricity technologies with electricity markets. She stayed on as a senior research scientist until 2002, when she moved to Carnegie Mellon University (CMU) to lead the multidisciplinary Electric Energy Systems Group there. In 2018, after her consulting work for Lincoln Lab ramped up, she retired from CMU to move back to the familiar environs of Cambridge. CMU’s loss has been MIT’s gain: in fall 2019, Marija will teach a course in modeling, simulation and control of electric energy systems, applying her work on streamlined models that use pared-down information.
Addressing the evolving needs of electric power systems has not been a ‘hot’ topic, historically. Traditional power systems are often seen by the academic community as legacy technol-

ogy with no fundamentally new developments. And yet when new software and systems are developed to help integrate distributed energy generation and storage, commercial systems operators regard them as untested and disruptive. “I’ve always been a bit on the sidelines from mainstream power and electrical engineering because I’m interested in some of these things,” she remarks.
However, Marija’s work is becoming increasingly urgent. Much of today’s power system is physically very old and will need to be retired and replaced over the next decade. This presents an opportunity for innovation: the next generation of electric energy systems could be built to integrate renewable and distributed energy resources at scale — addressing the pressing challenge of climate change and making way for further progress.
“That’s why I’m still working even though I should be retired.” She smiles. “It supports the evolution of the system to something better.”

MULTIMEDIA AND MENTORING
By Grace Chua
This year, LIDS alumnus Chung-Chieh Jay Kuo reached a remarkable milestone: as of May 2019, his lab, the Media Communications Lab at the University of Southern California’s Viterbi School of Engineering, has graduated an astonishing 150 PhD students — an average of five a year in its 30-year history.
It’s not just the numbers that prove Jay’s methods are working. Lest anyone think quality is sacrificed for quantity, his students are snapped up by companies such as Facebook, Google, Apple, Samsung, Mediatek and Qualcomm. Others start their own companies. And just over a quarter are in academia around the globe.
It is an impressive group. Ever a master connector, Jay meets up with lab alumni whenever he travels, and alumni tend to stay in touch with each other. (Bringing this instinct to LIDS, he recently helped set up the LIDS alumni webpage, http://lids-alum.org/. The site, which launched less than a year ago, was conceived by Jay and LIDS director John Tsitsiklis — Jay’s own advisor — and serves as a virtual home for all LIDS alumni, who are invited to share their achievements and stories.)
Jay started the Media Communications Lab in 1989, two years after graduating from MIT with a PhD in electrical engineering. Initially, the lab carried out fundamental research on image compression — most notably, work that contributed to the core technologies behind international standards such as the JPEG and MPEG in widespread use today.
As multimedia needs have evolved, so too has the lab’s research. Its current suite of projects, for instance, includes collaborating with ondemand giant Netflix on research into highquality wireless video streaming. This is in keeping with the trajectory of the last five years or so, as the lab has begun to explore computer vision and machine learning based on neural networks — systems that ‘learn’ to perform tasks based on analyzing sets of training examples. It’s a field that’s blossomed in the last few years. In 2015, for instance, AlphaGo, developed by artificial intelligence company DeepMind, became the first computer program to defeat a human professional player of the highly-complex Chinese board game Go; evidence of just how powerful a neural network using deep learning can be.
Jay’s group is looking to go further: “We’re interested in understanding the fundamental theory of neural networks, including explanations of their capabilities and limitations,” he says. Most companies using neural networks, he believes, focus on their applications and treat them as a black-box tool — their interest is in whether the neural networks work rather than why. “I feel that this presents an excellent opportunity for academic research. We need to understand the reason why they work, and when and why they do not work properly.”
The power of neural networks, Jay says, is their potential to harness and exploit big data. But neural networks can fail. They can be attacked by introducing tiny perturbations, for instance,
and that one small change can make the program’s output completely different. Are there other, more robust ways of exploring and exploiting big data sets? Jay thinks it’s possible, and he’s developing a mathematically transparent approach to analyzing big data. “Rather than being a black box, it would be a white box,” he quips.
As the lab’s reputation and career placements have grown, so has the number of interested students. What’s the secret to advising and mentoring so many of them?
One factor is that Jay takes students from across the spectrum of engineering disciplines. “I can always find a common language with them,
JAY TAKES STUDENTS FROM ACROSS THE SPECTRUM OF ENGINEERING DISCIPLINES:
“I CAN ALWAYS FIND A COMMON LANGUAGE WITH THEM, AND THAT IS MATHEMATICS.”
The Media Communications Lab’s research, from neural networks to image processing standards, has drawn keen interest from industry: the lab has received research grants from some 70 companies all over the world. That in turn helps support students, many of whom find a home at those companies once they graduate.
Jay has found the sweet spot for industry-backed research is things that are “a few years ahead of companies’ own engineering departments”. He reaches out to large corporations, giving talks and reconnecting with former students working there, and thinking about companies’ needs. “I always look to find the intersection of the theory with good applications. For totally pure research you can write a grant application to the NSF, but I want to do things that are more application focused.”
and that is mathematics. My advice to students is really be strong in mathematics — your interests can be flexible and very wide, but the fundamental training and fundamental discipline is centered around the math.”
Over the years, Jay has fine-tuned his system for supervising and mentoring students. Each student submits a research report regularly before group meetings, and he prioritizes which ones to meet each week to keep their research moving. “I am hands-on for most PhD students in their first 2-3 years, which will save them a lot of time wandering around without any direction; after they mature as researchers, I gradually become more hands-off,” he says.
But he also takes pains to build a nurturing lab culture. At the lab’s weekly seminars, guest speakers share not only their research but the how-to of being an effective researcher; they
also share job-hunting tips, advice on academic or industry career paths, and insights on technology trends.
The lab’s alumni sing Jay’s praises. Dr. Jing Zhang, now a postdoctoral researcher at Yale, graduated from USC in 2013. “Prof. Kuo is the most organized person I’ve ever seen in my life,” she says. At the same time, she adds, he’s warm and always approachable.
When she mentors students, she looks to him as a role model. “Something I learned from him is that you have to talk to students constantly, otherwise they’ll get lost.” For her own students, she set up a group chat in messaging app Slack so they can reach out whenever they have difficulties. “I don’t want them to dwell on a problem all week — I try and help them to keep moving.”
Ultimately, the way Jay approaches mentoring and lab culture boils down to values. Some principal investigators take a more competitive approach to training top-notch researchers. Jay believes it’s not a zero-sum game, and that there’s another way. “As a professor, my key role is to train students and make them better researchers and technical leaders. That means helping them build their strengths and capabilities. I put a lot of value on character as well — are they good citizens in society? Are they nice people?”
“Papers are a process to train the student,” Jay says, “but people are my product.”

Sound Bites: Carissa Prue

What do you do at LIDS?
My role here is Senior Financial Assistant, so that’s providing financial and proposal development support to our Administrative Officer. I help out with preparing and submitting grant proposals; monitoring account activity for our LIDS faculty and researchers while keeping accounts maintained and reconciled; working along with the administrative assistants to verify
purchases, approve travel reports, and process requests for payment — different financial tasks like that. But my work often overflows into other things in headquarters. I’ll help out wherever anyone needs help, like when students need a key for a room or someone needs help with the copier, those kinds of things. So really what I do at LIDS is kind of a little bit of everything when it’s needed. I love the variety. It’s never boring!
What drew you to financial work?
I had not worked in finance before I came to LIDS, but I enjoy learning new jobs and tasks. I felt like financial work was a good fit for me because I enjoy details and I enjoy when math works. And this role calls for a lot of detailoriented work: making sure that payments go through for the correct amount, creating budgets, reconciling accounts. Making sure everything matches exactly is kind of fun for me. If something doesn’t match down to the penny, or if something just doesn’t make sense or add up, I can’t let it go until I figure it out. It’s a bit of an adventure figuring out where the discrepancy is. Once I find it, I feel accomplished in knowing that I’ve done my best to ensure I’ve done all I can to handle my responsibilities to the best of my ability.
What brought you to MIT and LIDS?
My husband and I actually relocated to Boston from Buffalo, New York. We do a volunteer work and there was more of a need here than there was in the Buffalo area so we decided to relocate. Prior to that we lived in Asia for a short time. We were English teachers there — we lived half a year in Taiwan and then three years in China.
We came to Boston in September 2016. When we got here I always heard that MIT was a great place to work, and eventually I was able to temp here. It was a few months after mov-
ing to Boston, in January of 2017, that I started in LIDS as a temp. I was made a permanent member of the staff in February 2017.
When you’re not at work, what do you do?
The majority of my time, because I work here part-time, I do volunteer work. But outside of that, what I do for fun is spend time with friends and family. I love to travel, whether it is back home for a visit or to a new destination to experience new food, new languages or new cultures. I also really enjoy running. I’ve run a full marathon and a few half marathons. I just ran one in Boston (not THE Boston Marathon, though…even though that would be my dream!) and I’ll be doing the Buffalo marathon in May. I’m a fitness instructor at a gym, too. I teach a strength training class called Body Pump.
How do you like Boston and being at MIT?
My husband and I really enjoy Boston, it’s a great place. We’re enjoying all of the trails and paths around the Charles River and all of the beautiful scenery in all the types of weather. MIT has been great and LIDS has been awesome — my co-workers, especially. I don’t think I could have landed a better temp position back in 2017. I’m very aware of what a good place to work LIDS is, and I feel very grateful to have such a good group of people to work with here.




LIDS COMMENCEMENT RECEPTION
Congratulations to this year’s graduates!
PhDs received by:
M. Jehangir Amjad
Austin Collins
Chengtao Li
Qingkai Liang
Fangchang Ma
Ali Makhdoumi
Anuran Makur
Zelda Mariet
Sebastian Martin
Konstantina Mellou
Shayegan Omidshafiei
Anurag Rai
Valerio Varricchio
Yuhao Wang
Jianan Zhang
Hongyi Zhang
SMs or MEngs received by:
Jason Altschuler
Renardo Baird
Noam Buckman
Conleigh Byers
Erin Evans
Alireza Fallah
Xinzhe Fu
Siyi Hu
Abigail Katcoff
Muyuan Lin
Bai Liu
Kelvin Lu
Andrew Montanez
Manon Revel
Uma Roy
Thomas Sayre-McCord
Chandler Squires
Ihssan Tinawi
Vishrant Tripathi
Tori Wuthrich
Ruihao Zhu


LIDS
Awards & Honors Awards
A paper by Principal Research Scientist Audun Botterud was selected for presentation in the session on “Best Conference Papers on Power System Operations and Electricity Markets,” at the 2019 IEEE Power and Energy Society General Meeting.
Profs. Luca Carlone and Sertac Karaman, together with their collaborators, received the Best Student Paper Award at the 2019 Symposium on VLSI Circuits.
Alireza Fallah and Ian Schneider were named to the 2019 cohort of Siebel Scholars in recognition of their academic achievements, leadership, and commitments to addressing crucial global challenges.
Prof. Jonathan How was co-author on a paper that received Outstanding Student Paper Honorable Mention at the 2019 AAAI Conference on Artificial Intelligence (AAAI-19). Prof. How was also a finalist for the 2018 International Conference on Robotics and Automation Best Multi-Robot Systems Paper Award.
Prof. Ali Jadbabaie received a 2019 Multidisciplinary University Research Initiative (MURI) Award from the Directorate of Basic Research at the Department of Defense.
Profs. Ali Jadbabaie and Asu Ozdaglar were coauthors of a paper that received a Best Student Paper Award at the IEEE 2018 International Conference on Acoustics Speech and Signal Processing.
Prof. Patrick Jaillet and collaborators were finalists in the Production and Operations Management Society’s POMS-JD.com 2019 Best Data-Driven Research Paper Competition. Prof. Jaillet, together with colleagues, also won the ICAPS 2019 Best Applications Paper Award at the 29th International Conference on Automated Planning and Scheduling.
Prof. Thomas Magnanti was one of Singapore’s National Day Award recipients, given for his long-term work developing higher education in Singapore.
Omer Tanovic won the MIT School of Engineering Graduate Student Extraordinary Teaching and Mentoring Award. Omer is supervised by Prof. Alexandre Megretski
Prof. Alexander Rakhlin received the Joseph A. Martore Award for Excellence in Teaching from MIT’s Institute for Data, Systems, and Society (IDSS).
Prof. Devavrat Shah, along with LIDS alums Shreevatsa Rajagopalan and Jinwoo Shin received the ACM Sigmetrics Test of Time Paper Award 2019, which recognizes an influential SIGMETRICS paper from 10-12 years ago.
Prof. Suvrit Sra received an NSF CAREER award, as well as an NSF TRIPODS+X grant.
Prof. Caroline Uhler was named a 2019 Simons Investigator in the Mathematics and Physical Sciences by the Simons Foundation.
Prof. Alan Willsky received the 2019 IEEE Jack S. Kilby Signal Processing Medal for contributions
to stochastic modeling, multi-resolution techniques, and control-signal processing synergies.
Prof. Win and his group received a 2018 R&D 100 Award for “Peregrine,” a network localization and navigation prototype.
Honors
Prof. Guy Bresler was promoted to Associate Professor without Tenure in the department of electrical engineering and computer science, effective July 1, 2019.
Prof. Luca Carlone was elevated to IEEE Senior Member. Prof. Carlone was also elevated to AIAA Senior Member.
Prof. Sertac Karman was granted tenure by the department of aeronautics and astronautics.
Prof. Asu Ozdaglar was appointed the School of Engineering Distinguished Professor of Engineering.
Prof. Pablo Parrilo was appointed the Joseph F. and Nancy P. Keithley Professor by the department of electrical engineering and computer science.
Prof. Suvrit Sra was promoted to Associate Professor without Tenure in the department of electrical engineering and computer science, effective July 1, 2019.
Prof. John Tsitsiklis received an honorary doctorate from Harokopio University, Greece.
LIDS Seminars 2018-2019
Weekly seminars are a highlight of the LIDS experience. Each talk, which features a visiting or internal invited speaker, provides the LIDS community an unparalleled opportunity to meet with and learn from scholars at the forefront of their fields. Listed in order of appearance.
Le Xie
Texas A&M University
Department of Electrical and Computer Engineering
Sanjay Shakkottai
University of Texas, Austin
Department of Electrical and Computer Engineering
Terry Rockafellar
University of Washington
Math Department
Ayfer Ozgur
Stanford University
Electrical Engineering Department
Christos Papadimitriou
Columbia University
Department of Computer Science
Abbas El Gamal
Stanford University
Department of Electrical Engineering
Benjamin Hobbs
Johns Hopkins University
Department of Environmental Health & Engineering
Saurabh Amin
MIT
Department of Civil and Environmental Engineering
Naomi Leonard
Princeton University
Department of Mechanical and Aerospace Engineering
Kuang Xu
Stanford University
Graduate School of Business
Department of Electrical Engineering
Salman Avestimehr
University of Southern California
Department of Electrical and Computer Engineering
Julien Hendrickx
UCLouvain
Mathematical Engineering Department
Gah-Yi Ban
London Business School
Management Science and Operations
Yoram Singer
Princeton University and Google
Computer Science Department
Google Brain
Youssef Marzouk
MIT
Department of Aeronautics and Astronautics
Mihaela van der Schaar
University of Cambridge
Department of Engineering



2019 LIDS Student Conference


The annual LIDS Student Conference is a student-organized, student-run event that provides an opportunity for graduate students to present their research to peers and the community at large. The conference also features a set of distinguished plenary speakers. The 2019 Student Conference marks 24 years of this signature lab event.
Student Conference Chairs
Alireza Fallah
Julia Gaudio
Ezra Tal
Chulhee (Charlie) Yun
Committee Members
Sarah Cen
Georgia Dimaki
Xinzhe (Roger) Fu
Joseph Gaudio
Igor Kadota
Jason Liang
Bai Liu
Konstantina Mellou
Oscar Mickelin
Sarath Pattathill
Tianyi Peng
Manon Revel
David Rosen
Maryann Rui
Tuhin Sarkar
Abhin Shah
Sohil Shah
Dennis Shen
Dogyoon Song
Igor Spasojevic
Rajat Talak
Omer Tanovic
Vishrant Tripathi
Vasileios Tzoumas
Cesar Uribe
Chenyang Yuan
Student Speakers
Anish Agarwal
Raj Agrawal
Jackie Baek
Anastasiya Belyaeva
Espen Flo Bodal
Enric Boix
Matthew S. Brennan
Andres Campero
Joao Cavalcanti
Ryan Cory-Wright
Arthur Delarue
Alireza Fallah
Joseph Gaudio
Xiaoyue Gong
Karthik Gopalakrishnan
Rupamathi Jaddivada
Dongchan Lee
Emily Meigs
Konstantina Mellou
Dheeraj M. Nagaraj
Sadra Sadraddini
Milad Siami
James Siderius
Dogyoon Song
Omer Tanovic
Antonio Teran Espinoza
Vishrant Tripathi
Jingwei Yang
Chulhee (Charlie) Yun
Jingzhao Zhang
Renbo Zhao
Panelists
Prof. Abhay Parekh
University of California, Berkeley
Prof. Devavrat Shah
MIT
Prof. Caroline Uhler
MIT
Dr. Kalyan Veeramachaneni
MIT
Plenary Speakers
Prof. Maria Florina Balcan
Carnegie Mellon University
Prof. Calin Belta
Boston University
Prof. Andrea Montanari
Stanford University
Prof. Abhay Parekh
University of California, Berkeley
LIDS Community
LIDS had a fantastic 2018-2019 academic year, full of intellectual engagement and achievement, but also many activities that make our community a special place. It is our students and postdocs who often play a key role in organizing these activities.
Through their work on different committees, LIDS students and postdocs have organized many great social events (including lunches, picnics, Friday afternoon snacks, and sports events), a popular weekly series of informal research presentations, and terrific mentoring events, including panels and alumni talks.
Our thanks to all of the students, faculty, and staff who made these a success! We’d like to thank here, in particular, the student and postdoc organizers:
LIDS Social Committee
Daniel Bernstein
Igor Kadota
Jason Liang
Alicia (Sun) Yi
Chenyang Yuan
LIDS Mentoring Committee
Anish Agarwal
Zied Ben Chaouch
Rupamathi Jaddivada

LIDS & Stats Tea Talks Committee
Rupamathi Jaddivada
Vishrant Tripathi
Cristian-Ioan Vasile
Yuhao Wang
LIDS Postdoc Committee
Vasileios Tzoumas
César Uribe



