CAN BLACK HOLES CREATE NEW UNIVERSES? HUYGENS PROBE LANDS ON TITAN ORIGINS OF THE UNIVERSE ASKING THE ASTRONAUT SPACECRAFT MAPS MOON'S GRAVITY UNTRIED TECHNOLOGIES
CAN BLACK HOLES CREATE
NEW UNIVERSES? A Q&A WITH LEE SMOLIN
The universe may have been borne inside a black hole, and the black holes in our own cosmos might be birthing new universes of their own, if one physicist's controversial idea about time is true. Black holes are gigantic cosmic monsters, exotic objects whose gravity is so strong that not even light can escape their clutches. Black holes come in a wide variety of forms, from small stellar-mass bodies to the supermassive beasts that reside at the hearts of galaxies. When a star burns through the last of its fuel, it may find itself collapsing. For smaller stars, up to about three times the sun's mass, the new core will be a neutron star or a white dwarf. But when a larger star collapses, it continues to fall in on itself to create a stellar black hole.
Black holes formed by the collapse of individual stars are (relatively) small, but incredibly dense. Such an object packs three times or more the mass of the sun into a citysized range. This leads to a crazy amount of gravitational force pulling on objects around it. Black holes consume the dust and gas from the galaxy around them, growing in size. Going against the standard view of most scientists, theoretical physicist Lee Smolin has suggested that time is real, rather than the illusion that Einstein's theory of relativity makes it out to be. Smolin, who's based at Canada's Perimeter Institute for Theoretical Physics, outlines the idea in his new book "Time Reborn" (Houghton Mifflin Harcourt, April 2013). SPACE.com caught up with Smolin recently to learn more about this theory, and the real nature of time. SPACE.com: What does it mean for time to be real or not real? Isn't time obviously real? Smolin: In the physicist's conception of nature, as developed from Newton to Einstein, time becomes a secondary concept. It becomes replaced by a notion of computation, so that a process carrying out in time and causing things to happen becomes modeled by a logical computation. Logic and mathematics are outside of time, and therefore if that modeling is completely accurate, time is unreal. For example, Einstein is famously quoted as saying that people who understand physics know that the distinction between the past, present and future is only a stubbornly persistent illusion. SPACE.com: When you say time is real, in contradiction to these ideas by Einstein and others, then what does that mean? Smolin: First of all that the experience that we have of being in the present moment, which is one of a flow of moments succeeding each other, is not an illusion as Einstein and others asserted — it's the deepest clue we have as to the nature of reality.
Reality is structured to a series of moments so that anything that is real is real in a moment of time, and if something appears to persist in time, that's because it's continually renewing in time, in the moments of time, which are the reality of existence. Any truth about the world is a truth about the world within time — there are no timeless truths. And most importantly, there are no laws of nature that are outside of time. Everything changes, including the laws. SPACE.com: So how does this concept help us understand the laws of nature?
many black holes. That leads to the next hypotheses that if you change the laws, and the numbers that specify the laws, then typically you're going to make a universe that makes less black holes, and that's something that leads to predictions that can be tested. That's the theory that I call cosmological natural selection. SPACE.com: How would these universes pass on their traits to daughter universes? Smolin: At the level in which I propose this theory I didn’t answer that question, just like Darwin had no idea how inherited traits were inherited, because he didn't know anything about the molecular basis of genetics, which was only discovered with DNA. So I was able to make those predictions without specifying the microscopic basis of inheritance in cosmology. SPACE.com: Just how are new universes born inside black holes? Smolin: A star that collapses into a black hole very quickly squeezes down to infinite density and time stops — that's according to general relativity. And basically that moment when time stops is deferred by quantum mechanics, by quantum uncertainty, and rather than collapsing to infinite density, the star collapses to a certain extreme density, and then bounces back and begins to expand again. And that expanding star becomes the birth of a new universe. The point where time ends inside a black hole becomes joined to the point where time begins in a Big Bang in a new universe.
Reality is structured to a series of moments so that anything that is real is real in a moment of time
Smolin: The main reason why I advocate this new view of time is because it may make the laws of nature explicable. And by that, I mean the scientific answer to the question of why the laws of nature as we observe them have been selected to be what they are.
If the laws are timeless and eternal then there's no way to explain the choice of laws. As far as we understand it, the laws might easily have been different in many different ways. The masses of the elementary particles might have been different, the strength of the forces might have been different, or there might have been altogether different elementary particles and forces. I believe, and this is the result of an argument which is carried out in the book, that the only way within science of explaining the laws of nature is if the laws of nature are the result of dynamical evolution in time. SPACE.com: How does your theory of the dynamical evolution of the universe work? Smolin: The idea is that the universe evolved in a way which is very analogous to natural selection in a population, say, of bacteria. To do this the universe needs to reproduce itself, and I took over an older idea by John Wheeler and Bryce DeWitt, who were pioneers of quantum gravity. Their idea was that black holes become the seeds of the birth of new universes. John Wheeler had already speculated that when this happens, the laws of nature are reborn again, in the new baby universe; he called it reprocessing the universe. What I had to add to this to make it work like a model of natural selection, was that the changes passed form parent to child universe are very slight so there can be an accumulation of fitness. This hypothesis leads to the conclusion that assuming our universe is a typical member of this population of universes as it develops after many, many generations, that the universe is going to be finely tuned to produce
SPACE.com: Does this idea have testable predictions? Smolin: I did make two predictions which were eminently checkable by astrophysical and cosmological observations, and both of them could easily have been falsified by observations over the last 20 years, and both have been confirmed by observations so far. One of them concerns the masses of neutron stars and the prediction is there can't be a neutron star heavier than about twice the mass of the sun. This continues to be confirmed by the best measurements of the masses of neutron stars. The other prediction has to do with the cosmic microwave background radiation and the hypothesis of cosmological inflation. The observations of the Planck satellite are completely consistent with the version of inflation that cosmological natural selection supports. •
HUYGEN’S PROBE LANDS ON TITAN
Europe reaches new frontier as probe on board of touches down on moon of Saturn After its seven-year journey through the Solar System on board the Cassini spacecraft, ESA’s Huygens probe has successfully descended through the atmosphere of Titan, Saturn’s largest moon, and safely landed on its surface.
Huygens data, relayed by Cassini, were picked up by NASA’s Deep Space Network and delivered immediately to ESA’s European Space Operation Centre in Darmstadt, Germany, where the scientific analysis is currently taking place.
The first scientific data arrived at the European Space Operations Centre (ESOC) in Darmstadt, Germany, this afternoon at 17:19 CET. Huygens is mankind’s first successful attempt to land a probe on another world in the outer Solar System. “This is a great achievement for Europe and its US partners in this ambitious international endeavour to explore the Saturnian system,” said Jean-Jacques Dordain, ESA’s Director General.
“Titan was always the target in the Saturn system where the need for ‘ground truth’ from a probe was critical. It is a fascinating world and we are now eagerly awaiting the scientific results,” says Professor David Southwood, Director of ESA’s scientific programmme.
Following its release from the Cassini mothership on 25 December, Huygens reached Titan’s outer atmosphere after 20 days and a 4 million km cruise. The probe started its descent through Titan’s hazy cloud layers from an altitude of about 1270 km at 11:13 CET. During the following three minutes Huygens had to decelerate from 18 000 to 1400 km per hour. A sequence of parachutes then slowed it down to less than 300 km per hour. At a height of about 160 km the probe’s scientific instruments were exposed to Titan’s atmosphere. At about 120 km, the main parachute was replaced by a smaller one to complete the descent, with an expected touchdown at 13:34 CET. Preliminary data indicate that the probe landed safely, likely on a solid surface. The probe began transmitting data to Cassini four minutes into its descent and continued to transmit data after landing at least as long as Cassini was above Titan’s horizon. The certainty that Huygens was alive came already at 11:25 CET today, when the Green Bank radio telescope in West Virginia, USA, picked up a faint but unmistakable radio signal from the probe. Radio telescopes on Earth continued to receive this signal well past the expected lifetime of Huygens.
“The Huygens scientists are all delighted. This was worth the long wait,” says Dr Jean-Pierre Lebreton, ESA Huygens Mission Manager. Huygens is expected to provide the first direct and detailed sampling of Titan’s atmospheric chemistry and the first photographs of its hidden surface, and will supply a detailed ‘weather report’. One of the main reasons for sending Huygens to Titan is that its nitrogen atmosphere, rich in methane, and its surface may contain many chemicals of the kind that existed on the young Earth. Combined with the Cassini observations, Huygens will afford an unprecedented view of Saturn’s mysterious moon. “Descending through Titan was a once-in-a-lifetime opportunity and today’s achievement proves that our partnership with ESA was an excellent one,” says Alphonso Diaz, NASA Associate Administrator of Science. The Cassini-Huygens mission is a cooperation between NASA, the European Space Agency and ASI, the Italian space agency. The Jet Propulsion Laboratory (JPL), a division of the California Institute of Technology in Pasadena, is managing the mission for NASA’s Office of Space Science, Washington. JPL designed, developed and assembled the Cassini orbiter. “The teamwork in Europe and the USA, between scientists, industry and agencies has been extraordinary and has set the foundation for today’s enormous success,” concludes Jean-Jacques Dordain. •
THE UNIVERSE A LECTURE BY STEPHEN HAWKING
According to the Boshongo people of central Africa, in the beginning there was only darkness, water, and the great god Bumba. One day Bumba, in pain from a stomachache, vomited up the sun. The sun dried up some of the water, leaving land. Still in pain, Bumba vomited up the moon, the stars, and then some animals. The leopard, the crocodile, the turtle, and, finally man. This creation myth, like many others, tries to answer the questions we all ask. Why are we here? Where did we come from? The answer generally given, was that humans were of comparatively recent origin, because it must have been obvious, even at early times, that the human race was improving in knowledge and technology. So it can’t have been around that long, or it would have progressed even more. For example, according to Bishop Usher, the Book of Genesis placed the creation of the world at 9 in the morning, on October the 27th, 4,004 BC. On the other hand, the physical surroundings, like mountains and rivers, change very little in a human lifetime. They were therefore thought to be a constant background, and either to have existed forever as an empty landscape, or to have been created at the same time as the humans. Not everyone however, was happy with the idea that the universe had a beginning. For example, Aristotle, the most famous of the Greek philosophers, believed the universe had existed forever. Something eternal is more perfect than something created. He suggested the reason we see progress, was that floods, or other natural disasters, had repeatedly set civilization back to the beginning. The motivation for believing in an eternal universe, was the desire to avoid invoking divine intervention, to create the universe, and set it going. Conversely, those who believed the universe had a beginning, used it as an argument for the existence of God, as the first cause, or prime mover of the universe.
the universe. They were defined only within the universe, so it made no sense to talk of a time before the universe began. It would be like asking for a point south of the South Pole. It is not defined. If the universe was essentially unchanging in time, as was generally assumed before the 1920s, there would be no reason that time should not be defined arbitrarily far back. Any so-called beginning of the universe, would be artificial, in the sense that one could extend the history back to earlier times. Thus it might be that the universe was created last year, but with all the memories and physical evidence, to look like it was much older. This raises deep philosophical questions about the meaning of existence. I shall deal with these by adopting what is called, the positivist approach. In this, the idea is that we interpret the input from our senses in terms of a model we make of the world. One can not ask whether the model represents reality, only whether it works. A model is a good model, if first it interprets a wide range of observations, in terms of a simple and elegant model. And second, if the model makes definite predictions that can be tested, and possibly falsified, by observation. In terms of the positivist approach, one can compare two models of the universe. One in which the universe existed much longer. The model in which the universe existed for longer than a year, can explain things like identical twins, that have a common cause more than a year ago. On the other hand, the model in which the universe was created last year, can not explain such events. So the first model is better. One can not ask whether the universe really existed before a year ago, or just appeared to. In the positivist approach, they are the same.
The expansion of the universe, was one of the most important intellectual discoveries of the 20th century, or of any century.
If one believed that the universe had a beginning, the obvious question was, What happened before the beginning? What was God doing before He made the world? Was He preparing Hell for people who asked such questions? The problem of whether or not the universe had a beginning, was a great concern to the German philosopher, Immanuel Kant. He felt there were logical contradictions, or Antimonies, either way. If the universe had a beginning, why did it wait an infinite time before it began? He called that the thesis. On the other hand, if the universe had existed forever, why did it take an infinite time to reach the present stage? He called that the anti thesis. Both the thesis, and the anti thesis, depended on Kant’s assumption, along with almost everyone else, that time was Absolute. That is to say, it went from the infinite past, to the infinite future, independently of any universe that might or might not exist in this background. This is still the picture in the mind of many scientists today. However in 1915, Einstein introduced his revolutionary General Theory of Relativity. In this, space and time were no longer Absolute, no longer a fixed background to events. Instead, they were dynamical quantities that were shaped by the matter and energy in
In an unchanging universe, there would be no natural starting point. The situation changed radically however, when Edwin Hubble began to make observations with the hundred inch telescope on Mount Wilson, in the 1920s. Hubble found that stars are not uniformly distributed throughout space, but are gathered together in vast collections called galaxies. By measuring the light from galaxies, Hubble could determine their velocities. He was expecting that as many galaxies would be moving towards us, as were moving away. This is what one would have in a universe that was unchanging with time. But to his surprise, Hubble found that nearly all the galaxies were moving away from us. Moreover, the further galaxies were from us, the faster they were moving away. The universe was not unchanging with time, as everyone had thought previously. It was expanding. The distance between distant galaxies, was increasing with time. The expansion of the universe, was one of the most important intellectual discoveries of the 20th century, or of any century. It transformed the debate about whether the universe had a beginning. If galaxies are moving apart now, they must have been closer together in the past. If their speed had been constant, they would all have been on top of one another, about 15 billion years ago. Was this, the beginning of the universe. Many scientists were still unhappy with the universe having a beginning, because it seemed to imply that
physics broke down. One would have to invoke an outside agency, which for convenience, one can call God, to determine how the universe began. They therefore advanced theories in which the universe was expanding at the present time, but didn’t have a beginning. One was the Steady State theory, proposed by Bondi, Gold, and Hoyle in 1948. In the Steady State theory, as galaxies moved apart, the idea was that new galaxies would form from matter that was supposed to be continually being created throughout space. The universe would have existed for ever, and would have looked the same at all times. This last property had the great virtue, from a positivist point of view, of being a definite prediction, that could be tested by observation. The Cambridge radio astronomy group, under Martin Ryle, did a survey of weak radio sources in the early 1960s. These were distributed fairly uniformly across the sky, indicating that most of the sources, lay outside our galaxy. The weaker sources would be further away, on average. The Steady State theory predicted the shape of the graph of the number of sources, against source Strength. But the observations showed more faint sources than predicted, indicating that the density sources was higher in the past. This was contrary to the basic assumption of the Steady State theory, that everything was constant in time. For this, and other reasons, the Steady State theory was abandoned. Another attempt to avoid the universe having a beginning, was the suggestion that there was a previous contracting phase, but because of rotation and local irregularities, the matter would not all fall to the same point. Instead, different parts of the matter would miss each other, and the universe would expand again, with the density remaining finite. Two Russians, Lifshitz and Khalatnikov, actually claimed to have proved that a general contraction without exact symmetry, would always lead to a bounce, with the density remaining finite. This result was very convenient for Marxist Leninist dialectical materialism, because it avoided awkward questions about the creation of the universe. It therefore became an article of faith for Soviet scientists. When Lifshitz and Khalatnikov published their claim, I was a 21—year-old research student, looking for something to complete my PhD thesis. I didn’t believe their so-called proof, and set out with Roger Penrose to develop new mathematical techniques to study the question. We showed that the universe couldn’t bounce. If Einstein’s General Theory of Relativity is correct, there will be a singularity, a point of infinite density and spacetime curvature, where time has a beginning. Observational evidence to confirm the idea that the universe had a very dense beginning, came in October 1965, a few months after my first singularity result, with the discovery of a faint background of microwaves throughout space. These microwaves are the same as those in your microwave oven, but very much less powerful. They would heat your pizza only to minus 271 point 3 degrees centigrade, not much good for defrosting the pizza, let alone cooking it. You can actually observe these microwaves yourself. Set your television to an empty channel. A few percent of the snow you see on the screen, will be caused by this background of microwaves. The only reasonable interpretation of the background,
is that it is radiation left over from an early very hot and dense state. As the universe expanded, the radiation would have cooled until it is just the faint remnant we observe today. Although the singularity theorems of Penrose and myself, predicted that the universe had a beginning, they didn’t say how it had begun. The equations of General Relativity would break down at the singularity. Thus Einstein’s theory cannot predict how the universe will begin, but only how it will evolve once it has begun. There are two attitudes one can take to the results of Penrose and myself. One is to that God chose how the universe began for reasons we could not understand. This was the view of Pope John Paul. At a conference on cosmology in the Vatican, the Pope told the delegates that it was OK to study the universe after it began. But they should not inquire into the beginning itself, because that was the moment of creation, and the work of God. I was glad he didn’t realize I had presented a paper at the conference, suggesting how the universe began. I didn’t fancy the thought of being handed over to the Inquisition, like Galileo. The other interpretation of our results, which is favored by most scientists, is that it indicates that the General Theory of Relativity, breaks down in the very strong gravitational fields in the early universe. It has to be replaced by a more complete theory.. One would expect this anyway, because General Relativity does not take account of the small-scale structure of matter, which is governed by quantum theory. This does not matter normally, because the scale of the universe, is enormous compared to the microscopic scales of quantum theory. But when the universe is the Planck size, a billion trillion trillionth of a centimeter, the two scales are the same, and quantum theory has to be taken into account. In order to understand the Origin of the universe, we need to combine the General Theory of Relativity, with quantum theory. The best way of doing so, seems to be to use Feynman’s idea of a sum over histories. Richard Feynman was a colorful character, who played the bongo drums in a strip joint in Pasadena, and was a brilliant physicist at the California Institute of Technology. He proposed that a system got from a state A, to a state B, by every possible path or history. Each path or history has a certain amplitude or intensity, and the probability of the system going from Ato B, is given by adding up the amplitudes for each path. There will be a history in which the moon is made of blue cheese, but the amplitude is low, which is bad news for mice. The probability for a state of the universe at the present time, is given by adding up the amplitudes for all the histories that end with that state. But how did the histories start. This is the Origin question in another guise. Does it require a Creator to decree how the universe began. Or is the initial state of the universe, determined by a law of science. In fact, this question would arise even if the histories of the universe went back to the infinite past. But it is more immediate if the universe began only 15 billion years ago. The problem of what happens at the beginning of time, is a bit like the question of what happened at the edge of the world, when people thought the world was flat. Is the world a flat plate, with the sea pouring over the edge. I have tested this experimentally. I have been round the
world, and I have not fallen off. As we all know, the problem of what happens at the edge of the world, was solved when people realized that the world was not a flat plate, but a curved surface. Time however, seemed to be different. It appeared to be separate from space, and to be like a model railway track. If it had a beginning, there would have to be someone to set the trains going. Einstein’s General Theory of Relativity, unified time and space as space-time, but time was still different from space, and was like a corridor, which either had a beginning and end, or went on for ever. However, when one combines General Relativity with Quantum Theory, Jim Hartle and I, realized that time can behave like another direction in space under extreme conditions. This means one can get rid of the problem of time having a beginning, in a similar way in which we got rid of the edge of the world. Suppose the beginning of the universe, was like the south pole of the Earth , with degrees of latitude, playing the role of time. The universe would start as a point at the South Pole. As one moves north, the circles of constant latitude, representing the size of the universe, would expand. To ask what happened before the beginning of the universe, would become a meaningless question, because there is nothing south of the South Pole. The same laws of Nature hold at the South Pole, as in other places. This would remove the age-old objection to the universe having a beginning, that it would be a place where the normal laws broke down. The beginning of the universe, would be governed by the laws of science. The idea is that the most probable histories of the universe, would be like the surfaces of the bubbles. Many small bubbles would appear, and then disappear again. These would correspond to mini universes that would expand, but would collapse again while still of microscopic size. They are possible alternative universes, but they are not of much interest since they do not last long enough to develop galaxies and stars, let alone intelligent life. A few of the little bubbles, however, with grow to a certain size at which they are safe from recollapse. They will continue to expand at an ever-increasing rate, and will form the bubbles we see. They will correspond to universes that would start off expanding at an ever-increasing rate. This is called inflation, like the way prices go up every year.
intensity of the microwave background from different directions. The microwave background has been observed by the Map satellite, and was found to have exactly the kind of variations predicted. So we know we are on the right lines. The irregularities in the early universe, will mean that some regions will have slightly higher density than others. The gravitational attraction of the extra density, will slow the expansion of the region, and can eventually cause the region to collapse to form galaxies and stars. So look well at the map of the microwave sky. It is the blue print for all the structure in the universe. We are the product of quantum fluctuations in the very early universe. God really does play dice. We have made tremendous progress in cosmology in the last hundred years. The General Theory of Relativity, and the discovery of the expansion of the universe, shattered the old picture of an ever existing, and ever lasting universe. Instead, general relativity predicted that the universe, and time itself, would begin in the big bang. It also predicted that time would come to an end in black holes. The discovery of the cosmic microwave background, and observations of black holes, support these conclusions. This is a profound change in our picture of the universe, and of reality itself. Although the General Theory of Relativity, predicted that the universe must have come from a period of high curvature in the past, it could not predict how the universe would emerge from the big bang. Thus general relativity on its own, can not answer the central question in cosmology, Why is the universe, the way it is. However, if general relativity is combined with quantum theory, it may be possible to predict how the universe would start. It would initially expand at an ever increasing rate. During this so called inflationary period, the marriage of the two theories predicted that small fluctuations would develop, and lead to the formation of galaxies, stars, and all the other structure in the universe. This is confirmed by observations of small non uniformities in the cosmic microwave background, with exactly the predicted properties. So it seems we are on our way to understanding the origin of the universe, though much more work will be needed. A new window on the very early universe, will be opened when we can detect gravitational waves by accurately measuring the distances between space craft. Gravitational waves propagate freely to us from earliest times, unimpeded by any intervening material. By contrast, light is scattered many times by free electrons. The scattering goes on until the electrons freeze out, after 300,000 years.
Stars are not uniformly distributed throughout space, but are gathered together in vast collections called galaxies.
The world record for inflation, was in Germany after the First World War. Prices rose by a factor of ten million in a period of 18 months. But that was nothing compared to inflation in the early universe. The universe expanded by a factor of million trillion trillion in a tiny fraction of a second. Unlike inflation in prices, inflation in the early universe was a very good thing. It produced a very large, and uniform universe, just as we observe. However, it would not be completely uniform. In the sum over histories, histories that are very slightly irregular, will have almost as high probabilities as the completely uniform and regular history.. The theory therefore predicts that the early universe is likely to be slightly non-uniform. These irregularities would produce small variations in the
Despite having had some great successes, not everything is solved. We do not yet have a good theoretical understanding, of the observations that the expansion of the universe, is accelerating again, after a long period of slowing down. Without such an understanding, we can not be sure of the future of the universe. Will it continue to expand forever? Is inflation a law of Nature? Or will the universe eventually collapse again? New observational results, and theoretical advances, are coming in rapidly. Cosmology is a very exciting and active subject. We are getting close to answering the age old questions. Why are we here? Where did we come from? •
THE ASTRONAUT AN INTERVIEW WITH JEFFREY HOFFMAN
Professor in the Department of Aeronautics and Astronautics at MIT, Jeffrey Hoffman flew on five shuttle missions as a NASA astronaut. He also served as NASA’s European representative for four years. Don Cohen spoke with him at his office at MIT. Cohen: You were a scientist first and then an astronaut. How did that come about? Hoffman: I’ve been interested in space since I was a little kid. When the first astronauts began to fly, I was excited by the idea of flying in space, but I had no desire to be a military test pilot. I took an astronomy course in college, found that I liked it, and went on for a PhD in astrophysics. I was most interested in high-energy aspects of physics for two reasons. I liked the space connection, the fact that you had to go above the atmosphere. And, because we were looking at these wavelengths for the first time, you were almost guaranteed to make interesting discoveries. I was involved in the discovery and elucidation of the nature of X-ray bursts, work I did with Walter Lewin. Cohen: And you need to get above the atmosphere to study those wavelengths? Hoffman: Absolutely. For my PhD thesis, we flew gamma-ray detectors in balloons. That was before we realized that you can’t do gamma-ray astronomy from balloons: you need more exposure time. Now we do it from satellites, of course. Here at MIT we had our own SAS-3—small university-class satellite—we operated out of the control room at MIT. The commands went to Goddard to send up to the satellite, but it was our satellite, and we determined what commands should be sent. I was also project scientist for the high-energy X-ray experiment on the first high-energy astrophysical laboratory, HEOA-1. I’ve always followed the space program. When I read that NASA was going to need quite a few new astronauts for the shuttle and that they were looking for scientists, engineers, and doctors, not just for test pilots, I figured, “Why not have a go?” I put in my application and was fortunate enough to be selected in the first round. I was in the first group of shuttle astronauts that showed up for work in 1978. Cohen: What are the benefits of having a scientist in space? There are many scientists designing space experiments who really don't appreciate the limitations and also some of the special opportunities they would have. Hoffman: I think the most valuable thing was the work I did with scientists on the ground preparing experiments to go into space, being able to use my understanding of the environment of the shuttle in space to help them plan experiments. I worked with scientists in many different fields. Every time I got involved in a new project, it was like being a graduate student all over again, trying to understand what they were doing. I’ve always felt comfortable having a foot in both the science and engineering worlds, even when I was working
with sounding rockets and satellites. Being able to work in both disciplines is important. There are many scientists designing space experiments who really don’t appreciate the limitations and also some of the special opportunities they would have. When you’re doing laboratory science in space, the deeper your understanding of the experiment, the more likely you are to be able to recognize unusual results and take advantage of the serendipity that is a part of most laboratory science. That was very difficult in a Spacelab type mission because everything was so tightly programmed. You had to do what you were supposed to do and then shut down that experiment and go on to the next one. We didn’t have the luxury of turning people loose in the laboratory. NASA is very good at running missions: organizing an EVA [extravehicular activity] or planning for the visit of one of the supply ships. But a laboratory has to be run with flexibility, with rapid response. You need procedures, but you need the flexibility to know when to change things. That’s something I hope some day we’ll be able to get to in the space station. Cohen: How did your astronaut experience help scientists who had never flown? Hoffman: I was not the only one who felt it was important to get the astronaut perspective to scientists. There were a number of us—Franklin Chang-Diaz, Bonnie Dunbar, Rhea Seddon—who formed what was called the Science Support Group. We produced a movie where we went through some of the problems that people don’t understand—simple things like handling fluids, for example, because people didn’t plan on the unusual types of fluid behavior. Experiments can go awry because of that. And thermal control. Particularly in the early days, we lost a lot of locker experiments because of thermal problems. There is no density-driven convective cooling in weightlessness. Also things involving cabling could cause problems. Cables have a life of their own in space. You need to design systems so you can set them up and take them down without spending hours controlling all these things that are floating around. Little parts floating away can ruin your day. There are things you can do to avoid that, but you have to think of them beforehand.
Hoffman: Right now, the system places as many barriers as possible between the scientists and the crew. During some of the older Russian missions—I think they still do it this way—it was a requirement that the scientists be there to talk with the crew. At least, that is what some of the Russian scientists told me. We don’t allow that. A scientist has to put in a request to get something to the crew, and that has to be sent to the PAYCOM [payload command], and the PAYCOM, who is not a scientist and may not have a deep understanding of the science, has to transmit that up to the crew. It’s not the way laboratories should work. Cohen: Like a game of telephone, where you lose the message in translation. Hoffman: You got it. People guard the air-to-ground loops very carefully. They don’t want people getting on who don’t have proper protocols. But frankly it’s easier to teach a scientist how to talk over the air-to-ground loops than it is to teach a contractor or someone working at the PAYCOM console to be a scientist. Verbal communication is part of it. The training is much more important. The amount of time the crew can spend getting to know the scientists and understand the science that is supposed to be done is far more critical than the conversation back and forth. Cohen: Do you think we’re getting better at using the space station for research? Hoffman: A lot of people are working very hard to increase the efficiency of research operations on the station. We’re only just starting the operations phase of the space station. It took years before we really learned how to operate the shuttle efficiently. We’re pushing in the right direction, but it takes time. There are cultural gaps that have to be bridged. I hope we can do it successfully. At the moment the crews are still overscheduled. I think that maybe the biggest challenge that faces the space station program, at least from the scientific point of view, is transforming the station from a construction project into a flexible, working, scientific laboratory. Cohen: The construction is essentially finished … Hoffman: Yes, but the crew is still incredibly busy taking care of the station. People are trying to figure out how to get more crew time available, and not just time on orbit. When crews trained for Spacelab missions, they spent a lot of time with the scientists. In many cases, they were personally invested in the experiments, because they had spent time in the laboratories, they knew what the scientists were trying to achieve. I think that made a big difference in the success of many Spacelab experiments. Crews are so overwhelmed with training responsibilities now—just the basic stuff you’ve got to do in Russia, plus learning the European module, the Japanese module, robotics training, EVA training. The crew has basically been pulled out of the kind of science training that was a part of Spacelab missions, to the detriment of space
When you’re doing laboratory science in space, the deeper your understanding of the experiment, the more likely you are to be able to recognize unusual results and take advantage of the serendipity that is a part of most laboratory science.
Cohen: So, for various reasons, good communication between scientists and astronauts is important.
station science. Cohen: What are the potential advantages of science in space? Hoffman: In many cases, weightlessness improves the precision with which you’re able to make measurements. I remember an experiment where one of the limitations in the lab was the pressure gradient in a fluid caused by gravity. In space, you have no pressure gradient so you can get an order of magnitude improvement in the precision of the measurement. I think there are planned experiments for atomic clocks up in orbit. Because you don’t have atoms falling out of the field of view of the exciting lasers because of gravity, you can observe them for a much longer time and that gives you better precision. The hope is that we’ll get maybe an order of magnitude improvement in our ability to measure time. Whenever we make an improvement in our ability to measure time, it ends up having technological spinoffs, GPS being the most obvious example. In other cases, you’re trying to look at phenomena which flat-out don’t exist on the ground. That’s probably where serendipity is going to be even more important. Cohen: What kinds experiments would you personally like to see happen at the space station?
development can cost that much—that’s enough to finance experiments up in space. Assuming that we can do them quickly. That’s part of the other challenge I mentioned before: turning the station into a working laboratory. If the pharmaceutical company or a research university comes up with something they’d like to test and they’ve got to wait three years in the queue, you’ve lost it. I think that maybe the biggest challenge that faces the space program, at least from the scientific point of view, is transforming the station from a construction project into a flexible, working, scientific laboratory. Cohen: Do you think the station has a role to play in future space exploration? Hoffman: I very much believe in the space station as preparation for long-duration spaceflight, and I hope we will take up that mantle again. Cohen: And fly to Mars some day? Hoffman: The more we learn about Mars the more fascinating a place it is in terms of geological history, potential for biology, and resources. For long-term activities on Mars, we need to be able to do ISRU, in situ resource utilization. All explorers have lived off the land. The first time we go there, we’ll take everything we need, just like the first time we went to the moon, but for longer-term exploration we need to learn how to use the local resources. That’s absolutely critical. It makes a huge difference in terms of the ultimate cost as well, if you can make your own oxygen and rocket fuel. We need to do that first on the moon. There are differences between the moon and Mars, but would we really rely on surface operations that we’ve never tested out on another heavenly body the first time we go to Mars? I don’t think there needs to be a permanently manned moon base; I don’t want to see us build another space station there. Let’s remember that we can operate equipment on the moon telerobotically from the earth. The Mars rovers have to be pretty much autonomous, and when they run into problems, they have to shut down and wait for advice, whereas we can keep things running 24-7 if we want to on the moon, and periodically visit to set them up, make repairs, do whatever you have to do for operations while they’re building up supplies. We need to do that before we are ready to go to Mars. We also need to develop and demonstrate the capabilities for deep-space travel. That’s where visits to asteroids come in, because you don’t have to land on them. We don’t now have the technological capability to do entry, descent, and landing on Mars with human-class vehicles. I think we can develop at least the entry capability with experiments in the upper reaches of the earth’s atmosphere, which I know NASA is thinking about, but we’ve never had successful demonstrations. So there’s a lot that has to be done before we go to Mars. •
There are many scientists designing space experiments who really don't appreciate the limitations and also some of the special opportunities they would have.
Hoffman: Telling time better is probably at the top of the list for me, if only because there have been so many benefits from telling time better in the past. Demonstrating the efficacy of the station as a useful investment for our country is probably going to come from biotechnology. I was in Houston last weekend at the International Space Medicine Summit. They announced that they are reactivating the bioreactor program, which I think has a tremendous potential for health. If it turns out that this bioreactor research in orbit can lead to better vaccines and medicines and treatments, that’s the sort of thing that the public will really respond to. What goes on in laboratories doesn’t make the news, except when they make major discoveries. We hope there are going to be some significant discoveries from the space station. Cohen: And the advantage of having a bioreactor on the space station is what?
Hoffman: Suppose you have a bit of liver tissue. It starts to grow. As it gets bigger, it sinks toward the bottom. So you rotate the bioreactor so it’s at the top again. It’s continually falling through the liquid and it continues to grow. The problem is, as it gets bigger and bigger you have to rotate faster and faster to counteract the settling forces. Eventually you build up shear forces, which will rip the material apart. So there’s a limit. In space, where you don’t have the settling, these three-dimensional tissue cultures can be grown much bigger. That’s been demonstrated. The original work was done up on the Mir station. They’ve actually seen vascularizaton of tissues; they’ve grown knee cartilage, liver cells, cancer cells. You can then use these to test drugs. If you can get good three-dimensional human tissue to test on, you could save one or two years in the development of a drug. At $100 million a year—my understanding is drug
SPACECRAFT MAPS MOON'S GRAVITY NASA Solves Mystery of Moon's Surface Gravity NASA's Gravity Recovery and Interior Laboratory (GRAIL) mission has uncovered the origin of massive invisible regions that make the moon's gravity uneven, a phenomenon that affects the operations of lunarorbiting spacecraft. Because of GRAIL's findings, spacecraft on missions to other celestial bodies can navigate with greater precision in the future. GRAIL's twin spacecraft studied the internal structure and composition of the moon in unprecedented detail for nine months. They pinpointed the locations of large, dense regions called mass concentrations, or mascons, which are characterized by strong gravitational pull. Mascons lurk beneath the lunar surface and cannot be seen by normal optical cameras. GRAIL scientists found the mascons by combining the gravity data from GRAIL with sophisticated computer models of large asteroid impacts and known detail about the geologic evolution of the impact craters. The findings are published in the May 30 edition of the journal Science. "GRAIL data confirm that lunar mascons were generated when large asteroids or comets impacted the ancient moon, when its interior was much hotter than it is now," said Jay Melosh, a GRAIL co-investigator at Purdue University in West Lafayette, Ind., and lead author of the paper. "We believe the data from GRAIL show how the moon's light crust and dense mantle combined with the shock of a large impact to create the distinctive pattern of density anomalies that we recognize as mascons." The origin of lunar mascons has been a mystery in planetary science since their discovery in 1968 by a team at NASA's Jet Propulsion Laboratory in Pasadena, Calif. Researchers generally agree mascons resulted from ancient impacts billions of years ago. It was not clear until now how much of the unseen excess mass resulted from lava filling the crater or iron-rich mantle upwelling to the crust. On a map of the moon's gravity field, a mascon appears in a target pattern. The bulls-eye has a gravity surplus. It is surrounded by a ring with a gravity deficit. A ring with a gravity surplus surrounds the bulls-eye and the inner ring. This pattern arises as a natural consequence of crater excavation, collapse and cooling following an impact. The increase in density
and gravitational pull at a mascon's bulls-eye is caused by lunar material melted from the heat of a long-ago asteroid impact. "Knowing about mascons means we finally are beginning to understand the geologic consequences of large impacts," Melosh said. "Our planet suffered similar impacts in its distant past, and understanding mascons may teach us more about the ancient Earth, perhaps about how plate tectonics got started and what created the first ore deposits." This new understanding of lunar mascons also is expected to influence knowledge of planetary geology well beyond that of Earth and our nearest celestial neighbor. "Mascons also have been identified in association with impact basins on Mars and Mercury," said GRAIL principal investigator Maria Zuber of the Massachusetts Institute of Technology in Cambridge. "Understanding them on the moon tells us how the largest impacts modified early planetary crusts." Launched as GRAIL A and GRAIL B in September 2011, the probes, renamed Ebb and Flow, operated in a nearly circular orbit near the poles of the moon at an altitude of about 34 miles (55 kilometers) until their mission ended in December 2012. The distance between the twin probes changed slightly as they flew over areas of greater and lesser gravity caused by visible features, such as mountains and craters, and by masses hidden beneath the lunar surface. JPL, a division of the California Institute of Technology in Pasadena, Calif. managed GRAIL for NASA's Science Mission Directorate in Washington. The mission was part of the Discovery Program managed at NASA's Marshall Space Flight Center in Huntsville, Ala. NASA's Goddard Space Flight Center, in Greenbelt, Md., manages the Lunar Reconnaissance Orbiter. Operations of the spacecraft's laser altimeter, which provided supporting data used in this investigation, is led by the Massachusetts Institute of Technology in Cambridge. Lockheed Martin Space Systems in Denver built GRAIL. â€˘
THE FATE OF THE JAMES WEBB SPACE TELESCOPE
NASA's next-generation space observatory promises to open new windows on the Universe — but its cost could close many more. It has to work — for astronomers, there is no plan B. NASA’s James Webb Space Telescope (JWST), scheduled to launch in 2014, is the successor to the Hubble Space Telescope and the key to almost every big question that astronomers hope to answer in the coming decades. Its promised ability to peer back through space and time to the formation of the first galaxies made it the top priority in the 2001 astronomy and astrophysics decadal survey, one of a series of authoritative, ten-year plans drafted by the US astronomy community. And now, the stakes are even higher. Without the JWST, the bulk of the science goals listed in the 2010 decadal survey, released this August, will be unattainable. “We took it as a given that the JWST would be launched and would be a big success,” says Michael Turner, a cosmologist at the University of Chicago, Illinois, and a member of the committee for the past two decadal surveys. “Things are built around it.” Hence the astronomers’ anxiety: the risks are also astronomical. The JWST’s 6.5-metre primary mirror, nearly three times the diameter of Hubble’s, will be the largest ever launched into space. The telescope will rely on a host of untried technologies, ranging from its sensitive light-detecting instrumentation to the cooling system that will keep the huge spacecraft below 50 kelvin. And it will have to operate perfectly on the first try, some 1.5 million kilometres from Earth — four times farther than the Moon and beyond the reach of any repair mission. If the JWST — named after the administrator who guided NASA through the development of the Apollo missions — fails, the progress of astronomy could be set back by a generation. And yet, as critical as it is for them, astronomers’ feelings about the JWST are mixed. To support a price tag that now stands at roughly US$5 billion, the JWST has devoured resources meant for other major projects, none of which can begin serious development until the binge is over. Missions such as the Wide-Field Infrared Survey Telescope, designed to study the Universe’s dark energy and designated the top-priority space-astronomy project in the most recent decadal survey, will have to wait until after the JWST has launched. “Until then, we’re not projecting being able to afford large investments” in new missions, says Jon Morse, director of NASA’s astrophysics division. And all the space telescopes currently operated by NASA and the European Space Agency will reach the end of their planned lifetimes in the next few years. Worse, the JWST’s costs keep growing. In 2009, NASA required an extra $95 million to cover cost overruns on the telescope. In 2010 it needed a further $20 million. And for 2011 it has requested another $60 million — even as rumours are swirling that still more cash infusions will be required (see ‘Cost curve’). Senator Barbara Mikulski (Democrat, Maryland), chairwoman of the government subcommittee that oversees NASA’s budget, responded to these requests in June by calling for an independent panel to investigate the causes of the JWST’s spiralling cost and delays, and to find a way to bring them to resolution. “Building the JWST
is an awesome technical challenge,” Mikulski says. “But we’re not in the business of cost overruns.” John Casani, chairman of Mikulski’s investigative panel and a former project manager for NASA’s Voyager, Galileo and Cassini missions, emphasizes that the panel is making suggestions, not decisions. Those will be up to NASA, which is expected to announce a budgetary plan incorporating the panel’s suggestions on 2 November. But in considering potential solutions for the JWST’s woes, Casani says that “everything will be on the table” — including, conceivably, scrapping instruments or otherwise downgrading the programme.
The Goldin Opportunity The first concept for a Hubble replacement emerged in 1989, when Hubble was still a year away from launch. Astronomers already knew that its vision would not quite reach back to the ‘cosmic dawn’, 500 million years after the Big Bang, when the first stars and galaxies formed. So a next-generation space telescope that could fill the gap seemed like the logical next step. In 1993, NASA asked a committee of astronomers, chaired by Alan Dressler of the Carnegie Observatories in Pasadena, California, to define what such a telescope would need. The new telescope’s mirror would have to be big to gather the dim light of those first galaxies. So the committee recommended that the primary mirror be at least 4 metres across. The telescope would also have to be cryogenically cold, because at any temperature higher than 50 kelvin, infrared heat radiation from the telescope itself would wash out the faint photons that the astronomers were looking for. “That was the science that propelled the whole thing,” says Dressler. Finally, it would have to operate far from Earth. At infrared wavelengths, this planet glows like a light bulb. So the committee recommended that the telescope be placed 1.5 million kilometres outside Earth’s orbit, at the second Lagrangian point (L2), where the combined gravitational pull of the Sun and Earth creates a region of stability. Any spacecraft at L2 will also lie in the shadow cast by Earth, making it easier to keep cool (see ‘The James Webb Space Telescope’). In December 1995, Dressler briefed NASA’s then administrator, Daniel Goldin, on the recommendations. Goldin was intrigued. He was shaking up NASA’s science programmes, pushing a ‘faster, better, cheaper’ strategy to deliver more capable and inspiring missions at lower costs. Taking his cues from Silicon Valley and aerospace ‘skunkworks’ projects — small, highly autonomous ventures pursuing innovation within larger organizations — Goldin was pushing for miniaturization of bulky electronics, more off-the-shelf components, lower organizational overheads, and a continuous expansion of the technological boundaries with each mission. Dressler’s proposal seemed like a perfect opportunity to test that approach. Instead of a 4-metre telescope, Goldin asked, why not try one with a primary mirror 6–8 metres in
diameter? Some of the technology was in hand: NASA was developing the cryogenic infrared Spitzer Space Telescope with a 0.85-metre mirror made of beryllium, a metal that needs special handling — it corrodes skin at a touch — but is lightweight and keeps its shape through extreme temperature changes. That and other innovations could give the JWST a mega-mirror while reducing costs. As Goldin put it in a speech: “Let’s throw away glass. Glass is for the ground.” Some astronomers were dubious about initial cost estimates for the ambitious mission, which ranged from $500 million to $1 billion. But in the beginning, Goldin’s methods seemed to deliver: the first missions using the approach were wildly successful. Among them were 1997’s landmark Mars Pathfinder mission and its accompanying rover, Sojourner, and the 1998 Lunar Prospector mission that found evidence of water ice on the Moon. But they were followed in 1999 by the disastrous losses of the Wide-Field Infrared Explorer telescope and two planetary missions, the Mars Climate Orbiter and the Mars Polar Lander. This string of failures tarnished the agency’s reputation, and reminded everyone that ‘faster, better, cheaper’ was also riskier. By the end of Goldin’s tenure in 2001, NASA had already begun shifting back to its traditional, risk-averse and far more expensive strategy of exhaustive testing and extensive oversight. That shift would send the cost of the JWST soaring past the billion-dollar mark. The mirror diameter would be cut from 8 metres to 6.5 metres to help reduce costs. But in the meantime, as NASA carried out the many engineering trade-off studies and scientific working groups required to solidify the telescope’s design, a more insidious factor came into play: scientists started to pile on complexity. It happens with almost every major mission, says Peter Stockman, former head of the JWST mission office at the Space Telescope Science Institute in Baltimore, Maryland. “Everyone fears it will be the last opportunity in their scientific lifetime.” And there seemed little reason for restraint: in the 1990s, when the bulk of the design work was done, NASA’s astrophysics budget was projected to keep growing by a few per cent a year.
Stretched capabilities With each iteration, the JWST’s science objectives swelled. The core instrument package came to include a large-field-of-view near-infrared camera (NIRCam) and a multi-object near-infrared spectrograph (NIRSpec), primarily for investigating the earliest stars and galaxies; a general-purpose mid-infrared camera and spectrograph for observing dust-shrouded objects in the Milky Way; and a fine guidance sensor and tunable-filter imager to support the other three. These expanded capabilities would have to be supported by expensive and largely unproven technologies. The instruments needed extra-large, ultrastable infrared detectors. A five-layered membranous sunshield would have to be folded around the spacecraft before launch, then deployed in space to allow the telescope to cool to cryogenic temperatures. Unfurled, each layer would be about the same area as a tennis court. The primary mirror, too large to fit into any existing rocket fairing, would have to be assembled in 18 hexagonal, adjustable segments that would also unfold in orbit. Each segment would be painstakingly chiselled from beryllium, then coated with gold and polished. Arrays of electromechanical devices called
microshutters would allow NIRSpec to take spectra from up to 100 objects simultaneously, even if some of those objects were faint and lay next to brighter stars. Each individually controllable microshutter would be the width of a few human hairs, and NIRSpec would require more than 62,000 of them. In addition, every piece of technology in the spacecraft would have to be engineered to endure the violent vibrations of launch, the hard vacuum of outer space and the slow cool-down to cryogenic temperatures. The telescope’s optical surfaces, in particular, would have to survive all this while staying aligned to a precision of nanometres. And everything would have to perform nearly flawlessly for a minimum of five years, the baseline mission length. Small wonder, then, that NASA ended up spending almost $2 billion just on the JWST’s initial technology development. Nonetheless, the agency did not substantially cut any of the telescope’s capabilities to bring the costs back under control. Instead, it looked for partnerships, securing major contributions from the European and Canadian space agencies. NASA also maximized support for the project on Capitol Hill by awarding contracts for spacecraft components to a small army of companies and universities scattered through many congressional districts. Aerospace giant Northrop Grumman of Los Angeles, California, became the JWST’s prime contractor, under NASA’s Goddard Space Flight Center in Greenbelt, Maryland, which would manage the overall project. By the time the JWST passed its preliminary design reviews in spring 2008 and NASA had officially committed to building it, the project had been transformed from its comparatively modest ‘faster, better, cheaper’ origins into an audacious multibilliondollar, multi-instrument mission spanning institutions, countries and continents.
Passing the Test For nearly a year now, engineering models of the JWST’s various components have been trickling into the clean room in Goddard’s Building 29 for testing. (The centre’s white-suited technicians can be seen at work on Internet ‘Webb-cams’ .) Pieces of actual flight hardware are supposed to start arriving in the same room in spring and summer 2011. All of the JWST’s riskiest technologies have met their critical milestones and are on schedule for the 2014 launch. The most substantial challenge remaining before launch is to integrate and test the flight components to ensure that they function as a whole — and, of course, to do all that without exceeding the remaining budget. NASA’s traditional method is to ‘test as you fly’ — to operate the integrated flight hardware in conditions as close as possible to those it will experience in space. The problem is that the fully assembled telescope will be far too large to fit into any available thermal vacuum chamber. Just as the JWST’s scientific objectives required new technology, mission planners have had to devise entirely new protocols to test it. “With the JWST we have to do incremental modelling, building and testing, validating our model at each stage and then moving up to the next level of assembly,” says Phil Sabelhaus, the JWST project manager at Goddard. “We aren’t only testing — we’re also proving our ability to model correctly, which is how we will evaluate the JWST’s absolute performance on-orbit.” This
hierarchical assembly, testing and modelling is laborious and time-consuming, more like building several telescopes than one, and is a major contributor to the JWST’s remaining costs. So, unsurprisingly, it is one of the most probable targets for cost-cutting. “There are tests that are really essential to do, and tests that would be nice to do,” says Dressler. “With something of this magnitude, there is a natural tendency to double-check and triple-check, and maybe we can’t afford that.” On the other hand, he says, maybe they can’t afford not to: it was a decision to save money on testing that allowed a defect in Hubble’s primary mirror to go undetected until it was in orbit, nearly dooming the entire mission. The JWST’s supporters contend that, even with further budget overruns, the telescope will still break the historical cost pattern for large space telescopes. “Not even including its four space-shuttle servicing missions, Hubble cost $4 billion or $5 billion in today’s dollars just to build and launch,” Dressler notes. “Here we are, building a telescope that is almost seven times bigger, it is cryogenic, it is operating 1.5 million kilometres away, and it is costing the same amount as Hubble did, if not less. That is remarkable, and this is probably the biggest scale on which we will consider building such things in this country.” Even so, ambivalence still surrounds the JWST. Failure is not an option, either for NASA or for the astronomers it supports. Yet, in the face of flat or declining budgets, a dwindling docket of near-term astrophysics missions and rising public outrage over perceptions of runaway government spending, tough questions are inevitable. At a mid-September meeting of the agency’s astrophysics subcommittee, efforts to nail down just how many extra dollars lie between the JWST and its eventual arrival at L2 were met with silence. Until the announcement of a new budget and schedule, informed by recent panel reviews, that is the best answer anyone is likely to get. •
DESIGNED BY ANNA SCHOENBERGER