Read Beyond: Our Future in Space Online
Authors: Chris Impey
The next step is to activate a hydrosphere: raise the temperature by an additional amount sufficient to allow liquid water on the surface. Although still inhospitable, these conditions would allow extremophile microbes such as lichen, algae, and bacteria to be established. Their role is to prepare the regolith for photosynthetic organisms. Microbes used for this will be engineered to be optimally suited for their job. If the heating is done with asteroid impacts, these first two steps might take two to three hundred years.
The last step is to add oxygen to the atmosphere. Since oxygen is flammable, care would have to be taken to also add a buffer gas like nitrogen. Brute force would have to be used to import or create the initial oxygen needed for primitive plants, but when more advanced plants can propagate, they become the engine for oxygen production. It would take 500 to 1,000 years to make an atmosphere suitable for animals or humans.
Terraforming may be possible and it’s exciting at a technical level, but to see life breathed into the idea, we can turn to fiction. Kim Stanley Robinson wrote a science fiction trilogy in the mid-1990s about an overpopulated and dying Earth and the “First Hundred,” a pioneering group of Mars colonists. The books capture the ethical issues we’ll face if we go there, telling of the tensions between the Reds who prefer to leave Mars in its pristine state and the Greens who want to turn the planet into a second Earth.
24
The storytelling is very entertaining, but the physical descriptions are beyond evocative; they’re mesmerizing. Who wouldn’t want to visit Mars after reading this excerpt from
Red Mars
, the first book in the trilogy: “The sun touched the horizon, and the dune crests faded to shadow. The little button sun sank under the black line to the west. Now the sky was a maroon dome, the high clouds the pink of moss campion. Stars were popping out everywhere, and the maroon sky shifted to a vivid dark violet, an electric color that was picked up by the dune crests, so that it seemed crescents of liquid twilight lay across the black plain.”
_______________________
Extending Our Senses
What if we could have the experience of space travel without actually making the journey?
The cost and difficulty of protecting fragile humans and sending them vast distances through space suggest that we should find a different way to explore space. To see an alternative, look at the evolution of video games.
Pac-Man was the most famous arcade video game of all time. Released in 1980, the game had the player steer a small colored icon through a maze eating dots. Pac-Man’s popularity eclipsed that of space shooter games like Space Invaders and Asteroids, and it’s been estimated that by the end of the twentieth century, ten billion quarters had been dropped in Pac-Man slots. In 2000, a new computer game came out in which the player could create virtual people, houses, and towns and watch their cartoon characters live their virtual lives. The Sims sold more than 150 million copies worldwide. If we think of how far video games came in twenty years, from the primitive graphics of Pac-Man to the cartoonish but quasi-realistic 3-D graphics of The Sims, imagine what another twenty years will bring. A hint of that came in 2014 with the release of the Oculus Rift, a gaming helmet that immerses a player in 3-D virtual reality.
1
The best sense of the experience is the dramatic opening sequence of the 3-D movie
Gravity
.
The future of Solar System exploration may lie in telepresence, a set of technologies that allow a person to feel that he or she is in a remote location. Videoconferencing is one familiar and simple form of this technology. The market for projecting images and sound to connect meeting participants from around the world is growing 20 percent a year and is worth nearly $5 billion. Skype video calls now account for a third of all international calls, a staggering 200 billion minutes a year. Other examples include using robots with sonar to explore the ocean floor or robots with infrared sensors to explore caves. The robot provides the “eyes and ears” for an operator who doesn’t have to leave the comfort of an office or home.
When we “look” at Mars through the camera eye of the Curiosity rover or “sniff” the atmosphere with its spectrometer, we are using a form of telepresence. NASA has used red–green stereoscopic imaging on all of its recent rovers, but it missed a big chance to grab the public eye when it failed to build a 3-D high-definition video camera into the Curiosity rover in time for launch. Film director James Cameron had pitched the camera to give Earthlings a “you are there” immediacy as the rover trundled around the red planet.
2
A lot has changed since the last Apollo astronaut walked on the Moon. At the time of the first Moon landing, real-time, complex decision making had to be carried out by people. Now, robots and machines have impressive capabilities, so they can be remotely controlled by scientists at great distances.
Planetary scientists have used remote sensing for some forty years. The twin Viking landers were designed to analyze samples of Mars soil for traces of microbial life. No cameras were included in the specifications, but Carl Sagan argued that images from the surface would engage the public. Besides, he noted mischievously, what if there are Martian polar bears and we miss them because we don’t take pictures? So cameras were added, and their images of stark desert vistas were immediately compelling to the public. Probes to the outer Solar System since then have “watched” the volcanoes on Io, “listened” to magnetic storms on Jupiter, “sniffed” the atmosphere of Titan, and “tasted” the icy geysers on Enceladus.
Telepresence implies something more than remote sensing; it’s a technology that allows someone to feel as if he’s in a remote location. The word was coined in 1980 by US linguist and cognitive scientist Marvin Minsky. He was inspired by a short story by science fiction author Robert Heinlein. The concept was further developed by Fred Saberhagen in
Brother Assassin
, from the Berserker series:
. . . it seemed to all his senses that he had been transported from the master down into the body of the slave-unit standing beneath it on the floor. As the control of its movements passed over to him, the slave started gradually to lean to one side, and he moved its foot to maintain balance as naturally as he moved his own. Tilting back his head, he could look up through the slave’s eyes to see the master-unit, with himself inside, maintaining the same attitude on its complex suspension.
3
This level of control and verisimilitude is far off in space exploration, but we’re approaching it with the virtual reality of video games. The difference between gaming and science applications is that a video game tries to digitally re-create a real-world experience while science uses technology to digitally represent and transmit the real world.
Remote control of robots—often called telerobotics—is infiltrating life in surprising ways. Robots are used nowadays to defuse bombs, extract minerals from hazardous mines, and explore the deep sea floor. They also act as aerial drones and doctor’s assistants. They’re even beginning to be seen in the boardroom and the workplace. Many commercial robots look like vacuum cleaners with a screen on top and are no more than ventriloquist’s dummies; after the comical first impression, it’s disconcerting to realize that there’s a real person at the other end of the device. A striking recent example was a talk by Edward Snowden at the TED2014 conference.
4
The controversial NSA whistleblower was in hiding somewhere in Russia, but he was represented on stage by a screen attached to two long legs that ended in a motorized cart. Snowden communicated with the moderator and turned toward the audience to answer questions; he could see and hear everything that was going on.
At a 2012 symposium on “Space Exploration via Telepresence,” held at NASA’s Goddard Space Flight Center, scientists rubbed shoulders with roboticists and technology entrepreneurs. A major topic was latency—the time it takes a robot to respond to commands and communicate results back to the operator. Latency is governed by the speed of light. In terrestrial applications, latency is essentially zero, but on the Moon it’s a couple of seconds, on Mars it ranges from ten to forty minutes, and to the outer Solar System it’s up to ten hours. This makes real-time communications impossible.
Astronauts on the International Space Station have tested the remote control of a mobile robot named Justin, which was developed by the German Aerospace Center.
5
The robot has four-fingered hands, and astronauts control it with a sense of “touch.” This is done with haptic technology that uses forces and vibrations to re-create the feeling of touch.
6
To avoid latency, and to avoid the costs of going in and out of gravity pits, future explorers may control ground operations from Moon orbit or Mars orbit. NASA is testing a “blue collar” robotic miner that digs, fills and empties buckets, and can right itself if it falls. It would be part of the advance expedition to Mars to mine and build with local materials in preparation for the later arrival of astronauts. Meanwhile, the European Space Agency is developing a robotic exoskeleton so that astronauts can control a remote robot as if it were an extension of their body, and they’ve already tested a robot that can carry out simple tasks aboard the International Space Station (
Figure 41
).
Figure 41. Robonaut is a robotic humanoid development project conducted by NASA’s Johnson Space Center. This version, R2, was first used on the International Space Station in 2011. The torso can be positioned to help the crew with engineering tasks and extra-vehicular activities (EVAs).
The frontier of telepresence is its merger with artificial intelligence, a development foreseen by computer science pioneer Marvin Minsky in 1980.
7
A robot doesn’t need to be just a remote extension of a human; it can process information and make its own decisions. This will be exciting, but it will raise fascinating moral and ethical questions, especially if these semiautonomous robots come into contact with each other.
Here Come the Bots
Richard Feynman was an iconic physicist who won a Nobel Prize for his work in quantum theory. His delight in understanding how nature worked was infectious. In 1959, he wrote an influential essay titled “There’s Plenty of Room at the Bottom,” in which he argued that miniaturization of computers still had a long way to go. He talked about the limits of making machines and computers and realized that there might one day be technologies that could manipulate matter on the scale of individual molecules and atoms.
8
That day has finally arrived.
Nanotechnology involves scales of a billionth of a meter or smaller. It’s disconcerting to think that our world might one day be run by robots too small to see, but the benefits could be enormous. We’re familiar with swallowing a pill to treat an illness or a disease. But what about swallowing a pill-size robot that can monitor our vital functions from inside and warn of impending problems? Or a pill that could release a thousand tiny molecular machines to combat microbes or regenerate bones or blood vessels? Feynman anticipated a time when we could “swallow the doctor.”
9
Some people think nanotechnology is unnatural, but the research is often inspired by biology. A beautiful example is flagella, which propel bacteria in a liquid medium. They’re molecular motors, complete with propellers, universal joints, rotors, gears, and bushings.
10
Cancer therapy is high on the list of medical applications, since the current treatments rely on drugs and radiation, which are blunt and often toxic tools. Nanobots would be able to move directly to a cancer site, distinguish between malignant and normal cells, and do treatment without side effects or damage to the immune system. The potential of using nanobots for drug delivery and regenerative medicine has galvanized medical researchers. Federal grants for applying nanotechnology to medicine now exceed $2 billion (
Figure 42
). Similar capabilities will be focused on the environment, where nanosponges can be used to clean up oil spills and neutralize toxic chemicals. These nanosponges could also increase the efficiency of oil extraction, avoiding adverse effects of fracking.
The US military is investing heavily in nanotechnology. Many of the programs are classified, but they include miniaturization of drones to the size of insects to conduct surveillance, the use of “smart motes” the size of grains of sand to monitor battlefields for toxic gases, and development of armor and protection for soldiers that can alter its structure at the molecular level.
11
Figure 42. Nanobots or micromachines will be used increasingly in medicine to deliver drugs, repair damaged tissue, and fight diseases such as cancer. These same technologies can be used for exploration of planets and their moons.