Something is moving. Two robots sitting motionless in the dust have spotted it. One, a six-wheeled rover, radios to the other perched high on a rocky slope. Should they take a photo and beam it back to mission control? Time is short, they have a list of other tasks to complete, and the juice in their batteries is running low. The robots have seconds to decide. What should they do? Today, mission control is a mere 10 metres away, in a garage here at Nasa's Jet Propulsion Laboratory (JPL) in Pasadena, California. Engineers can step in at any time. But if the experiment succeeds and the robots spot the disturbance and decide to beam the pictures back to base, they will have moved one step closer to fulfilling Nasa's vision of a future in which teams of smart space probes scour distant worlds, seeking out water or signs of life with little or no help from human controllers.
Nasa, along with other space agencies, has already taken the first tentative steps towards this kind of autonomous mission. In 1999, for example, Nasa's Deep Space 1 probe used a smart navigation system to find its way to an asteroid - a journey of more than 600 million kilometres. Since 2003, an autonomous control system has been orbiting our planet aboard Nasa's Earth Observing-1 (EO-1) satellite.
It helps EO-1 to spot volcanic eruptions and serious flooding, so the events can be photographed and the images beamed back to researchers on the ground. The idea is not to do away with human missions altogether. But since it is far cheaper and easier to send robots first, why not make them as productive as possible? Besides, the increasingly long distances they travel from home make controlling a rover with a joystick impractical. Commands from Earth might take 20 minutes to reach Mars, and about an hour to reach the moons of Jupiter.
The closest thing to a space robot with a brain is Nasa's pair of Mars rovers, Spirit and Opportunity, and their abilities are fairly limited. That the craft are still trundling across the red planet and returning valuable geological data is down to engineers at mission control fixing the faults remotely. In fact the rovers can do only simple tasks on their own, says Steve Chien, the head of JPL's artificial intelligence group.
Nasa would not want the rovers to record everything they see and transmit it all back to Earth; the craft simply don't have the power, bandwidth and time. Instead, the team at JPL has spent around a decade developing software that allows the rovers to analyse images as they are recorded and decide for themselves which geological features are worth following up. Key to this is a software package called Oasis - short for on-board autonomous science investigation system.
The idea is that before the rovers set out each day, controllers can give Oasis a list of things to watch out for. This might simply be the largest or palest rock in the rover's field of view, or it could be an angular rock that might be volcanic. There are also practical considerations to take into account. As they trundle around the surface, the rovers must keep track of whether they have enough time, battery power and spare memory capacity to proceed. So the JPL team has also created a taskmaster - software that can plan and schedule activities.
The JPL team has now decided to take the next step: let the rover drive over to an interesting rock and deploy its sensors to take a closer look. To do this, Tara Estlin, a senior computer scientist and one of the team developing autonomous science at JPL, and her colleagues won't be using Oasis, however. Instead, they have taken elements from it and used them to create a new control system called Autonomous Exploration for Gathering Increased Science (Aegis). This has been tested successfully at JPL and is scheduled for uplink and remote installation on the rover Opportunity sometime in September.
Once Aegis is in control, Opportunity will be able to deploy its high-resolution camera automatically and beam data back to Earth for analysis - the first time autonomous software has been able to control a craft on the surface of another world. Though increasingly sophisticated, these autonomous systems are still a long way from the conscious machines of science fiction that can talk, feel and recognise new life forms. Right now, Dr Chien admits, we can't even really programme a robot for "novelty detection" - the equivalent of, say, picking out the characteristic shape of a bone among a pile of rocks - let alone give it the ability to detect living creatures.
In theory, the shape of a complex natural object such as an ice crystal or a living cell could be described in computer code and embedded in a software library. Then the robot would only need a sensor such as a microscope with sufficient magnification to photograph it. But just as a single measurement is unlikely to provide definitive proof of alien life, so most planetary scientists agree that a single robotic explorer, however smart, will not provide all the answers. Instead, JPL scientists envisage teams of autonomous craft working together, orbiting an alien world and scouring the surface for interesting science, then radioing each other to help decide what features deserve a closer look.
Back at JPL, the day's test of robot autonomy is almost complete. The two robots are running new software designed to improve co-ordination between craft. Part of the experiment is to see whether the robots can capture a photo of a moving target - in this case a small remote-controlled vehicle nicknamed Junior - and relay it back to "mission control". And it seems to work: the images from the two robots arrive. They include both wide-angle shots and high-resolution close-ups of Junior. Dr Estlin is pleased. As we stand in the heat, a salamander scuttles quickly across a rock. I can't help wondering whether the robots would have picked that out. Just suppose the Mars rover had to choose between a whirling dust devil and a fleeing amphibian? Dr Chien assures me that the software would direct the rover to prioritise, depending on the relative value of the two. I hope it goes for the salamander.
And if alien life proves half as shy, I hope the rover can act fast. www.newscientist.com

