According to Stephen Hawking, a highly acclaimed British scientist, genetic engineering should be used to prevent human intelligence being overtaken by that of computers. Famous Moore's law states that computers double their performance every 18 months. Hawking claims that targeted genetic changes could increase the complexity of our DNA and literally "improve" human beings. In his view, this is the road we should take if we want biological systems to remain superior to electronic ones. Other researchers are also concerned about the future of the humankind. Bill Joy of Sun Microsystems recently wrote an online essay for the Wired magazine, titled "Why The Future Doesn't Need Us", prompting discussion about how far robotics, genetic engineering and nanotechnology might go in supplanting and overpowering humanity. How real is the danger that our computers might develop intelligence and take over the world? Do we really want to build machine that will have intellectual advantages over their creators?
An earlier article gives a brief overview of projects aiming to build intelligent machines, capable of communicating in natural language and acquiring new knowledge. In reality, researchers are still struggling to make intelligent machines that can learn on their own. Artificial Intelligence NV (Ai), an international company split between Boston and Tel Aviv, claims a breakthrough with a computer that is learning to talk like a toddler. Contrary to the traditional approach of using a set of hardwired rules and a vocabulary database to approximate human conversation, researchers at Ai follow the Turing's idea of the child machine. Such "developmental" approach becomes especially interesting when applied to hardware: an autonomous robot cannot hold all the details of the environment in which it is going to act. Following this idea, professor Juyang Weng of Michigan State University recently introduced a new kind of robots: machines that can develop their "mental skills" automatically through real-time interactions with the environment.
Professor Weng and his team have already built a prototype developmental robot. SAIL (for Self-organizing Autonomous Incremental Learner) is a human-size robot that wanders the halls of Michigan State's Engineering Building, responding to touch, voices and vision input.
Each of its two eyes is controlled by a fast pan-tilt head. Its torso has 4 pressure sensors to sense push actions and force. It has 28 touch sensors on its arm, neck, head, and bumper to allow human to teach how to act by direct touch. Its drive-base is adapted from a wheelchair, allowing SAIL to operate both indoor and outdoor. The main computer is a high-end dual-processor dual-bus PC workstation with 512MB RAM and an internal 27GB three-drive disk array for real-time sensory information processing, real-time memory recall and update as well as real-time effector controls.