Home > Robot Visions (Robot 0.5)(135)

Robot Visions (Robot 0.5)(135)
Author: Isaac Asimov

This may not be possible in every case, and there will have to be innovative social initiatives to take care of those who, because of age or temperament, cannot fit in to the rapidly changing economic scene.

In the past, advances in technology have always necessitated the upgrading of education. Agricultural laborers didn't have to be literate, but factory workers did, so once the Industrial Revolution came to pass, industrialized nations had to establish public schools for the mass education of their populations. There must now be a further advance in education to go along with the new high-tech economy. Education in science and technology will have to be taken more seriously and made lifelong, for advances will occur too rapidly for people to be able to rely solely on what they learned as youngsters.

Wait! I have mentioned robot technicians, but that is a general term. Susan Calvin was not a robot technician; she was, specifically, a robopsychologist. She dealt with robotic "intelligence," with robots' ways of "thinking." I have not yet heard anyone use that term in real life, but I think the time will come when it will be used, just as "robotics" was used after I had invented that term. After all, robot theoreticians are trying to develop robots that can see, that can understand verbal instructions, that can speak in reply. As robots are expected to do more and more tasks, more and more efficiently, and in a more and more versatile way, they will naturally seem more "intelligent." In fact, even now, there are scientists at MIT and elsewhere who are working very seriously on the question of "artificial intelligence."

Still, even if we design and construct robots that can do their jobs in such a way as to seem intelligent, it is scarcely likely that they will be intelligent in the same way that human beings are. For one thing, their "brains" will be constructed of materials different from the ones in our brains. For another, their brains will be made up of different components hooked together and organized in different ways, and will approach problems (very likely) in a totally different manner.

Robotic intelligence may be so different from human intelligence that it will take a new discipline-"robopsychology"-to deal with it. That is where Susan Calvin will come in. It is she and others like her who will deal with robots, where ordinary psychologists could not begin to do so. And this might turn out to be the most important aspect of robotics, for if we study in detail two entirely different kinds of intelligence, we may learn to understand intelligence in a much more general and fundamental way than is now possible. Specifically, we will learn more about human intelligence than may be possible to learn from human intelligence alone.

Essays The Robot As Enemy?

It was back in 1942 that I invented "the Three Laws of Robotics," and of these, the First Law is, of course, the most important. It goes as follows: " A robot may not injure a human being, or, through inaction, allow a human being to come to harm." In my stories, I always make it clear that the Laws, especially the First Law, are an inalienable part of all robots and that robots cannot and do not disobey them.

I also make it clear, though perhaps not as forcefully, that these Laws aren't inherent in robots. The ores and raw chemicals of which robots are formed do not already contain the Laws. The Laws are there only because they are deliberately added to the design of the robotic brain, that is, to the computers that control and direct robotic action. Robots can fail to possess the Laws, either because they are too simple and crude to be given behavior patterns sufficiently complex to obey them or because the people designing the robots deliberately choose not to include the Laws in their computerized makeup.

So far-and perhaps it will be so for a considerable time to come-it is the first of these alternatives that holds sway. Robots are simply too crude and primitive to be able to foresee that an act of theirs will harm a human being and to adjust their behavior to avoid that act. They are, so far, only computerized levers capable of a few types of rote behavior, and they are unable to step beyond the very narrow limits of their instructions. As a result, robots have already killed human beings, just as enormous numbers of noncomputerized machines have. It is deplorable but understandable, and we can suppose that as robots are developed with more elaborate sense perceptions and with the capability of more flexible responses, there will be an increasing likelihood of building safety factors into them that will be the equivalent of the Three Laws.

But what about the second alternative? Will human beings deliberately build robots without the Laws? I'm afraid that is a distinct possibility. People are already talking about security Robots. There could be robot guards patrolling the grounds of a building or even its hallways. The function of these robots could be to challenge any person entering the grounds or the building. Presumably, persons who belonged there, or who were invited there, would be carrying (or would be given) some card or other form of identification that would be recognized by the robot, who would then let them pass. In our security-conscious times, this might even seem a good thing. It would cut down on vandalism and terrorism and it would, after all, only be fulfilling the function of a trained guard dog.

But security breeds the desire for more security. Once a robot became capable of stopping an intruder, it might not be enough for it merely to sound an alarm. It would be tempting to endow the robot with the capability of ejecting the intruder, even if it would do injury in the process-just as a dog might injure you in going for your leg or throat. What would happen, though, when the chairman of the board found he had left his identifying card in his other pants and was too upset to leave the building fast enough to suit the robot? Or what if a child wandered into the building without the proper clearance? I suspect that if the robot roughed up the wrong person, there would be an immediate clamor to prevent a repetition of the error.

Hot Series
» Unfinished Hero series
» Colorado Mountain series
» Chaos series
» The Sinclairs series
» The Young Elites series
» Billionaires and Bridesmaids series
» Just One Day series
» Sinners on Tour series
» Manwhore series
» This Man series
» One Night series
» Fixed series
Most Popular
» A Thousand Letters
» Wasted Words
» My Not So Perfect Life
» Caraval (Caraval #1)
» The Sun Is Also a Star
» Everything, Everything
» Devil in Spring (The Ravenels #3)
» Marrying Winterborne (The Ravenels #2)
» Cold-Hearted Rake (The Ravenels #1)
» Norse Mythology