Fastolfe breathed deeply and sat back in his chair. "You hinted as much when you returned from Gladia's." He looked at Baley with a hint of savagery in his eyes. "Could you not have told me this 'key' you have at the start? Need we have gone through all - this?"
"I'm sorry, Dr. Fastolfe. The key makes no sense without all - this."
"Well, then. Get on with it."
"I will. Jander was in a position that you, the greatest robotics theoretician in all the world, did not foresee, by your own admission. He was pleasing Gladia so well that she was deeply in love with him and considered him her husband. What if it turns out that, in pleasing her, he was also displeasing her?"
"I'm not sure as to your meaning."
"Well, see here, Dr. Fastolfe - She was rather secretive about the matter. I gather that on Aurora sexual matters are not something one hides at all costs."
"We don't broadcast it over the hyperwave," said Fastolfe dryly, "but we don't make a greater secret of it than we do of any other strictly personal matter. We generally know who's been whose latest partner and, if one is dealing with friends, we often get an idea of how good, or how enthusiastic, or how much the reverse one or the other partner - or both - might be. It's a matter of small talk on occasion."
"Yes, but you knew nothing of Gladia's connection with Jander."
"I suspected - "
"Not the same thing. She told you nothing. You saw nothing. Nor could any robots report anything. She kept it secret even from you, her best friend on Aurora. Clearly, her robots were given careful instructions never to discuss Jander and Jander himself must have been thoroughly instructed to give nothing away."
"I suppose that's a fair conclusion."
"Why should she do that, Dr. Fastolfe?"
"A Solarian sense of privacy about sex?"
"Isn't that the same as saying she was ashamed of it?"
"She had no cause to be, although the matter of considering Jander a husband would have made her a laughingstock."
"She might have concealed that portion very easily without concealing everything. Suppose, in her Solarian way, she was ashamed."
"Well, then?"
"No, one enjoys being ashamed - and she might have blamed Jander for it, in the rather unreasonable way people have of seeking to attribute to others the blame for unpleasantness that is clearly their own fault."
"Yes?"
"There might have been times when Gladia, who has a shortfused temper, might have burst into tears, let us say, and upbraided Jander for being the source of her shame and her misery. It might not have lasted long and she might have shifted quickly to apologies and caresses, but would not Jander have clearly gotten the idea that he was actually the source of her shame and her misery?"
"Perhaps."
"And might this not have meant to Jander that if he continued the relationship, he would make her miserable, and that if he ended the relationship, he would make her miserable. Whatever he did, he would be breaking the First Law and, unable to act in any way without such a violation, he could only find refuge in not acting at all - and so went into mental freeze-out. - Do you remember the story you told me earlier today of the legendary mindreading robot who was driven into stasis by that robotics pioneer?"
"By Susan Calvin, yes. I see! You model your scenario on that old legend. Very ingenious, Mr. Baley, but it won't work."
"Why not? When you said only you could bring about a mental freeze-out in Jander you did not have the faintest idea that he was involved so deeply in so unexpected a situation. It runs exactly parallel to the Susan Calvin situation."
"Let's suppose that the story about Susan Calvin and the mind-reading robot is not merely a totally fictitious legend. Let's take it seriously. There would still be no parallel between that story and the Jander situation. In the case of Susan Calvin, we would be dealing with an incredibly primitive robot, one that today would not even achieve the status of a toy. It could deal only qualitatively with such matters: A creates misery; not-A creates misery; therefore mental freeze-out."
Baley said, "And Jander?"
"Any modern robot - any robot of the last century - would weigh such matters quantitatively. Which of the two situations, A or not-A, would create the most misery? The robot would come to a rapid decision and opt for minimum misery. The chance that he would judge the two mutually exclusive alternatives to produce precisely equal quantities of misery is small and, even if that should turn out to be the case, the modern robot is supplied with a randomization factor. If A and not-A are precisely equal misery-producers according to his judgment, he chooses one or the other in a completely unpredictable way and then follows that unquestioningly. He does not go into mental freeze-out."
"Are you saying it is impossible for Jander to go into mental freeze-out? You have been saying you could have produced it."
"In the case of the humaniform positronic brain, there is a way of sidetracking the randomization factor that depends entirely on the way in which that brain is constructed. Even if you know the basic theory, it is a very difficult and long sustained process to so lead the robot down the garden path, so to speak, by a skillful succession of questions and orders as to finally induce the mental freeze-out. It is unthinkable that it be done by accident and the mere existence of an apparent contradiction as that produced by simultaneous love and shame could not do the trick without the most careful quantitative adjustment under the most unusual conditions. - Which leaves us, as I keep saying, with indeterministic chance as the only possible way in which it happened."