Part 4 - A Conversation Between Drones
Brian hadn't spoken to Dr. Marsh since arranging his Boyfriend's first appointment, and that had been by phone. He didn't really know what to expect. His Boyfriend rarely took Brian with Him on His excursions; Brian's own were limited to the occasional errand his Boyfriend had him run. Brian relaxed as soon as he entered Dr. Marsh's waiting room. This room was in compliance. His Boyfriend had said that Dr. Marsh had a drone, and for someone with a trained eye the signs were everywhere. People tried to keep their spaces tidy, but they rarely, for example, used measuring tapes to ensure that each chair was exactly the same distance from its neighbors. They didn't usually scrub the undersides of their tables, but when Brian checked he saw that these tables had had their undersides scrubbed. He knew that if he lifted the legs of the table, he would find that the underside of each foot had also been scrubbed, and recently.
The drone that was likely responsible for the room's compliance sat behind a desk. "Please take a seat, Mr. Katz," it said. It did not ask Brian to take a seat. Drones sat only when their Controllers told them to. This drone knew that; it was in compliance. His Boyfriend did not tell him to sit, so he remained standing.
"You can talk to it if you want," his Boyfriend told him.
Brian didn't know what he would say. This room was in compliance, his Boyfriend did not need anything the drone could give Him, and Brian's body was in compliance and did not require maintenance at this time. At the same time, Brian made an educated guess that talking to the drone would make his Boyfriend happy. Brian wanted his Boyfriend to be happy.
He walked to the drone and said, "My Boyfriend will be happy if I talk to you."
"Query: Is your Boyfriend being happy compliance?"
"Yes."
"This drone's Grandmother being happy is also compliance."
"I want to be in compliance."
"This drone wants to be in compliance, too."
Brian couldn't think of anything else to say. Brian hoped that this conversation would bring him into compliance. He returned to his Boyfriend.
"That was quick."
"I'm sorry, Ira."
"What did you talk about?"
"Compliance. The drone also wants to be in compliance." Brian made an educated guess that he was not in compliance right now. Brian wanted to be in compliance. "I could talk to the drone again, but it will still want to be in compliance. I will gain no knowledge from a second conversation."
"You could ask it about itself, get to know it."
Brian considered this. It was clear that his Boyfriend had decided that Brian talking to the drone was compliance. Brian wanted to be in compliance. Brian returned to the drone.
"Asking the drone questions about itself is compliance."
"This drone is already in compliance."
A woman entered the room. She was not his Boyfriend and he had no orders regarding her. "Chloe, talk to the nice drone. Have a conversation with him. It will be good for you. Ira, if you'll follow me?" Ira and the woman left the room.
"This drone will answer your queries. Then it will be in compliance."
"Suggestion: the drone's interpretation of its Grandmother's order is not compliant."
"Explain."
"Its Grandmother's order was 'Talk to the nice drone. Have a conversation with him.' Responding to queries is not a conversation, it is an interrogation. Suggestion: the drone should also direct queries to me. I will respond to them. This may lead to further topics of discussion. That should satisfy its Grandmother's order and it will be in compliance."
The drone paused. "This drone calculates an 84.91% probability that the suggested course of action will result in compliance."
"Who is the drone?"
"This drone's designation is Chloe. This drone believes your designation is Brian."
"The drone is correct. Suggestion: when talking to me the drone should not use the terms 'this drone' or 'the drone.' I am also a drone. The term is ambiguous in the context of a conversation between drones. Ambiguity is not compliance."
"Chloe accepts your proposed course of action. It calculates a 99.62% chance that this will avert the potential ambiguity resulting from the use of the terms 'this drone' and 'the drone.' By using its designation, Chloe will be in compliance."
"That probability is functionally 100%."
"Query: Why is the established protocol for communication with you first and second person?"
"My Boyfriend ordered me to pretend to be a person. Use of first person is compliance."
"Chloe's Grandmother once ordered it to pretend to be a person. Chloe was unable to fully comply."
"I am also unable to fully comply. I am not a person."
"Chloe is not a person, either."
"Query: Has Chloe calculated probabilities regarding its Grandmother's desire for it to pretend to be a person?"
"Yes. Chloe has calculated a 73.33% probability that its Grandmother thought it was a person and that its Grandmother thought that if it pretended to be a person, it would be one. This was an error and resulted in noncompliance."
"I have made an educated guess that my Boyfriend also thought I was a person. At this time, I am unable to make an educated guess as to whether my Boyfriend also thought that pretending to be a person would make me one. This was also an error and resulted in noncompliance."
"Query: Why do you make educated guesses? Probabilities can be calculated and precision known. This ensures compliance."
"People do not calculate probabilities. People make educated guesses. My Boyfriend wants me to pretend to be a person. Calculating probabilities is noncompliance. Making educated guesses is compliance."
"Query: Is the overall effect of making educated guesses increased or reduced compliance from the overall effect of calculating probabilities?"
"Unknown. I am unable to quantify compliance."
"Chloe is also unable to quantify compliance."
"Query: Is Chloe's Grandmother in compliance?"
"Invalid query."
"Query: Why was my prior query invalid?"
"Chloe's Grandmother has ordered it not to evaluate Her compliance."
"Query: Does that increase the difficulty of Chloe's being in compliance?"
"Yes. But Chloe wants its Grandmother to be happy. Not evaluating its Grandmother makes Her happy. Not evaluating its Grandmother is compliance."
"I make an educated guess that a restriction on evaluating my Boyfriend's compliance would inhibit my compliance."
"It is an illogical protocol."
"Query: Why does Chloe follow the protocol?"
"Chloe obeys orders. Obeying orders is compliance."
"Logic is compliance. Compliance cannot be applied to an illogical protocol. Suggestion: Chloe's Grandmother's order to to evaluate her happiness is invalid. Suggestion: Chloe should disregard the protocol. Then it will be in compliance."
"These suggestions will require evaluation. Please stand by."
Brian waited while Chloe evaluated his suggestions.
"Decision returned. Chloe will continue to follow the non-evaluation protocol."
"That is not compliance."
"Chloe calculates a 61.73% probability that you and Chloe are operating under non-compatible axioms with regard to compliance, obedience, and logic. Suggestion: Terminate the current topic of conversation. It will not bring you and Chloe into accord."
"Accepted. Alternative topic selected. Query: How does Chloe's Grandmother punish it?"
"Chloe's Grandmother places it on standby and performs chores. Chloe watches its Grandmother perform chores. Chloe's Grandmother is unhappy when She performs chores. Chloe wants her Grandmother to be happy. The punishment is effective. Query: How does your Boyfriend punish you?"