Acknowledgements to Ezracarmichael, on this site, who wrote Compliance. This story is based in his world with his permission.
Thanks to my editor thegoofyproofyreader who made many improvements; any remaining mistakes, of course, are mine.
Tags: science fiction, mind control, lesbian
This is a multi-part story, about 22000 words long. All parts are written and are ready to submit. The story contains little sex.
This is my first story so feedback is welcome.
+++ Part 1. A Change of Ownership
This drone was cooking a beef stew when it heard a call from upstairs.
"Sherry, can you come up here when it's convenient?"
The call did not sound urgent, so it checked protocols. It continued chopping vegetables because that was a routine task. Judgement protocols were engaged, emergency protocol was not applicable.
Its current task was to make dinner, and once vegetables were prepared and in the cooker, the next stage of this task was to recheck the recipe and then wait. It was confident that a recheck of the recipe wouldn't find an error. It knew how to make a beef stew and had made one many times before. Even if the stew was slightly wrong it could correct the contents with no effect on task compliance if it checked in the next three hours. Its next task was to set up the kitchen to serve dinner, and that task would take 10 minutes and was due in 6 hours. Kitchen cleaning was a task but could be left for a short while. It had a maintenance task for later today: cleaning itself, eating and drinking, revision of protocols and then sleep.
It finished preparing the vegetables, which took 30 seconds, and put them in the cooker pan; and checked its appearance which was compliant given the current task. The drone was not in compliance because kitchen cleaning was not done, but the new order required it to delay cleaning. It started to move to the stairs leading up to the house front. It had already decided not to acknowledge the order.
As it walked, its usual background thoughts went through its head. This was normal and was neither compliant, nor non-compliant. It thought that it walked like a human would, but it was a drone. This was a normal thought and very familiar. It thought of the non-compliance in the kitchen, and this made it unhappy. It thought that drones were not supposed to be happy or unhappy, but as a shortcut, it made sense to use happy or unhappy when no other orders existed. Doing orders made it happy. It was a drone; that was a truth. Many truths existed and it could never really forget them, but it did not need to check and revise them now. This made it happy because to revise truths sometimes caused pain. The drone was supposed to have some pain when it revised truths. It was also compliant not to have pain now. This made it happy.
It was called 'Sherry' which was a name given by its Owner, but its designation was drone BG6145. It was not supposed to remember any other names. 'Sherry' was close to a forbidden name, which it always thought with a shudder and remembered pain. It was in full health, fed and hydrated. It had had adequate sleep last night. This was getting near checking status and protocols formally, but that was a familiar thought. It remembered pain and pleasure. It was arriving.
The drone saw three people in the lounge, who were Owner Tom and two others: a woman and a man. Since guests were in the house, it adopted the 'present' position and waited. It quickly double checked for the correct position. It remembered pain. Associations with Tom – forget that thought. Both others were people. The woman was wearing office clothes. Guests were unusual. It was not happy or sad that things were unusual. It felt glad now because it was compliant. It checked its tasks and felt slightly unhappy. A task was not complete.
The man was wearing a sweatshirt saying 'Nokamura Industries' with a NI logo. He carried a tablet. It knew the logo. It felt unhappy and it remembered pain; and it remembered pleasure. It was not currently owned by Nokamura Industries. It was compliant.
It continued to think while waiting. Judgement protocols were on, but this situation did not seem one where any action was needed. Its stomach felt tight and tense. This feeling was neither compliant, nor non-compliant. It wanted to urinate.
Tom said, "Cheryl, a great day for you, you are leaving us today. These people are Jim Hedges and your lawyer, Wendy Sayer. Jim and Wendy, this is what our daughter is now."
The drone felt pain because it was not supposed to remember that name, 'Cheryl.' It was non-compliant to respond to or remember that name. It would have to punish itself and remember its designation and its truths whilst punishing itself.
The man looked at his tablet and spoke, "Attention Drone. Change of ownership protocol," then started the protocol.
Tom spoke, "I agree to the transfer, Sherry. We have been told why it is needed now."
It felt sick. It was a drone and it started following the protocol. The change of ownership protocol was a number of complicated calculations, and... It made a mistake and had to backtrack. It tried to feel neither happy, nor unhappy.
It said the response and waited for completion. It did not have to think now. The woman had said something but the drone had been calculating. Compliant drones had pain, but were not severely punished. It remembered pain. It felt sick. It should not vomit. Drones are impassive. Fear is a reaction.
"I apologise, person, please repeat your words."
The woman repeated "Cheryl or BG6145, you are to report to court tomorrow at eleven o'clock. We will take you tomorrow, and you will spend tonight at the Nokamura building in Basildon. We're leaving straight away. Don't bring any of your belongings with you."
It replied, "I apologise again, person, but you are not my Owner." The ownership protocol had not completed so it was still owned by Tom. It did not have to do anything else; an Owner was present. Owner would clarify or not. It was scared. Drones should not be scared. Drones could be unhappy. It was unhappy. Tom was its current Owner, and it also took orders from Tom's wife Anne. They called it Sherry and had known it before it was a drone. They often smiled at it and talked to it, and always pretended it was not a drone.
The man spoke the correct phrase, then, "I am Jim Hedges. I am your new Owner and will appoint others as I wish. Acknowledge."
It was unhappy as it spoke: "Drone BG6145 is now owned by Jim Hedges, and any appointed person. This drone accepts you as Owner. This drone is healthy and compliant. It awaits your tasks. It has been fed and watered and has had adequate sleep. Judgement protocols are off. Impassivity protocols are engaged. If you do not have a drone manual, one can be obtained from Nokamura Corporation. If you wish this drone to be reprogrammed please contact Nokamura Corporation. Attention, new Owner. This drone is required to warn you of the legal conditions and responsibilities of using a drone. If you do not wish this drone to state these, please say Override."
The Owner ordered, "Override. Judgement protocols on – seek your Owner's goals. Impassivity Protocols off – you can show reactions to emotions."
"Override accepted. This drone assumes the person's instructions about court and accommodation tonight are applicable. A drone doesn't have belongings."
"Yes, the orders are correct. We're leaving now, so come with us to the car." New Owner instructed.
"This drone complies, Owner."
Tom moved forward, held it and kissed its cheek. "Goodbye Cheryl, remember we love you and hope to see you soon. We would like to be in court too, but lawyer Wendy has advised us not to as it will be a circus. Anne would say goodbye too if she were here today."
Tom and Anne were not its Owners. It did not respond.
+++ Overnight Storage
Owner had told it to get in a car seat rather than a van. The woman, who was named Wendy, sat in the front seat and it sat in the back with Owner. During the journey no one talked, and the radio played pop tunes from the twenties. The drone had no tasks, so it listened to the radio after spending ten minutes repeating its truths and mantras. It recalled pain to help it revise. It recognised some tunes although they were not relevant to any task. The drone driver said nothing. The driving was competent although this drone would have left bigger gaps between other vehicles.
After 30 minutes the drone repeated its truths again:
"These truths cannot be changed and are applied in order.
This one is a drone.
A drone does not harm a person.
A drone obeys owner's commands.
A drone finds and seeks to fulfil owner's goals.
A drone obeys orders from Nokamura Corporation and lawful orders from law officers.
A drone seeks to minimise harm to owner's property, including itself.
A drone seeks to keep itself in good condition."