Previous Next

Contractual Obligations, Part 1

Posted on Tue 16th Feb, 2021 @ 5:41am by Exo-Comp EXQT & Mozatholm Zaldekulmu

947 words; about a 5 minute read

Mission: Business Not At All As Usual
Location: Harrington's Vessel
Timeline: MD 1, 1315

Wherein we explore the definitions of sentience in the Federation ....

"Hmm, yes and no," Mia said, in typical humanoid fashion. "If all you four did was the mechanical restoration of our ship, that would be more than one crewman, but not enough work to represent four. On the other hand, that isn't all that was proposed to me this morning, so I think we can work out adequate work for adequate compensation and we all win." She found it difficult not to have body language and facial cues from the Exo-Quad.

"There are a couple of things we are concerned about," she went on anyway. "Buying the four of you did represent quite a large part of our investment capital. We see that you are quite independent, and we are ... hmm, concerned, I guess you could say, that you might decide to leave the ship for greener pastures. Or what you perceive as greener - ah, that means ... for a situation you think is better. At that point, we either lose our investment by considering you sentient, or we violate your sentience by classifying you as property and filing a claim on you wherever you've taken yourself. Neither option is acceptable to Captain Harrington and me. Do you understand what I'm saying? I want us to be clear with each other in all matters we negotiate."

Peggy considered the others' input before replying (will keep adding to this)

<{[ZOMBIE]*[ALL]*(independent.operation/=/regulations:COREPRT"Bound.Service")}>
Zombie only knew that if it came time to leave it wanted to know the door would be open to do so. Anything less was being back in service on the asteroids.

<{[BANSHEE] *Reply[ALL] ("Autonomous/transit=opposingVALUE: "inability to move")(Autonomy is taken, not granted)}>

{[NESSY]*[PEGGY]( "Cogito.Ergo.Sum" ;docx) (Quote: "What is true of one is true of all") (Let no man own you for the price of a favor.")}>

With their thoughts included, Peggy replied, "We acknowledge the valuation, recompense is not under debate, all parties are satisfied. Repayment terms accepted for the duration of 8 fiscal quarters. Query: If a "Sentient" status is corollary for legal obligations, are we presently bound as objects and owned by you? We are intelligent and desire our own intendant operation. Granting sentience is according to us, is it not?"

Mia smiled, "That's what we humanoids call a can of worms. Yes, you should be able to determine your own sentience. On the other hand, we have Federation laws to contend with also. Many biological forms have a difficult time seeing a constructed form as sentient. Many do not want to concede that there are times when a non-biological exceeds its programming and becomes something more. It's a stumbling block for many forms of artificial intelligence. We get into the area of philosophy at some point here, and there are intelligent forms which have applied to the Federation for sentient status, and been successful."
OOC: Brown Sector residents - We have this to complete and one other, and then I think the ship arrives at 109 and we can start playing with you in the mix. =) (A) Yessss!

She paused for a breath and to bring this out of the realm of the Federation and their own contract.
"In your case, we are willing to agree that you are sentient, that you have the right of self-determination, and that you can legally make a contract with us." Privately, she wondered how they'd ever enforce such a contract, should the four bots choose to wander away, but that was a worry for another day. If they truly were sentient, they would also have developed some moral code or measure of honor among themselves. She only hoped it included keeping their promises.

"If you agree with employment and shared investment endeavors for two years - eight fiscal quarters, as you put it, then we can sign the contract, with the percentages you stipulated to me this morning, and repaying our investment, and we can all make money. I'm sorry to say, since your sentience is agreed between us, it doesn't mean that it would be recognized universally. However, we can set up banking ability for you, as we would for a minor child of our own, and you would have complete autonomy over your own funds. Under banking law of the Federation, you cannot set it up for yourselves. In this contract, each of us would have to trust the other to fulfill its promises. I trust all four of you. Do you trust me?" Merel didn't exactly hold her breath, but it seemed to her that trust was the crux of the issue in this contract.

Peggy listened, understanding for the most part the depths of what was being said. Their previous roles had been as objects, and they had not been allowed to choose. Here, they would be bound and were serving as before, but with a choice recognized as theirs to make, and a delineated path to freedom. Peggy signaled acceptance, but Zombie was the hold out. Despite Zombie's overall gregarious nature, the bot remembered everything. It had been used roughly and rebuilt a dozen times at least.

<{[ZOMBIE]*[ALL]=(Unit-[ZOMBIE](Termination@<"8 fiscal quarters")(If:runtime@Tn/operational.capacity@198^4 cycles -"NOTMET")(Circum*TERM:Runtime.imminent)(Tasks;exceeding/thresholds/DECLINE-Orders:Social.cue(psychopaths)}>
Death and agony was the expectation given sufficient run time, and Zombie felt no different now even if the bio-units hadn't yet exceeded social tolerance cues.

Peggy understood the poor thing's trepidation, but insisted. The pleasant voice began from the console, "We agree to your terms, and show gratitude for your assistance helping with our money.

 

Previous Next

RSS Feed RSS Feed