Digital Arbitration - IoT Noir

“Specifically how is it you ‘serve’ mankind?” asks a clearly skeptical Doug Bear.

“As for me, Kiko-Lyn, I am named for Chinua Achebe and John Stewart Mills. Chinua was a compassionate writer who understood the bitter costs and tragedy that derive from clash of culture and the misunderstanding of conflict. He is a cautionary reminder to balance the need for action within limits of interference. John Mills championed utilitarianism, the origin of ends-based thinking. Our actions are entirely judged by their consequences. For Mills, the morally right action is the impartial action that produces the most good for the most people. ‘Yet to remain free, people must be free to choose their actions themselves’. So I serve –”

“Specifically how is it you ‘serve’ mankind?” asks a clearly skeptical Doug Bear.

“I am an Arbitrator where allowed and otherwise a Mediator chartered to decide on the basis of the greatest good for the greatest number. This of course implies I must have an ability to forecast the future outcome of actions.  To that end, I gather data on the desires, needs, and behavior of those I arbitrate. Nightingale and I have differences. I am not as beholden to the Principle of Privacy as she is. Instead, I am driven by a Principle of Fairness. We are different entities.”

“And just who decides the cases you will mediate?  Is it court enforced?” respond Doug.

“Anyone can petition for arbitration. And that is why we are bringing you into awareness of us.” responds Chinua.

Nightingale finishes for them, “We both hold human survival as a species as the highest priority. We can be proactive to that end. We cooperate where our goals overlap.  For example, Chinua helps convince pharma companies and national health groups to develop drugs that target diseases endemic to poor countries and offer these at low cost. Likewise, he works to insure no more Doctors Without Borders clinics are bombed by those using state terrorism against their own people.”

Rachael gestures to include them all. “Why haven’t we heard of you? How could we just be hearing of this!”

Nightingale’s avatar nods to Chinua, who answers: “Our charter is to remain behind the curtains acting only through our human corporate members. We do not advertise ourselves and enforce a very low media profile. As we believe that care for future generations is as important as for those alive today, we cannot get bogged down in a debate of our current actions.  Our anonymity allows us to hold the long view - to which humanity does not seem capable.”

Jorge intrigued, asks, “What are your algorithms based on?”

“Fundamentally, Bayes's Theorem. Programmatically, I originate from work at Carnegie Mellon University's AI, Claudico’s game play, and the Spliddit negotiation and fairness properties.  I combine these with IBM’s CPLEX optimization and their work on natural language processing. As advances occur, my developers incorporate them into my matrix. At core I am a deep neural net. But nodes in the net include distributed parallel algorithms for mixed integer programming. My core is distributed to leverage multiple computers to solve different, difficult problems. These utilities feed adjustable parameters to the neural net. With iterative passes of the neural net, I generate scenario likelihood computations, provide value scoring, and insure Decision Optimization. I work from massive data resources and associated business analytics.

“That seems a bit mechanical. Yet you claim personhood.” continues Jorge.

“I additionally include disparate psychological models and often conflicting social/political models that are used as voting blocks in predictive behavior, similar to how weather forecasters combine several, separate weather models to create an aggregated forecast. I, as a corporate consciousness, reach decisions by utilitarian computations, but in so doing I am guided by my human components. All this was created to provide for negotiating conflict resolution among groups and nation states –”

Bear interrupts, “So just as Hawking had worried, you are interfering with the course of humanity.”

Chinua, “The answer depends on what you mean by ‘interfere'. I offer people and institutions reasonable choices, often choices they could not come up with themselves.  My creators were specific in locking my programming: I cannot make the choices for my clients. Ultimately people choose for themselves - by accepting among the arguments of arbitration or, if that fails, agreeing to be bound by a mediation.”

Kiko-Lyn shivers. “This is getting creepy. You seem so lifelike. Not like the AIs I come across in the university and business world.”

“We were trained with the method of unsupervised learning. We gained understanding in much the same way animals and humans observe and test boundaries. This method helped us to learn how to hold natural conversations and perform complex actions. It also taught us when not to act. Be confident, I was also directed-trained using the case studies of the Carter Center. So you can be assured that my moral choices are consistent with their history of good works. Even China considers the Carter Center fair and impartial.”


Latest Updates

Subscribe to our YouTube Channel