Fr. 238.00

Controllable Artificial Intelligence and the Future of Law

English · Hardback

Will be released 20.10.2025

Description

Read more

This book broaches the newly crafted concept of algorithmic dictatorship that draws on a plethora of human biases that creep into the algorithm and feeds into an automated decision that comes to the expense of citizens´ lives, freedoms, health, property, fair lending, and credit scoring. This book sheds a keen light on the slew of reasons in view of which artificial intelligence should be both interpretable and controllable, as opposed to merely explainable. The reason for that is straightforward: the skewed data baked into the bigoted algorithms—machine biases—spawns harrowing effects with which criminal justice has been grappling for a long-haul/drawn-out. Tallyingly, and perhaps unsurprisingly, law enforcement evinces biases that run along both gender and race lines. No surprise springs from the fact that computer-generated algorithms that propel predictive policing are often flagged as tools whereby racial discrimination abounds. It should not therefore be pegged as flabbergasting that this sort of shady algorithmic governance is a byproduct of a grueling algorithmic dictatorship that is shaping up to crumble the foundations of Rule of Law upon which stands modern societies. This is one of the key takeaways of this book. Disturbingly enough, brain–computer interfaces are poised to be converted into shady tools to collate/gauge thoughts, emotions, sentiments, and crime-related information that would be otherwise inaccessible to the governments’, rogue nations’, or unscrupulous actors’ prying eyes. Much to our dismay, an eerily dystopian world is unfolding before our very eyes. This is the gist of transhumanism—a byproduct of convolutional neural networks that revolve around deep learning genetic algorithms—that will overhaul the current legal landscape beyond recognition. This book charts the path ahead as to draw set-in-stone boundaries to prevent jurisdictions from careening into the chaos of genetic plutocracy that should be wished away.
Hugo Luz dos Santos holds a PhD in Law (2019-2021). He is University Professor at City University of Macau. He is Fellow of the Forum for International Conciliation and Arbitration (FICA, Oxford, United Kingdom). He has published 32 books and authored over 120 papers. Hugo Luz dos Santos has been awarded the Fellowship of the Royal Society of Arts of the United Kingdom (London, United Kingdom) in recognition of his outstanding contributions to the field of justice, rule of law and policy worldwide.

List of contents

Part I: Has the pendulum swung too far? Towards an Algorithmic Dictatorship?.- Chapter 1: Why you should be worried with a looming Algorithmic Dictatorship.- Part II: The History of Artificial Intelligence.- Chapter 2: Why do we need a Controllable Artificial Intelligence: brief historical account.- Part III: Critical Artificial Intelligence Studies (CAIS).- Chapter 3: Philosophy and Technology.- Part IV: Algorithmic Dictatorship and the chilling effect on democracy.- Chapter 4: The devilishly harmful virus of Algorithmic Dictatorship: is the Rule of Law backsliding beyond repair?.- Part V: Algorithmic Common Good to oust Algorithmic Dictatorship.- Chapter 5: The rich tapestry of remedies that comprise Algorithmic Common Good.

About the author

Hugo Luz dos Santos holds a PhD in Law (2019-2021). He is University Professor at City University of Macau. He is Fellow of the Forum for International Conciliation and Arbitration (FICA, Oxford, United Kingdom). He has published 32 books and authored over 120 papers. Hugo Luz dos Santos has been awarded the Fellowship of the Royal Society of Arts of the United Kingdom (London, United Kingdom) in recognition of his outstanding contributions to the field of justice, rule of law and policy worldwide.

Summary

This book broaches the newly crafted concept of algorithmic dictatorship that draws on a plethora of human biases that creep into the algorithm and feeds into an automated decision that comes to the expense of citizens´ lives, freedoms, health, property, fair lending, and credit scoring. This book sheds a keen light on the slew of reasons in view of which artificial intelligence should be both interpretable and controllable, as opposed to merely explainable. The reason for that is straightforward: the skewed data baked into the bigoted algorithms—machine biases—spawns harrowing effects with which criminal justice has been grappling for a long-haul/drawn-out. Tallyingly, and perhaps unsurprisingly, law enforcement evinces biases that run along both gender and race lines. No surprise springs from the fact that computer-generated algorithms that propel predictive policing are often flagged as tools whereby racial discrimination abounds. It should not therefore be pegged as flabbergasting that this sort of shady algorithmic governance is a byproduct of a grueling algorithmic dictatorship that is shaping up to crumble the foundations of Rule of Law upon which stands modern societies. This is one of the key takeaways of this book. Disturbingly enough, brain–computer interfaces are poised to be converted into shady tools to collate/gauge thoughts, emotions, sentiments, and crime-related information that would be otherwise inaccessible to the governments’, rogue nations’, or unscrupulous actors’ prying eyes. Much to our dismay, an eerily dystopian world is unfolding before our very eyes. This is the gist of transhumanism—a byproduct of convolutional neural networks that revolve around deep learning genetic algorithms—that will overhaul the current legal landscape beyond recognition. This book charts the path ahead as to draw set-in-stone boundaries to prevent jurisdictions from careening into the chaos of genetic plutocracy that should be wished away.

Customer reviews

No reviews have been written for this item yet. Write the first review and be helpful to other users when they decide on a purchase.

Write a review

Thumbs up or thumbs down? Write your own review.

For messages to CeDe.ch please use the contact form.

The input fields marked * are obligatory

By submitting this form you agree to our data privacy statement.