Fr. 56.90

How Humans Judge Machines

English · Hardback

Shipping usually within 1 to 3 weeks (not available at short notice)

Description

Read more










How people judge humans and machines differently, in scenarios involving natural disasters, labor displacement, policing, privacy, algorithmic bias, and more.

How would you feel about losing your job to a machine? How about a tsunami alert system that fails? Would you react differently to acts of discrimination depending on whether they were carried out by a machine or by a human? What about public surveillance?

How Humans Judge Machines compares people's reactions to actions performed by humans and machines. Using data collected in dozens of experiments, this book reveals the biases that permeate human-machine interactions.

Are there conditions in which we judge machines unfairly? Is our judgment of machines affected by the moral dimensions of a scenario? Is our judgment of machine correlated with demographic factors such as education or gender?

César Hidalgo and colleagues use hard science to take on these pressing technological questions. Using randomized experiments, they create revealing counterfactuals and build statistical models to explain how people judge artificial intelligence and whether they do it fairly. Through original research, How Humans Judge Machines bring us one step closer tounderstanding the ethical consequences of AI.

About the author

Written by César A. Hidalgo, Director of the Center for Collective Learning at the University of Toulouse, the author of Why Information Grows, and coauthor of The Atlas of Economic Complexity (MIT Press), together with a team of social psychologists (Diana Orghian and Filipa de Almeida) and roboticists (Jordi Albo-Canals), How Humans Judge Machines presents a unique perspective on the nexus between AI and society. Anyone interested in the future of AI ethics should explore the experiments and theories in How Humans Judge Machines.

Summary

How people judge humans and machines differently, in scenarios involving natural disasters, labor displacement, policing, privacy, algorithmic bias, and more.

How would you feel about losing your job to a machine? How about a tsunami alert system that fails? Would you react differently to acts of discrimination depending on whether they were carried out by a machine or by a human? What about public surveillance?

How Humans Judge Machines compares people's reactions to actions performed by humans and machines. Using data collected in dozens of experiments, this book reveals the biases that permeate human-machine interactions.

Are there conditions in which we judge machines unfairly? Is our judgment of machines affected by the moral dimensions of a scenario? Is our judgment of machine correlated with demographic factors such as education or gender?

César Hidalgo and colleagues use hard science to take on these pressing technological questions. Using randomized experiments, they create revealing counterfactuals and build statistical models to explain how people judge artificial intelligence and whether they do it fairly. Through original research, How Humans Judge Machines bring us one step closer tounderstanding the ethical consequences of AI.

Product details

Authors Jordi Canals, Jordi A Canals, Jordi Albo Canals, Filipa De Almeida, Cesar A Hidalgo, Cesar A. Hidalgo, Hidalgo Cesar A., Natalia Martin, Diana Orghiain, Diana Orghian
Publisher The MIT Press
 
Languages English
Product format Hardback
Released 28.02.2021
 
EAN 9780262045520
ISBN 978-0-262-04552-0
No. of pages 256
Dimensions 181 mm x 235 mm x 23 mm
Subjects Natural sciences, medicine, IT, technology > IT, data processing > General, dictionaries

COMPUTERS / General, Information technology: general issues, Information technology: general topics

Customer reviews

No reviews have been written for this item yet. Write the first review and be helpful to other users when they decide on a purchase.

Write a review

Thumbs up or thumbs down? Write your own review.

For messages to CeDe.ch please use the contact form.

The input fields marked * are obligatory

By submitting this form you agree to our data privacy statement.