Fr. 226.00

Explainable Agency in Artificial Intelligence - Research and Practice

English · Hardback

Shipping usually within 1 to 3 weeks (not available at short notice)

Description

Read more










The book is a collection of cutting-edge research on the topic of explainable agency in artificial intelligence (XAI), including counterfactuals, fairness, human evaluations, and iterative and active communication among agents.

List of contents

1. From Explainable to Justified Agency, 2. A Survey of Global Explanations in Reinforcement Learning, 3. Integrated Knowledge-Based Reasoning and Data-Driven Learning for Explainable Agency in Robotics, 4. Explanation as Question Answering Based on User Guides, 5. Interpretable Multi-Agent Reinforcement Learning with Decision-Tree Policies, 6. Towards the Automatic Synthesis of Interpretable Chess Tactics, 7. The Need for Empirical Evaluation of Explanation Quality

About the author

Dr. Silvia Tulli is an Assistant Professor at Sorbonne University. She received her Marie Curie ITN research fellowship and completed her Ph.D. at Instituto Superior Técnico. Her research interests lie at the intersection of explainable AI, interactive machine learning, and reinforcement learning.
Dr. David W. Aha (UC Irvine, 1990) serves as the Director of the AI Center at the Naval Research Laboratory in Washington, DC. His research interests include goal reasoning agents, deliberative autonomy, case-based reasoning, explainable AI, machine learning (ML), reproducible studies, and related topics.

Summary

The book is a collection of cutting-edge research on the topic of explainable agency in artificial intelligence (XAI), including counterfactuals, fairness, human evaluations, and iterative and active communication among agents.

Customer reviews

No reviews have been written for this item yet. Write the first review and be helpful to other users when they decide on a purchase.

Write a review

Thumbs up or thumbs down? Write your own review.

For messages to CeDe.ch please use the contact form.

The input fields marked * are obligatory

By submitting this form you agree to our data privacy statement.