Fr. 69.00

The Paradigm Shift to Multimodality in Contemporary Computer Interfaces

English · Paperback / Softback

Shipping usually within 1 to 2 weeks (title will be printed to order)

Description

Read more

During the last decade, cell phones with multimodal interfaces based on combined new media have become the dominant computer interface worldwide. Multimodal interfaces support mobility and expand the expressive power of human input to computers. They have shifted the fulcrum of human-computer interaction much closer to the human. This book explains the foundation of human-centered multimodal interaction and interface design, based on the cognitive and neurosciences, as well as the major benefits of multimodal interfaces for human cognition and performance. It describes the data-intensive methodologies used to envision, prototype, and evaluate new multimodal interfaces. From a system development viewpoint, this book outlines major approaches for multimodal signal processing, fusion, architectures, and techniques for robustly interpreting users' meaning. Multimodal interfaces have been commercialized extensively for field and mobile applications during the last decade. Research also is growing rapidly in areas like multimodal data analytics, affect recognition, accessible interfaces, embedded and robotic interfaces, machine learning and new hybrid processing approaches, and similar topics. The expansion of multimodal interfaces is part of the long-term evolution of more expressively powerful input to computers, a trend that will substantially improve support for human cognition and performance. Table of Contents: Preface: Intended Audience and Teaching with this Book / Acknowledgments / Introduction / Definition and Typre of Multimodal Interface / History of Paradigm Shift from Graphical to Multimodal Interfaces / Aims and Advantages of Multimodal Interfaces / Evolutionary, Neuroscience, and Cognitive Foundations of Multimodal Interfaces / Theoretical Foundations of Multimodal Interfaces / Human-Centered Design of Multimodal Interfaces / Multimodal Signal Processing, Fusion, and Architectures / Multimodal Language, Semantic Processing, and Multimodal Integration / Commercialization of Multimodal Interfaces / Emerging Multimodal Research Areas, and Applications / Beyond Multimodality: Designing More Expressively Powerful Interfaces / Conclusions and Future Directions / Bibliography / Author Biographies

List of contents

Preface: Intended Audience and Teaching with this Book.- Acknowledgments.- Introduction.- Definition and Typre of Multimodal Interface.- History of Paradigm Shift from Graphical to Multimodal Interfaces.- Aims and Advantages of Multimodal Interfaces.- Evolutionary, Neuroscience, and Cognitive Foundations of Multimodal Interfaces.- Theoretical Foundations of Multimodal Interfaces.- Human-Centered Design of Multimodal Interfaces.- Multimodal Signal Processing, Fusion, and Architectures.- Multimodal Language, Semantic Processing, and Multimodal Integration.- Commercialization of Multimodal Interfaces.- Emerging Multimodal Research Areas, and Applications.- Beyond Multimodality: Designing More Expressively Powerful Interfaces.- Conclusions and Future Directions.- Bibliography.- Author Biographies.

About the author










Dr. Sharon Oviatt is well known for her extensive work in multi-modal and mobile interfaces, human-centered interface designs, educational interfaces, and communications interfaces. She was the recipient of the Inaugural ACM ICMI Sustained Accomplishment Award and a National Science Foundation Special Creativity Award for her pioneering research on mobile multimodal interfaces. She is also a member of the CHI Academy. Dr. Oviatt has published over 150 scientific articles in a multidisciplinary range of venues, including computer science, learning and cognitive sciences, and linguistic sciences. She is an Associate Editor of the main journals and edited book collections in the field of human interfaces, and authored the chapter on "Multimodal Interfaces" in The Human-Computer Interaction Handbook, 3rd Edition. She was a founder of the International Conference on Multimodal Interfaces (ICMI), which became an annual ACM-sponsored international conference series under her guidance.

Dr. Philip R Cohen is internationally known for his work in multimodal systems and artificial intelligence (intelligent agents and natural language processing). He is a Fellow of the Association for the Advancement of Artificial Intelligence, and past President of the Association for Computational Linguistics. He has over 120 refereed publications in a wide range of venues, and is the recipient (with Prof. Hector Levesque) of an inaugural Influential Paper award by the International Foundation for Autonomous Agents and Multi-Agent Systems. Cohen is currently Vice President, Advanced Technologies at VoiceBox Technologis, Inc. His prior positions include being Founder and Executive Vice President of Research at Adapx Inc., professor and co-director of the Center for Human-Computer Communication in the Department of Computer Science and Engineering at the Oregon Health and Science University, and a Senior Computer Scientist and Director of the Natural Language Program in the Artificial Intelligence Center of SRI International.


Product details

Authors Philip R Cohen, Philip R. Cohen, Sharon Oviatt
Publisher Springer, Berlin
 
Original title The Paradigm Shift to Multimodality in Contemporary Computer Interfaces
Languages English
Product format Paperback / Softback
Released 01.01.2015
 
EAN 9783031010859
ISBN 978-3-0-3101085-9
No. of pages 221
Dimensions 191 mm x 13 mm x 235 mm
Illustrations XXII, 221 p.
Series Synthesis Lectures on Human-Centered Informatics
Subject Natural sciences, medicine, IT, technology > IT, data processing > Operating systems, user interfaces

Customer reviews

No reviews have been written for this item yet. Write the first review and be helpful to other users when they decide on a purchase.

Write a review

Thumbs up or thumbs down? Write your own review.

For messages to CeDe.ch please use the contact form.

The input fields marked * are obligatory

By submitting this form you agree to our data privacy statement.