Fr. 199.00

Metamodeling for Extended Reality

English · Hardback

Shipping usually within 6 to 7 weeks

Description

Read more

This open access book which is based on the author's dissertation in 2024 explores the challenges of metamodeling in the context of extended reality and emphasizes the need for new concepts in metamodeling to effectively combine it with extended reality technologies. The central question of this work is how metamodeling can be used "in" and "for" extended reality.
The book is structured in nine chapters: Chapter 1 introduces the topic by providing background information and outlining the research objectives, questions, methodology and structure. Chapter 2 delves into the existing literature and developments in the field. It covers various aspects of modeling, such as conceptual, enterprise, and metamodeling, as well as extended reality (XR), virtual reality (VR), augmented reality (AR), and the metaverse. Next, chapter 3 presents the generic requirements for metamodeling for augmented and virtual reality by systematically deriving use cases for joining AR and metamodeling. Chapter 4 then identifies specific requirements for integrating metamodeling with XR, such as coordinate mappings, visualization of model components, detection and tracking, context, or interaction. Subsequently, chapter 5 introduces a new domain-specific visual modeling language for creating augmented reality scenarios, particularly within the context of metamodeling, before chapter 6 outlines the conceptualization and design of a 3D enhanced metamodeling platform considering extended reality, detailing its structure, components, and the interconnection of its elements. Chapter 7 then presents the initial implementation of the various components of this modeling platform, and chapter 8 evaluates the newly introduced platform both theoretically and empirically. Eventually, chapter 9 concludes the book by an alignment with the initial research questions, discussing limitations, and provides a final summary and an outlook for further research.
This book mainly aims at researchers in conceptual modeling, especially in projects related to XR, VR, or AR, as the presented new domain-specific modeling method for creating workflows for XR/VR/AR applications does not assume specific prior programming knowledge.

List of contents

1. Introduction.- 2. State-of-the-Art and Related Work.- 3. Derivation of Generic Requirements for Metamodeling for Extended Reality.- 4 Specific Requirements for Metamodeling for Extended Reality.- 5. ARWFMM: A Modeling Method as an Example for Knowledge-Based Virtual and Augmented Reality.- 6. M2AR: An Architecture for a 3D Enhanced Metamodeling Platform for Extended Reality.- 7. Prototypical Realization of the M2AR Metamodeling Platform.- 8. Evaluation of the M2AR Platform Prototype.- 9. Summary and Outlook.

About the author

Fabian Muff is a senior research assistant in the Digitalization and Information Systems Research Group at the Université Fribourg, Switzerland. His main research interests are in metamodeling, conceptual modeling and augmented reality (AR).

Product details

Authors Fabian Muff
Publisher Springer, Berlin
 
Languages English
Product format Hardback
Released 28.03.2025
 
EAN 9783031767616
ISBN 978-3-0-3176761-6
No. of pages 258
Dimensions 155 mm x 18 mm x 235 mm
Weight 520 g
Illustrations XII, 258 p. 152 illus., 94 illus. in color.
Subjects Natural sciences, medicine, IT, technology > IT, data processing > IT

Virtuelle Realität, Virtual Reality, Software Engineering, Augmented Reality, Augmented Reality (AR), Open Access, software development, Virtual and Augmented Reality, Conceptual modeling, formal methods, Extended Reality, metamodeling, domain-specific modeling

Customer reviews

No reviews have been written for this item yet. Write the first review and be helpful to other users when they decide on a purchase.

Write a review

Thumbs up or thumbs down? Write your own review.

For messages to CeDe.ch please use the contact form.

The input fields marked * are obligatory

By submitting this form you agree to our data privacy statement.