Fr. 196.00

Visual Perception and Control of Underwater Robots

English · Hardback

Shipping usually within 3 to 5 weeks

Description

Read more










This book covers theories and applications from aquatic visual perception and underwater robotics. Within the framework of visual perception for underwater operations, image restoration, binocular measurement, and object detection are addressed.


List of contents

1. Introduction 2. Adaptive Real-Time Underwater Visual Restoration with Adversarial Critical Learning 3. A NSGA-II-Based Calibration for Underwater Binocular Vision Measurement 4. Joint Anchor-Feature Refinement for Real-Time Accurate Object Detection in Images and Videos 5. Rethinking Temporal Object Detection from Robotic Perspectives 6. Reveal of Domain Effect: How Visual Restoration Contributes to Object Detection in Aquatic Scenes 7. IWSCR: An Intelligent Water Surface Cleaner Robot for Collecting Floating Garbage 8. Underwater Target Tracking Control of an Untethered Robotic Fish with a Camera Stabilizer 9. Summary and Outlook

About the author

Junzhi Yu is a professor of Peking University, whose research interests incude biomimetic robots, intelligent control, and intelligent mechatonic systems. In these areas, he has (co-)authored 3 monographs, and published over 100 SCI papers in the prestigious robotics and automation related journals.
Xingyu Chen, PhD in University of Chinese Academy of Sciences.
Shihan Kong, PhD student in University of Chinese Academy of Sciences.

Summary

This book covers theories and applications from aquatic visual perception and underwater robotics. Within the framework of visual perception for underwater operations, image restoration, binocular measurement, and object detection are addressed.

Customer reviews

No reviews have been written for this item yet. Write the first review and be helpful to other users when they decide on a purchase.

Write a review

Thumbs up or thumbs down? Write your own review.

For messages to CeDe.ch please use the contact form.

The input fields marked * are obligatory

By submitting this form you agree to our data privacy statement.