Fr. 69.00

AI and Multimodal Services - AIMS 2025 - 14th International Conference, Held as Part of the Services Conference Federation, SCF 2025, Hong Kong, China, September 27-30, 2025, Proceedings

English, German · Paperback / Softback

Will be released 15.11.2025

Description

Read more

This book constitutes the refereed proceedings of the 14th International Conference on AI and Multimodal Services AIMS 2025, held as Part of the Services Conference Federation, SCF 2025, in Hong Kong, China, during September 27 30, 2025.
The 9 full papers included in this book were carefully reviewed and selected from 22 submissions. They focus on for the development, publication, discovery, orchestration, invocation, testing, delivery, certification, and management of artificial intelligence (AI) and multimodal applications and services.

List of contents

Research Track
.- Theoretical Reconstruction of New Quality Productive Forces and the Shenzhen Paradigm of Intelligent Socialism.
.- Artificial Bee Colony Algorithm Based on Nonlinear Dual Search Strategy.
.- Hybrid BERT-BiLSTM Model for SQL Injection Detection: Potential Applications in Banking Information Systems.
.- AC-Net: An Adaptive Step-Size Low-Light Image Enhancement Method Based on Global Illumination Modeling.
.- Semi-Supervised Scene Text Detection based on Teacher-Student Scheme and Cascaded Hybrid Network.
.- Towards Fast-Slow Thinking in Conversational Emotion Recognition via Causal Prompting with Peak-End Rule.
.- 3D Path Planning for UAVs in Complex Environments Using an Improved Hybrid Genetic-PSO Algorithm.

Application and Industry Track
.- Research on Algorithms Based on Autoregressive Fusion Models.
.- A Microservice-Based Implementation of Chinese Conversational Digital Avatars Using NVIDIA ACE.

Customer reviews

No reviews have been written for this item yet. Write the first review and be helpful to other users when they decide on a purchase.

Write a review

Thumbs up or thumbs down? Write your own review.

For messages to CeDe.ch please use the contact form.

The input fields marked * are obligatory

By submitting this form you agree to our data privacy statement.