Fr. 59.50

Speech Technology - A Theoretical and Practical Introduction

English · Paperback / Softback

Will be released 31.03.2026

Description

Read more










In recent years, speech recognition devices have become central to our everyday lives. Systems such as Siri, Alexa, speech-to-text, and automated telephone services, are built by people applying expertise in sound structure and natural language processing to generate computer programmes that can recognise and understand speech. This exciting new advancement has led to a rapid growth in speech technology courses being added to linguistics programmes; however, there has so far been a lack of material serving the needs of students who have limited or no background in computer science or mathematics. This textbook addresses that need, by providing an accessible introduction to the fundamentals of computer speech synthesis and automatic speech recognition technology, covering both neural and non-neural approaches. It explains the basic concepts in non-technical language, providing step-by-step explanations of each formula, practical activities and ready-made code for students to use, which is also available on an accompanying website.

List of contents










1. Overview; 2. Speech; 3. Finite-state language modeling; 4. Statistical language models; 5. Non-neural synthesis; 6. Non-neural recognition; 7. Neural nets; 8. Neural synthesis; 9. Neural recognition; 10. Other technologies.

About the author

Michael Hammond is Full Professor in the Department of Linguistics at The University of Arizona. His work focuses on phonology, psycholinguistics, and computational linguistics. His notable publications include The Phonology of English (OUP, 1999) and Python for Linguists (CUP, 2020).

Customer reviews

No reviews have been written for this item yet. Write the first review and be helpful to other users when they decide on a purchase.

Write a review

Thumbs up or thumbs down? Write your own review.

For messages to CeDe.ch please use the contact form.

The input fields marked * are obligatory

By submitting this form you agree to our data privacy statement.