Multimodal HCI Integration

Paper #:
  • 1999-01-5509

Published:
  • 1999-10-19
Citation:
Vassiliou, M., Sundareswaran, V., Chen, S., and Wang, K., "Multimodal HCI Integration," SAE Technical Paper 1999-01-5509, 1999, https://doi.org/10.4271/1999-01-5509.
Pages:
7
Abstract:
A multipurpose test-bed for integrating user interface and sensor technologies has been developed, based on a client- server architecture. Various interaction modalities (Speech recognition, 3-D Audio, Pointing, wireless Handheld- PC-based control and interaction, sensor interaction, etc.) are implemented as servers, encapsulating and exposing commercial and research software packages. The system allows for integrated user interaction with large and small displays using speech commands combined with pointing, spatialized audio, and other modalities. Simultaneous and independent speech recognition for two users is supported; users may be equipped with conventional acoustic or new body-coupled microphones.
Access
Now
SAE MOBILUS Subscriber? You may already have access.
Buy
Select
Price
List
Download
$27.00
Mail
$27.00
Members save up to 40% off list price.
Share
HTML for Linking to Page
Page URL

Related Items

Technical Paper / Journal Article
2011-04-12
Standard
2006-02-20
Standard
2006-02-20
Article
2016-12-08
Training / Education
2018-03-07
Standard
2006-02-20