Move the couch where? : Developing an augmented reality multimodal interface
- Authors
- Irawati, S.; Green, S.; Billinghurst, M.; Duenser, A.; Ko, H.
- Issue Date
- 2006-10
- Publisher
- Institute of Electrical and Electronics Engineers Inc.
- Citation
- ISMAR 2006: 5th IEEE and ACM International Symposium on Mixed and Augmented Reality, pp.183 - 186
- Abstract
- This paper describes an augmented reality (AR) multimodal interface that uses speech and paddle gestures for interaction. The application allows users to intuitively arrange virtual furniture in a virtual room using a combination of speech and gestures from a real paddle. Unlike other multimodal AR applications, the multimodal fusion is based on the combination of time-based and semantic techniques to disambiguate a users speech and gesture input. We describe our AR multimodal interface architecture and discuss how the multimodal inputs are semantically integrated into a single interpretation by considering the input time stamps, the object properties, and the user context. ?2006 IEEE.
- ISSN
- 0000-0000
- URI
- https://pubs.kist.re.kr/handle/201004/81537
- DOI
- 10.1109/ISMAR.2006.297812
- Appears in Collections:
- KIST Conference Paper > 2006
- Files in This Item:
There are no files associated with this item.
- Export
- RIS (EndNote)
- XLS (Excel)
- XML
Items in DSpace are protected by copyright, with all rights reserved, unless otherwise indicated.