Human Factors in Integration and Synchronization of Input Modes
nStudy conducted by Oviatt to explore multimodal integration and synchronization patterns that occur during pen and speech based human-computer interaction.
nEvaluate the linguistic features of spoken multimodal interfaces and how the differ from unimodal speech recognition interfaces.
nDetermine how spoken and written modes are naturally integrated and synchronized during multimodal input construction