Eyes-free interaction with aural user interfaces

Date
2015-04-11
Language
American English
Embargo Lift Date
Department
Committee Chair
Committee Members
Degree
Ph.D.
Degree Year
2015
Department
School of Informatics
Grantor
Indiana University
Journal Title
Journal ISSN
Volume Title
Found At
Abstract

Existing web applications force users to focus their visual attentions on mobile devices, while browsing content and services on the go (e.g., while walking or driving). To support mobile, eyes-free web browsing and minimize interaction with devices, designers can leverage the auditory channel. Whereas acoustic interfaces have proven to be effective in regard to reducing visual attention, a perplexing challenge exists in designing aural information architectures for the web because of its non-linear structure. To address this problem, we introduce and evaluate techniques to remodel existing information architectures as "playlists" of web content - aural flows. The use of aural flows in mobile web browsing can be seen in ANFORA News, a semi-aural mobile site designed to facilitate browsing large collections of news stories. An exploratory study involving frequent news readers (n=20) investigated the usability and navigation experiences with ANFORA News in a mobile setting. The initial evidence suggests that aural flows are a promising paradigm for supporting eyes-free mobile navigation while on the go. Interacting with aural flows, however, requires users to select interface buttons, tethering visual attention to the mobile device even when it is unsafe. To reduce visual interaction with the screen, we also explore the use of simulated voice commands to control aural flows. In a study, 20 participants browsed aural flows either through a visual interface or with a visual interface augmented by voice commands. The results suggest that using voice commands decreases by half the time spent looking at the device, but yields similar walking speeds, system usability and cognitive effort ratings as using buttons. To test the potential of using aural flows in a higher distracting context, a study (n=60) was conducted in a driving simulation lab. Each participant drove through three driving scenario complexities: low, moderate and high. Within each driving complexity, the participants went through an alternative aural application exposure: no device, voice-controlled aural flows (ANFORADrive) or alternative solution on the market (Umano). The results suggest that voice-controlled aural flows do not affect distraction, overall safety, cognitive effort, driving performance or driving behavior when compared to the no device condition.

Description
Indiana University-Purdue University Indianapolis (IUPUI)
item.page.description.tableofcontents
item.page.relation.haspart
Cite As
ISSN
Publisher
Series/Report
Sponsorship
Major
Extent
Identifier
Relation
Journal
Rights
Source
Alternative Title
Type
Number
Volume
Conference Dates
Conference Host
Conference Location
Conference Name
Conference Panel
Conference Secretariat Location
Version
Full Text Available at
This item is under embargo {{howLong}}