Abstract
We examine the use of auditory display for ubiquitous computing to extend the boundaries of human-computer interaction (HCI). Our design process is based on listening tests, gathering free-text identification responses from participants. The responses and their classifications indicate how accurately sounds are identified and help us identify possible metaphors and mappings of sound to human action and/or system status.
Original language | English |
---|---|
Pages (from-to) | 36-44 |
Number of pages | 9 |
Journal | IEEE Multimedia |
Volume | 12 |
Issue number | 2 |
DOIs | |
Publication status | Published - Apr 2005 |