The accessibility of handheld mobile devices is a unique problem domain. They present with a small form factor, constraining display size, and making serious demands on user mobility. Existing assistive technology tackles these problems with bespoke solutions and text-to-speech augmentation, bulking out the device, and forcing visual metaphors upon blind users. Stepping away from such "bolt-on" accessibility, this research revisits the processes by which user interfaces are designed, constructing a model of user interface development that allows for dynamic adaptation of the interface to match individual user capability profiles. In doing so, it abstracts content meaning from presentation, mapping interaction metaphors to categorized user capabilities within individual design spaces (visual, sonic, and haptic) and interaction metaphors to relevant content meaning.
展开▼