Sensors and Systems
Breaking News
Trimble and GroundProbe Collaborate to Offer Complete Monitoring Portfolio for Geotechnical and Geospatial Mining Professionals
Rating12345Integrated approach means less hassle and more support for...
Space42 and ICEYE Announce Joint Venture to Bring Satellite Manufacturing to the UAE
Rating12345ABU DHABI, UAE —  Space42 (ADX: SPACE42), a UAE-based...
Hexagon appoints new Group Executive Vice President and new President of Hexagon’s Geosystems division
Rating12345 Thomas Harring, currently President of Hexagon’s Geosystems division,...
  • Rating12345

thumb_MapAndBelt

If you are walking or cycling, and don’t want to (or are unable to) spend most of the time focusing on a screen, the use of mobile devices tends to be a frustrating experience. The same is true in bright sunlight or if your eyesight just isn’t good enough to see every detail on the mobile screen.  The persisting problem of displaying information on the small screens of mobile devices is pushing both display development making use of sounds, gestures and the sense of  touch.

If you are walking or cycling, and don’t want to (or are unable to) spend most of the time focusing on a screen, the use of mobile devices tends to be a frustrating experience. The same is true in bright sunlight or if your eyesight just isn’t good enough to see every detail on the mobile screen.  The persisting problem of displaying information on the small screens of mobile devices is pushing both display development making use of sounds, gestures and the sense of  touch. One problem is that the non-visual channels are often used only as an enhancement of the visual instead of being truly designed to stand on their own. Improved multimodal perceptualizations (visualization  including other senses than the visual) would make  applications more accessible and easier to use in actual mobile, navigational situations.

The HaptiMap EU project (FP7-ICT-224675) is aimed at making maps and location based services more accessible by using several senses like touch, hearing and vision. HaptiMap is a four year project, and is currently half-way: HaptiMap started in 2008 and has been running for two years.
HaptiMap will enable digital maps and mobile location based services to be accessible to a wide range of users. Our strategy is: to develop tools that make it easier for developers to add adaptable multimodal components (designed to improve accessibility) into their applications; to raise the awareness of these issues via new guidelines and to suggest extensions to existing design practices so that accessibility issues are considered throughout the design process.

 

MapAndBelt

Paper maps can be enhanced multimodally.

Multimodality is a useful addition for navigation applications, allowing transfer of information from the relatively overloaded visual sense to hearing and touch.  However, unless well understood the use of touch and sound can be annoying.  In HaptiMap we are looking at ways to effectively employ touch and hearing to make map and navigation applications more useful and engaging as well as more accessible to users with impairments.

During the first year the project focused on user studies and on laying the foundations for the implementation work. In HaptiMap we use and advocate an iterative and user-centred design methodology where end users are involved all through the work process and where designs and prototypes are tested iteratively.

The focus of the project as a whole during this first year has been the user studies. In order to get a full picture of the relevant issues we have used a range of techniques that involve questionnaires, interviews, focus group discussions, probe studies, workshops, and contextual tests. 221 users were involved in the first year activities, where 83 were sighted, 72 visually impaired and 66 elderly. In addition, 188 users answered a web questionnaire.

Our studies of severely visually impaired, elderly and sighted users indicate general similarities in the types of information all people need, although we observe that in general visually impaired users require a higher level of detail. We have explored both goal directed navigation and other more exploratory situations, since appliances supporting navigation may be used in both scenarios. The contemporary lack of a proper solution for pedestrian navigation was apparent throughout our current studies.

Appropriate sounds or touch feedback as the user points the device in different directions can be used for orientation and navigation (non-visual augmented reality.

Appropriate sounds or touch feedback as the user points the device in different directions can be used for orientation and navigation (non-visual augmented reality).

All user groups stressed the need for better adaptation of information and device interaction to the pedestrian situation (preferably organized in layers to allow different levels of detail). There were also requests for more use of additional sensory channels – as well as hands-free use. This is well in line with the basic assumption underlying the HaptiMap project, and with this work gave us a solid basis for the design and development effort.

Another important result from the first year is the HaptiMap user study guidelines (available through the project website). The guidelines were compiled to provide a common basis for the work in HaptiMap, but this document is written also to be valuable and useful for designers, developers and researchers outside HaptiMap. This document can be found at the project website.

During the second year we have focused on interaction design research, and iteratively carried out user studies on how to perceptualize maps and navigational information. We have done studies on the design of visual map displays for people who have limited eyesight as well as studies using only vibratory and/or sound feedback for map feature display and guiding.

A special study was conducted to determine different causes for getting lost, and another was conducted on the design and recognition of feasible patterns for guiding when using only touch feedback. User scenario walks have been conducted to inspire interaction design and discussions on map information content. Furthermore, a study involving blind and sighted users riding tandems together was conducted to elicit information about the type of information required to involve the blind co-driver actively in the route travelled.

00084_20100819151752

Tactile or audio feedback as the user slides the finger over the display can be used to enhance/show object locations

In order to strengthen our efforts to make developers and designers aware of the importance of tactile and audio feedback in mobile applications we have also done interviews and video observations “in the wild” of how mobile applications are actually used.

In addition we have continued our studies of how multimodality can be employed to improve navigation and way finding. Informed from the year one user studies we have looked different scenarios in which users navigate and how multimodality can be effectively employed and have explored how these situations can be supported with current technologies.

We have found that low-fidelity tactile and auditory information can provide an effective way to communicate information about points of interest and landmarks to users in a way that is both straightforward and unobtrusive.  As a means of evaluating this we have released the PocketNavigator onto the android marketplace.

This means that anybody can download it and allows us to gain useful feedback and follow an iterative development cycle.  What we have learned is also being fed into the development of new novel hardware which can be incorporated also into accessories or clothing. These technologies will be combined with findings for visual displays for users who are visually impaired but with some residual residual vision as well as work to determine ontologies to determine the features of the environment that should be communicated.  This will allow us to communicate information that users need discretely and effectively.

A special software suite, the HaptiMap toolkit, has been developed that gives the developers of programs for mobile phones the ability to build multi-modal programs that make use of our findings to interact with mapping and location based services also through the senses of sight sound and touch. The Haptimap toolkit is adaptable in the sense that, firstly: When built from source it adapts to the target platform’s capabilities, e.g. recognising what capabilities are available (motion sensors, sound player and screen size). 

An environment where mobile devices are used.

An environment where mobile devices are used.

And secondly: It is adaptable when in use, for example, a Bluetooth GPS location device may come into range, or may go out of range. The toolkit is cross platform –  it builds for use with mobile platforms: Windows Mobile 6.x, iPhone OS versions 3 and  4. Symbian 3rd and 5th editions, Android (using the JNI), Linux phones (such as Open Moko) and Maemo 4 (Diablo) and 5 (Freemantle). For desktop platforms: Windows (XP, Vista, and 7), Linux and Macintosh OSX (Leopard 4.5 and Snow Leopard 4.6). It can be used to build applications for such versatile applications as route planning and choice of map properties to display when the map is used on a mobile device.

To demonstrate the toolkit we have started to enhance existing location-based services with the invented accessibility features. We proposed nine applications that are to demonstrate all facets of the toolkit. The demonstrators range from existing commercial applications, such as NAVTEQ Connect or GeoMobile’s JUICY BEATS Event Guide, over newly developed research applications, to sample applications specifically targeted at developers. First accessibility features can already be tested, e.g. by the previously mentioned PocketNavigator, which is available through the Android Market at the time of writing. More features are to be expected in spring 2011 when the demonstrators will be evaluated in different settings and with diverse user groups.

To support industrial design processes, we have chosen a twofold approach. We have participated in standardisation work in order to develop standards to support accessibility, and we have worked more concretely with different types of materials useful for developers and designers.  
During 2010 we detailed our initial exploitation plans for the potentially commercial results of HaptiMap. The main commercial results will include extensions to the core toolkit and real-world demonstrator applications of navigation solutions.

Competitor details, user characteristics, market analysis and financial planning were generated for both the Haptimap toolkit and the commercial demonstrators (event guides) during the year. At this stage, with approximately half (24 months) of the projects lifetime remaining, it is clear that the huge growth in the LBS market means that HaptiMap is a very timely initiative. We believe that the main deliverables of this project, the toolkit and the audio-haptic guidelines, can underpin a profitable business model when used by mobile LBS developers.

HaptiMap, Haptic, Audio and Visual Interfaces for Maps and Location Based Services, is a project which receives financial support from the European Commission in the Seventh Framework Programme, under the Cooperation Programme ICT –  Information and Communication Technologies (Challenge 7 – Independent living and inclusion). 

 

———————————————————————————–

Charlotte Magnusson is Associate Professor, Certec, Division of Rehabilitation Engineering Research, Department of Design Sciences, Lund University in Lund, Sweden

More Information: The HaptiMap project is coordinated by Lund University, Sweden. The other partners in the project are NAVTEQ, Siemens, BMT Group, CEA, ONCE, Finnish Geodetic Institute, University of Glasgow, OFFIS, Queen’s University, Fundacion Robotiker, Kreis Soest, Lunds Kommun and GeoMobile. You can read more about the project at the project website www.haptimap.org.

Leave a Reply

Your email address will not be published. Required fields are marked *