New Project Starts

We have just started working in a new EC (CHIST-ERA) project called eGlasses, it is in partnership with a range of research groups across Europe in Poland, Austria, France, Switzerland and Luxembourg.

The eGlasses project is focused on the development of an open platform in the form of multisensory electronic glasses and on the integration and designing of new intelligent interaction methods using the eGlasses platform. This is an initial development focused on long-term research and technological innovation in perceptual and super-perceptual (e.g. heart rate, temperature) computing. It is an emerging technology that is also focused on the creation of mobile, perceptual media. Perceptual media refers to multimedia devices with added perceptual user interface capabilities. These devices integrate human-like perceptual awareness of the environment, with the ability to respond appropriately. This can be achieved by using automatic perception of an object’s properties and delivering information about the object’s status as a result of reasoning operations. For example, using the eGlasses, it will be possible to control a device, which is recognized within the field of view using the interactive menu, associated with the identified device. Other examples include presentation of a recognized person name, recognition of people with abnormal physiological parameters, protection against possible head injuries, etc.

The platform will use currently available user-interaction methods, new methods developed in the framework of this project (e.g. a haptic interface) and will enable further extensions to introduce next generation user-interaction algorithms. Furthermore, the goal of this project is to propose and evaluate new and intelligent user interactions, which are particularly useful for healthcare professionals, people with disabilities or at risk of exclusion, and to create and evaluate behavioural models of these mobile users.

The main scientific and technological objectives of the project are to design and evaluate the following:

– eye-tracking hardware and algorithms for a user, who is mobile in a noisy real world environment, – algorithms for perceptual media and for super perceptual computing, – methods for locating objects and guiding vision towards the identified objects,

– methods of interactions with users and objects (menu of activities for the identified person or object), – a haptic interface in a form of a peripheral proximity radar, – methods for the recognition of the user’s own gestures and recognition of gestures of the observed person,

– methods for context-aware behavioural studies, – methods for reference applications. The result of the project will be an open platform in the form of multisensory electronic multimedia glasses and a set of new methods for intelligent user interactions, especially in the context of perceptual media.

About Rod McCall

Rod McCall is a researcher in the field of human-computer interaction in areas such as augmented reality, mobile gaming in-car systems and virtual environments. He works for the University of Luxembourg and has previously worked for Fraunhofer and Runtime Revolution. He is an advocate of new technologies only when they have some practical benefit for individuals or society.

Leave a Reply

Your email address will not be published. Required fields are marked *

You may use these HTML tags and attributes: <a href="" title=""> <abbr title=""> <acronym title=""> <b> <blockquote cite=""> <cite> <code> <del datetime=""> <em> <i> <q cite=""> <strike> <strong>