Eye Tracking

The innovation department for "Cognitive Services and Augmented Reality" (CSAR Lab) at e.solutions has developed the eye tracking platform Cockpit Vision for a robust and high-performance use in vehicles. The modular platform allows a versatile range of applications from monitoring to driver interaction. Tracking of facial- and eye-features in real time not only provides the head pose and direction of gaze in 3D, but also simultaneously captures changes in facial expression. In addition to cockpit interaction through the driver's gaze, the driver's condition itself can be measured, thus enabling accurate Driver Monitoring

This video of our Cockpit Vision platform exemplarily shows display (de-) activations by the driver’s tracked gaze.

Versatile Use Cases

Eye Tracking or Gaze Tracking plays an essential role in human-machine interaction and thereby enables new versatile operating concepts and comfort functions. In practice, there are use cases for controlling Virtual or Augmented Reality, for contactless interaction with displays in public spaces or with the TV at home and for the intelligent cockpit in the car.

As an integral part of an intelligent cockpit, a multitude of innovative functions can be realized in the automobile by using eye tracking. In addition to a contactless operating concept for general control elements such as displays, fans and window regulators, the driver's state can also be determined by means of attention.

Developing the Product

The research and development for gaze direction recognition at e.solutions focuses on automotive applications that place high demands on reliability, performance and flexibility in terms of the widest possible range of applications. We meet these various challenges with a multi-level iterative strategy consisting of design, simulation and prototyping as well as evaluation.

With the help of a realistic 3D simulation of the cockpit and the driver as well as his possible range of movement, a design can be optimized for a certain viewing angle coverage from the multitude of design possibilities, e.g. position and strength of the IR illumination/-s and the camera position. The following video shows a simulation of these effects, as well as the influence of shadows and light reflections in the eye for different groups of people with and without glasses.

The most promising design is then implemented in a real prototype to be validated and subjected to further experiments and subject tests. For this purpose, we have built a portable as well as functional cockpit simulator from the interior of a production vehicle, thus creating variable but realistic test and development conditions. New applications, such as our gaze direction recognition, can now be integrated and tested here. Therefore the cockpit has been extended with IR lighting, camera and additional displays.

Our Cockpit Vision Software now brings our eye tracking demonstrator to life, as the video at the top of the page shows. As an exemplary application, the center display unhides and the passenger display fades out when triggered by the driver's gaze in order to be protected from disturbing glare or distracting content on the passenger display while driving.

In order to evaluate the accuracy of the gaze tracking, position markings were added to the setup and various experiments with subjects were conducted. The following figure shows the distribution of the detected viewing positions of different drivers and poses for 9 different marker positions.

Technik

Techniques for gaze direction recognition can be roughly differentiated according to the required hardware and the resulting accuracy and ease of handling. For example, headset-based systems – usually used in user studies or VR/AR applications – allow relatively accurate tracking of eye movement because of the short distance between the eye and the eyewear-integrated camera. However, these also require a tracking system for the headset in the room. Because of the extra glasses required, these systems are rather unsuitable in an automotive context. Tracking systems that work with a camera from a distance usually require a head pose detection, but are much more versatile and unobtrusive to use. For robust use in cars, infrared (IR) light sources and IR-sensitive cameras are used to compensate for external lighting influences such as sunlight or darkness. 

In addition, the position of the external light sources as well as their reflection on the cornea of the eye can be used to determine the direction of vision more accurately than it would be possible using the pupil and head position only.