Augmented reality user interface is an attractive solution for interaction with smart objects that suffer from not providing a rich user interface for interaction. However, an important challenges of interaction with augmented reality user interfaces is to show the UI of the smart device that the user wants to interact with. Gaze can tell the system what the point of interest is at the moment. Eye movements can also be used for providing input to the system. In this project, we have explored how a combination of eye and head movements can be used to provide intuitive input to an augmented reality user interface. Two novel interaction techniques that combine these modalities differently are developed and evaluated against the standard head pointing mechanism on HoloLens.
Gaze tracking on Hololens
Augmented reality user interface is an attractive solution for interaction with smart objects that suffer from not providing a rich user interface for interaction. However, an important challenges of interaction with augmented reality user interfaces is to show the UI of the smart device that the user wants to interact with. Gaze can tell the system what the point of interest is at the moment. Eye movements can also be used for providing input to the system. In this project, we have explored how a combination of eye and head movements can be used to provide intuitive input to an augmented reality user interface. Two novel interaction techniques that combine these modalities differently are developed and evaluated against the standard head pointing mechanism on HoloLens.