I am 4th year HCI PhD student

image52

Novel interaction techniques, wearable computing, tangible interaction, AR/VR

Site Content

RadarCat [UIST'16]

image53

RadarCat is a small, versatile radar-based system for material and object classification which enables new forms of everyday proximate interaction with digital devices. 

Video

Solinteraction [IMWUT, UbiComp'19]

image54

We employ radar sensing and machine learning for the exploration of tangible interactions with a family of techniques we term, Solinteraction. 

Video

SWiM [CHI'17]

image55

We propose and evaluate a novel design point around a tilt-based text entry technique which supports single handed usage, based on the gesture keyboard.

Video

WatchMI [MobileHCI'16]

image56

We present WatchMI: Watch Movement Input that enhances touch interaction on a smartwatch to support continuous pressure touch, twist, pan gestures and their combinations.

Video

SpeCam [MobileHCI'17]

image57

SpeCam is a lightweight surface color and material sensing approach for mobile devices which only uses the front-facing camera and the display as a multi-spectral light source.

Video

ItchyNose [ISWC'17]

image58

We propose a sensing technique for detecting finger movements on the nose, using EOG sensors embedded in the frame of a pair of eyeglasses.

Video

Mirror Mirror [CHI'16]

image59

Mirror Mirror is a design system that combines Spatial Augmented Reality with a mirror display. Virtual garments are visible in the mirror reflection as well as on the body.

Video

Telepresence [ICAT-EGVE'2015]

image60

A novel system for mixed reality based remote collaboration system, which enables a local user to interact and collaborate with another user from remote space using natural hand motion. 

Video

Project Zanzibar [CHI'18]

image61

A flexible mat that locates, uniquely identifies, and communicates with  tangible objects placed on its surface, as well as senses a user’s touch  and hover gestures 

Find out more