Hui-Shyong Yeo

3rd year PhD student in Human-Computer Interaction, University of St Andrews, with Prof. Aaron Quigley

About Me

I am a 3rd year PhD student in the SACHI research group, supervised by Prof. Aaron Quigley. Before that, I was a researcher in UVR Lab, KAIST with Prof. Woontack Woo. My research interest lie in exploring and  developing novel interaction techniques that transcend the barrier between humans and computers. In particular, I am interested in topics  such as gestural/tangible interaction, mobile/wearable computing, Augmented/Virtual reality, text entry and pen interaction. Currently, I am focusing on Single-Handed Interaction Techniques for my PhD dissertation. Beyond HCI, I am also interested in cloud computing and cloud storage. 

Selected Projects

WatchMI [MobileHCI 16]

WatchMI supports pressure touch, twisting and panning gestures on unmodified smartwatch.

RadarCat [UIST 16]

RadarCat can recognize surface materials and objects using Google Soli, enables novel sensing and interaction.

SWiM [CHI 17]

SWiM supports single handed text entry on large and small devices, using tilt-based shape writing.

SpeCam [MobileHCI 17]

SpeCam can recognize surface materials using front-facing camera of smartphone only, allows context-aware computing.

ItchyNose [ISWC 17]

Itchy Nose uses finger movements on the nose to command a wearable computer. Itchy Nose may allow users to respond to notifications quickly without distracting nearby colleagues.

Project Zanzibar [CHI 18]

Project Zanzibar is a flexible, portable mat that can sense and track physical objects, identify what they are, and allow you to interact through multi-touch and hover gestures