I am a 3rd year PhD student in the SACHI research group, supervised by Prof. Aaron Quigley. Before that, I was a researcherin UVR Lab, KAIST with Prof. Woontack Woo.
My research interest lie in exploring and developing novel interaction techniques that transcend the barrier between humans and computers.
In particular, I am interested in topics such as gestural/tangible interaction, mobile/wearable computing, Augmented/Virtual reality, text entry and pen interaction.
Currently, I am focusing on Single-Handed Interaction Techniques for my PhD dissertation. Beyond HCI, I am also interested in cloud computing and cloud storage.
RadarCat is a small, versatile radar-based system for material and object classification which enables new forms of everyday proximate interaction with digital devices. We demonstrate high accuracy in classifying different types of objects in real time.
We present WatchMI: Watch Movement Input that enhances touch interaction on a smartwatch to support continuous pressure touch, twist, pan gestures and their combinations.
Investigating Tilt-based Gesture Keyboard Entry for Single-Handed Text Entry on Large Devices, based on the gesture keyboard (shape writing) where users articulate a gesture by tilting the device.
A personal T-shirt design system combining Spatial Augmented Reality with a mirror to achieve high fidelity 3D feedback. In this way, augmented graphics are visible on the body, in the reflection, and on the background.
SpeCam is a lightweight surface color and material sensing approach for mobile devices which only uses the front-facing camera and the display as a multi-spectral light source.
Project Zanzibar is a flexible, portable mat that can sense and track physical objects, identify what they are, and allow you to interact through multi-touch and hover gestures.