Jiseong Gu, Jaehyun Han, Seongkook Heo, Sunjun Kim, Sangwon Choi, Geehyuk Lee,
Narae Lee, Woohun Lee

ScreenPad Technology

ScreenPad is a hover tracking touchpad technology based on optical sensors. Common touchpads only can recognize the positions of the touched fingers. Unlike them, ScreenPad technology enables the touchpad to sense the position and shape of hands even when they do not contact to the touchpad surface. This feature extends the input vocabulary, and it may provide a better user experience. Moreover, because it adopts the optical structure, ScreenPad senses the non-conductive objects, so it is acting like a 2.5D scanner.ScreenPad adopts an optical hover tracking structure. This structure consisted of two components: set of phototransistors, and LED matrix. LEDs, which are emitter, are turned on one by one sequentially. They illuminate the thumb. All phototransistors, which are receiver, are wired in parallel. They act as a single planar sensor and measure the reflected light from the thumb. By this structure, we could expand the sensible area up to few centimeters.
We first tried RemoteTouch. We explored the possibility of touch-screen-like interaction with a remote control in the TV-viewing environment, which is an indirect control environment. To simulate a direct control environment, we displayed a shadow of the user’s thumb. This shadow can display the image of the thumb while the thumb floats or stays on the touchpad, which enables users to get continuous visual feedback even when they are not touching the surface. Users can touch the screen, press a button, flick a cover-flow list, and draw a simple stroke using this shadow of the thumb.
We tried another possibility ThickPad. We explored the use of a hover tracking touchpad in a laptop environment. By utilizing a shape sensing capability of the ScreenPad, we designed new types of gestures: an area gestures. Area gestures may be easy and intuitive. For instance, a user may cover the left part of the touchpad with the left hand to switch to other application temporarily and use the right part of the touchpad to manipulate the application. A user will return to the main application by removing the left hand from the touchpad.
We are continuously improving the sensing accuracy and trying various form factors. We examined a remote-controller-like form factor in RemoteTouch, and laptop touchpad form factor in ThickPad. We are now examining bigger form factors with more components such as pressure sensors, accelerometers, and feedback actuators.


  • Seongkook Heo, Jaehyun Han, and Geehyuk Lee. 2013,
    Designing Rich Touch Interaction through Proximity and 2.5D Force Sensing Touchpad.
    OZCHI 2013
  • Jiseong Gu, Seongkook Heo, Jaehyun Han, Sunjun Kim, and Geehyuk Lee. 2013,
    LongPad: A TouchPad Using the Entire Area below the Keyboard on a Laptop Computer
    CHI 2013
  • Jaehyun Han, Sangwon Choi, Seongkook Heo, Geehyuk Lee. 2012,
    Optical touch sensing based on internal scattering in a touch surface.
    Electronics Letters, 48(22)
  • Sangwon Choi, Jiseong Gu, Jaehyun Han, Geehyuk Lee. 2012,
    Area Gestures for a Laptop Computer Enabled by a Hover-Tracking Touchpad.
    In Proceedings of the 10th asia pacific conference on Computer human interaction (APCHI ’12). ACM, New York, NY, USA.
  • Sangwon Choi, Jaehyun Han, Sunjun Kim, Seongkook Heo, and Geehyuk Lee. 2011,
    ThickPad: a Hover-Tracking Touchpad for a Laptop
    In Proceedings of the 24th annual ACM symposium adjunct on User interface software and technology (UIST ’11 Adjunct). ACM, New York, NY, USA.
  • Sangwon Choi, Jaehyun Han, Geehyuk Lee, Narae Lee, and Woohun Lee. 2011,
    RemoteTouch: Touch-Screen-like Interaction in the TV Viewing Environment.
    In Proceedings of the 2011 annual conference on Human factors in computing systems (CHI ’11). ACM, New York, NY, USA.


Press Release





Future Pads
ThickPad, UIST 2011
ScreenPad hover tracking technology?ScreenPad hover tracking technology
RemoteTouch, CHI 2011