Gesture Based Information Control

Design Concept:

Sharing Information Between Devices Using In-Air Gesture Based Control.

This project was about the coordination of information flows across different devices through spatial controls, working in a spatially aware context. 

Design Problem

As technology begins to support more advanced and robust gesture detection, more natural ways of interacting with technology are becoming possible.  One function of interaction my team and I believe would work well with gesture interaction is information sharing across multiple devices.  Presently, information sharing across devices can be difficult, time consuming, and tedious or near impossible. 

While simply developing an intuitive way to push information between devices with a mouse and keyboard has the potential to be a huge improvement, utilizing gestures to directly “grab” and “drop” data files from device to device can create a more natural mapping for the user and give the user a more solid understanding of the operations they are performing by physically performing an action to represent the operation.


For this project our team took advantage of a full-room sensor mesh that can track multiple users and electronic devices throughout the room.  Using a configurable set of natural gestures, the user will be able to perform gestures in the air to tell their computer to perform certain operations.  The main operation we are focusing on is the “grab/drop or throw” gesture for “grabbing” files from one device and “dropping” them onto another.  The receiving device will know how to handle the specific type of file, and act upon it accordingly (for example, sharing a video file with a TV will begin playing the file automatically).  By just using this simple gesture, many new possibilities are available not only to make sharing information between devices simpler, but also to aid in collaboration between groups.


  • Ubiquitous information sharing across devices:  we want to be able to make sharing your data across your different devices simple and invisible, where technology gets out of the way.
  • Potential gestures to investigate: Swipe, grab, tap, slide, push (kind of the opposite of the grab, or a full hand stop motion), gestures with different numbers of digits, throwing/tossing, pigtail/loop/cursive ‘e’.


Here are a few examples of the interaction and gesture based control concepts we developed for this project.

Tools Used:

  • Digital Camera
  • Sketching
  • iPad camera and image tracing app
  • Leap Motion (used for gesture testing)
  • Google Docs and Drawing software
  • Keynote