Active Reading Gestures was a project for my seminar on Novel Interaction Techniques. Using previous research done by my partner, we created a prototype of a system that was based on natural reading gestures. The system was based on gestures people make when reading or manipulating documents. We designed a system that would cooperate with the way people already interacted with documents. I prototyped our system as a video. The following video is shown as a supplement alongside a presentation. A more detailed description of the system is available in our paper
The second part of this project involved creating a system that incorporated this work...
Read# (pronounced "Read Sharp") is a new system for document interaction and active reading activities. Read# allows for digital interactions synchronously with paper. The system allows a computer to be invisibly integrated into the reading environment and naturally track and record a reader’s gestures while minimizing the number of deliberate actions a user needs to make to annotate and comprehend a piece of reading. Read# aims to introduce the advantages of digital documents to the realm of physical documents. A video demo is available on YouTube and the research paper is available here.