“Multi-modal Human-Computer Interaction” by Eric Rombokas, Ph.D. Postdoctoral Researcher, Department of Biology, University of Washington
Abstract: New sensory devices are enabling a fundamental shift in how we interact with technology, bringing interaction out of the screen and into the world using movements of the hands, face, and body. Cameras have long been explored as a means for this, but we are seeing a revolution in depth-sensing cameras that are making it better than ever. I will argue that we can go even further by exploiting other sensors simultaneously. We humans are naturally multi-modal, seamlessly blending the senses together simultaneously as we move, so it is natural to explore body movement not just visually, but also through the action of our muscles. I am working on sensing the action of muscles through electromyography (EMG) to compliment and extend gesture computing. This information provides an ideal counterpoint to the limitations of imaging, providing a richer physical interaction and potentially new insights into human movement and sensation.
“Sparse decision making: how to classify using very few sensors” by Bingni Brunton, Ph.D. Postdoctoral Researcher, Departments of Biology and Applied Mathematics, University of Washington
Abstract: Dr. Brunton will speak about our recent work developing an algorithm that harnesses enhanced sparsity, the orders-of-magnitude reduction in number of measurements required for signal classification over reconstruction. This sparse sensors algorithm provides one approach to the question, given a fixed budget of sensors, where should they be placed to optimally inform decision making? I will argue that this perspective has applications to sensory-motor neural technology development, as well as to our understanding of how biological organisms process information about a complex environment.