Tutorials

Title :
Computational Intelligence Based Human-Machine Interfaces – from Large Data Sets to 3D Touch and Thought Control

Tutorial Organizer:
Milos Manic, misko@ieee.org

Technical Outline of the tutorial
Technology advancements in how we interact with machines, computers, smart phones and tablets have introduced paradigms that were inconceivable a decade ago. Humans command smartphones and tablets via touch, gestures, and voice, vehicles use tactile control to communicate with drivers, and motion/depth sensing devices enable interactive gaming across the Internet. This tutorial will illustrate projects from various commercially available devices (3D touch device - Novint Falcon, EEG Neuroheadset - Emotive) to more expensive immersive visualization environments (Computer Assisted Virtual Environments – CAVE). It will also be demonstrated how augmented reality (visual/tactile) simulators can be produced (driving simulator). Examples of funded projects will entail:
·    Immersive visualization interaction - 3D Visual and interactive data mining,
·    Brain activity monitoring – control of mobile robots via thought,
·    3D force-feedback tactile HMI – “drive by touch” teleoperation (control mobile robots, remote welding of nuclear waste),
·    Augmented reality HMIs – tactile, immersive visualization driving simulators,
·    Rich displays – OpenGL based, roll-up visualization on tablets (building energy management systems),
·    Depth perception input devices (kinect) – motion recognition.

Pre-requisites
None

Target audience
Audience with interests in utilizing low-cost, off the shelf HMI interfaces with applications in energy, rehabilitation, robotics, and gaming.