A Dynamic 3D Performance Space for Control of Virtual Musical Instruments (Note)

Andrew Hollenbach, Matthew Cox, Joe Geigel

The musician performs on the interactive projection stage.

Active Pathways: Using Active Tangibles and Interactive Tabletops for Collaborative Modeling in Systems Biology

Meghna Mehta, Ahmed Sabbir Arif, Apurva Gupta, Sean DeLong, Roozbeh Manshaei, Graceline Williams, Manasvi Lalwani, Sanjay Chandrasekharan, Ali Mazalek

Active Pathways is an active tangible and tabletop system that aims to support collaborative discovery and learning in biochemical modeling.

AnyLight: Programmable Ambient Illumination via Computational Light Fields

Yuichiro Takeuchi, Shunichi Suwa, Kunihiko Nagamine
Describes AnyLight, a "true" programmable lighting system based on the principle of integral imaging. Can change the way we think about artificial lighting.

Are Kindergarten Children Ready for Indirect Drag Interactions? (Note)

Vicente Nacher, Alfredo Ferreira, Javier Jaen, Fernando Garcia-Sanjuan

Kindergarten child performing the accelerometer task.

Ballumiere: Real-Time Tracking and Spherical Projection for High-Speed Moving Balls (Note)

Shio Miyafuji, Masato Sugasaki, Hideki Koike

We propose a new prediction method and a system for Real-Time Tracking and Spherical Projection for High-Speed Moving Balls. This is an example of projection on ball thrown in the air.

CapCam: Enabling Rapid, Ad-Hoc, Position-Tracked Interactions Between Devices

Robert Xiao, Scott Hudson, Chris Harrison
We present CapCam, a system providing rapid, ad-hoc pairing between a smartphone and a capacitive touchscreen, enabling rich, zero-configuration, positionally tracked interactions between the phone and screen.

ControllAR: Appropriation of Visual Feedback on Control Surfaces (Note)

Florent Berthaut, Alex James Jones

ControllAR facilitates the appropriation of visual feedback on control surfaces. Here elements of the GUI of a music production application have been remixed and placed over the sensors of a MIDI controller.

CurationSpace: Cross-Device Content Curation Using Instrumental Interaction

Frederik Brudy, Steven Houben, Nicolai Marquardt, Yvonne Rogers

Based on the concept of Instrumental Interaction, CurationSpace allows users to interact with digital curation artefacts on shared interactive surfaces using personal smartwatches as selectors for instruments and content.

Differentiating in-Game Frustration from at-Game Frustration using Touch Pressure

Matthew K Miller, Regan L Mandryk
We conducted an experiment to determine whether touch pressure from game interaction can differentiate between motivating in-game frustration and disheartening at-game frustration. We can!

DIRECT: Making Touch Tracking on Ordinary Surfaces Practical with Hybrid Depth-Infrared Sensing

Robert Xiao, Scott Hudson, Chris Harrison
We present DIRECT, a touch tracking approach which achieves high accuracy using depth and infrared data from a commodity depth camera.

Effect of Touch Latency on Elementary vs. Bimanual Composite Tasks (Note)

Elie Cattan, Amélie Rochet-Capellan, François Bérard
Latency has been proved to have negative influence for direct touch interaction on elementary tasks. We go one step further and show similar effects on a composite task using bimanual interaction.

EV-Pen: Leveraging Electrovibration Haptic Feedback in Pen Interaction

Qinglong Wang, Xiangshi Ren, Sayan Sarcar, Xiaoying Sun
We present an Electrovibration Pen (EV-Pen) which leverages electrovibration technology in pen interactions and provides pen-on-paper feeling and supports precise interaction.

Expanding Selection for Information Visualization Systems on Tablet Devices

Ramik Sadana, John T Stasko

A Clutch-based modifier is introduced for augmenting standard multitouch gestures to perform advanced selection within a tablet-based data visualization system.

Felines, Foragers, and Physicists: Supporting Scientific Outreach with Multi-Surface and Multi-Space Games

Victor Cheung, James R Wallace

We describe the design, development, and deployment of two scientific outreach games: Quantum Cats and Alice and Schrodinger's Excellent Adventure.

GlassHands: Interaction Around Unmodified Mobile Devices Using Sunglasses

Jens Grubert, Eyal Ofek, Michel Pahud, Matthias Kranz, Dieter Schmalstieg

GlassHands extends the input space around unmodified mobile devices using reflections in glasses.

In-Place-Ink: Toward More Direct Handwriting Interfaces

Jiseong Gu, Geehyuk Lee
We explored the potential of the In-Place-Ink pen interfaces, where a user writes directly on a target text input area.

Ketsuro-Graffiti: an Interactive Display with Water Condensation (Note)

Yuki Tsujimoto, Yuichi Itoh, Takao Onoye

Ketsuro-Graffiti: an Interactive Display with Water Condensation

LayerFish: Bimanual Layering with a Fisheye In-Place

Andrew M. Webb, Andruid Kerne, Zach Brown, Jun-Hyun Kim, Elizabeth Kellogg

Example of bimanual interaction in LayerFish. The left hand touch has a correspondent selected and held in place, while the right hand drags to scroll the fisheye scene index.

Less Is More: Efficient Back-of-Device Tap Input Detection Using Built-in Smartphone Sensors (Note)

Emilio Granell, Luis A. Leiva

Back-of-device (BoD) interaction using a regular (unmodified) smartphone.

Miners: Communication and Awareness in Collaborative Gaming at an Interactive Display Wall (Note)

Ulrich von Zadow, Daniel Bösel, Duc Dung Dam, Patrick Reipschläger, Anke Lehmann, Raimund Dachselt

We present Miners, a collaborative game for a display wall using bimanual tangible+touch interaction. Our exploratory study finds high enjoyment and engagement, but also awareness and occlusion issues.

MultiLens: Fluent Interaction with Multi-Functional Multi-Touch Lenses for Information Visualization

Ulrike Kister, Patrick Reipschläger, Raimund Dachselt

We propose MultiLens, touch-enabled magic lenses including a widget-based approach with a novel drag-snap slider and continuous gestures to fluently manipulate multiple lens functions, parameters and the combination of lenses.

Multimodal Segmentation on a Large Interactive Tabletop: Extending Interaction on Horizontal Surfaces with Gaze

Joshua Newn, Eduardo Velloso, Marcus Carter, Frank Vetere
We propose two solutions for the design of interactive systems that utilise remote gaze-tracking on the tabletop; multimodal segmentation and X-Gaze, our novel technique to interact with out-of-reach objects.

Physical-Digital Privacy Interfaces for Mixed Reality Collaboration: An Exploratory Study

Mohamad H salimian, Derek Reilly, Stephen Brooks, Bonnie MacKay
We present a study exploring privacy behaviours in mixed reality collaborative environments. We consider two scenarios involving hiding and sharing blended physical-virtual documents around tabletop and under two vertical display conditions.

Rapid Command Selection on Multi-Touch Tablets with Single-Handed HandMark Menus

Md. Sami Uddin, Carl Gutwin

Rapid selection with HM-Finger technique by chunking two sequential actions.

RootCap: Touch Detection on Multi-electrodes using Single-line Connected Capacitive Sensing

Masaya Tsuruta, Shuta Nakamae, Buntarou Shizuki

RootCap: Touch Detection on Multi-electrodes using Single-line Connected Capacitive Sensing

ShadowHands: High-Fidelity Remote Hand Gesture Visualization using a Hand Tracker

Erroll Wood, Jonathan Taylor, John Fogarty, Andrew Fitzgibbon, Jamie Shotton

ShadowHands - a novel technique for visualizing a remote user's hand gestures using a single depth sensor and hand tracking system.

Shared interactive music experiences in public spaces: user engagement and motivations

Maximilian Müller, Nuno Otero, Marcelo Milrad
In two in-the-wild studies we identified specific (group) interactions with a collaborative music player. We propose features for further engagements with such systems and new ways of social interaction.

Talaria: Continuous Drag & Drop on a Wall Display (Note)

Hanae Rateau, Yosra Rekik, Laurent Grisoni, Joaquim Jorge
Example of a one-handed Take-off gesture. Starting dragging on the screen and finishing in midair.Dragging will continue as long as users do not drop the dragged object touching the screen with the NDH.

TapSkin: Recognizing On-Skin Input for Smartwatches

Cheng Zhang, AbdelKareem Bedri, Gabriel Reyes, Bailey Bercik, Omer T Inan, Thad E Starner, Gregory D Abowd

TapSkin Gestures

The Effect of Visual Distractors in Peripheral Vision on User Performance in Large Display Wall Systems

Anton Sigitov, Ernst Kruijff, Christina Trepkowski, Oliver Staadt, André Hinkenjann
We report on two studies that investigated the effect of task-irrelevant distractors in peripheral vision for the user working on a large display. We apply the results to mixed-focus collaboration.

Towards Road Traffic Management with Forecasting on Wall Displays

Arnaud Prouzeau, Anastasia Bezerianos, Olivier Chapuis

An operator checking the impact of his action on the traffic, only for a specific local area, with two simulations on a dragmagic.

UD Co-Spaces: A Table-Centred Multi-Display Environment for Public Engagement in Urban Design Charrettes

Narges Mahyar, Kelly J Burke, Jialiang (Ernest) Xiang, Siyi (Cathy) Meng, Kellogg S Booth, Cynthia L Girling, Ronald W Kellett

UD Co-Spaces (Urban Design Collaborative Spaces) is an integrated, tabletop-centered multi-display environment for engaging the public in the complex process of collaborative urban design.

UI Testing Cross-Device Applications

Maria Husmann, Michael Spiegel, Alfonso Murolo, Moira Norrie
We present XD-Testing, a library for verifying if applications distribute correctly across specific combinations of devices and if they behave as expected despite being distributed on arbitrary combinations of devices.