• Platinum Pass Platinum Pass
  • Full Conference Pass Full Conference Pass
  • Full Conference One-Day Pass Full Conference One-Day Pass
  • Basic Conference Pass Basic Conference Pass
  • Student One-Day Pass Student One-Day Pass
  • Experience Pass Experience Pass
  • Exhibitor Pass Exhibitor Pass

Date/Time:
18 – 19 November 2019, 10:00 - 18:00
20 November 2019, 10:00 - 16:00
Venue: Great Hall 3&4 - Experience Hall (Foyer Level, Merivale St)


[CURATED] An Integrated 6DoF Video Camera and System Design

Description: Designing a fully integrated 360◦ video camera supporting 6DoF head motion parallax requires overcoming many technical hurdles, including camera placement, optical design, sensor resolution, system calibration, real-time video capture, depth reconstruction, and real-time novel view synthesis. While there is a large body of work describing various system components, such as multi-view depth estimation, our demo is the first to show a complete, reproducible system that considers the challenges arising when designing, building, and deploying a full end-to-end 6DoF video camera and playback environment. Our system includes a computational imaging software pipeline supporting online markerless calibration, high-quality reconstruction, and real-time streaming and rendering. Most of our exposition is based on a professional 16-camera configuration, which will be commercially available to film producers. However, our software pipeline is generic and can handle a variety of camera geometries and configurations. The entire calibration and reconstruction software pipeline along with example datasets is open sourced to encourage follow-up research in high-quality 6DoF video reconstruction and rendering.

Read More

360Drops: Mixed Reality Remote Collaboration using 360 Panorama in 3D Scene

Description: Mixed Reality Remote Collaboration that enables a remote user to create multiple 360 panoramas in a 3D scene that can be accessed and interacted in different ways to communicate with a local user. This research is intended to investigate enchantment in terms of performance and technique for remote collaboration.

Read More

AlteredWind: Manipulating Perceived Direction of the Wind by Cross-Modal presentation of Visual, Audio and Wind Stimuli

Description: We developed AlteredWind, a multisensory wind display system that manipulates users' perception of wind by the integration of visual, audio, and wind stimuli. AlteredWind present images of flowing particles and three-dimensional (3D) sounds of wind together with the wind to induce the visuo-haptic and audio-haptic cross-modal effect.

Read More

Biofeedback Interactive VR System Using Biological Information Measurement HMD

Description: We proposed biofeedback interactive VR system. In this system, VR experiences are interactively changed according to user's biological information or emotion estimated by it. To realize this system, we developed the respiration and pulse wave measurement device that can be easily attached to various HMDs.

Read More

Brobdingnagian Glass: A Micro-Stereoscopic Telexistence System

Description: We propose a system that realizes the binocular perspective of a miniature human using a vibrating hemispherical mirror and a camera in order to remove the lower limit of realizable scale. We reproduced the binocular stereovision at 1.72 mm as the interpupillary distance, about a 1/38 scale of human.

Read More

Co-Limbs: An Intuitive Collaborative Control for Wearable Robotic Arms

Description: ’Co-limbs’ user interface for wearable robot arms.

Read More

Demonstration of ThermAirGlove: A Pneumatic Glove for Material Perception in Virtual Reality through Thermal and Force Feedback

Description: We demonstrate ThermAirGlove(TAGlove), a pneumatic glove which provides concurrent on-hand thermal and force feedback by controlling the volume and temperature of the air pumped into airbags embedded in the glove. TAGlove could generate the thermal cues of different materials (e.g., copper, glass, urethane), and support users’ material identification in VR.

Read More

Enhancing Suspension Activities in Virtual Reality with Body-Scale Kinesthetic Force Feedbacks

Description: This work presents a suspension kit that can suggest a range of body postures and thus enables various exercise styles of users in virtual reality. Users immersed in an exercise experience perceive active kinesthetic force feedback produced by the kit via suspending their weight with arm exertion.

Read More

Hanger Drive: Driver Manipulation System for Self-balancingTransporter Using the Hanger Reflex Haptic Illusion

Description: Self-balancing transporter are becoming popular as medium-distance transportation means such as police patrols and sightseeing tours, and are expected to further gain prevalence. In this study, we control the driving direction of self-balancing transporter indirectly by controlling the motion of user who is riding the vehicles.

Read More

Hapballoon: Wearable Haptic Balloon-Based Feedback Device

Description: The Hapballoon is a lightwearable device that can be used to present various haptic sensations. It is possible to present force feedback, especially with respect to pinching and gripping of objects, as well as temperature and vibration information to enhance the material feeling of the VR world.

Read More

Haptiple: A Wearable, Modular and Multiple Haptic Feedback System for Embodied Interaction

Description: We propose a multiple haptic feedback system called Haptiple, which is a wearable and modular system for embodied interactions based on a wireless platform. The system consists of vibro-tactile, pressure, and thermal/wind that can be placed on multiple body parts such as the hand, wrist, ankle, and chest.

Read More

IlluminatedFocus: Vision Augmentation using Spatial Defocusing

Description: We propose IlluminatedFocus, AR glasses enabling depth-independent spatial defocusing of a human vision. We show a system that switches focused and defocused views independently in each area. We realize various vision augmentation applications based on our method to show its potential to expand the application field of optical see-through AR.

Read More

Inclination Manipulator : Pitch Redirected Walking using Haptic Cues

Description: “Inclination Manipulator” manipulates the perception of spatial inclination by complementing the gap between the senses in real world (somatosensory) and VR space (vision) using haptic cue for generating rotational force component of the body. This enables to virtual walking up a slope on a flat surface in the real world.

Read More

Levitar: Real Space Interaction through Mid-Air CG Avatar

Description: We propose a system to be a CG avatars in real space by using mid-air imaging technology. The video captured from the mid-air image position is presented to the user via the HMD and the camera gaze direction synchronized with the user's head movement.

Read More

Licker: A Tongue Robot for Representing Realistic Tongue Motions

Description: We present Licker, a flexible tongue robot that is capable of mimicking a human tongue motion. The aim of this robot is to grow social bonding, regardless of species by licking.

Read More

Light'em: A Multiplexed Lighting System

Description: "Light'em" realizes multiplexing of the lighting environments using an active shutter system. The effect of visual stimuli on indoor environment desirability varies for each individual. However, different desired environments collide between individuals in one space. We propose a lighting environment multiplexing system simultaneously making independently controllable multiple different lighting environments.

Read More

NavigaTorch: Projection-based Robot Control Interface using High-speed Handheld Projector

Description: We propose ``NavigaTorch'', a projection-based robot control interface that enables the user to operate a robot quickly and intuitively. The user can control and navigate a robot from the third-person viewpoint using the projected video as visual feedback by using a handheld pixel-level visible light communication (PVLC) projector.

Read More

PinpointFly: An Egocentric Position-pointing Drone Interface using Mobile AR

Description: We proposed PinpointFly, a novel AR-based egocentric drone manipulation interface that increases spatial perception and manipulation accuracy by overlaying a cast shadow on the ground. We designed and implemented a proof-of-concept prototype using a motion tracking system, see-through AR techniques and a programmable drone.

Read More

Polyvision: 4D Space Manipulation through Multiple Projections

Description: 4D space has always been a source of imagination and intellectual activities for human beings. By our novel visualization technique based on multiple 3D projections created in VR, Polyvision equips us with the sense of 4D letting us explore high dimensional data and mathematical objects in a totally new way.

Read More

PortOn: Portable mid-air imaging optical system on glossy materials

Description: PortOn is a portable optical system that form mid-air images that stand on a glossy surface such as a table or the floor. PortOn projects light to form a mid-air image at a position that is easy for a human to see when it is placed on a flat surface.

Read More

ReFriction: Remote friction control on polystyrene foam by ultrasound phased array

Description: We propose a system that can remotely change the friction feel of the polystyrene foam surface using an Airborne Ultrasound Tactile Display (AUTD). Since no device is needed on the target surface, tactile display could be large or disposable. Furthermore, changing the surface friction of three-dimensional objects is also expected.

Read More

Simple is Vest: High-Density Tactile Vest that Realizes Tactile Transfer of Fingers

Description: We developed a high-density tactile vest, which adopts 144 eccentric mass vibration motors actuated individually and five Peltier elements that presents the haptic sensation of the five fingertips to the back rather than to the fingertip as a new haptic presentation method for objects in a virtual reality (VR) environment.

Read More

StickyTouch: An Adhesion Changeable Surface

Description: We propose StickyTouch, a novel tactile display that represents adhesive information on a surface. Adhesion control can be achieved by a temperature sensitive adhesive sheet whose temperature is locally controlled by peltier devices arranged in a grid. We implemented a proof-of-concept prototype and proposed the example applications of the display.

Read More

SwarmCloak: Landing of a Swarm of Nano-Quadrotors on Human Arms

Description: A novel system SwarmCloak is used for landing of a fleet of four flying robots on the human hands and forearms using light-sensitive landing pads with vibrotactile feedback. Two types of tactile displays with vibromotors are activated by the light emitted from LED array at the bottom of each quadcopter.

Read More

Synesthesia Wear : Full-body haptic clothing interface based on two-dimensional signal transmission

Description: Synesthesia Wear is a novel full-body haptic interface that provides vibrotactile sensations to the whole body and free movement in space using Two-dimensional signal transmission Technology. We also demonstrate spatial computing application as the future use in Mixed Reality using Synesthesia Wear.

Read More

Three-dimensional Interaction Technique Using an Acoustically Manipulated Balloon

Description: We use an acoustically manipulated balloon as a visual and tangible interface for the representation of a mid-air virtual object in a full-body AR/MR environment. The user can manipulate the physical balloon by manipulating the corresponding virtual object, and conversely, can manipulate the virtual object by physically manipulating the balloon.

Read More

TwinCam Go: Proposal of Vehicle-Ride Sensation Sharing with Stereoscopic 3D Visual Perception and Vibro-Vestibular Feedback for Immersive Remote Collaboration

Description: We proposed and developed a prototype of vehicle-ride sensation sharing system that enables a rider to remotely collaborate with a driver and can receive both 3D visual perception and vibro-vestibular sensation. A remote rider can collaborate with driver via voice communication and perceive motion sensation via wheelchair’s movement.

Read More