SIGGRAPH Asia 2019 Awards submissions from various programs. Recognizing individuals who have made exceptional effort in their community through their research, teaching, service, or writing.


Computer Animation Festival Award Winners
Best In Show Award - Kids
BEST IN SHOW AWARD – Kids

Creator(s): Michael, Mario

Studio/Organization Affiliation(s): Some Shorts, Playables

Country(s) of Origin: Netherlands, Switzerland

Synopsis: KIDS is a game of crowds. The project consists of a short film, an interactive animation and an art installation. How do we define ourselves when we are all equal? Who is steering the crowd? What if it is heading in the wrong direction? Where does the individual end and the group begin? What is done by choice, and what under duress? KIDS was made using traditional 2D hand-drawn line animation in black and white. The animation was assembled, composited and choreographed using a game engine with a custom-made animation system in conjunction with physics simulations. The characters in a crowd behave much like matter: They attract and repel, lead and follow, grow and shrink, align and separate. They are purely defined by how they relate to one other - without showing any distinguishable features. KIDS is the second collaboration of filmmaker Michael Frei and game designer Mario von Rickenbach after their project PLUG & PLAY. The project is co-produced by Playables, SRG SSR and Arte. The app is published by Double Fine Presents for mobile devices and computers.


Best In Show Award - Kids
JURY SPECIAL AWARD – Spring

Creator(s): Andy, Francesco

Studio/Organization Affiliation(s): Blender

Country(s) of Origin: Netherlands, Switzerland

Synopsis: Spring is the story of a shepherd girl and her dog, who face ancient spirits in order to continue the cycle of life. This poetic band visually stunning short film was written and directed by Andy Goralczyk, inspired by his childhood in the mountains of Germany. The Spring team used the development version of Blender 2.80 for the entire production, before the software was in official Beta even. As for all of Blender’s Open Movies, the entire production process and all its source files are being shared on the production platform Blender Cloud.


Best In Show Award - Kids
BEST STUDENT FILM AWARD - The Ostrich Politic

Creator(s): Mohammad, Moïra

Studio/Organization Affiliation(s): Miyu Distribution

Country(s) of Origin: France

Synopsis: Ostriches carry on their daily activities burying their heads, believing It’s an instinctive behavior. However, one day a research by phylogeneticist Dr. Kays proves otherwise.

Back to Top


Emerging Technologies Award Winners
BEST DEMO VOTED BY ATTENDEES - Enhancing Suspension Activities in Virtual Reality with Body-Scale Kinesthetic Force Feedbacks
BEST DEMO VOTED BY ATTENDEES - Enhancing Suspension Activities in Virtual Reality with Body-Scale Kinesthetic Force Feedbacks

Presenter(s): Yuan-Syun Ye, National Chiao Tung University, Institute of Multimedia Engineering, Taiwan
Hsin-Yu Chen, National Chiao Tung University, Taiwan
Liwei Chan, National Chiao Tung University, Taiwan

Description: This work presents a suspension kit that can suggest a range of body postures and thus enables various exercise styles of users in virtual reality. Users immersed in an exercise experience perceive active kinesthetic force feedback produced by the kit via suspending their weight with arm exertion.


HONOROUBLE MENTIONS
Hanger Drive: Driver Manipulation System for Self-balancing Transporter Using the Hanger Reflex Haptic Illusion
Hanger Drive: Driver Manipulation System for Self-balancing Transporter Using the Hanger Reflex Haptic Illusion

Presenter(s): Masato Kobayashi, The University of Electro-Communications, Japan
Yuki Kon, The University of Electro-Communications, Japan Jianyao Zhang, The University of Electro-Communications, Japan
Hiroyuki Kajimoto, The University of Electro-Communications, Japan

Description: Self-balancing transporter are becoming popular as medium-distance transportation means such as police patrols and sightseeing tours, and are expected to further gain prevalence. In this study, we control the driving direction of self-balancing transporter indirectly by controlling the motion of user who is riding the vehicles.


Licker: A Tongue Robot for Representing Realistic Tongue Motions
Licker: A Tongue Robot for Representing Realistic Tongue Motions

Presenter(s): Ryota Shijo, The University of Electro-Communications, Japan
Izumi Mizoguchi, The University of Electro-Communications, Japan

Description: We present Licker, a flexible tongue robot that is capable of mimicking a human tongue motion. The aim of this robot is to grow social bonding, regardless of species by licking.


BEST DEMO VOTED BY COMMITTEE - StickyTouch: An Adhesion Changeable Surface
BEST DEMO VOTED BY COMMITTEE - StickyTouch: An Adhesion Changeable Surface

Presenter(s): Yoshitaka Ishihara, Osaka University, Japan
Ryo Shirai, Osaka University, Japan
Yuichi Itoh, Osaka University, Japan
Kazuyuki Fujita, Tohoku University, Research Institute of Electrical Communication, Japan
Takao Onoye, Osaka University, Japan

Description: We propose StickyTouch, a novel tactile display that represents adhesive information on a surface. Adhesion control can be achieved by a temperature sensitive adhesive sheet whose temperature is locally controlled by peltier devices arranged in a grid. We implemented a proof-of-concept prototype and proposed the example applications of the display.


HONOROUBLE MENTIONS
Enhancing Suspension Activities in Virtual Reality with Body-Scale Kinesthetic Force Feedbacks
Enhancing Suspension Activities in Virtual Reality with Body-Scale Kinesthetic Force Feedbacks

Presenter(s): Yuan-Syun Ye, National Chiao Tung University, Institute of Multimedia Engineering, Taiwan
Hsin-Yu Chen, National Chiao Tung University, Taiwan
Liwei Chan, National Chiao Tung University, Taiwan

Description: This work presents a suspension kit that can suggest a range of body postures and thus enables various exercise styles of users in virtual reality. Users immersed in an exercise experience perceive active kinesthetic force feedback produced by the kit via suspending their weight with arm exertion.


Brobdingnagian Glass: A Micro-Stereoscopic Telexistence System
Brobdingnagian Glass: A Micro-Stereoscopic Telexistence System

Presenter(s): Ryo Ito, The University of Tokyo, Japan
Leo Miyashita, The University of Tokyo, Japan
Masatoshi Ishikawa, The University of Tokyo, Japan

Description: We propose a system that realizes the binocular perspective of a miniature human using a vibrating hemispherical mirror and a camera in order to remove the lower limit of realizable scale. We reproduced the binocular stereovision at 1.72 mm as the interpupillary distance, about a 1/38 scale of human.


SwarmCloak: Landing of a Swarm of Nano-Quadrotors on Human Arms
SwarmCloak: Landing of a Swarm of Nano-Quadrotors on Human Arms

Presenter(s): Evgeny Tsykunov, Skolkovo Institute of Science and Technology, Russia
Ruslan Agishev, Skolkovo Institute of Science and Technology, Russia
Roman Ibrahimov, Skolkovo Institute of Science and Technology, Russia
Luiza Labazanova, Skolkovo Institute of Science and Technology, Russia
Taha Moriyama, The University of Electro-Communications, Japan
Hiroyuki Kajimoto, The University of Electro-Communications, Japan
Dzmitry Tsetserukou, Skolkovo Institute of Science and Technology, Russia

Description: A novel system SwarmCloak is used for landing of a fleet of four flying robots on the human hands and forearms using light-sensitive landing pads with vibrotactile feedback. Two types of tactile displays with vibromotors are activated by the light emitted from LED array at the bottom of each quadcopter.

Back to Top


XR Award Winners
BEST XR TECHNOLOGY VOTED BY ATTENDEES - Upload Not Complete
BEST XR TECHNOLOGY VOTED BY ATTENDEES - Upload Not Complete

Presenter(s): Chin-Hsiang Hu, Pepperconrs Interactive Media Art Inc., Taiwan
Bing-Hua Tsai, PSquare Media Lab, Taiwan
Zhao-Qing Chang, Pepperconrs Interactive Media Art Inc., Taiwan

Description: Imagine that upload process can see a virtual object in real space. When you see the virtual object and feel the influence (wind and vibration), after passing through the upwardly extending tunnel, the screen enters the completely virtual space, but you don't know whether the upload is completed.


HONOROUBLE MENTIONS
HyperDrum: Interactive Synchronous Drumming in Virtual Reality using Everyday Objects
HyperDrum: Interactive Synchronous Drumming in Virtual Reality using Everyday Objects

Speaker(s): Ryo Hajika, The University of Auckland, New Zealand
Kunal Gupta, The University of Auckland, New Zealand
Prasanth Sasikumar, The University of Auckland, New Zealand
Yun Suen Pai, The University of Auckland, New Zealand

Description: HyperDrum, which is about leveraging this cognitive synchronization to create a collaborative music production experience with immersive visualization in virtual reality. Participants will wear an electroencephalography (EEG) head-mounted display to create music and VR space together using a physical drum.


[Curated] Beyond the screen - Volumetric Displays from Voxon Photonics
[Curated] Beyond the screen - Volumetric Displays from Voxon Photonics

Speaker(s): Ben Weatherall, Voxon Photonics, Australia

Description: Volumetric 3D data is fast becoming the gold standard in 3D interactive entertainment. Advances in Volumetric capture technology have enabled entirely new digital experiences that include Sport Replay, Music Videos, Gaming and Advertising. Yet despite the technological advances in 3D content creation, the most common way to view 3D is still on a 2D screen. VR and AR has partially addressed this shortcoming, but the need to wear goggles or headgear creates a barrier between the user and the 3D experience. Voxon is seeking to remove that barrier, and in doing so is enabling human-centric 3D digital experiences that can be viewed from any direction with the naked eye. Using a unique volumetric rendering technology the Voxon display can bringing 3D digital assets into the physical world, and in doing so, enable a new type of shared visual experience. To make this display technology a reality Voxon had to develop the world’s fastest real-time light engine, which is capable of rendering over 3 Billion points of light per second. Ben Weatherall, Voxon’s Unity Lead Programmer will talk about Voxon’s core technology, how to create content, and some of the unique aspects of volumetric imagery that set it apart from other types of media. He will also discuss important areas for future research in the volumetric display space.


BEST XR CONTENT VOTED BY ATTENDEES - Super Size Hero
BEST XR CONTENT VOTED BY ATTENDEES - Super Size Hero

Speaker(s): Till Sander-Titgemeyer, Filmakademie Baden-Wurttemberg, Animationsinstitut, Germany
Jiayan Chen, Filmakademie Baden-Württemberg, Animationsinstitut, Germany
Ramon Schauer, Filmakademie Baden-Wurttemberg, Animationsinstitut, Germany
Mario Bertsch, Filmakademie Baden-Wurttemberg, Animationsinstitut, Germany
Sebastian Selg, Filmakademie Baden-Wurttemberg, Animationsinstitut, Germany
York von Sydow, Filmakademie Baden-Wurttemberg, Animationsinstitut, Germany
Ihab Al-Azzam, Filmakademie Baden-Wurttemberg, Animationsinstitut, Germany
Verena Nomura, Filmakademie Baden-Wurttemberg, Animationsinstitut, Germany

Description: The player will take on the role of an overweight superhero trying to save the day. In order to do so, the player who is wearing a tracked fat-suit has to use his belly in order to prevent a bank robbing.


HONOROUBLE MENTIONS
Pumping Life: Embodied Virtual Companion for Enhancing Immersive Experience with Multisensory Feedback
Pumping Life: Embodied Virtual Companion for Enhancing Immersive Experience with Multisensory Feedback

Speaker(s): Jing Yuan Huang, National Taipei University of Technology, Taiwan
Wei Hsuan Hung, National Taipei University of Technology, Taiwan
Tzu Yin Hsu, National Taipei University of Technology, Taiwan
Yi Chun Liao, National Taipei University of Technology, Taiwan
Ping Hsuan Han, National Taipei University of Technology, Taiwan

Description: We present Pumping Life, a dynamic flow system for enhancing the virtual companion with multisensory feedback, which utilizes water pumps and heater to provide shape deformation and thermal feedback. To show the interactive gameplay with our system, we deploy the system into a teddy bear in a VR game.


Lost City of Mer Virtual Reality Experience
Lost City of Mer Virtual Reality Experience

Speaker(s): Gregory W. Bennett, Auckland University of Technology, New Zealand
Liz Canner, Astrea Media, United States of America

Description: Lost City of Mer is a virtual reality experience combined with a smartphone app that immerses players in a fantasy undersea civilization devastated by ecological disaster caused by global warming. Harnessing the empathetic potential of VR players are given agency regarding their personal carbon footprint in combating climate change.


Encounters
Encounters: A Multiparticipant Audiovisual Art Experience with XR

Speaker(s): Ryu Nakagawa, Nagoya City University, Japan
Ken Sonobe, Nagoya City University, Japan

Description: What if we can make sound with physical objects using supernatural powers? We propose a multiparticipant audiovisual art experience using XR. In the experience, participants can fire virtual bullets or virtual beams at physical objects which then create a sound and a corresponding virtual visual effect.

Back to Top


Real-Time Live! Winner
BEST REAL-TIME LIVE! - The AirSticks: An Instrument for Audio-Visual Performance Through Gesture in Augmented Reality
BEST REAL-TIME LIVE! - The AirSticks: An Instrument for Audio-Visual Performance Through Gesture in Augmented Reality

Presenter(s): Alon Ilsar, SensiLab, Monash University, Australia
Matthew Hughes, University of Technology Sydney, Technische Universität Berlin, Australia

Description: The AirSticks are a gesture-based audio-visual instrument. This latest incarnation combines spatially controlled sound design with a 3D game environment projected onto a transparent screen. This system allows for the composition of highly integrated audio-visual environments superimposed directly onto the performance area.

Description: Alon Ilsar is an Australian-based drummer, composer, instrument designer and researcher at Monash University's SensiLab. He is the co-designer of a new gestural instrument for electronic percussionists, the AirSticks. Alon is researching the uses of the AirSticks in the field of health and wellbeing, making music creation more accessible to the broader community. Alon holds a PhD in instrument design through the University of Technology Sydney. He has played the AirSticks at Sydney’s Vivid Festival, on Triple J’s Like a Version and at NYC’s MET Museum, in projects including Trigger Happy ‘Visualised’, The Hour, The Sticks, Tuka (from Thundamentals), Sandy Evans’ ‘Ahimsa,’ Ellen Kirkwood’s ‘[A]part‘, Kirin J Callinan, Kind of Silence (UK), Cephalon (US) and Silent Spring. He has played drums in Belvoir Theatre’s ‘Keating! the Musical,’ Sydney Theatre Company’s ‘Mojo,’ Meow Meow with the London Philharmonic, Bergen Philharmonic and Sydney Symphony Orchestras, Alan Cumming, Jake Shears and Eddie Perfect.

Matt Hughes is a researcher, artist, programmer and musician whose works dive deep into the connection between sight and sound. Currently a PhD student at UTS' Animal Logic Academy, Matt's research explores the implementation and implications of using augmented reality (AR) in live electronic music performance. His work with Alon Ilsar for the AirSticks won the 'Best Instrument' and 'Best Performance' people's choice awards at the 2019 Guthman New Musical Instrument competition at Georgia Tech.

Back to Top