Flocking around Water

Mar. 2017


Introduction

Flocking around Water is a interactive project built in AlloSphere, a large-scale facility which creates an immersive environment inside. This project is an audio visual composition that 3 groups of sound generating agents flock inside the environment and interact with the user. The goal of this project is to design an immersive user interface system that intended to blur the boundary between user interface components that is not included in the content of the scene and the virtual objects in the scene. Three groups of flocking objects were not only the components of the visual animation but also the controllers that control the flocking animation and the surrounding sounds respectively. This project is the final project for MAT 201B, Computing with Media Data, at UCSB.

Advisor: Karl Yerkes

Programming Language: C++

Tools: AlloSphere, PhaseSpace Gloves

Exhibition: MAT End of Year Show 2017, UCSB

This piece in AlloSphere creates a virtual world containing 3 groups of agents flocking inside. The flocking behavior, color and sound of each group is different from each other. The purplish cones are generating the sound of filtered closed hi-hat, the greenish cubes are generating chirping sound from crickets, and the small yellow tetrahedrons, which are representing "water" in this scenario is moving around with the sound of a running brook. The center of each group, which is their average position in the space, will be used as the sound source position for spatial sound.

The interaction is realized by a pair of gloves, which were PhaseSpace Gloves modified by Tim Wood from AlloSphere research group to make it fully functional in AlloSphere. By pinching with the left index finger and the left thumb towards an agent, that agent will be picked, and all the other akin agents will stop their flocking behavior to seek the position of the picked one. And the picked one will follow the movemont of the left hand after being captured, followed by the whole group. By pinching with the right index finger and the right thumb after that, the picked agent and its group will be dragged closer to the center of the Allosphere, which will increase the amplitude of the sound from this group.

Try to catch an agent.

Captured a cone, and dragged towords the center of the AlloSphere, result in an amplified sound effect and the whole cone group is crowded around the operator and the audience.

Unleash the whole group.

Captured the cone group again and dragged them around inside the virtual world.

Because the stereoscopic effect in the AlloSphere is realized by 26 projectors, which are connecting to one simulator, projecting on the sphere-shaped "wall" inside it, the viewer will need to wear a pair of special-made glasses to see the three-dimensional scenes. Thus, it is almost impossible to record a video of the project in the AlloSphere and show the same effect when play it on a regular computer screen afterwards. To somewhat depict how this project works, I recorded two videos of this piece running on my own computer.


Outside view.

Inside view.







Visualizing Time Oriented Data
in Virtual Reality
In Progress / Processing / Unity / Oculus
Event Poster Design End of Year Show 2017 / MAT UCSB
Flocking around Water Augmented Virtuality / C++ / AlloSphere
Tourism Navigation Device
for Qinhuai, Nanjing
UI Design / Product Design / Prototype / Cultual Research
Star Wars Nebula Virtual Sculpture / Processing / MySQL / SPL Database
AR Navigation App Augmented Reality / Android / OpenGL ES / Vuforia / UI Design
Art & Design Distribution Virtual Sculpture / Processing / Behance API
Reading Preference
about Sports
Processing / MySQL / SPL Database
Supermarket System
Design
UX Research / User Centered Design
All-set App UI Design / Prototype
Nanjing Breakfast
Packing
Branding / Cultual Research / Student Research Training
Luminaria Physical Computing / Arduino (Electronics)
Doodle in the Air Virtual Sculpture / Processing / Kinect