Visual music is a sensory experience, not a logical experience. Imagine listening to Beethoven and thinking how the French horns sound like your grandmother’s car. That type of thinking uses the wrong part of your brain and takes away your ability to enjoy the musical experience. Instead, we try to sit back and allow the show to become a sensory experience—a treat for our ears without the need to find symbolism in every note and sound.
Imagine a percussion piece. All drums, nothing else. Then the arranger adds in a piano. But, instead of letting the pianist explore the range of the piano, the arranger has the pianist track the drums note for note, beat for beat. Is that creation or a waste of the possibilities of a piano?
Audio music does not translate directly to visual music, no more than a drum composition can dictate the piano solo. Yet, there are corollaries between many elements of musical composition and visual music. What follows aren’t rules but ideas of how to create visual music.
Most visual art is art is created and then viewed. The painter finishes the painting and then you view it. The film maker finishes production and then you view it. This creates a separation of the consciousness of perceiving the visual images with that of creating it, preventing direct communication between the artist and the audience.
We are constructing a language of visual music. This language relies on non-symbolic communication. Rather than relying on past experiences and taught words to help viewers understand our language, we are communicating through feelings and sensations. This language is created by the visual musician every time they craft a piece yet, like every language ours too has a grammar and a vocabulary.
When we create visual music we aren’t creating in 3D—we are creating in 4D. In addition to creating in an up and down, left and right, back and forth world we are also creating a world that changes over time. What you see and think of as a 3D model is only one slice of time, like a CAT scan shows 2D slices of a 3D reality.
Anyone who has put on a head mounted display like the Oculus has probably gotten motion sickness at one point or another. Smart designing however can create movement without motion sickness in virtual reality.
Provided their HMD is working correctly and getting an adequate frame rate, most people aren’t going to get sick just sitting still in virtual reality.
The sensation of movement in visual music is what gives the viewer a sense of rhythm and the physical sensation of being in the environment. Yet, convincing the viewer’s brain that he is moving and that the virtual world is standing still, is a challenge. For VR developer’s working with realistic imagery, this isn’t a challenge. If the viewer is standing in a room and the walls start to move, his brain is going to assume that he is moving and the walls are standing still.
One of the most fun design considerations for VR is how to work with peripheral vision. Most two dimensional designs focus on the central 70 degrees of your vision. That central part is where almost all of your cones are that let you see color and give you your visual acuity. Your full field of view though is about 180 degrees and that remaining 110 degrees is your peripheral vision.
Turns out the first TV screens were really small—and the original movie screens weren’t that much better if you were in the back row. VR environments are anything but small and what designers and artists need to figure out is how that affects the ways that worlds and scenes need to be developed.
To figure that out we need to take a step back and think about our eyes. The fovea is this tiny part of your retina (less than 1%)