Visual Music Systems

About the OVC-3D

A new OVC for a new era

It’s time for musicians to take control over visual reality. Time to turn dreamworlds into shared realities. Time to liberate the dance from the dancer.

We believe new technologies should be used to open doors to new worlds, not to recreate today’s world. We have spent over a decade experimenting, refining, and creating programs to that transform virtual reality platforms into space ships to new realities.

The result is the OVC-3D. A program that maximizes the visual potential of 3D headsets like Oculus, Vive, and Pulse to create experiences of stunning beauty and originality. 

So slip on your headset, sit back, and experience new worlds. See immersive music compositions created in real time, browse through immersive symphonies created by our musicians, or learn to create your own realities. 

The OVC-3D is played in real time like any other musical instrument. The colors and patterns are triggered instantaneously by the actions of the artist and only by these actions – the audio signals are not used within the visual synthesizer. Just as multiple audio musicians can perform together to fill up the audio environment, multiple visual musicians can contribute simultaneously to the 3D scenes. The system can be played completely live by one or more musicians, and contributions can also be overdubbed using the multi-track recorder in the same way that compositions are developed in an audio recording studio.

The OVC-3D is a complete redesign of the earlier 2D visual instrument developed in the 1970s. The new system primarily consists of software running on a PC equipped with an advanced graphics card. The input devices consist of proprietary hand controllers especially designed for this work along with customized foot pedals. A simplified version can be operated using VR controllers like the Oculus Touch, Sony headset, or the HTC Vive controller. The system supports many types of outputs, but the focus has been on head mounted devices (HMDs) and multi-projector dome systems. For VR audiences, we can generate stereoscopic panoramic mp4s, but we also provide a proprietary player. The player provides better image quality and also tracks head position, which can’t be done in mp4s.

The hardware and software have been developed by a team of engineers over a decade. The system design is radically different from gaming systems, simulators, or 3D modeling systems. The design, in many ways, is more similar to audio systems. It includes several sub-systems:

  • A graphical programming environment to build synthesizers similar to PureData
  • A networking protocol called VMIF that is similar to MIDI
  • A performance instrument able to accept multiple types of input controllers
  • A multi-track recording and editing system
  • Output drivers, including systems for distributing processing over multiple client computers

Each of these modules required a massive development effort, and over one million lines of computer code have been written.

Our goals are to explore new, reality altering worlds, to enable others to generate their own worlds, and to create a community of artists that use and improve our tools to create experiences we never anticipated.