Introducing Optophonia – A Real-Time Motion Graphics Environment for Performance Visuals, NFTs and Beyond

Optophonia is an audiovisual app for Mac and PC, a creator community and an NFT marketplace for the production, curation and dissemination of visual music. Optophonic art involves the temporal composition of moving images produced in tandem with music production. The centerpiece software is based on Unreal Engine’s leading edge real-time graphics capabilities. We are developing a toolset based on Unreal that allows for fast and intuitive experimentation and performance of interactive and generative visuals for music contexts.

The software gives digital artists an easy drag and drop virtual environment for fast modeling of geometry, lights, animations, visual effects, and movement paths. To this it adds a full set of interactive controllers and generative programming for additional complexity. The app also connects directly to NFT marketplaces, where unique assets can be imported into the app, and new motion graphics works can be minted as NFT outputs directly from within the app.

Our project is inspired by the conceptual approach of the historical Optophonic Piano, from which it takes its name, invented by renowned avant-garde artist and inventor, Wladimir Baranoff-Rossiné (1888–1944). Our platform, based on the blockchain innovations of NFTs and DAOs, empowers artists to collaborate in the creation of immersive audiovisual media for live events, the Metaverse, and as standalone media works.

Taken together, our online galleries and user community will encourage the development of Optophonia as a professional creative tool and art form in the Web3 era.

NFT Integration

Optophonia is the first app of its kind to integrate NFTs fully into its overall concept and experience. NFT-based art works at both the apps’ input and output sides. You can collect NFTs as source media (3D models, 2D art, video clips, motion capture files etc.) and use them as the ‘raw material’ for minting your own NFT motion graphics at the output.

The Optophonia DAO supports artists and the user community through key activities such as selecting artists to curate, helping us decide on new features to integrate into the app, and deciding which artists to support and acquire art from.

NFT content opens up new economies for artists, by allowing them to continue to earn royalties as their media is traded. If you’re a seeker of VJ loops, the media you obtain as NFTs are not strictly a cost, but rather become an asset because of its tokenization on the blockchain. A limited edition digital loop you acquire for your performance can be traded later, even at a profit. If you’re a creator of 3D models, digital art or even music and videos, you have a new community organized marketplace to sell your works in that is dedicated to promoting this art form.

Optophonia organizes a marketplace for the sale of digital assets that can be used in motion graphics production. For example, a choreographer and dancer can earn royalties on producing dance motion capture files, while another 3D artist can earn royalties on their digital avatar model, while another artist can earn royalties for providing virtual clothing for the dancing avatar. Smart contracts allow ongoing royalty splits of this kind to be automated and disrupts current economic models for creative work.

NFTs as digital assets in an Optophonic work have the potential to upend decades of internet economic practices that favor giant tech dominance, and create entirely new dynamics around content production and distribution that are more likely to favor individual artists and creativity.

UI Tour

Our flagship app, designed for working on desktops and laptops, is based on five main tabs, which provide a complete feature set for live motion graphics — Gallery, Stage, Control, Program and Sequence, described in more detail below. We give artists who produce visual media for music contexts easy intuitive access to Unreal’s core capabilities without the complexity of needing a whole development team or deep expertise in game engines.

Gallery Screen

The first tab organizes Projects, Performances, Builds and Mints. With Optophonia there is some terminology to get used to so this is a good place to start defining some terms. What you create in Optophonia are Compositions. Generally, Projects are roughly synonymous with Compositions, because a Composition is what you are creating in a Project. A new Project will be empty of a Composition, of course, so the term distinction is still useful.

Assuming that the idea of a Project file is well known enough, we can discuss Performances. With Optophonia, the Composition is a motion visual assembly of virtual objects and media that can be played in real time. You can create any number of Performances with a Composition. While in most apps a Performance might be simply embedded in the Project file, with Optophonia, because of the NFT integration, Performances can be minted as NFTs, which is why they have their own Gallery space.

Performances are of two kinds — Optophonia native, and video rendered. Native Performances can be opened up within the app to be played back live. A native Performance is essentially a recording of a collection of Composition data values over time. Performances can be recorded in the Stage, Control and Program screen, depending on what makes the most sense for any given Composition.

Any recorded Performance can be rendered as a video file, and these video files can in turn be minted as NFTs via the Mint button. Optophonia contains an integrated toolset for making seamless video loops, so Performances can be made loop-ready if one is desiring a seamless loop as the NFT media type.

The other kind of media that can be minted as NFTs are Builds. A Build is a self-contained executable — its own mini-app of the Composition. This allows users to mint NFTs of interactive or generative media. Builds are created in the Program screen, discussed more below.

The final section of the Gallery organizes media that has already been minted. These will be references to rendered video files and builds that have been previously output from the app as NFTs.

Stage Screen

The Stage screen is where the core elements of the Composition are assembled. At a high level, there are two main kinds of elements that are used to create Compositions: assets that have been imported, e.g. from Optophonia’s NFT galleries (NFTs can be free, and the app includes some NFT media assets for immediate experimentation), and resources that are internal to the Unreal environment such as geometry, lights, animation components and visual effects.

The Stage screen supports many types of media that can be used as imported assets: 2D art (to be used as textures), 3D models (which will usually be the .fbx file type), digital humans, rigged characters, motion files, video clips and audio files.

The hierarchy view of all objects in the Composition are organized in the Composition Elements panel, and the properties of any selected object can be manipulated in the Properties panel. The central Staging Area — where the Composition is assembled — also has a number of tools for modeling (e.g. Boolean functions, alignment tools and perspective views), and for viewing performances, such as showing them according to different aspect ratios, and also recording and playing them.

Performances of Optophonic Compositions can be recorded in any of the three central tabs — Stage, Control, and Program. These three screen areas allow for increasing the complexity of Compositions. Because the Stage screen includes a set of animation objects, a complex and refined motion graphics Composition can be created entirely within the Stage screen using drag and drop and built-in animations in conjunctions with the Paths tool (the top left icon under the logo) which can create movement paths through the composition for the main viewing camera and other elements.

The Stage screen (along with the Control and Program screens) also has a row of 16 Snapshot buttons. These can be used in two main modes. In the regular snapshot mode, each snapshot simply records an internal preset of all the Composition objects’ properties and the camera position, so that a user can quickly change between them.

The Snapshots can also be used as a visual sequencer that syncs with incoming MIDI data so that snapshots can be beat-aligned to music. This aligns with the most common paradigm in beat sequencing where a whole note is subdivided into sixteen steps of 1/16th notes. The app also generates its own internal tempo information so that an external MIDI sync signal is not always required in order to experiment with Optophonia’s visual sequencer possibilities with the Snapshot feature.

Control Screen

The Control screen is where the Composition can be made interactive. A completely customized control interface can be set up, to be used as GUI elements or as emulations of linked outboard control equipment. A comprehensive set of virtual controller elements that can be combined, through drag and drop, into unique control assemblages, many of which can interface with external hardware through MIDI or OSC protocols. These include: dials, sliders, buttons, pad triggers, XY matrix, linkage to the pitch bend and modulation wheels of MIDI keyboards, and envelope drawing tools. QWERTY and MIDI keyboards can also be used as controllers, along with audio signals and MIDI files.

A number of Composition viewing options are also available. Any given Performance can be viewed in an embedded video window, a floating window that can be resized on other screens, in VR via a head mounted display, and also live-streamed to a frame sharing application such as Syphon (Mac) or Spout (PC) for recording as a video file inside of another app.

Frame sharing also allows Optophonic content to be streamed live to projection mapping, VJ and other live streaming software. And of course, the motion graphics output can be streamed live IRL to a video projector or other live media display. There is also a virtual sticky notes feature and MIDI/OSC communication.

The Control screen gives more control (naturally!) over recording Performances, since any object’s properties can be assigned to a controller object. External or internally generated tempo information can be synced to appropriate control elements. Since Optophonic recorded output will often consist of video media that should loop seamlessly, there’s a loop toolset available to make sure that the start and end points of recorded Performances can have their data parameters aligned so that the first and last frames are essentially the same, to make for seamless loop output.

Program Screen

The Program screen adds a generative dimension through a visual programming paradigm based on Unreal’s underlying Blueprint system. Users familiar with visual programming will recognize the common node and virtual cable design paradigm. Whole objects in the Composition Elements panel or their individual properties can become nodes in the Program screen. Also, many of Unreal’s most relevant features for interactive and generative motion graphics are available as additional programming nodes.

The Program screen also has a set of tools for testing Builds and exporting the final executables, if the user is wanting to output an interactive or generative work. Since Unreal is a platform designed to export large scale apps, it has the capacity to export standalone executables. Users will be able to produce media that retains interactive or generative features, in addition to the rendered linear media files of recorded Performances.

Sequence Screen

The final screen builds on Unreal’s cinematics system to allow users to create Sequences out of their Performances, where each Performance is treated as a clip in a timeline editor. These new Sequences of multiple Performances can in turn be made into either new Performances that are Optophonia native (in essence, these would be composite Performances), or render a video clip of the Sequence.

The Sequence tool could be used to create longer clips, such as for a music video, or to make a longer performance clip. It contains standard editing tools such as splice, cross fade, wipe, fade in and fade out, but also adds a novel feature related to creating seamless loops, called Interpolated Transitions. This allows a user to easily line up the data values of clip’s in and out points so that edits can be made smooth through data interpolation of all objects’ property values, and this can be applied as well to the final assembled sequence so that it is prepared for looping in a seamless fashion.

Users familiar with video editing applications will also recognize the usual dual-viewer layout of having two viewing windows, one for the selected Performance clip and one for the Sequence as a whole. The sequencing timeline also allows for audio to be added as a reference for sequencing the recorded Performance clips, and also for adding in short audio sting elements at the edit transition points.

The timeline can also accommodate keyframe-based editing for any assigned object properties, so that these can be automated in real time similar to traditional motion graphics application such as After Effects or DaVinci Resolve’s Fusion.

Feature Explorations

Below is a collection of software feature explorations which are posted on our YouTube channel. Each short video is a vignette highlighting different aspects of the Optophonia app’s toolset while also covering a different aesthetic approach.

By being based on Unreal, Optophonia offers unlimited freedom in creativity for performative and programmable music visuals. These short videos also show how many kinds of media assets can be used inside the app, including: 2D art, 3D models, video, sound, rigged characters and motion files such as mocap. Additionally, the app includes a robust toolset for modeling customized geometry including Boolean, smart grid and alignment features.

Interactive and Generative Motion Graphics

Music: What It Takes

Optophonia has full MIDI and OSC integration, allowing you to play motion graphics like a musical instrument. Every object property of a Composition can be assigned to an external controller (knobs, sliders, keys, buttons, pads etc.) or played inside the app as a GUI element.

Properties and objects can also be made into nodes in the Program screen for visual programming. If you prefer to program through scripting, we incorporate a script editor in the Tools palette for writing C++ code.

Abstract Approaches

Music: Zen Man

Abstract is a very direct emulation and updated take on the mechanics of the original Optophonic Piano (circa 1916), which was a Futurist invention that created live motion visual light projections for music accompaniment. It utilized a collection of spinning painted discs and kinetic filters that were triggered in a piano keyboard style interaction paradigm as shown in this video recreation of the instrument. The Optophonic Piano can be thought of as an interactive motion graphics instrument originally developed as hardware, and this short vignette recreates its spirit in a virtual environment.

VJ Loops and Virtual Architecture

Music: Down Temp Oh

The early iteration version of the music video for my track Down Temp Oh takes as its starting material one of my 2D virtual photography images and moves a camera through a reimagined version of it. It incorporates a number of VJ loop style video textures as the camera moves through a semi-post-apocalyptic structure (all that thick brown smog is supposed to remind the viewer of heavily polluted cities and ecological degradation in general). This is a longer clip so make sure to watch to the end to find a nice surprise.

Mixing Flatness and Depth Effects

Music: Ambient Piano №1

Besides showing how Optophonic visuals can work with acoustic material such as the soundtrack’s algorithmic piano, this clip explores mixing together the aesthetics of 3D and 2D motion graphics. We often end our clips with animals and avatars as a nod to the world of crypto art : )

Music Videos

Music: Syzygy

Optophonia can be used beyond live performance contexts and work as an environment for producing music videos. Effects ranging from photo-realism to hyper-reality to fully abstract are achievable since the underlying graphics are based on a leading edge graphics engine. Full cinematics and rendering options allow for up to 4K resolution video renderings of shot sequences.

Prismatic Tunnels

Music: Ballad of a Dead Pixel

This series of four prismatic tunnels — some of which are flooded, as a reference to rainy weather at outdoor music events — represent a common genre of visuals at live music shows: traversing infinite tunnel structures.

Retro Computer Graphics as Neon City Walkthrough

Music: Hiatus

Retro Neon Walkthrough Vibe pays homage to earlier generation computer graphics (think Tron era) which take on a more updated contemporary look with neon-emissive material textures. The scene does a walkthrough of an industrial village and also integrates video clips with image synthesis and glitch effects. This also shows how Optophonic media might work as a metaverse environment.

Music Reactivity

Music: Metanoia

All aspects of a motion graphics composition can be made responsive to music. Audio input or files can analyzed by its spectrum and beat structure to produce visual behaviors, as well as MIDI (live data or saved as MIDI files), OSC and control voltages from analog hardware via CV-to-MIDI converters. The example above shows a music-reactive particle system where music features control particle velocity and frequencies in the spectrum are linked to color properties.

Digital Humans

Music: notPussyFootin

Besides working with rigged characters (the usual digital stuff of humanoid and creaturely animated characters), digital humans with baked-in animation files can also be used in an Optophonic environment, here shown as dancers. Digital humans are based on photogrammetry techniques which are fundamentally different in kind from 3D modeled characters.

Complex Strobe Patterns

Architectural Light Strobes shows off some possibilities for kaleidoscopic effects that oscillate between 2D, 3D, still and motion visual variations. It is heavy with strobing lights content, so viewers with sensitivities to flashing image media are cautioned.

Music: Zenzizenzizenzic

Stay Informed About Optophonia

To keep current with the development of Optophonia, sign up to our newsletter here.