Mastering the Immersive Future
How OBAM Promises to Solve Dolby Atmos Production’s Biggest Workflow Limitation
For years, producers working with Dolby Atmos have faced a fundamental challenge that many in the industry have quietly accepted as simply the way things are: the absence of a proper mastering workflow with the final polish achievable in traditional stereo work. While we've grown accustomed to having comprehensive control over stereo masters through dedicated mastering chains, the world of object-based audio has remained surprisingly limited in this regard. This gap has forced many engineers to develop workarounds or simply accept these limitations in their immersive productions.
This limitation stems from the complex nature of Dolby Atmos itself. Unlike stereo audio, where mastering processors can easily access and manipulate a simple two-channel signal, Dolby Atmos presents audio engineers with 128 channels of information, including both bed channels and object channels, each carrying not just audio data but also positional metadata that defines where sounds exist in three-dimensional space. Traditional audio plugins were never designed to understand or process this type of complex, metadata-rich signal flow.
The Channel-Based Limitation Problem
Traditional DAWs and plugins operate on channel-based architectures, exchanging audio through fixed channel layouts such as mono, stereo, 5.1, or 9.1.6. While this approach has served stereo and surround production well, it presents several critical limitations for immersive audio workflows. First, the inflexibility of fixed formats cannot reflect the actual production goal of object-based audio formats like Dolby Atmos, where sounds exist as discrete entities moving through three-dimensional space rather than being locked to specific speaker positions.
The channel-based approach also proves error-prone and cumbersome because of the proliferation of different immersive formats and the varying channel orders implemented across different DAWs and plugin formats. Even experienced professionals regularly encounter confusion when working across different systems, leading to routing errors and compatibility issues that can compromise the integrity of complex immersive productions.
Enter OBAM: Object Based Audio Module
These limitations have not gone unnoticed by developers working at the forefront of immersive audio technology. Fiedler Audio, the company behind the widely adopted Dolby Atmos Composer, has recently presented a comprehensive solution to address these channel-based workflow constraints. Their OBAM standard represents a significant change in how we approach immersive audio processing.
OBAM, which stands for Object Based Audio Module, establishes a new plugin format specifically designed to overcome the fundamental limitations of traditional channel-based workflows. Unlike conventional audio plugins that receive only audio samples in fixed channel layouts, OBAM-compatible plugins receive both audio content and comprehensive metadata describing each object's position and semantic characteristics.
This metadata communication between plugin and host opens up processing possibilities that were previously unimaginable. An OBAM-compatible compressor, for instance, could react to gain reduction based not only on the audio content but also on where that audio exists in space. The standard eliminates the confusion of channel order variations across different DAWs and plugin formats, as each object carries its defined spatial position as inherent metadata.
Perhaps most intriguingly, OBAM enables position itself to become both a source and destination for processing. Imagine delay processors that adjust their timing based on incoming object positions, or envelope followers that generate positional metadata from audio analysis to modulate other objects' spatial placement. The technical implementation allows for dynamic processing decisions that consider not just the sonic characteristics of the material, but its three-dimensional context within the mix.
Practical Implications for Production Workflows
The first implementation of OBAM appears in Fiedler Audio's own Dolby Atmos Composer, where OBAM plugins can be hosted in the Master Channel. This integration allows producers to process their entire Dolby Atmos mix or selected portions with object-aware capabilities, addressing one of the most significant workflow limitations that immersive audio producers have faced since the format's introduction.
From a practical standpoint, OBAM transforms the production process by introducing true object-based mastering capabilities. Previously, any processing applied to a complete Dolby Atmos mix required either rendering the project to individual channel files for external processing or accepting the limitations of channel-based plugins that couldn't understand the broader spatial context. With OBAM, dynamic processing can now be applied intelligently across the entire sound field, with processors understanding the distinction between bed channels providing foundational ambience and discrete objects requiring different treatment.
The ability to apply mastering-grade processing within the production environment itself has the potential to fundamentally change how we can approach both the creative and technical aspects of immersive audio. This contextual awareness allows for more sophisticated approaches to managing the relationship between direct sounds and their spatial environment, while maintaining the object-based nature of the production throughout the entire workflow.
The Broader Impact on Immersive Audio
The introduction of OBAM represents more than just a technical advancement; it signals a maturation of the immersive audio production ecosystem. Fiedler Audio has made the OBAM SDK available to third-party plugin developers free of charge, encouraging widespread adoption and innovation within the plugin development community. This open approach suggests that we may see a rapid expansion in the number of available OBAM-compatible processors as developers recognize the creative and technical possibilities offered by object-aware processing.
As object-based audio formats continue to gain adoption across music, film, and gaming applications, the availability of sophisticated processing tools becomes increasingly critical for the format's continued growth and artistic development. Perhaps most significantly, OBAM's object-based approach provides future-proofing capabilities that channel-based workflows lack. When producers work with object-based production methods, they maintain the ability to adapt their content to new reproduction formats as they emerge, without requiring fundamental changes to their production methodology. This flexibility proves essential for content distribution and technological evolution in an rapidly advancing field.
This advancement also has implications for how we train the next generation of audio engineers. With proper object-based mastering tools now available for immersive formats, educational programs can begin teaching comprehensive workflows that mirror the depth and sophistication students expect from traditional stereo production techniques. The gap between what's possible in conventional audio production and what's achievable in immersive audio continues to narrow, making object-based production more accessible to a broader range of practitioners.
A New Chapter for Object-Based Audio
Fiedler Audio's development of the OBAM standard addresses a fundamental limitation that has constrained immersive audio production since its inception. By enabling metadata communication between plugins and hosts, this new standard allows for the development of processing tools that truly understand and enhance the three-dimensional nature of object-based audio. For producers who have been working within the constraints of traditional channel-based architectures, OBAM represents a significant step forward in achieving the level of control and refinement we've long sought in immersive audio production.
The availability of the OBAM SDK at no charge to plugin developers suggests strong potential for ecosystem growth and innovation. As more OBAM-compatible processors become available, we can expect to see increasingly sophisticated approaches to immersive audio processing emerge. This development not only improves the technical quality achievable in current productions but also opens creative possibilities that will undoubtedly influence how we conceive and create spatial audio experiences in the future.
If you'd like to see OBAM in action and learn more about how this technology works in practice with specific plugins and production techniques, I've created a detailed demonstration on my YouTube channel that walks through setting up and using OBAM-compatible processors in a real production scenario.