Approved Vendor

Built for the Big Screen

LISTEN TO THE ARTICLE

Some good news from Audio: Now you can listen to our stories.

0:00

From Concept to Canvas at 160,000 sqft

When Sphere hosted its first-ever electronic music residency Afterlife Presents Anyma: The End of Genesys the challenge was clear: deliver a visual experience as immersive as the venue itself. Eight sold-out nights featuring Tiësto, CamelPhat, and Amelie Lens.

To meet that vision, we developed a CG pipeline tailored specifically for Sphere’s scale. From custom rendering to real-time engines and high-performance software architecture, every solution was engineered to ensure distortion-free visuals, at 16K resolution, across 160,000 square feet of curved LED. Here’s how we did it:

The Sphere

The challenge

The visuals were never meant to stand alone, they were designed to move with the music: reactive, emotional, and precisely timed to elevate the performance. Delivering that level of integration at Sphere’s scale meant confronting creative and technical obstacles rarely seen in the industry.

We tackled them head-on, and did so using a CG pipeline, rather than the real-time engines like Unreal typically used for Sphere. This choice was driven by the creative vision, which relied heavily on complex particle simulations that demanded a more robust and flexible rendering approach.

the process, Super-Sized

Bending reality to real

We worked directly with the show’s Creative Director, and due to the complexity of the FX and simulations involved, our team was included from the conceptual stage as a creative consulting partner, shaping the early visual language while preparing for the technical demands of execution.

Sphere’s architecture introduced a unique spatial challenge: visuals had to feel “correct” from both the front row and the highest seats. We had to develop a custom workflow to exaggerate and subtly distort elements, shifting scale, warping space, and enhancing motion cues, to preserve depth and immersion across the entire dome. What was technically “accurate” wasn’t always what looked best; our team leaned on visual intuition as much as engineering precision.

Amelie Lens’ live set, fan footage (Tiktok)
Flat animation

Time vs. render reality

Rendering in 16K at 360fps isn’t just demanding, it’s a race against time. Every frame had to be sampled and validated in full spherical format. The result? A complex render pipeline, combining CPU-based render farm development with GPU rendering for beauty passes, that pushed technical resources to their limits, all while balancing speed and quality.

This hybrid approach proved essential in meeting the project’s tight time constraints. To further optimize rendering time, the team developed an lighting workflow: instead of rendering interactive lights frame-by-frame for the full duration, key lighting passes were rendered on a single frame across several stages. These were then animated in compositing to create the illusion of dynamic lighting–preserving realism while drastically reducing render time.

Detail, everywhere

The lower rows demanded intense detail fidelity, while higher seats offered more visual forgiveness. Despite that, every scene was optimized end-to-end with careful asset distribution, shading, and 3D scene setup to ensure clarity and richness in every shot. Detail mattered at every level: in the model, in the render, and in the final composition, so that the experience held up whether viewed from five feet away or fifty.

Tiesto live set, fan footage (Tiktok)
Tiesto Venue animation view

Fighting the Moiré monster

One unexpected challenge was the Moiré effect. When ultra-high-res visuals met the LED mesh at certain speeds and scales, interference patterns emerged, creating unwanted visual artifacts that couldn’t be detected in VR headsets or preview monitors, only on Sphere’s LED itself.

We tackled this through painstaking trial and error, fine-tuning content design to avoid triggering distortions. In some cases, even technically correct render passes had to be subtly offset or adjusted in post, solely to minimize Moiré, an intentional deviation from the ideal in service of the final, on-screen result.

CamelPhat live set, fan footage (Tiktok)
CamelPhat, flat animation

The result

Seamless loops, infinite motion

Some scenes, like an endless forward drift through a digital nebula, had to be looped invisibly over 40+ seconds. To maintain flow, the end frame needed to match the beginning not through typical keyframe morphing, but through a scene structure that was inherently repeatable from the start. This required a smart, non-standard approach: building the entire 3D simulation to naturally contain loopable motion within itself.

The result was a seamless visual cycle where motion, parallax, and spatial detail evolved continuously, without the viewer ever sensing repetition. It was a technical choreography requiring both surgical precision and structural foresight.

‘The End of Genesys’ intro, flat animation, looped

“The Sphere project called for engineering precision rather than just visual beauty. The goal was to set a new visual standard, ensuring that every seat now feels like the best seat in the house.”

Jakub Wrzalik, Head of CG, Juice.

Bring Sphere-level visuals to your next project.