Gaussian Splatting – A$AP Rocky "Helicopter" music video
Recorded: Jan. 19, 2026, 10:03 a.m.
| Original | Summarized |
A$AP Rocky Releases Helicopter Music Video featuring Gaussian Splatting - Radiance Fields LearnTools & ResourcesServicesMediaAbout UsSearch...Radiance FieldsSubscribeRadiance FieldsSubscribePlatformsPapersJob BoardBuyers GuideRadiance FieldsSubscribePop CultureA$AP Rocky Releases Helicopter Music Video featuring Gaussian SplattingA$AP Rocky Releases Helicopter Music Video featuring Gaussian SplattingA$AP Rocky Releases Helicopter Music Video featuring Gaussian SplattingMichael RubloffJan 13, 2026Believe it or not, A$AP Rocky is a huge fan of radiance fields.Yesterday, when A$AP Rocky released the music video for Helicopter, many viewers focused on the chaos, the motion, and the unmistakable early MTV energy of the piece. What’s easier to miss, unless you know what you’re looking at, is that nearly every human performance in the video was captured volumetrically and rendered as dynamic splats.I spoke with Evercoast, the team responsible for capturing the performances, as well as Chris Rutledge, the project’s CG Supervisor at Grin Machine, and Wilfred Driscoll of WildCapture and Fitsū.ai, to understand how Helicopter came together and why this project represents one of the most ambitious real world deployments of dynamic gaussian splatting in a major music release to date.The decision to shoot Helicopter volumetrically wasn’t driven by technology for technology’s sake. According to the team, the director Dan Strait approached the project in July with a clear creative goal to capture human performance in a way that would allow radical freedom in post-production. This would have been either impractical or prohibitively expensive using conventional filming and VFX pipelines.Chris told me he’d been tracking volumetric performance capture for years, fascinated by emerging techniques that could enable visuals that simply weren’t possible before. Two years ago, he began pitching the idea to directors in his circle, including Dan, as a “someday” workflow. When Dan came back this summer and said he wanted to use volumetric capture for the entire video, the proliferation of gaussian splatting enabled them to take it on.The aesthetic leans heavily into kinetic motion. Dancers colliding, bodies suspended in midair, chaotic fight scenes, and performers interacting with props that later dissolve into something else entirely. Every punch, slam, pull-up, and fall you see was physically performed and captured in 3D.Almost every human figure in the video, including Rocky himself, was recorded volumetrically using Evercoast’s system. It’s all real performance, preserved spatially.This is not the first time that A$AP Rocky has featured a radiance field in one of his music videos. The 2023 music video for Shittin’ Me featured several NeRFs and even the GUI for Instant-NGP, which you can spot throughout the piece.The primary shoot for Helicopter took place in August in Los Angeles. Evercoast deployed a 56 camera RGB-D array, synchronized across two Dell workstations. Performers were suspended from wires, hanging upside down, doing pull-ups on ceiling-mounted bars, swinging props, and performing stunts, all inside the capture volume.Scenes that appear surreal in the final video were, in reality, grounded in very physical setups, such as wooden planks standing in for helicopter blades, real wire rigs, and real props. The volumetric data allowed those elements to be removed, recomposed, or entirely recontextualized later without losing the authenticity of the human motion.Over the course of the shoot, Evercoast recorded more than 10 terabytes of raw data, ultimately rendering roughly 30 minutes of final splatted footage, exported as PLY sequences totaling around one terabyte.That data was then brought into Houdini, where the post production team used CG Nomads GSOPs for manipulation and sequencing, and OTOY’s OctaneRender for final rendering. Thanks to this combination, the production team was also able to relight the splats.One of the more powerful aspects of the workflow was Evercoast’s ability to preview volumetric captures at multiple stages. The director could see live spatial feedback on set, generate quick mesh based previews seconds after a take, and later review fully rendered splats through Evercoast’s web player before downloading massive PLY sequences for Houdini.In practice, this meant creative decisions could be made rapidly and cheaply, without committing to heavy downstream processing until the team knew exactly what they wanted. It’s a workflow that more closely resembles simulation than traditional filming.Chris also discovered that Octane’s Houdini integration had matured, and that Octane’s early splat support was far enough along to enable relighting. According to the team, the ability to relight splats, introduce shadowing, and achieve a more dimensional “3D video” look was a major reason the final aesthetic lands the way it does.The team also used Blender heavily for layout and previs, converting splat sequences into lightweight proxy caches for scene planning. Wilfred described how WildCapture’s internal tooling was used selectively to introduce temporal consistency. In his words, the team derived primitive pose estimation skeletons that could be used to transfer motion, support collision setups, and allow Houdini’s simulation toolset to handle rigid body, soft body, and more physically grounded interactions.One recurring reaction to the video has been confusion. Viewers assume the imagery is AI-generated. According to Evercoast, that couldn’t be further from the truth. Every stunt, every swing, every fall was physically performed and captured in real space. What makes it feel synthetic is the freedom volumetric capture affords. You aren’t limited by the camera’s composition. You have free rein to explore, reposition cameras after the fact, break spatial continuity, and recombine performances in ways that 2D simply can’t.In other words, radiance field technology isn’t replacing reality. It’s preserving everything.FeaturedRecentsFeaturedPlatformsLichtFeld Studio Releases v0.4Several updates to LichtFeld Studio have arrived.Michael RubloffJan 16, 2026PlatformsLichtFeld Studio Releases v0.4Several updates to LichtFeld Studio have arrived.Michael RubloffJan 16, 2026PlatformsLichtFeld Studio Releases v0.4Several updates to LichtFeld Studio have arrived.Michael RubloffPlatformsArrival Space Releases Version 2026.1Arrival Space has added HDR support and more.Michael RubloffJan 16, 2026PlatformsArrival Space Releases Version 2026.1Arrival Space has added HDR support and more.Michael RubloffJan 16, 2026PlatformsArrival Space Releases Version 2026.1Arrival Space has added HDR support and more.Michael RubloffPlatformsSplatTransform 1.0 ReleasedSplatTransform is no longer just a CLI tool.Michael RubloffJan 15, 2026PlatformsSplatTransform 1.0 ReleasedSplatTransform is no longer just a CLI tool.Michael RubloffJan 15, 2026PlatformsSplatTransform 1.0 ReleasedSplatTransform is no longer just a CLI tool.Michael RubloffPlatformsNVIDIA Unveils AlpaSim at CESNVIDIA's most recent announcement heavily uses radiance fields for simulation.Michael RubloffJan 14, 2026PlatformsNVIDIA Unveils AlpaSim at CESNVIDIA's most recent announcement heavily uses radiance fields for simulation.Michael RubloffJan 14, 2026PlatformsNVIDIA Unveils AlpaSim at CESNVIDIA's most recent announcement heavily uses radiance fields for simulation.Michael RubloffTrending articlesTrending articlesTrending articlesPlatformsInstant NGP 2.0 ReleasedA long awaited day is here.Michael RubloffJul 8, 2025PlatformsInstant NGP 2.0 ReleasedA long awaited day is here.Michael RubloffJul 8, 2025PlatformsInstant NGP 2.0 ReleasedA long awaited day is here.Michael RubloffPlatformsSuperSplat Unveils Major Updates: 2.0 is HereMassive updates have arrived for PlayCanvas's SuperSplat in the 2.0 release.Michael RubloffFeb 13, 2025PlatformsSuperSplat Unveils Major Updates: 2.0 is HereMassive updates have arrived for PlayCanvas's SuperSplat in the 2.0 release.Michael RubloffFeb 13, 2025PlatformsSuperSplat Unveils Major Updates: 2.0 is HereMassive updates have arrived for PlayCanvas's SuperSplat in the 2.0 release.Michael RubloffResearchRadiant Foam: RadfoamAnother novel Radiance Field representation is here with interesting capabilities.Michael RubloffFeb 3, 2025ResearchRadiant Foam: RadfoamAnother novel Radiance Field representation is here with interesting capabilities.Michael RubloffFeb 3, 2025ResearchRadiant Foam: RadfoamAnother novel Radiance Field representation is here with interesting capabilities.Michael RubloffWritten by Michael RubloffMichael is the Founder and Managing Editor of Radiancefields.comMore from Michael RubloffMore from Michael RubloffView AllPlatformsLichtFeld Studio Releases v0.4Several updates to LichtFeld Studio have arrived.Michael RubloffJan 16, 2026PlatformsLichtFeld Studio Releases v0.4Michael RubloffJan 16, 2026PlatformsLichtFeld Studio Releases v0.4Several updates to LichtFeld Studio have arrived.Michael RubloffJan 16, 2026PlatformsArrival Space Releases Version 2026.1Arrival Space has added HDR support and more.Michael RubloffJan 16, 2026PlatformsArrival Space Releases Version 2026.1Michael RubloffJan 16, 2026PlatformsArrival Space Releases Version 2026.1Arrival Space has added HDR support and more.Michael RubloffJan 16, 2026PlatformsSplatTransform 1.0 ReleasedSplatTransform is no longer just a CLI tool.Michael RubloffJan 15, 2026PlatformsSplatTransform 1.0 ReleasedMichael RubloffJan 15, 2026PlatformsSplatTransform 1.0 ReleasedSplatTransform is no longer just a CLI tool.Michael RubloffJan 15, 2026FeaturedFeaturedPlatformsInstant NGP 2.0 ReleasedA long awaited day is here.Michael RubloffJul 8, 2025PlatformsInstant NGP 2.0 ReleasedA long awaited day is here.Michael RubloffJul 8, 2025PlatformsInstant NGP 2.0 ReleasedMichael RubloffJul 8, 2025PlatformsSuperSplat Unveils Major Updates: 2.0 is HereMassive updates have arrived for PlayCanvas's SuperSplat in the 2.0 release.Michael RubloffFeb 13, 2025PlatformsSuperSplat Unveils Major Updates: 2.0 is HereMassive updates have arrived for PlayCanvas's SuperSplat in the 2.0 release.Michael RubloffFeb 13, 2025PlatformsSuperSplat Unveils Major Updates: 2.0 is HereMichael RubloffFeb 13, 2025ResearchRadiant Foam: RadfoamAnother novel Radiance Field representation is here with interesting capabilities.Michael RubloffFeb 3, 2025ResearchRadiant Foam: RadfoamAnother novel Radiance Field representation is here with interesting capabilities.Michael RubloffFeb 3, 2025ResearchRadiant Foam: RadfoamMichael RubloffFeb 3, 2025Recent articlesPlatformsLichtFeld Studio Releases v0.4Several updates to LichtFeld Studio have arrived.Michael RubloffJan 16, 2026PlatformsArrival Space Releases Version 2026.1Arrival Space has added HDR support and more.Michael RubloffJan 16, 2026PlatformsSplatTransform 1.0 ReleasedSplatTransform is no longer just a CLI tool.Michael RubloffJan 15, 2026RadianceFields.comDocumenting the exciting progression of radiance field based technologies, including Neural Radiance Fields (NeRFs) and 3D Gaussian Splatting.ServicesConsultingPartnershipsBuyers GuideAffiliatesJob BoardBlogResearchNewsletterWrite for UsGlossaryTimelineAboutAbout UsContact UsAuthorsLegalPrivacy PolicyCalculatorsVRAM CalculatorVideo Frame Calculator© 2025 Radiancefields.com. All rights reserved.RadianceFields.comDocumenting the exciting progression of radiance field based technologies, including Neural Radiance Fields (NeRFs) and 3D Gaussian Splatting.ServicesConsultingPartnershipsBuyers GuideAffiliatesJob BoardBlogResearchNewsletterWrite for UsGlossaryTimelineAboutAbout UsContact UsAuthorsLegalPrivacy PolicyCalculatorsVRAM CalculatorVideo Frame Calculator© 2025 Radiancefields.com. All rights reserved.RadianceFields.comDocumenting the exciting progression of radiance field based technologies, including Neural Radiance Fields (NeRFs) and 3D Gaussian Splatting. ServicesConsultingPartnershipsAffiliatesJob BoardAboutContact UsAbout UsAuthorsLegalPrivacy PolicyCalculatorsVRAM CalculatorVideo Frame CalculatorBlogResearchNewsletterWrite for UsGlossaryTimeline© 2025 Radiancefields.com. All rights reserved.RadianceFields.comDocumenting the exciting progression of radiance field based technologies, including Neural Radiance Fields (NeRFs) and 3D Gaussian Splatting.ServicesConsultingPartnershipsAffiliatesJob BoardAboutContact UsAbout UsAuthorsLegalPrivacy PolicyCalculatorsVRAM CalculatorVideo Frame CalculatorBlogResearchNewsletterWrite for UsGlossary© 2025 Radiancefields.com. All rights reserved. |
A$AP Rocky’s music video for *Helicopter* represents a pioneering integration of radiance field technologies, particularly dynamic Gaussian splatting, into mainstream media production. The project, spearheaded by director Dan Strait and executed with the collaboration of teams like Evercoast, Grin Machine, WildCapture, and Fitsū.ai, marks one of the most ambitious real-world applications of volumetric performance capture in a major music release. The video’s aesthetic, characterized by kinetic motion, surreal spatial transformations, and physically staged stunts, was made possible through the use of 3D Gaussian splatting—a technique that captures and renders human movement in three-dimensional space with unprecedented flexibility. Unlike traditional filming, which relies on fixed camera perspectives and post-production compositing, the project leveraged volumetric data to preserve the authenticity of physical performances while enabling radical post-production manipulation. This approach allowed for scenes where dancers collide, bodies suspend in midair, and props dissolve into abstract forms, all rooted in real-world motion but recontextualized through digital workflows. The decision to employ this technology was not driven by novelty alone but by a creative vision to achieve post-production freedom that conventional methods could not provide. Director Strait’s initial concept, developed in July 2025, aimed to capture human performance with a level of spatial fidelity that would allow for dynamic repositioning of virtual cameras, breaking spatial continuity, and recomposing elements without losing the integrity of the original motion. This required a shift from traditional filmmaking pipelines, which often struggle with complex visual effects (VFX) due to their reliance on 2D compositing and limited spatial data. Chris Rutledge, the CG Supervisor at Grin Machine, highlighted that the team had been exploring volumetric capture for years, with Gaussian splatting emerging as a viable solution to enable this workflow. The technology’s maturation, particularly in tools like Houdini and OTOY’s OctaneRender, allowed for real-time previews of volumetric data, enabling rapid creative decisions on set. This process was further streamlined by Evercoast’s 56-camera RGB-D array, which captured performer movements in Los Angeles during a two-week shoot in August 2025. The raw data, amounting to over 10 terabytes, was later rendered into approximately 30 minutes of splatted footage, exported as PLY sequences totaling around one terabyte. This data served as the foundation for post-production, where teams used CG Nomads GSOPs for manipulation and sequencing, while OctaneRender facilitated relighting of the splats to achieve a more dimensional “3D video” look. The workflow’s flexibility was further enhanced by Blender’s role in layout and previsualization, as well as WildCapture’s internal tools for ensuring temporal consistency. By converting splat sequences into lightweight proxy caches, the team could plan complex scenes efficiently, while primitive pose estimation skeletons enabled collision setups and physically grounded interactions in Houdini’s simulation toolset. The result is a video that blurs the line between reality and digital manipulation, with viewers often assuming the imagery was AI-generated. However, Evercoast emphasized that every stunt, swing, and fall was physically performed within a controlled capture volume. The surreal quality of the final product stems not from synthetic generation but from the freedom volumetric capture provides to recontextualize real-world motion. For instance, scenes featuring wooden planks as helicopter blades or wire rigs for stunts were captured in their physical form but later removed, recomposed, or reimagined through the splatting workflow. This approach preserves the authenticity of human movement while allowing for creative liberties that 2D filming could not accommodate. The project also builds on A$AP Rocky’s history with radiance field technologies, including his 2023 video for *Shittin’ Me*, which incorporated NeRFs (Neural Radiance Fields) and the GUI for Instant-NGP. However, *Helicopter* represents a significant step forward by leveraging dynamic Gaussian splatting for full-volume representation of human performances. The technical execution required close collaboration between multiple stakeholders: Evercoast managed the physical capture setup, Grin Machine handled CG supervision and rendering, and WildCapture contributed tools for motion transfer and simulation. The integration of these elements into a cohesive workflow underscores the growing maturity of volumetric capture as a viable alternative to traditional VFX pipelines. For example, OctaneRender’s Houdini integration allowed the team to relight splats dynamically, introducing shadowing and depth that enhanced the video’s three-dimensionality. Additionally, Evercoast’s web player enabled real-time previews of fully rendered splats, facilitating iterative adjustments without the need for heavy downstream processing. This workflow mirrors simulation-based production more closely than traditional filming, where decisions are often locked in during the initial shoot. The project’s success also highlights the potential of Gaussian splatting to redefine how motion is captured and manipulated. Unlike NeRFs, which focus on static scenes, Gaussian splatting excels in capturing dynamic, time-varying data, making it particularly suited for performance-based content. The ability to preserve spatial relationships while allowing for post-hoc repositioning of virtual cameras opens new creative possibilities for filmmakers and artists. Furthermore, the project’s scale—capturing over 10 terabytes of data across multiple performances—demonstrates the feasibility of applying these techniques to large-scale productions. While challenges remain, such as the computational demands of processing and rendering vast datasets, *Helicopter* serves as a case study in how emerging technologies can be harnessed to achieve artistic visions previously deemed impractical. The video’s reception has been marked by both admiration for its technical achievements and confusion about its production methods. Many viewers, unfamiliar with volumetric capture, assumed the footage was generated by artificial intelligence due to its seamless integration of physical and digital elements. This reaction underscores the transformative potential of Gaussian splatting: it does not replace reality but enhances its representation, offering a new lens through which to experience motion and space. By preserving the raw data of human performance while enabling infinite recontextualization, the project challenges conventional notions of what constitutes a “real” visual experience. In this way, *Helicopter* not only showcases the capabilities of modern radiance field technologies but also signals a shift in how media is created and consumed. As the industry continues to adopt tools like SplatTransform, SuperSplat, and NVIDIA’s AlpaSim, projects like this will likely become more common, pushing the boundaries of what is possible in visual storytelling. For now, *Helicopter* stands as a testament to the power of interdisciplinary collaboration and technological innovation in redefining creative expression. |