This is a short breakdown of the Emmy-nominated “Notorious B.I.G. – Sky’s The Limit: A VR Concert Experience”, which was produced during the fall of 2022 as a VR experience for Meta on their Horizon Worlds platform, and for 2D playback on Facebook. The main production company was Hyperreal, with Metamo Industries and Hi From The Future on the main vendor side.
My involvement was on the environments and output side, and not with the mocap and animation of the digital character assets, so this breakdown will not delve into those areas.
Due to a demanding schedule, our pre-production phase faced significant time constraints, leaving us with little time for storyboarding and in-depth R&D. Despite these challenges, we successfully completed the project within a remarkably tight timeline of just over 8 weeks. This is thanks to the solid teams that we had on this project, which spanned 5 continents, and many time zones.

The concert was shown in regular 16:9 UHD video format on a Facebook page, and as a VR180 format. The VR180 was viewed in Horizon World’s app using an Oculus headset, with the projected concert giving the impression of a large IMAX-style movie screen to the viewers. The viewer were free to walk around the environment which consisted of several floors of balconies overlooking the screen.

To accommodate two distinct client output formats, we employed two separate camera rigs for each shot, allowing us to effectively address the specific aspect ratio and framing requirements of each format. In the case of VR180, we encountered the challenge of minimizing camera movement during the downtown car chase to prevent any discomfort or motion sickness for viewers experiencing the content in-headset.
To tackle this issue, we decided to use the place the standard (non-VR180) renders on billboards throughout the city. This solution not only enabled us to integrate more dynamic and engaging camerawork but also utilized only a fraction of the overall frame. By doing so, we struck a balance between maintaining visual appeal and ensuring a comfortable viewing experience for VR180 users.

The workflow to create the VR180 images entails splitting the main camera into 6 views (front, back, left, right, top, bottom) and then stitching the resulting images afterwards to create the final (hemisphere) image. This format presented further challenges due to differences in the world position of the procedural pedestrians and traffic when rendered through each of the 6 cameras. With a deadline looming, the only solution was to implement changes to the camera system so that each camera did not require a regeneration of the Unreal Level at the start of each render, ensuring consistency across all the cameras.

During pre-production, I used Google Earth to do a virtual location scout to get an idea of where the Brooklyn Bridge should be located to help with the overall framing behind the concert stage, and also to see what the Manhattan skyline would appear like in real life from various vantage points on the Brooklyn side. These images were then used as a visual reference for the digital matte painter, which helped us to understand how much coverage was needed based on the proposed camera angles for the stage, and the overall flow of the buildings. I took the resulting matte painting and sliced it into sections, and placed them onto cards in Unreal to allow for rotating the sections to follow the Manhattan coast line, to help give more dimension to the 2d paintings.

For the concert portion, the Brooklyn and Manhattan sets were converted into UE 4.27 for compatibility with the Stagecraft LED wall for realtime playback. Stagecraft allowed for in-camera integration of the virtual set with the live action musicians and singers during their performances. The Biggie and Puffy digital characters were rendered out of Unreal Engine and composited afterwards in post-production.

For the ending sequence inside of Biggie’s apartment, the camera pans around him as he sitting in a chair smoking a cigar. We wanted to have some nice, wispy smoke, that convey the size and scale in the scene. Race Krehel from Metamo Industries ran some initial tests using a real-time smoke solution called Fluid Ninja, that worked really well, and we ended up using it for the final elements.


Throughout this project, we successfully managed a highly skilled and geographically diverse team, spanning five continents and even more timezones. Amazing work by everyone, and I’m honored that it has been nominated for an Emmy in “Outstanding Emerging Media Program”.

Posted: 09/2023