The traumatic economic impact of the pandemic has made remote production a necessity. 5th Kind has thrown itself into finding ways for our award-winning tools to support getting people back to work. Through an R&D partnership, we demonstrated how to create a virtually connected set that meets COVID compliant standards and keeps all the necessary crew and studio staff interactively engaged with the production remotely.
Collaborating on the Ripple Effect film through the Entertainment Technology Center at USC, we used our CORE platform’s Restful APIs and event framework to integrate with several other solutions and become the orchestration platform serving each phase of production, most notably for live production operations.
On Set Ecosystem
On set, CORE became essential for communicating with offset personnel, group decision making, and data sharing between products. The onset ecosystem was a modern Virtual Production comprised of a number of technical systems from camera, lights, and script notes, to LED walls and witness cams..
In a typical set, these systems exist independently of one another. Dozens of people would be on set watching the takes and providing real-time feedback for their departments. A program like CORE would serve both up and downstream in the pipeline to bring in pre-production assets for reference and use as well as ingest all the files and metadata in real time.
- Script Notes
- Camera/Lens Metadata
- Circle takes and directors notes
- BTS files
- Etc.
On Set Pipeline
On Ripple Effect, we adapted parts of this pipeline to ingest files and data in real time as it happened on set. This allowed us to create instantly reviewable and shareable files, as well as the ability to create editorial deliverables.
Real-Time Remote Production
Using C4IDs to track the files throughout the workflow, CORE integrated the live streams from Teradek , on set metadata from ScriptE, and dailies deliverables from Colorfront. For the live streams, Teradek’s feeds piped directly into our CORE Live feature in which the playback camera(s) and witness cam were available as long as the cameras were live. This integrated product ecosystem allowed us to create a pipeline in near real time that integrated both the live and color corrected dailies, with a large array of on set metadata.
CORE Live streams video village experience to off set crew and executives. Witness cam on the right shows what’s happening on set. 10 people were watching the feed for this take.
Three data elements were also created as a part of the feed:
- A chat section to capture team comments and conversation during the shoot.
- A “tag” section to pin timecoded notes and input directly onto shots as they are happening. These tags will be captured with the recorded files in a future version.
- An informational feed section that will eventually be integrated with ScriptE to give viewers the most detailed information on takes, such as the shot and take information, who’s in it, etc.
CORE Live’s Feed Proved Invaluable
- The COVID safety officer was able to monitor set from her computer in another room.
- The Co-Directors could note their circle takes or other thoughts as they went along.
- Producers not required on set could monitor the both video village and set activity and give feedback to crew members as needed.
- Department heads or team members not required on set or held in other areas safely distanced from set could monitor their teams and give feedback in real time.
- Studio execs could tap into the stream at their convenience.
Capturing the Metadata
Everything streamed live was recorded in 5-minute intervals directly into CORE.
In addition to the live stream, by integrating CORE with other products in the ecosystem, we were able to create a more holistic data environment for assets providing those teams working downstream in the pipeline with all the data they’d need when they worked with those files.
Downstream Pipeline and Data Sharing
Following the more traditional pipeline, the camera takes processed off Teradek through SohoNet became the high resolution dailies files with processing by ColorFront. The dailies were uploaded into CORE each night, converted into proxies and sent widely and securely via CORE to all the necessary parties. Each dailies file also contained color, camera, ALE, and take information sourced from the script notes.
By integrating CORE with Colorfront and ScriptE, and enabling our AVID ALE importer, each asset contained all the essential information required for editing and searching the files. The data from CORE was synced to DNAfabric and eventually used by the post teams in Avid and Adobe processing them through Pulse and into Technicolor for final color correction. Post file proxies were then uploaded back into CORE for review and final approvals.
Metadata & Searchability
The value of the metadata captured in the production pipeline is innumerable. For the directors and post personnel, files could be sourced by specific information such as a camera lens, take number, or shoot day. Any of the data imported from other systems, became a searchable value.
This kind of information not only increases asset visibility and helps to accelerate processes. It’s essential for maintaining studio libraries and increasing the reuse value of assets to reduce overall costs.
Through Ripple Effect, our 5th Kind team has created a replicable system for studios and independent productions to have their own virtually connected sets. Virtually connected sets support COVID safety protocols and mitigate risk, improve overall production visibility, and open up a range of automation capabilities. As a result, the production pipeline is more accelerated and efficient than ever before, and leveraged, interoperable metadata has become a requirement for its functionality. The pandemic has changed the way we work forever. Yet limitations breed innovation, and this year has been pushing innovation to its limits.