What are broadcast cinematics?

Courtesy of KéexFrame
In this Real-Time Explainer, we’ll explore the brave new world of broadcast cinematics. We’ll see how traditional pre-rendered motion graphics and real-time broadcast graphics are evolving and converging into this new genre. We’ll look at the benefits of broadcast cinematics, and check out some examples of their use. And we’ll explain why Unreal Engine is a good choice for creating them, and walk you through the process of doing so. So let’s get started by finding out exactly what we mean by broadcast cinematics.
 

What are broadcast cinematics?

When we’re watching live TV broadcasts, the show’s graphic elements play a huge part in the overall experience. Typically, these come in two categories: pre-rendered animations and real-time graphics.

The show openers, bumpers, and promos usually consist of pre-rendered animations that might take days or even weeks to create, with each frame taking minutes or even hours to render, and each second of footage requiring at least 50 frames. These high-quality cinematics often feature advanced rendering techniques like ray tracing, motion blur, and global illumination, just as would be used in films.

In many news and sports programs, for the actual show content, graphics are updated at the last minute or even in real time, based on incoming data, such as sports scores, election results, or weather forecasts. Until very recently, these graphics have been of a much lower quality than the pre-rendered sequences, due to the limitations of what could be achieved with real-time rendering.

Unreal Engine is a real-time 3D creation tool that has evolved to the point where it can now bridge this gap, offering both the real-time editing and rendering capabilities of traditional lower-quality broadcast graphics applications, and the high-end, advanced rendering features of offline renderers. The result is that studios are now able to deliver what is known as broadcast cinematics: film-quality motion graphics sequences that can be updated to reflect new data on the fly.
Courtesy of FOX Sports

What are the benefits of broadcast cinematics?

Broadcast cinematics have the potential to elevate the visual quality of a show’s interactive graphic elements, while at the same time, enabling openers, bumpers, and promos to be updated for each episode or even during episodes—keeping them fresh and interesting while still maintaining the brand identity.
Courtesy of Capacity Studios
This also means that a show can keep the same look and feel between its different graphic aspects, and enables assets to be reused between them without extra rework. Moreover, the same team of artists can create all the elements, increasing productivity.
 

What can broadcast cinematics be used for?


Broadcast cinematics can be used in a variety of ways depending on the level of interactivity and editorial context required. They can be used in any production that depends heavily on editorial changes to which it needs to adapt. Here are some examples:
  • A sports opener that changes based in the ranking of winning teams or information about the current game
  • An elections night bumper that shows the latest seats that have been called each time we come back from commercial
  • A weather report opener that reflects the current or forecast conditions in the region
Courtesy of FOX Sports
New York studio KéexFrame used Unreal Engine to create a real-time cinematic opener for the Qatar 2022 World Cup. You can get an in-depth review of the process they followed in this Spotlight.
 

Why is Unreal Engine a good choice for creating broadcast cinematics?

Unreal Engine is the world’s most open and advanced real-time 3D creation tool. It offers real-time ray tracing that enables you to render film-quality graphics in the same time it takes to display them, and features the Blueprint visual scripting system, which is great for authoring interactivity without the need to write a line of code.

The latest version of Unreal Engine, UE5, delivers groundbreaking feature sets like Lumen, a real-time global illumination and reflections system, and Nanite, a virtualized micropolygon geometry system that enables you to work with extremely high-resolution meshes. It’s also integrated with MetaHuman, a framework for creating fully rigged photorealistic digital humans in minutes. Together, these features bring an unprecedented level of quality to real-time broadcast graphics.
Courtesy of KéexFrame
Unreal Engine is free to download, and free to use for creating television content like motion graphics and broadcast cinematics. It comes fully loaded and production-ready out of the box, with every feature and full source code access included. There are also hundreds of hours of free tutorials and other learning resources to get you started. If you’re familiar with 3D DCC applications, you’ll find that a lot of your skills are transferable.
 

How do you make broadcast cinematics in Unreal Engine?

The process of creating a broadcast cinematic will be similar to the way you plan and execute your branding graphics (opener, bumpers, promos, etc.) today, but with extra steps for deciding on the level of interactivity or editability you want to have, and then actually implementing it. Currently, the timeframe for creating branding graphics with traditional tools typically ranges from two to five weeks, depending on complexity. Creating a broadcast cinematic takes a similar amount of time.

Here are the steps:
  • Concept and ideation – This is where the producers and creatives of the show discuss the core idea, and ensure that it will communicate the intended message to the audience.
  • Mood boarding – In this step, you’ll find or create visual references that convey the visual tone for the message and look of the show. Some ideas of where to look for inspiration are ArtStation, Vimeo, and any of the many AI image generators that are available (use Google or another search engine to find one).
Courtesy of KéexFrame
  • Animatic – This is where the main camera angles and pace of editing are created. Using Sequencer, Unreal Engine’s built-in multi-track nonlinear editor, the main shots are defined to set the pace and tone for the motion and intention. This is where there is often closer participation with production, sometimes with live sessions changing directly in engine. In this stage, you’ll also define and discuss the editable capabilities per shot, ready for the Blueprint team to start making tests.
Courtesy of KéexFrame
  • Detailing – In this step, the details of animation, lighting, shading are all added; it’s at this stage that the largest number of team members are working on the project simultaneously. As well as offering robust toolsets for creating and editing animation, lighting, and shading, Unreal Engine supports both multi-user editing and collaborative version control through software such as Perforce.
  • Effects – In this final stage of visual development, effects like particles or volumetrics are added. Unreal Engine includes the Niagara VFX system, as well as Chaos, a physics system for simulating destruction, cloth, fluids, hair, and vehicle dynamics.
  • Interactivity and Blueprint development – This process happens in parallel with detailing and effects. Based on what was discussed during the animatic stage per shot, this stage is where the Blueprint team creates the “hooks” to make the graphics editable—for example, replacing textures, live videos, or even 3D objects. In addition, interfaces for controlling the changes are created. Depending on requirements for operation, these can either be widgets directly manipulated in Unreal Engine, or via remote control through a web browser on a tablet or laptop.
  • Rendering setup – Rendering details are then defined per shot, using Console Variables (CVars) to balance performance with quality where needed.
  • Testing, rehearsals, and optimization – The last step before going to air is to test the graphic in the context of the show and tweak it to have the most efficient rendering times.
  • It’s showtime! – Prepare to dazzle your audiences.

So there you have it! We hope this Explainer has given you some understanding of the nature of broadcast cinematics, explained their benefits, shown some examples of their use, and left you with an idea of how to go about creating them.

For more inspiration, insights, and information on what Unreal Engine can bring to the industry, visit our Broadcast & Live Events hub, check out our recent roundup of related news and case studies, and download our free, comprehensive field guide.

More Real-Time Explainers


real-time explainer

What is virtual production?

real-time explainer

What is augmented reality?

Related articles


Broadcast & Live Events

Unreal Engine 5.4 is now available

Broadcast & Live Events

We are updating Unreal Engine, Twinmotion, and RealityCapture pricing in late April

Broadcast & Live Events

Spring Creator Sale 2024: Up to 70% off over 25,000 products