Image courtesy of Duality Robotics

Training robots: from animated films to virtual environments

Sébastien Lozé |
March 4, 2021
Autonomous robots have served the industrial sector for many years, but largely as immobile, caged machines limited to highly specific tasks. Truly autonomous robots, drones, and vehicles—machines that can navigate a busy building, a desert landscape, or the open road, and do it as well as a human being can—have yet to be achieved. 

Many companies are working on attaining this standard, and are testing these machines in virtual environments where edge cases and anomalies can be rigorously tested alongside the norms, all without the machines causing danger to people, property, or themselves. But these tests require realistic virtual worlds running in real time, and a deftness with the duality of the physical and virtual.

This duality is at the core of Duality Robotics, informing its Falcon platform for robotics and AI development. Falcon, with Unreal Engine as its underpinning, features environments so realistic that machine learning networks can’t tell the difference between the synthesized and the real worlds. It shows so much promise, in fact, that the company recently received an Epic MegaGrant to further develop the Falcon platform.

Robotics meets animation

Duality Robotics was founded in 2018 by Mike Taylor, who led teams to build massive field robots at Caterpillar Inc. and is now Duality’s Chief Roboticist and Head of Solutions Engineering, and CEO Apurva Shah, a Visual Effects Supervisor for high-end animated films. Shah’s resume includes 12 years at Pixar Animation Studios, where he helped to solve the very data-heavy environment problems faced by the robotics industry today.

While Shah and Taylor might seem an unlikely duo, each brings a vital piece of the simulation puzzle: how to accurately simulate complex environmental scenarios, and how to make the scene work with real-time responses from sensors and machines. 
Image courtesy of Duality Robotics
The key, as it turns out, is the Universal Scene Description (USD) standard, a data format developed at Pixar to efficiently define extremely large, complex 3D models. “From the very beginning,” says Shah, “we knew we had to have a very strong foundation in terms of our data model.”

Shah uses the analogy of Pixar films like Toy Story, Cars, and Brave, released in 1995, 2006, and 2012 respectively, to illustrate the rationale behind adopting USD for robotics testing. A large part of Toy Story, he reminds us, took place in the relatively small world of a young boy’s bedroom.

“Fast forward to the Cars films with their landscapes and cities, and to Brave, where we had to simulate an entire forest,” says Shah. “Before USD, we just couldn't manage the complexity. It clogged up our pipelines.”

To explain how this challenge applies to robotics, Shah draws a parallel to the journey that robots are now taking. “For decades, we’ve had large robots operating inside cages. Their world is very simple,” he says. “Now, imagine you take that cage away, and you are instead moving around inside a busy warehouse, and it's complete chaos from the robot’s perspective. The complexity is not just two or four times more—it’s exponentially higher. So, how do you represent that efficiently? In order to solve a problem, you first have to be able to represent the problem.”

It was this parallel that led Shah and Taylor to adopt USD as the standard format for the enormous environmental data sets they use to create testing solutions for autonomous robots, vehicles, and drones. By using the USD format, Duality is able to create rich environments that run in real time, even with massive amounts of data running in and out of the simulation via sensors and machines. 
Image courtesy of Duality Robotics
This power is important to Duality’s high-tech customers like Honeywell Aerospace, which has a long history of leading next-gen aerial autonomous systems technologies. “A robust and modular 3D simulation environment, among other tools, is critical to efficiently develop next-generation solutions and enable algorithms and systems to be rapidly designed, tested, and validated.” says Jeff Radke, Engineering Director, Advanced Technology, Honeywell Aerospace. 

Simulating a complex world

Duality’s clients, including Honeywell, usually have some experience with simulation before they contact the company. Taylor explains that many come to them for speed and versatility instead of growing an in-house solution which tends to add program risks and schedule creep.

“Most engineers recognize the need to have an autonomy simulator, but spinning up a solid simulator can be a challenge. Teams often lack the staff to get started or to make the best use of what they have.” he says. “A common hurdle is that they don't have anyone that's familiar with the gaming and cinema side of things.”

Shah further explains that there are a number of off-the-shelf simulation products that each do one specific thing, and do it well—for example, a wind tunnel simulator that does that job, and only that—but when the client tries to combine these tools with their own custom simulations, the result is incomplete, inaccurate, or too slow for real-time playback.

The combination of several factors—Shah’s experience in producing realistic environments, Taylor’s deep knowledge of robotics, their use of the USD file format for scale, interoperability, and efficiency, and Unreal Engine’s graphics and simulation capabilities—has made it possible for Duality to produce simulation solutions that set a high bar in this competitive industry in terms of speed and realism.
Image courtesy of Duality Robotics
“Solutions in this space come with a lot of promised performance, but in practice, they often deliver only a fraction of that promise,” Taylor says. “Beyond the core simulator, field robotics also requires simulated environments that represent the variety and the chaos of the real world.”

Shah adds that part of Duality’s value-add is Falcon’s ability to add real-life camera effects to the sensor stream, creating visuals that closely match what the machine will actually see in the real world. “The lens and camera model, the depth of field, bokeh effects—all of that gets built into the picture before the vision system sees the synthetic output of an electro-optical sensor,” he says.
Image courtesy of Duality Robotics

Proving simulation with high-quality data

Simulation realism and speed is critical to companies building autonomous systems for the real world. A key part of the effort in developing Honeywell’s next-gen autonomy solutions, for example, is validating that the systems meet Honeywell’s own rigorous quality standards. For such advanced systems, it is important to augment the process of collecting real-world data across possible scenarios and environments with data from a high-fidelity 3D photorealistic simulator. 

The rapid advancements in AI and machine learning methods, which are often at the core of these systems, only further amplifies the need for high-quality, diverse synthetic data. Properly deploying the right simulation tools enable engineering teams to refine their algorithms iteratively and incrementally in the lab, improving development speed without compromising quality.
Image courtesy of Duality Robotics
“Duality’s simulator has provided us a framework to easily integrate our algorithms and quickly generate quality data for different scenarios, thereby reducing the cost, time, and risks in data collection and testing," says Thandava Edara, Offering Director for Alternative Navigation and Autonomy Group, Honeywell Aerospace.

Another of Duality’s recent projects was a digital twin of a self-driving long hauler. The customer already had a truck kitted out with sensors and actuators for autonomous driving, so Duality started there—they gathered field data via physical sensors, riding along with the truck and learning the exact characteristics of that specific big rig while it performed its driving tasks. 

Then came the digital twin. “Outside of Unreal Engine, we built a simulation of their custom actuators, and within Unreal Engine, we built a multi-body dynamics model of the big rig, with a separate tractor and trailer constrained together.” explains Taylor. “Then we tuned the dynamics of this combined system to match the field data.”

The result was a solution where the truck’s autonomous software was able to command this combined simulation to drive the 3D model of the truck inside Unreal Engine. “Unreal’s architecture allowed us to build a unified simulation that was so accurate, we were able to achieve centimeter-level correspondence between the simulated truck and real-world data, all during dynamic maneuvers at highway speeds,” says Taylor. 
Image courtesy of Duality Robotics
In the same way that an accurate simulation within a realistic virtual environment can validate a physical system, it can also point up flaws in physical devices or data. Shah and Taylor recount a time when Duality was tasked with creating a virtual practice environment for an autonomous sidewalk robot. Within the simulation, the sidewalk robot consistently misinterpreted certain types of shadows as objects. 

While this might have at first seemed like a flaw in the simulation, it turns out that the real robot was doing exactly the same thing when it encountered those types of shadows—the simulator had accurately mimicked the robot’s sensor to the point that its perception system made the same mistakes.. “Not only the same type of mistakes, but the same nature of mistakes,” points out Taylor. “The behavior of virtual and real machines were exactly the same, which is crucial.” 

Unreal Engine as a platform

Part of Duality’s challenge in adopting Unreal Engine as a base tool for its simulation solutions has been overcoming misconceptions that clients sometimes have about using a game engine for such exacting technology. “Some customers think that game engines can’t be deterministic, but we can quickly show that they can,” says Taylor. “We also hear that they’ve tried using a game engine before, and it didn’t work for them. They’re also concerned about accurate behavior of people, cars, foliage. These are valid concerns and we are happy to help them understand exactly how capable Unreal Engine can be and how we can use it to solve their problems.”

Duality is able to put aside such concerns through targeted, detailed demonstrations in the machine’s domain, where clients are able to see their specific problem addressed in real time. “The proof is in the pudding,”     Taylor observes. “We don’t just tell them, we show them.”
Image courtesy of Honeywell
Image courtesy of Honeywell
Duality chose Unreal Engine as the platform for Falcon for a number of reasons. “The fidelity of the world is very important,” says Shah. “We actually model anything that's more than half a centimeter wide. Depending on the angle and distance, a sensor could read a small anomaly differently when it’s represented by a texture instead of a model. Unreal Engine retains that fidelity in its graphics.”

Another advantage is the ease with which Duality can build simulations that engineers can manage on their own, and even build on. For example, a Duality system can include attributes exposed by the USD file format as tunable parameters, so an engineer can change lens information on the fly without having to know how to use Unreal Editor.

But Shah’s favorite thing about Unreal Engine is its architecture, which includes a powerful visual scripting system called Blueprints. “It’s like a 3D operating system,” he says. “It's a great foundation with C++, Blueprints, and a very robust editor in which you can build levels and address these scenarios. It's the whole ecosystem, and the architecture of that ecosystem, that matters to us.”

Duality’s API offers native ROS integration out of the box, and engineers can even code directly in Python, which Shah calls the “lingua franca” of machine learning. “If you already have an autonomous stack that you've built, you have to do minimal work to integrate it,” he says.

The Duality team expects to be working on their autonomy simulator for some time to come, and to keep using Unreal Engine to forward that goal. 
“The virtual and the physical—how can we blur the line between these two things?” Shah asks. “That's the duality that we are working on. This isn’t a two-year problem—we see this as a 10- or 15-year problem. And Unreal Engine provides the kind of fidelity and scalability that we need to do it.”

    Let’s talk!

    Interested in finding out how you could unleash Unreal Engine’s potential for training and simulation?
    Get in touch to start that conversation.