The Game Plan With Technology and Vehicle Development

Jon Lawson

How is computer games technology transforming the way that vehicles are developed? Richard Gotch goes on a quest to discover the single source of truth

When BMW reported in 2017 that it was combining virtual reality (VR) with additive manufacturing (3D printing) to develop an all-new way of assessing driver interface options, it was a stand-alone project. Today, the company has embraced real-time visualisation technology throughout its business, from design to production planning to sales. At the heart of each application is a graphics engine originally created for developers of computer games.

Known as real-time visualisation platforms, these powerful systems are now used by most major vehicle manufacturers across a growing range of 2D and 3D applications.

Doug Wolff, technical manager for Unreal Engine, the visualisation platform developed by Epic Games and chosen by BMW as a cornerstone of its visualisation strategy, believes this is just the beginning of the journey.

“Today, each application of our technology is generally standalone, part of a system developed for a specific purpose,” he explains. “The big step will be the introduction of a single, company-wide model that begins its life with the earliest styling concepts, develops through R&D and vehicle engineering before supporting manufacturing, vehicle personalisation and then marketing and sales.”

Wolff says a company’s first experience with VR is usually “a deep technical challenge that would munch through impossible amounts of resources if addressed using conventional approaches.” An example is the rapid acceleration in the adoption of advanced driver assistance systems.

Each new system needs a new user interface and, because it is safety critical, this must be thoroughly validated. Yet there is limited prior knowledge, so there is no accepted starting point. On top of that proliferation, we are now working in a global market: is the right solution for Germany also the right solution for China? There may be 30 design options to test, with collaboration needed from specialists spread across the globe.

Mixed reality (MR) provides a fast, affordable way to find the answers. In BMW’s mixed reality laboratory, engineers sit in a physical vehicle buck that provides the tactile input while the variable components are viewed in VR using commercially available headsets. “The difference in cost compared with previous generation visualisation systems is night and day,” adds Wolff. “All that is needed is a high-end PC, a gaming-spec graphics card and some off-the-shelf VR hardware. The cleverness is in the visualisation engine and the way the engineers use it.”

At Daimler, the ability of Unreal Engine to run on distributed, low-cost machines via the cloud is being employed to allow engineers to share VR models anywhere in the world, using nothing more complex than their laptop and a VR headset. They can evaluate designs, annotate the models, adjust sizes and finishes, reposition elements and save files back to the central PDM (product data management) system, all in a multi-user immersive environment shared with their colleagues. The company says this allows design and engineering teams located in different parts of the world to collaborate far more effectively than they could if they were using the traditional methods of video conferencing or telephone.

“Daimler has referred to this system as ‘a multiplayer online game for engineers,’ because the idea was to integrate single-click access to CAD data with the power of a multi-user games environment,” says Wolff, referring to Unreal’s heritage as the technology underpinning many of the world’s most impressive computer games and cinematic effects. “Their experience with the system has shown that engineers find it much easier to judge sizes and work together on problem solving if they can interact around a scalable, sectionable 3D model. They also found that the ability to hold ad-hoc collaborative sessions has increased productivity.”

Central Models

The key to many of the new-generation VR systems employed by transport engineers has been the ability to transfer data between the visualisation engine and CAD in real-time, without pre-processing. As any of the CAD/CAE system developers will confirm, this is a difficult challenge that still requires significant work. Since 2015 when Epic Games launched its enterprise division to support users and developers outside the games world, this has been one of the priorities for the Unreal Engine team. The ultimate goal – one that Wolff says is approaching rapidly – is for the VR model to become what he calls ‘a single source of truth’; a digital twin that evolves continuously as the design process moves from styling to production.

“The direct link to CAD means the VR model can be employed at each stage of the vehicle’s product life without conflict, duplication or any of the risks that can plague parallel engineering,” he enthuses.

The single model will begin its life as a tool for the styling team, then progress to the industrialisation stage in which designers work with engineers to translate the vision into an affordable, comfortable vehicle that will meet all the necessary regulations. Here, adds Wolff’s colleague Heiko Wenczel, director of industry management at Epic, clients are finding it much easier to resolve the often-conflicting requirements of each specialisation if they work together on a photo-realistic vehicle model.

Wenczel says this is a great example of how improved tools can create what he calls ‘a digital thread’; that is, an integrated view of the vehicle’s data throughout its lifecycle, across traditionally siloed functions.

“An aspect of my job that I particularly enjoy is hearing how our customers are becoming much more collaborative,” he states. “I’ve watched specialists in packaging, thermal, body-in-white and NVH (noise, vibration & harshness) using a VR model to work interactively together as they strive to find space for additional systems, eliminating many of the traditional delays that slow down decision making.”

At the other end of the vehicle journey from sketch to showroom, electrification is also one of the drivers for the introduction of VR into manufacturing planning. Using the VR vehicle model, production processes can be optimised in a virtual assembly hall before being set up in the real world. The training of operators can then begin, even before hardware is ready. Wenczel says there are a growing number of programmes in which Unreal Engine is being used to optimise plant operations, both for all-new facilities and for existing lines where space must be found to include the assembly of electric and electrified vehicles alongside more traditional powertrain options.

Unlocking the HMI

Electrification is also the surprising driver behind the first introduction of a game engine into the car itself.  When General Motors announced its new electric Hummer, the only supplier name-checked in the press release was Epic Games.

Epic’s Wenczel believes the strategy behind GM building the new Hummer’s HMI around games technology is a response to several longer-term industry trends. He accepts that it would be easy to look at the HMI and see a snappily designed, graphics-rich user interface that’s there to grab attention. States Wenczel: “That’s certainly part of why games technology was chosen. The designers love it because increasingly large screens provide vast, seamless surfaces for their creativity.” But he points out that the benefit GM chose to highlight is more strategic: an ability to quickly develop ways for new systems – in this case electrification – to communicate with the driver.

When you are deciding how to display information about state of charge and energy regeneration, or the location of hazards and the urgency of returning control to the driver, there are no decades of experience to call on. “Game engines provide new freedoms for designers searching for the most effective (and most engaging) ways to communicate this information and a fast, efficient way to build and test each option,” says Wenczel.

When one is chosen, the traditional development path is for engineers to code the software, leading to an iterative process in which designers and ergonomists review the system and engineers revise the code. Games developers work without these silos, so the best platforms are structured to allow designers to evolve the end product in real-time (design-driven development), quickly and efficiently, while engineers get on with their day jobs.

There are two further significant benefits of specifying a game engine for the HMI: they allow a very high degree of customisation, potentially down to the individual car and driver; and they help to unlock the potential of over-the-air updates and cloud-based services. Wenczel says the technical roadmap for an HMI is surprisingly critical to the successful adoption not just of new vehicle technologies, but also of new revenue streams. “Like a mobile phone, it is the gateway to additional services and personalisation, but in a vehicle they can be so much more sophisticated,” he says. “How about an augmented reality experience that conveys interesting facts and stories about landmarks that are highlighted as you pass?”

Looking forward, what’s next?

Returning to Doug Wolff, he is clear on what needs to be achieved. “To really transform the way that our industry works, we need to enable seamless implementation of the single source of truth concept,” he says. “Everything else we have talked about is pretty much here now. We can then begin to join-up the different uses to fulfil the vision of enabling a digital twin of the vehicle that develops through the design and development process, supports flexible manufacturing, enables new levels of user experience, facilitates new revenue streams and even simplifies end-of-life recycling.”

The key to much of this is the ability to transfer data between the visualisation engine and specialist applications in real-time, without pre-processing. Wolff says his company is not going to compete with the developers of specialist applications for specific activities. He sees the role of the visualisation specialist as complementing those systems by offering them seamless, two-way access to the central model and, should they choose to adopt it, real-time visualisation.

It’s why he is working closely with industry groups such as the Institute for Digital Engineering in the UK.

Wolff is keen to emphasise that the final piece of the jigsaw is the creativity of the vehicle manufacturers and their development partners. Areas such as cloud-based services are so new that no one is really sure how best to develop them commercially. “The smartphone model is a starting point, but it is far too restrictive,” he says. Then there is the growing need for training AI, for example by generating the edge cases needed for calibrating advanced driver assistance systems. “All this capability is available now,” concludes Wolff; “the roadmap is the creativity and experience that will be brought to how it is applied.”