Features

Some of the coolest ways Unreal Engine is being used in and outside of gaming

Electric vehicles and performance-capture and Star Wars, oh my!

Metahuman facial capture

It’s no secret that Epic Games’ Unreal Engine is ubiquitous. Between its potential for powerful technical features and ease of use, it’s become a very appealing tool for game developers. It’s why it’s used in everything from Fortnite and Final Fantasy VII Rebirth to the Canadian-made Gears of War and Mass Effect series.

But in recent years, we’ve seen more creators talk about incorporating Unreal in their work outside of games. One of the biggest examples of this is how the engine was used in the production of the popular The Mandalorian Disney+ series. That’s honestly why I found Unreal Fest Orlando 2025 so interesting. As a hardcore gamer, I was naturally most excited to see an Unreal Engine 5 (UE5) tech demo for The Witcher 4 and then talk to some of the developers about it, but as a general appreciator of tech, I really enjoyed seeing how the engine was being used in ways that weren’t strictly gaming-related.

Read on for a highlight of some of the coolest ones that I saw, as well as one that has me less enthused. While this is naturally a good showcase for Epic and its engine, I also think it’s a worthwhile topic because it further illustrates the impact that video games can have on other sectors. Games aren’t just a form of entertainment; they drive innovation across the board.

Cars, cars, and more cars

Rivian Unreal Fest

While the other MobileSyrup Brad is much more of a car guy, I do appreciate a sweet ride, and Unreal Fest certainly had a couple of them. First, there was the Rivian R1T Adventure, an electric vehicle (EV) that’s perfect for off-road use. Notably, this car features a dynamic display that’s powered by Unreal. Given that this EV is literally all about “adventure,” it seems fitting that the infotainment system would have a made-in-Unreal experience that transports from different terrains, like mountains to deserts, as well as an updated camp mode featuring a Rivian tent and flickering campfire.

I also got to see how Porsche is using Unreal in different ways during a panel with the company’s MHP digital tech division. To start, creative product officer Robert Andersen and head of immersive experience Emma Schröder talked about creating renders of Porsche vehicles. They showed an example of Google’s AI-powered Veo 3 being used to generate a mock-up of a Porsche Macan Electric, but noted that it didn’t accurately represent the desired vehicle.

Unreal Fest Porsche

This led MHP to turn to Unreal Engine for its ability to render high-quality content in real-time. From there, it ended up having a variety of use cases:

  • Creating online renders, which can then be brought into stores as life-sized interactive digital models that don’t take up retail space
  • Videos to demonstrate the product
  • Personalized brochures and give-aways
  • Internal training tools

And because this is all digital, MHP can make any necessary updates without needing to dispose of or replace costly physical materials. All around, it seemed pretty efficient.

Surprise game developer/multimedia artist deadmau5

Deadmau5 Unreal Fest

When I first saw Joel “deadmau5” Zimmerman on the Unreal Fest schedule, I really only thought I’d go to his panel because he’s Canadian. I haven’t really followed his career in the past few years, so I didn’t know he actually sold his catalogue earlier this year. And after attending the panel, I didn’t realize just how multimedia the Niagara Falls native’s career has become.

First off, he recently released an Unreal-powered game called Meowingtons Simulator, which is basically a quirky ragdoll physics-based rhythm experience featuring cats that’s a tribute to his late feline friend. But he also talked about how he’s become more multidisciplinary through Unreal Engine — specifically with his Oberha5li Studios production company. Essentially, it’s all about creating a “metaverse” of virtual worlds, whether that’s in-game Fortnite content like concerts or events where they can interact with Zimmerman himself or even just colourful 4K visualizers on his YouTube channel. During the panel, he explained that there are challenges regarding syncing up his audio and visual creations, but Unreal has helped iron those out.

I had no idea he did this much beyond music itself, so it was pretty cool to see.

A galaxy far, far away

Smuggler's Run family

Smugglers Run. (Image credit: Disney)

Unreal isn’t just used for Star Wars video games like Jedi: Fallen Order and Jedi: Survivor; it can even power related Disney theme park attractions.

During my time at Unreal Fest, I got to attend a special presentation from Disney Imagineer Asa Kalama about how his team used Unreal Engine to create the Smugglers Run experience at Disney’s Star Wars: Galaxy Edge theme park. Kalama noted how the attraction, which has you and a crew controlling the Millennium Falcon, requires a level of interactivity that many “linear” rides don’t have. As a result, his team turned to Unreal. He also explained that the improved fidelity of Unreal Engine 5 is helping Disney introduce an even more “branching” Mandalorian-themed experience next year.

For more on all of this, check out our deep dive into how Unreal powers Smugglers Run with Kalama.

MetaHumans are almost frighteningly good

I really enjoy behind-the-scenes looks at the making of entertainment. It’s something I wish the gaming industry did more often, like we saw with Ubisoft Toronto showing its big performance-capture studio this year and last. In some ways, this is acting in its rawest form, since you’re not in an elaborately produced set — you’re wearing a goofy ball-covered unitard on a pretty sterile and empty stage.

I say all of that because it helps give context for why Unreal’s MetaHuman truly blew my mind. With the tool, you can create photorealistic 3D models in minutes. I got to try a demo for myself using only an Android phone (Snapdragon 8 Gen 1 or newer required) and a basic USB webcam and microphone. From there, I picked a Wolverine-esque avatar and got cracking. Amazingly, it was in sync with nearly every movement I made, from individual winks and cheek shaking to teeth gritting and specific cocking of my head. I say nearly because it’s still a beta and wasn’t quite as good at picking up my nose or tongue. Still, it was mind-boggling to do this with everyday hardware — no need for that marker-based facial recognition setups you’ve likely seen in those aforementioned looks at performance-capture.

I also got to see a developer showcase an upper-body MetaHuman demonstration using a few mounted cameras (plus one attached to a headpiece he wore) and a little harness. Again, though, it was far less than the full-body suits you’d normally see, allowing him to prepare in about a minute. From there, we saw him walking around and making all kinds of exaggerated faces and hand gestures, with the avatar following suit about a second later. All the while, another developer could almost instantly swap between avatars of different races, genders and body sizes.

Metahuman Unreal demo

As the “performing” developer told us, this sort of tech is especially useful for pre-visualization. Instead of spending a while creating all these different mock-ups and trying to narrow down the exact look of a character, only to scrap it once seeing it in action, an artist can, in theory, get an earlier, closer glimpse at all of this. And beyond that, it’s just great to see this kind of push towards democratizing these development tools.

While this can obviously be used well in games, the applications aren’t limited to that medium, as the 3D characters can also be featured in film and television projects.

Still unsure of: AI and Persona Device

While there was much that impressed me out of Unreal Fest, I do have some apprehensions related to AI and gaming. For one, Epic CEO Tim Sweeney’s comments to IGN at the show about AI being used to create a game on the scale of Breath of the Wild were horrifying. (The whole reason that game felt so groundbreaking was because it was so purposefully and meticulously hand-crafted.) And taking that AI conversation further, Epic made waves recently for using the tech to power a Fortnite Darth Vader NPC who can talk to players.

Let me first stress that I’ve long admired what Epic has done with Fortnite. It strikes that fine balance of being an approachable, reasonably monetized free-to-play game that is also jam-packed with exciting content, especially from its mind-bogglingly vast array of pop-culture crossovers. And the recent big Star Wars update, which adds the likes of Darth Jar-Jar and Palpatine, is undeniably neat. But I’ll be honest: that accompanying AI-powered Darth Vader NPC has really left a sour taste in my mouth.

AI Darth Vader Fortnite

While Epic and Disney have noted that the late, great James Earl Jones gave permission in 2022 for his voice to be replicated with AI, I question whether a man in his 90s (as he was at the time) would truly understand the implications of this. Did he even know really what AI is? Hell, I know a lot of far younger people who don’t really fully understand AI. So that’s always nagged at me a bit, especially when we’ve seen plenty of aspiring human actors post their takes on Vader. An AI taking away the potential jobs of worthy successors to Jones kind of sucks.

But all of that aside, we’ve already seen the risks of using this Vader AI, like instances of the Sith Lord dropping the “F” word, leading Epic to issue a patch. It’s easy to imagine a situation where this kind of AI accidentally uses hate speech or other inappropriate language, especially if a sneaky bad actor manipulates it into doing so. We’ve seen how easy it is to create offensive imagery using generative AI tools, for instance.

I bring all of this up because Unreal Fest confirmed that this tool, dubbed Persona Device, is coming to the user-generated Unreal Editor for Fortnite (UEFN) platform. This means that players will be able to create their own AI-powered NPCs like Fortnite‘s Vader. On stage, Epic showcased one example of this, a little experience featuring an AI called Mr. Buttons, and I got to try it out on the show floor. I’m impressed by the technology, though apprehensive as well given all of the context for Vader and AI in general.

Press Button Epic Unreal

First, the demo is quite simple. There is a red button in the centre of the room, and you can use a mic to talk with Mr. Buttons about whether you should press it. Leveraging Google’s Gemini LLM, Mr. Buttons will have all kinds of responses depending on what you say. For instance, I was impressed that my random questions (like “does pineapple belong on pizza?”) got him increasingly frustrated about me avoiding pressing the button. Eventually, he said I should press the button because it might lead to a world full of pizza. All told, it felt like a riff on Portal with the English-voiced AI making increasingly amusing comments and trying to bring about chaos, and honestly, it was much more natural than other human-AI game interactions I’ve seen.

That said, I’m not thrilled about AI-generated voices. Human actors bring so much life to a role, and I’m always wary of anything that seeks to replace that. While I’m sure there could be some interesting emergent gameplay opportunities from an ever-reactive AI, I question whether that’s “worth” the cost of losing that human touch. I’d much rather see AI continue to improve to bolster NPC behaviour in, say, a combat encounter, than as a means of replacing human-crafted narratives and performances.


Overall, though, Unreal Fest was a great experience. It made me better appreciate the tech that powers this art form, even as someone who doesn’t understand a whole lot of its inner complexities. I am certainly skeptical of some of the ways AI continues to be pushed, but otherwise, I really liked what I saw.

MobileSyrup may earn a commission from purchases made via our links, which helps fund the journalism we provide free on our website. These links do not influence our editorial content. Support us here.

Related Articles

Fatal error: Uncaught Aws\S3\Exception\PermanentRedirectException: Encountered a permanent redirect while requesting https://ms-staging-baselayer-static-assets.s3.ca-central-1.amazonaws.com/?list-type=2&delimiter=%2F&prefix=uploads%2Fwpcf7_uploads%2F. Are you sure you are using the correct region for this bucket? in /var/www/html/vendor/aws/aws-sdk-php/src/S3/PermanentRedirectMiddleware.php:49 Stack trace: #0 /var/www/html/vendor/guzzlehttp/promises/src/Promise.php(209): Aws\S3\PermanentRedirectMiddleware->Aws\S3\{closure}(Object(Aws\Result)) #1 /var/www/html/vendor/guzzlehttp/promises/src/Promise.php(158): GuzzleHttp\Promise\Promise::callHandler(1, Object(Aws\Result), NULL) #2 /var/www/html/vendor/guzzlehttp/promises/src/TaskQueue.php(52): GuzzleHttp\Promise\Promise::GuzzleHttp\Promise\{closure}() #3 /var/www/html/vendor/guzzlehttp/guzzle/src/Handler/CurlMultiHandler.php(163): GuzzleHttp\Promise\TaskQueue->run() #4 /var/www/html/vendor/guzzlehttp/guzzle/src/Handler/CurlMultiHandler.php(189): GuzzleHttp\Handler\CurlMultiHandler->tick() #5 /var/www/html/vendor/guzzlehttp/promises/src/Promise.php(251): GuzzleHttp\Handler\CurlMultiHandler->execute(true) #6 /var/www/html/vendor/guzzlehttp/promises/src/Promise.php(227): GuzzleHttp\Promise\Promise->invokeWaitFn() #7 /var/www/html/vendor/guzzlehttp/promises/src/Promise.php(272): GuzzleHttp\Promise\Promise->waitIfPending() #8 /var/www/html/vendor/guzzlehttp/promises/src/Promise.php(229): GuzzleHttp\Promise\Promise->invokeWaitList() #9 /var/www/html/vendor/guzzlehttp/promises/src/Promise.php(272): GuzzleHttp\Promise\Promise->waitIfPending() #10 /var/www/html/vendor/guzzlehttp/promises/src/Promise.php(229): GuzzleHttp\Promise\Promise->invokeWaitList() #11 /var/www/html/vendor/guzzlehttp/promises/src/Promise.php(69): GuzzleHttp\Promise\Promise->waitIfPending() #12 /var/www/html/vendor/aws/aws-sdk-php/src/AwsClientTrait.php(58): GuzzleHttp\Promise\Promise->wait() #13 /var/www/html/vendor/aws/aws-sdk-php/src/ResultPaginator.php(151): Aws\AwsClient->execute(Object(Aws\Command)) #14 /var/www/html/vendor/aws/aws-sdk-php/src/functions.php(52): Aws\ResultPaginator->valid() #15 /var/www/html/vendor/aws/aws-sdk-php/src/functions.php(69): Aws\map(Object(Aws\ResultPaginator), Object(Closure)) #16 [internal function]: Aws\flatmap(Object(Aws\ResultPaginator), Object(Closure)) #17 /var/www/html/wp-content/plugins/s3-uploads/inc/class-stream-wrapper.php(695): Generator->valid() #18 [internal function]: S3_Uploads\Stream_Wrapper->dir_readdir() #19 /var/www/html/wp-content/plugins/contact-form-7/includes/file.php(362): readdir(Resource id #734) #20 /var/www/html/wp-includes/class-wp-hook.php(322): wpcf7_cleanup_upload_files() #21 /var/www/html/wp-includes/class-wp-hook.php(348): WP_Hook->apply_filters(NULL, Array) #22 /var/www/html/wp-includes/plugin.php(517): WP_Hook->do_action(Array) #23 /var/www/html/wp-includes/load.php(1280): do_action('shutdown') #24 [internal function]: shutdown_action_hook() #25 {main} thrown in /var/www/html/vendor/aws/aws-sdk-php/src/S3/PermanentRedirectMiddleware.php on line 49