back to top
Surgical VR Development Software – Unity Vs. UnrealSurgical VR Development Software – Unity Vs. Unreal

Surgical VR Development Software – Unity Vs. Unreal

Ghost Medical Animation + VR Surgery
April 10, 2022
View Table of Contents

Which game engine is better for making VR surgery simulations? We’re comparing Unity and Unreal to determine the ideal program for making the best looking surgical VR training applications.

There are a lot of different types of software developers. Depending on the goals of the application, you’ll put together a team of various specialists with the right combination of skills to get the job done.  There are Backend Developers that build the parts of the program the user can’t see and Frontend Developers who focus on the user interface. Operating System Developers provide the Windows, MacOS, iOS and Linux software that runs all the applications developed by App Developers so that the operating system has a reason to exist in the first place. There are even Language Developers whose job is making the programs that all the other developers use to create their programs.

When it comes to developing medical VR applications, the developers are largely made up of what is known as a Game Developer. It might seem strange that something as serious as simulating a life saving surgical procedure be left in the hands someone who probably could earn a lot more money making Candy Crush more addictive, there are many reasons that those who make games are well suited for VR surgery simulation.

First let’s start with virtual reality applications and the criteria needed to create highly functional VR surgery simulations. VR surgery is an extremely visual medium. It requires lighting-fast interactive responsiveness. The best tools for developing visually immersive experiences with a high degree of interactivity are the very same game engines that drive extremely popular first-person shooters. It simply makes more sense to tap into the rich graphics capabilities of a game engine than to try to engineer your own. So yes, all that lushly rendered imagery seen in murder simulators like Grand Theft Auto, Halo, or Fortnite can also be used to make equally compelling physician training VR surgery simulations as well.

There are several powerful game engines to choose from. The two most widely used and arguably the most capable for creating VR content are Unreal and Unity. Both game engines give developers all the tools they need to make highly functional VR surgery simulations that run remarkably well in virtual reality headsets. Unreal and Unity function very differently. Each engine has such a wide range of exceptional strengths and glaring weaknesses that the longstanding debate of which engine is superior will likely not end soon.

These two programs are designed to make the same type of interactive application but do it so differently that to a trained eye, it is often obvious to spot which engine was used after just a few minutes playing a game.

Can you tell the difference between Unity and Unreal? Tap on the screenshots below to find out.

Cities Skyline Unity or Unreal

Cities Skyline

Robo Recall Unity or Unreal

Robo Recall

Rust Unity or Unreal

Rust

Escape from Tarkov unity or unreal

Escape from Tarkov

Fortnite is it unity or unreal

Fortnite

Prae for the Gods Unreal or Unity

Praey for the Gods

Gears of war 5 is it Unreal or Unity

Gears of War 5

sea of thieves - unreal or unity

Sea of Thieves

subnautica unreal or unity

Subnautica

How to spot an app made with Unreal?

When it comes to render quality, Unreal is the clear winner. Everything Unreal makes is beautiful without requiring the developers to perform technological backflips to get it to look beautiful. The way it handles lighting and shading is often so striking that, well, it’s unreal. Currently, that beauty comes at a price. Unreal is the engine behind some of the most beautiful games you’ve ever played.

If you can’t tell by simply by looking at the game’s image quality, you probably won’t so uncertain while waiting 10 minutes staring at a load screen when you launch a new level in X-Com on your Nintendo Switch or Final Fantasy on even the most souped-up liquid-cooled gaming PC. Beyond long load times, Unreal’s hardware requirements for the end user requires GPU super computing power to hit the 72 minimum frames per second and latency no longer than 50ms required for VR applications. Failure to sustain consistently high frame rates and low latency, can cause cybersickness.

The most affordable and easy to use VR headset at the time of this writing is currently the Quest 2 from the blue pretzel company formerly known as Oculus. The Quest VR headset is a self-contained unit that can be easily found at any Best Buy for $300. Their ease of use and light weight makes them easy to ship overnight anywhere in the world.  Ease of use and portability make live VR surgery training sessions a snap even from thousands of miles away. The benefits to using a VR headset like the Quest make up for the fact that the graphics processor tucked into headset is really quite weak.

The Snapdragon, powering the Quest 2 headset itself is responsible for delivering real-time graphics to two high resolution displays as well as processing the hand and head tracking for the user. Unreal, as beautiful the engine is, is simply too much for meager capabilities of the Quest 2’s graphics processor. This makes choosing Unreal for a surgical simulation a very risky proposition. When a scene requiring 20 medical instruments are added to multiple organic tissue layers, the processor will likely reach the limits of its capability. The result is reduced frame rate and users becoming violently ill when the frame rate drops to 12 fps just as the tip of the scalpel contacts the patient’s beautifully rendered, photorealistic skin.  

Unity on the other hand, offers developers the ability to create interactive applications that use less storage space and perform very well on a much wider range of user hardware. This is an important consideration given the current unprecedented tech-drought, making it all but impossible to buy a high-end graphics card without paying a king’s ransom. In the case of virtual reality in particular, the mobile processors that makes it possible to enter a fully immersive surgical simulation on affordable standalone VR headsets require developers to make their engine choice very carefully. This is likely the largest reason that the vast majority of VR titles are made using Unity.

While a VR application made with Unity may run very well on even the least powerful hardware, there’s more to an app than just performance. In medicine in particular, the instruments and anatomy need to look as real in the headset as possible. To deliver a simulation that physicians can actually use help them learn to diagnose and treat patients, it must match what they will see in the real world. Unfortunately, this is where Unity and Unreal couldn’t be any different.

How to spot an app made with Unity?

Everything Unity makes is ugly by default. Every 3D object imported into the development software must be painstakingly noodled and tweaked. 3D modelers must find the balance to make it render in real-time without it looking like an 80’s music video. Materials, especially metallic or shiny ones needed for most surgical instruments in the operating room suffer from jaggy edges in a rendering issue known as aliasing. While fans of Unity argue that the engine has various tools developers can activate to mitigate issues like aliasing, they rarely admit that built-in antialiasing features cease to work when being rendered in the graphics processor used in mobile VR devices.

Certainly not all games made with Unity are ugly. In fact, there are a lot of very decent, even beautiful simulations made with Unity that anyone can experience using a Quest VR headset. Unfortunately talking Unity out of its desire to make everything ugly is exhausting for even the best shading artists. Sure, anyone can easily make a VR application that runs well on any VR headset if the graphics needed don’t exceed that of Minecraft. No matter how much time, effort, money, and talent you shove into making Unity make applications look good, they’ll never be masterpieces, or what can be made with Unreal without breaking a sweat.

Hopefully the issue of unattainable beauty in surgical VR simulations won’t remain an issue for long. Rumors of Apple’s upcoming VR and AR hardware product specs are supported by equally impressive patent filings. Their plans to use not 1 but 2 Apple M1 Silicon processors to power the twin displays at 3000ppi is significant. On spec alone, the Apple headset provide an image clarity 7.5x sharper than the Quest2. That paired with the graphics processing power provided by the Apple silicon means developers can finally escape the limitations of measly processors in today’s VR headsets. One only needs to remember cell phones prior to the iPhone to appreciate Apple’s ability to release a product that changes everything.

Unity’s render pipeline will improve and Unreal will become more efficient. Both engines are serious about providing developers with powerful tools in the VR/AR/XR landscape. Evidence of Unity’s desire to improve the engine’s reputation of inferior rendering quality can be seen in their recent tech demo, Enemies. Unity’s efforts to show the development community they’ve improved rendering quality of their engine did just that.

Certainly, the demo revealed that Unity is aware of their inferior render quality and pulled out all the stops attempting to change the narrative. Unfortunately, the real test of an engine’s capabilities are not revealed in tech demos but in the results produce by developers facing the challenges of real world technical and budget limitations. If you want to know what to expect from an engine’s baseline functions, compare portfolios from student developers, not AAA development teams capable to heroic feats that are unrealistic to achieve without extensive modification efforts.

Pioneers in these early days of medical VR development are forced to deal with current hardware and software limitations. To mitigate Unity’s render shortcomings, it is possible for virtual surgery developers to take on the daunting task of scripting their own render pipeline and shading algorithm. It’s not an easy undertaking to alter a game engine’s base function in a way that delivers low latency performance and increased visual quality. Expert material shaders working in the Wraith-VR department at Ghost Medical spent three years experimenting with the Unity engine to develop better methods to manipulate virtual rays of light until they had created a significantly better real-time VR rendering solution.  

Unity URP vs Custom shaders make surgical VR instruments look smoother
The Wraith-VR team at Ghost Medical created WraithLux, a custom scripted shader that greatly improves upon Unity’s built-in render pipeline. Notice the reduction of jaggy artifacts on the edges of the metallic material.

Comeback to the Ghost Medical blog often to see updates as we continue investigating tools for making better Surgical VR simulations. Also check out Bryan Wirtz’s in depth comparison of Unity and Unreal in this comprehensive review he made for gamedesigning.org – https://www.gamedesigning.org/engines/unity-vs-unreal/


Share:

View Bibliography

Similar Blog Content

Talk To Us

Ghost Medical is an award-winning leader in medical visualization, specializing in custom medical animation, medical marketing services, and surgical training solutions. We provide a comprehensive range of digital services designed to elevate medical marketing, enhance patient communication, and streamline staff training. With a team of highly skilled professionals and deep expertise in biomedical processes, we ensure precise and impactful representations of your device, product, or procedure through 3D animations, medical illustration, and other dynamic media formats. Contact Ghost Medical today to discover how we can help you train surgeons, boost medical device sales, or effectively communicate your pharmaceutical innovations.

Start a Project