If you're not up to speed yet, here's a clip from the keynote and the video above provides an in-depth look at how it works.
NVIDIA RTX Remix is a free modding platform built on NVIDIA Omniverse that vastly expands the modding capabilities of compatible DX8 and DX9 games. With Remix, modders can use a combination of AI texture tools and their own artwork to replace textures, objects and materials with high-quality versions. It also allows modders to inject modern lighting techniques like ray tracing, DLSS 3 and NVIDIA Reflex into the game.
Perhaps the biggest and most exciting thing about RTX Remix is that it effectively enables modding for titles that were either harshly limited or almost impossible to mod beforehand!
So how does it work? Modders can start the game with RTX Remix running alongside it and at the press of a button, all the render data for textures, geometry and lighting data is captured and converted to the Universal Scene Description (USD) format which is then loaded into the application. From there, modders will be able to replace assets, update textures and tweak the lighting to their heart's content. It also features AI-assisted upscaling tools to supplement the modder's workflow. Once they're ready to share what they have created, all the changes can be bundled into an RTX mod.
For players, you simply install the RTX mod near your game executable and when you boot up, the changes made in Remix will be automatically applied. That's not all though. RTX Mods will also include an in-game configuration menu that allows the player to tweak lighting, reflectivity and other values on the fly to achieve their desired effect.
RTX Remix will require modders to be using a ray tracing capable NVIDIA GPU to create the content (RTX 20 series or above) but players can install and play RTX mods with any GPU that supports Vulkan Ray Tracing.
You can see a great example of what's possible with RTX Remix by looking upcoming Portal RTX trailer or the Morrowind RTX Mod showcased during the keynote. There's no confirmed release date yet, but we're optimistic that you'll be able to get your hands on RTX Remix sometime in early 2023.
NVIDIA is excited to see how modders will take advantage of RTX Remix and has joined us in this post to answer your questions from the comments. Before we get to that though, here's a word from Nyle.
To ask a question, simply leave a comment on this article - please take a minute to scan the existing comments so that we avoid too many repeated questions. Nyle will answer as many as he can, but please keep in mind the topics they can and can't discuss!
Don't worry about digging through the comments for answers though, I'll be adding a copy of each question and answer to the spoiler in the pinned comment below.
98 comments
Comments locked
A moderator has closed this comment topic for the time beingUntil next time,
Nyle Usmani
NVIDIA Product Manager
1. This is more of an expansion of the previous question posed by dariort64, more specifically the part about LOD’s.
Would it be possible to completely subvert/ignore the base games LOD system and simply set up our own LOD hierarchy through RTX Remix?
This way the artist could go through and set their own performance target with mesh specific LOD distances catered to their custom assets.
2. This question ties into the first question a little. Would it be viable to give artists the ability to create “proxy” meshes for
the path tracer to use in reflections or to manually target a specific LOD to be used in reflections.
(or maybe just use the original games geometry for reflections)
3. Is it possible to override a mesh/s culling to prevent it from ever being culled or set custom parameters that would dictate when an
object will be culled in order to facilitate more efficient and or better reflections in the path-tracer?
This would allow artists to decide what geometry stays rendered in the scene, regardless
of what the base game intends.
4. I know that you guys will be validating games on you’re end, but will end-users be able to attempt to use RTX Remix with any DX8/9 game
they chose? I know similar questions have already been asked but I couldn’t find any definitive answers on this.
5. This question is more of a stretch but have you guys ever looked into a “virtualized geometry system” similar to UE5’s Nanite solution?
It would completely eliminate the need for LOD’s (and allow absurd triangle densities), but I am well aware that that is in no way a trivial task.
What Epic managed to achieve with Nanite is borderline voodoo magic.
so in short my question is. would it work on games that have a lower direct x but also say lower direct x and above for example DirectX 6.0 and above?
We look forward to seeing your future experiments with RTX Remix! On to your questions:
Without commenting on Unreal Tournament 2004 specifically, here’s how it should work.
As long as players have the RTX Mod with remastered assets, they will see the maps with updated visuals. The server is uninvolved in the rendering process and is not relevant to whether users will see updated visuals–it’s all about the RTX Mod users have locally stored on their PC. Because the RTX Remix runtime is only influencing the visuals, players should in theory be able to play against each other even if some players are seeing the vanilla game while others are using RTX Mods.
To correct a misconception, you may not have to remaster every custom map. In old games, many custom maps reuse the same textures and assets. RTX Remix should replace all of those textures and assets if the mod has replacements available, even if the level geometry is being seen for the first time.
As for dynamically downloading RTX Mods in game for players who do not have those assets already–that type of functionality would have to be modded into the original game by the modder so that remastered assets are downloaded and put into the RTX Remix Runtime's asset folder. RTX Remix wouldn’t be involved in architecting such a system.
Another thing to note–Anti-cheat software may prevent RTX Remix from working in certain competitive multiplayer titles.
The renderer would fall back to showing classic assets, wherever it has no replacement assets to point to.
Will it be possible to make custom implementations of the shading of surfaces (e.g. shader programming) to be able to recreate specific materials more easily?
How will the USD replacement functionality work with skinned assets? Or will it only support static mesh replacement? Will assets that use vertex blending be exported with their matching bones and weights correctly? And what about games that use morphers for facial animations?
What kind of scripting functionality will be available to control what lighting or other scene settings get used in a game? Would this provide a way to modify the scene in real time to match certain systems of the game (like a game with a dynamic day/night system)?
Will there be any systems in place to deal with replacing assets in games that are capable of streaming and swapping different level of detail meshes?
Very detailed questions!
This topic has come up a lot and we are deep in research, but there’s nothing to announce today.
It depends on the implementation used by the game. For CPU based skinning, we get the post-skinned data in Remix, and while it renders correctly, it's a bit more challenging to replace these assets. It’s something we would like to look at closer in the future.
For GPU based (fixed function) skinning, we actually have access to the pre-skinned mesh and the skeleton (including the collision volumes for raytracing) which gives us a great deal of flexibility to modify and reskin characters in RTX Remix.
We saw some (incorrect) speculation that we did not include NPC’s in our Morrowind RTX showcase because we weren’t able to properly mod them. But Morrowind actually passes us pre-skinned meshes and the skeleton so it was perfectly in our capabilities to modify the NPCs. In order to give the artists time to do an incredible job, we narrowed the scope of the project to an indoor environment without NPCs.
Variable level of detail during streaming is an interesting case. Here's how it should work: you would need to take multiple captures at each level of detail, and RTX Remix would regard each asset at each level of detail as a unique instance. For example, imagine an asset has three levels of detail depending on the distance of the player–at 5 meters it’s high quality, at 15 meters it’s middle quality and at 25 meters it’s low quality. That object would actually be regarded as three separate objects depending on the distance of the player.
For the modder, they have two options. Recreate all three assets with different levels of detail to give you a similar feeling as the original game. Or map a high quality asset to all three instances, which would help make the asset feel more at home in a modern graphical context.
Hope that helps Dario. Can't wait to see what you can do with RTX Remix!
Notably for Nexus, New Vegas and Skyrim.
What I'm really, really hoping is the case, is that it will only take some extra work to make games like that fully RTX-ready, perhaps missing out on some automated stuff that Remix would otherwise do, but I'm afraid that it will simply not work.
Additionally, I saw in the previews that they were always shown off indoors, and with no NPCs or other dynamic objects. It can obviously handle at least some dynamic stuff given Portal with RTX's cubes, pellets, turrets, and things, but with Morrowind, there were no NPCs, and the outdoors was never shown outside of a vanilla shot for comparison.
How does Remix handle games which have a dynamic day/night cycle, and which load/unload objects dynamically in a single area (Such as exploring the worldspace in Morrowind and travelling to different cells)? How is it able to differentiate between indoors/outdoors? Can you specify an object or vector as relating to the sun/moon, and have it adjust RTX parameters dynamically to match?
Another thing is how it would deal with games that use culling. Will objects behind you stop casting their shadows because the game engine is no longer rendering them? Can it catch the whole scene in one go with that regardless, or would it require multiple 'snapshots'?
I love technical questions like this.
I want to preface some of these specific questions by saying lots of aspects of RTX Remix are still in motion, and many games behave in unique ways depending on a host of variables including their lighting model. Not every title has been tested.
For dynamically loaded objects, as long as the capture properly includes each object that is dynamically loaded in and out of an area, the replacements should also dynamically load.
As far as day/night cycles go, that’s a trickier question. Assuming the celestial bodies are coded as directional lights, the ray traced conversion should retain that feeling of dynamic day and night cycles with more realistic shadowing and skies. From our outdoor tests in Mount and Blade: Warband, we’ve seen some stunning sunsets and sky lighting with realistically rendered light rays peeking through clouds–it’s entrancing to see that kind of thing in games from the 2000s!
If it's a lit skybox it should also work. But developers in those days took a lot of shortcuts to simulate a lit sky and there might be certain approaches to doing classic lighting that we haven’t seen in our testing, and therefore haven’t yet addressed.
There shouldn’t be a need to differentiate between outdoors/indoors in a ray traced lighting model since lights begin to behave realistically. An indoor environment would simply possess more meshes and geometry that lights bounce off of, or that occlude lights to cast more shadows. Both outdoor and indoor environments would have to be relit with the context of realistic lighting, which is something we show in our RTX Remix overview video: https://youtu.be/Vg52-HZhrFc?t=163.
On adjusting RTX parameters dynamically, it might help if you can provide us with an example of a game that we can research that would require something like that, to help illustrate the use case (I've sent you a DM). As for RTX Remix’s options with respect to celestial bodies, classic games can specify a directional light (vector + light parameters) and RTX Remix will ray trace it accurately. Or the original game could have a big mesh up in the sky and we can put an emissive texture on it, or even attach a big sphere light to it. There are a lot of options as it pertains to lighting outdoor environments.
Culling is another case where there might be different interactions with RTX Remix depending on the game.
As you’ve noted, if culling causes objects to cease to exist in the original game, then the RTX Renderer would also cease rendering them. Portal has built in settings that let us disable a lot of the more aggressive culling, which lets us avoid most of those issues. It still culls lights very aggressively, so we had to implement persistent static lights even after the game stops drawing them. There are other workarounds of course–for example, we can simply attach new static lights to level geometry and disable the original lights to retain the original game’s feel without breaking the immersion of full ray tracing.
If other games don't allow customizing the culling settings, and the issue can't easily be addressed by traditional modding, then we may need to look into implementing a different system. For example, some kind of caching to improve object permanence. There are many paths to a solution and we will be paying attention to see if this becomes an issue for classic games.
As it is currently implemented, a snapshot will capture a single frame of content which includes everything rendered, whether it is in view or out of view of the player, and every object will be cataloged (alongside the geometry, textures, lighting, cameras). Some games that render the entire level might only require a single capture, while other games that have aggressive culling might require more. RTX Remix will understand duplicate instances of the same asset through a game, so a modder may already have certain assets captured from a prior room when they begin a capture in a new room/area. For a game, you will have to take multiple captures, although a single area might only need a single capture.
We did not get to every one of your questions but hopefully this demystifies things quite a bit. I should point out, another poster asked about skinning and NPC's, so definitely take a glance at the summary that Nexus Mods will post to see how RTX Remix handles that.
Btw, Killer7 was one of my favorite games on the Gamecube (always hoped the Smiths would make it into Super Smash Bros).
RTX Remix really started as a way to make full ray tracing (i.e. path tracing) accessible for modders, so that faking reflections or light could be a thing of the past. Big publishers already design their games with ray tracing in mind–we think the modding community should have that same power at their disposal.
As such, in place of non-RT global illumination, volumetric lighting, and screen space reflections, RTX Remix offers highly customizable light accurate global illumination, volumetrics, and on/off screen reflections.
There are some post effects that are non-RT that allow modders to tweak the experience like bloom, tone mapping, motion blur, but the focus is to empower modders with higher fidelity lighting.
How are skinned/animated objects tracked? Are they replaceable?
Currently there is no such API but if there is interest for something like this, we can look to see if there’s anything we can do.
We've answered a similar question about skinning in this comment section. Please take a look at that answer for more information :).