Jump to content
The Dark Mod Forums

Convert normal maps to RGTC?


duzenko
 Share

Recommended Posts

15 hours ago, nbohr1more said:

Can you try the latest SVN? Orbweaver commited a fixed BC5 loader

I can now confirm that on my Intel UHD machine, the BC5 normalmap shows up ingame and without artefacts. And the "bindless handle" console error no longer shows up. 👍

My cvars are r_useBindlessTextures 1 and image_useNormalCompression 1.

  • Like 2
Link to comment
Share on other sites

I wrote a simple script to convert all *_local.tga textures to DDS/BC5 using compressonator.
From a very brief testing, the game loads them and looks normal.

So I wonder about this point:

Quote
On 11/17/2019 at 4:35 AM, nbohr1more said:

2) Pre-Compressed Normal maps cannot use image program functions like addnormals and heightmap so
they were seen as potential stumbling blocks for material re-use workflows which were the main source of
asset variation early in the mod

The easiest solution is to run image programs using DDS image as source. All these texture compressed formats are trivial to decompress.
The main question is about quality, I guess.

Link to comment
Share on other sites

Yes, there are 3 options to resolve this:

1) Retain an uncompressed copy for the subset of materials that need image programs against DDS ( default Doom 3 workflow )

2) Decompress on image load if image programs are found in the material

3) Replace the image programs with GLSL shaders that accept DDS

I have no objections to option 2 as long as you can also force loading of the uncompressed image if desired.

I guess if someone implements a solution, they get the privilege of choosing. :)

Please visit TDM's IndieDB site and help promote the mod:

 

http://www.indiedb.com/mods/the-dark-mod

 

(Yeah, shameless promotion... but traffic is traffic folks...)

Link to comment
Share on other sites

11 minutes ago, nbohr1more said:

I have no objections to option 2 as long as you can also force loading of the uncompressed image if desired.

I guess you can do this now "in theory", i.e. disable image_usePrecompressedTextures.

Link to comment
Share on other sites

I'll have to test but I believe the entire Doom 3 workflow is currently available including the forceHighQuality material keywords, so

option 1 may already be in place as long as we keep at least some of the uncompressed images around.

 

Probably glass and water normal maps would be the most important but I guess there might be few AI that have heightmap materials that might need to be kept this way?

Please visit TDM's IndieDB site and help promote the mod:

 

http://www.indiedb.com/mods/the-dark-mod

 

(Yeah, shameless promotion... but traffic is traffic folks...)

Link to comment
Share on other sites

Decompressing and recompressing would be undesirable, it's a needless quality loss which will be permanently embedded in the mod resources once it's done and the uncompressed originals are thrown away.

I believe the proper way to do this would be to dump the compressed image after the engine has loaded the shader and done the necessary processing, i.e. using the equivalent of image_writePrecompressedTextures (which may or may not need updating to support RGTC normals). This way you're only getting the compression artifacts once, after the final image has been generated, rather than generational loss from two sequential compression steps.

Of course this only helps with core resources — if mappers are making their own shader stages that perform customised image operations on mod-distributed normal maps, there is no way to handle that except either continue to provide uncompressed source images or rely on decompression/recompression.

Link to comment
Share on other sites

8 hours ago, duzenko said:

I'm not afraid of decompression quality. It's normal maps, so not going to be obvious

The common opinion is exactly opposite: compression artefacts are way more visible in normal maps than in diffuse maps, and mostly due to specular light.

8 hours ago, OrbWeaver said:

I believe the proper way to do this would be to dump the compressed image after the engine has loaded the shader and done the necessary processing, i.e. using the equivalent of image_writePrecompressedTextures (which may or may not need updating to support RGTC normals). This way you're only getting the compression artifacts once, after the final image has been generated, rather than generational loss from two sequential compression steps.

It won't be easy... but I guess it is possible to check for image program and dump tga of the end result, then compress tga to dds and replace image program with dds image path.


By the way, will we simply delete tga normal maps after everything is done?

Link to comment
Share on other sites

@stgatilov When you converted normals to BC5 did you generate mipmaps?

If so, may I kindly request that you commit these converted textures to SVN in batches?'

:)

Did you get a chance to measure mission load times with BC5 ?

Please visit TDM's IndieDB site and help promote the mod:

 

http://www.indiedb.com/mods/the-dark-mod

 

(Yeah, shameless promotion... but traffic is traffic folks...)

Link to comment
Share on other sites

2 hours ago, duzenko said:

Option X: only bundle the tga's and generate .dds's on first load (which we already have a cvar and code for)

That seems to be a reasonable option.

It still doesn't fully answer how we should handle image program behavior.

This would ideally incorporate Orbweaver's idea of pre-processing the raw data with image programs prior to compression and I guess we would need to generate new images for all permutations that are required by alternate material defs that are included in mission packages ( could be pretty bloated storage wise )

Please visit TDM's IndieDB site and help promote the mod:

 

http://www.indiedb.com/mods/the-dark-mod

 

(Yeah, shameless promotion... but traffic is traffic folks...)

Link to comment
Share on other sites

2 hours ago, duzenko said:

Option X: only bundle the tga's and generate .dds's on first load (which we already have a cvar and code for)

That's an interesting option, although a bit complicated.
And if we do it for normal maps, better do it for ordinary textures too.

Basically, we can distribute tga/png textures, and write dds textures straight on player's machine on-demand.

Of course, TDM installation will increase in size... With all the normal maps, I have dds directory of size 2.5 GB. Now consider how the size will increase if we replace dds with tga for ordinary textures (provided that we find most of them in SVN history). I'd say it will become 5-6 GB download and 9 GB after install...

Link to comment
Share on other sites

It's complicated also in the sense that out of the box it screws up at least some of the textures (transparency/format?) 🤪 e.g. console font

Do you guys have any estimates which formats have best compression?

I.e. dds or tga inside a pk4, or .jpg?

Link to comment
Share on other sites

How many missions are actually using custom shader-based processing of core mod normal maps?

Could those missions packages be updated in the same way as the core mod: generate output DDS files for all the included image programs, then include these in the mission pack?

Link to comment
Share on other sites

I have alternative proposal.

It seems that RGTC compression algorithms are trivial. I'll check up the algorithm in compressonator. If it is really that simple, then I'll implement SIMD-accelerated compressor RGBA8 -> RGTC on CPU.

After that we can compress textures in memory during load like we do now, but 1) it will be done same way regardless of driver, 2) it will be fast regardless of driver, 3) it can be moved from main thread to worker threads which we already have for loading TGA, making it parallel to uploading data to OpenGL.

It seems such approach allows to avoid the whole "compress everything" story and leave assets as they are. It also allows using uncompressed normal maps (although large missions won't boot in such case). Also, it avoids locking in to RGTC compression format. Precompressed RGTC textures should also work fine if someone decides to use them.

Link to comment
Share on other sites

That sounds great!

I guess the question is whether this compressor will also quickly create mipmaps which were previously identified as the major pain point for texture loading. Eg, can mipmap generation also be SIMD accelerated?

Please visit TDM's IndieDB site and help promote the mod:

 

http://www.indiedb.com/mods/the-dark-mod

 

(Yeah, shameless promotion... but traffic is traffic folks...)

Link to comment
Share on other sites

Hmm... I guess the resize portion of mipmap generation should be SIMD friendly. Here is an example resize library that uses SIMD acceleration:

https://github.com/avaneev/avir

Please visit TDM's IndieDB site and help promote the mod:

 

http://www.indiedb.com/mods/the-dark-mod

 

(Yeah, shameless promotion... but traffic is traffic folks...)

Link to comment
Share on other sites

Resizing should be parallelisable but note that there are speed/quality tradeoffs to be made by choosing an appropriate filtering algorithm.

I suspect most GL drivers just do a simple box blur, which is OK but not necessarily the highest quality way of downscaling mipmaps. GIMP offers a whole load of filtering options: Lanczos, Mitchell, Bicubic etc.

  • Like 1
Link to comment
Share on other sites

19 hours ago, OrbWeaver said:

I suspect most GL drivers just do a simple box blur, which is OK but not necessarily the highest quality way of downscaling mipmaps. GIMP offers a whole load of filtering options: Lanczos, Mitchell, Bicubic etc.

I looked in compressonator, it uses box filter (CMP_GenerateMipLevelF in cmp_boxfilter.cpp) :

                    for (int i = 0; i < 4; i++)
                        *pDst++ = (c1[i] + c2[i] + c3[i] + c4[i]) / T(4.f);

with rounding down.
By the way, given that c1[i], c2[i], c3[i], c3[i] and T are all unsigned char, which this code does not overflow?

Note that this is not "blur", this is averaging of 2x2 blocks. This is not even arbitrary resize, where old texels and new texels interact in arbitrary way.

Link to comment
Share on other sites

Some notes about the main algorithm.

Pixel values [0..255] are treated as float values [0..1] (standard thing in image processing).
 

The 8 values which can be selected from are called "ramp points".
When min/max ramp points are determined, all the other ramp points are computed by formula and rounded to the nearest integer (see GetRmp1, _bFixedRamp is true in our case).
Then all input pixel values are encoded as the closest ramp value (see Clstr1). If several candidates are best, then it's hard to say which one is chosen due to numeric errors from dividing by 255.

The format supports two orders of min/max ramp values (see here). The auxilliary one always has 0 and 255 values. That's good for alpha in transparency, but seems pointless for normal maps. Still, the compressonator's algorithm tries both ways and chooses the one which is better (see CCodec_DXTC::CompressAlphaBlock).

Now, the hardest part is selecting min/max ramp points, and it is far from simple (see CompBlock1). It includes some search when max-min is greater than 48, plus some "refinement" trying to minimize error while ignoring rounding. I won't be able to reproduce this, that's for sure.


However, there is also a setting for compression speed, and if you set it to fast or super fast, then different code is used instead of all the stuff described above (see DXTCV11CompressAlphaBlock). For some reason, it is disabled under Windows. There is also good explanation at the beginning of dxtc_v11_compress.c file.

This is much less arcane, although there are still several hardcoded candidates for min/max ramp points (it tries to add plus/minus one to them).

Link to comment
Share on other sites

An initial stab at using "forceHighQuality" didn't seem to work, but I feel my whole experiment was flawed since disabling normal compression also failed to change the image ( different images with the same base name ). This makes me fear that folder precedence over pk4 might be broken for this somehow? I will try again tomorrow with repacking the images instead.

Please visit TDM's IndieDB site and help promote the mod:

 

http://www.indiedb.com/mods/the-dark-mod

 

(Yeah, shameless promotion... but traffic is traffic folks...)

Link to comment
Share on other sites

Note that as long as image_usePrecompressedTextures is on, this takes precedence over all other image_ options, which means you will see no difference by disabling image_useNormalCompression. If the engine is looking for a precompressed image and it finds one, it assumes it is correct, and does not bother loading and compressing the uncompressed original.

I don't know how this interacts with forceHighQuality though. Does forceHighQuality completely bypass any DDS loading (including of precompressed images), or does it just selectively disable image_use[Normal]Compression on a particular source image but still allow the loading of a precompressed version if present? This would need testing or examination of the code.

  • Like 1
Link to comment
Share on other sites

Join the conversation

You can post now and register later. If you have an account, sign in now to post with your account.

Guest
Reply to this topic...

×   Pasted as rich text.   Paste as plain text instead

  Only 75 emoji are allowed.

×   Your link has been automatically embedded.   Display as a link instead

×   Your previous content has been restored.   Clear editor

×   You cannot paste images directly. Upload or insert images from URL.

 Share


  • Recent Status Updates

    • STiFU

      I started skateboarding again after a 20 year hiatus. Whish me luck I don't break anything. 😄
      · 2 replies
    • freyk

      Some years earlier i created some launchers for TDM. uploaded today some new sourcecode and executables on my github repo.
       
      · 0 replies
    • Epifire

      Say, I know it's been a while since the site got overhauled from the crash. But did we ever figure out if/how to get the recent topics & replies list back? It's not a total deal breaker but it was nice for becoming a thread creeper again...


      · 2 replies
    • Epifire

      Some of you who've been on the TDM discord know I was out of work last Winter, just putting in hrs for the developer portfolio. Currently I've been working a seasonal job to pay the bills and now I'm finally in my last week before I get bumped off. Things will be tight but I'm planning a long off period to make as much content as I can. Big plans in Unreal Engine as well as my most ambitious TDM collab yet! Never been so excited to be a stuck at home to pursue my life's work. With a lotta time and maybe some luck, I'm hoping to get enough art work done that I may start applying around to studios.
      · 4 replies
    • STiFU

      I finally got around to play Prey and I truly loved it. It is an incredible homage to System Shock and Deus Ex. While the gameplay is not en par with those two titles, the game makes up for that with its well written lore and story. The whole "world" just feells so authentic and it features a ton of really god environmental story-telling. Recommended for every immersive sim fan, i.e., everyone on this forum.
      · 5 replies
×
×
  • Create New...