-
Notifications
You must be signed in to change notification settings - Fork 229
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Specific file results in very bad compression results when mipmaps are enabled #759
Comments
Please list the command and options that is being used. Images are compressed to ETC1S individually. BasisLZ is then applied to all levels, starting from the largest, using a global codebook from which previously seen runs of data may be reused. It is therefore most unlikely that adding mipmaps could affect the quality of the base image. @hybridherbst are you sure the 2 images you posted here are displaying the same miplevel? |
@MarkCallow I'm not sure if this displays the same mip level but I am sure that this is the mip level viewers like three.js and others choose when displaying the file. In the original zip I added glb files where the ktx2 textures have and have not mipmaps as per the commands below. Command is pretty much "default options for a linear file" as far as I can tell:
vs.
Here's the extracted textures again: |
I disagree. Almost identical textures don't result in the same level of quality loss; this is the first time we're seeing that level of degradation over thousands of processed textures. And yes, we're using ETC also for data textures in some cases where file size is more important, and so far that has never resulted in such bad compression. Please don't get me wrong, I just want to figure out where this bug comes from and if there's a way to avoid it - we're super happy with libktx in general and from my perspective this is an outlier. |
Each level is calculated from the original texture. The screen shot I posted of the uncompressed texture shows the images that will be input to the BasisU encoders.
@hybridherbst have you made similar metallic roughness textures using only the default ETC1S/Basis-LZ parameters and not had this problem? If so,
Please try the interactive glTF-Compressor Tool and see if you can get better results for this texture by fine tuning the compression parameters. If something is going on it is happening within the BasisU ETC1S encoder and is outside my expertise. Therefore I must consult @richgel999. Sorry to bother you Rich but please give us your opinion about this. Follow this link metallicRoughness.zip for the original jpg. The link provided in the initial comment only has .glb files. |
@hybridherbst have you found a similar metallic-roughness texture that does not display this issue? Have you tried the interactive tool? |
The source file |
Did you mean "to filter in linear space" here? I don't quite understand this statement... is specifying Data textures (like metallic/roughness) are always linear in glTF. They cannot be sRGB, and the glTF spec is strict about saying that metadata in the file MUST be ignored, because the metadata is so often wrong... Tools processing glTF files should always use If the source file really has sRGB-encoded the metallic roughness data, though, data loss is baked into the source file. |
We're using etc1s for data textures fine, still besides this one singular texture which produces so much worse results. I'm not sure I understand both of your comments above – seems like Mark is saying "never use |
I don't think there's any other choice glTF Transform could make but Certainly unfortunate that image metadata is such a mess though. :/ |
Yes I meant filter in linear space. The data, according to the file metadata, is sRGB and when I removed the @donmccurdy glTF-Transform should issue a warning if someone tries to use an image whose metadata says sRGB for data use. Image tools these days generally set the metadata correctly so such a mismatch suggests a cockpit error when producing the image.
Make sure the source image is linear. |
Note that if you specify a UNORM format to As I wrote earlier you should be really sure the metadata is wrong before forcing linear. @donmccurdy do you have any evidence that the metadata is wrong for reasons other than cockpit error? |
@MarkCallow I have no evidence either way for the specific texture in this thread. It could certainly be user error! But in the broader 3D modeling ecosystem, I would not describe the problem as one of user error. These decisions are being made at lower levels of software abstraction than most users understand or control — they export a model from Substance Painter, Maya, 3DS Max, or Blender, with 5-10 textures embedded. They do not know which texture uses which color space, and they probably don't have control over that choice even if they do understand it. For a quick test, I opened two examples from the glTF Sample Models repository. These are simply the first two samples I opened — I did not have to go digging. Mosquito In Amber:
Sheen Chair:
I agree that "image tools these days generally set the metadata correctly", but only in the context of tools for opening and editing one texture at a time. Within the context of a larger 3D art workflow with many models and textures, there are "weak links" more often than not. For three.js, the web platform does not even give us the ability to read the color space of a JPEG or PNG texture (KTX2 is an exception) without duplicating the texture in memory. I'd be glad to see the glTF Validator, glTF Transform, and other tools flag the issue more clearly. Perhaps we can improve the state of the 3D ecosystem. But it's a problem that long predates glTF, and I consider the image metadata to be mostly useless today, outside of very carefully controlled workflows. |
@donmccurdy there is a difference between the metadata being wrong or non-existent and the colorspace of the source being inappropriate for the task. |
@MarkCallow In my experience the second case — data using an inappropriate colorspace with correct metadata — is rare. It's overwhelmingly more common that the metadata is wrong. Easily a 100:1 ratio. Perhaps this ticket represents an exception, but I can't change the default behavior based on that. It doesn't seem that there's a general solution here, at the level of either KTX Software or glTF Transform. As you said above, the texture data must be linear. |
@donmccurdy how do you know this? My experience with image editing tools is that encoding the data and setting metadata tend to go together when writing a file. In my previous comment I should have noted that a better choice for the second case is |
Pinging again. For example in the following case
how do you know the metadata is wrong vs. the file being inadvertently encoded into sRGB? I think it is basically a guess in which case glTF-Compressor should warn the user that either the metadata is wrong or the wrong encoding has been used. It should have an option or options for the user to indicate which is the case: --assign-oetf for the first and --convert -oetf for the second. |
This is simply my experience with PBR materials and their associated textures. It's also the reason why glTF specifies ICC profiles must be ignored. I agree that modern "image editing tools" are pretty good about color spaces. But 3D art pipelines are more complicated and less consistent. In particular — I think the distinction between "sRGB color" and "non-color data" is not consistently represented in metadata. Metadata for wide gamut colorspaces like "Display P3" or "Rec. 2020" is less likely to be a side effect of broken workflows, and is therefore more reliable when present. |
| I think the distinction between "sRGB color" and "non-color data" is not consistently represented in metadata. Again I ask how can you tell if the metadata contradicts what's actually in the file or if the non-color data has been incorrectly (as in "it has but should not have been") encoded into sRGB. Is there any image metadata that describes a distinction between color and non-color data? Metadata offering a choice between linear and sRGB does not. The latter simply says the input data has been encoded into a non-linear space. I still think glTF-Compressor should inform the user and provide them the opportunity to guide it to a resolution instead of deciding the meta always contradicts the data that is actually in the file. That can lead to much worse results than decoding the sRGB data, if that is what it is, and using that. |
I cannot look at a JPG containing metal/rough data, annotated as "sRGB", and tell you with confidence whether the metadata is wrong, or the actual choice of encoding is wrong. Other than a probabilistic expectation that it's usually the metadata. Is that what you're asking?
None that I'm aware of. Omission of an ICC profile perhaps, but I suspect many tools will simply interpret that as "sRGB".
Ok — I have no particular opinion on what the glTF Compressor should do. I'd certainly be supportive of getting the glTF Validator to report inconsistent image color space metadata. |
Related to
etc1s
command donmccurdy/glTF-Transform#1062A specific file ends up with very bad quality after compression, but only if mipmaps are enabled.
This texture here: https://github.com/donmccurdy/glTF-Transform/files/12327580/watch.zip
Compression without mipmaps:
Compression with mipmaps:
Compressing without mipmaps results in expected quality. This is the first time we're hitting a particular texture giving such results, however an outlier like this is quite problematic for the use of KTX in production pipelines.
The text was updated successfully, but these errors were encountered: