Hi i’m new to color management - noticed my tests started failing with an upgrade to 152 and it is because of the color management.
I have a custom color object which creates a three.js color object eventually
I am picking colors in a straight forward RGB fashion with values between 0…255 for each tuple
In my test I do the following : I assign 10 to my custom color.r value, g and b are both 0
Some time later, the custom color object spits out a hex string based on its values: (0x0a0000), creates a THREE.Color object and calls setHex(‘0x0a0000’)
Before 152 - this would result in the three.js color.r having the value : 0.0392156862745098
With 152 - the result is now (color.r) : 0.0030352698352941175
I fixed this of course now after reading this article, by saying
color.setHex(‘0x0a0000’, THREE.LinearSRGBColorSpace)
Here is my question
the default of color.setHex is setHex(string, colorspace = SRGBColorSpace) - it does not feel very intuitive to say color.setHex(‘0x0a0000’, THREE.LinearSRGBColorSpace) because the value I am passing is a hex string and in my mind a normal SRGB space hex string.
What am I missing?
does the setHex function read:
a) ‘setHex’ and convert the number to THREE.LinearSRGBColorSpace
or
b) ‘setHex’ and the number I am giving you is of type THREE.LinearSRGBColorSpace
and how should getHex() be interpreted (when read by a human being) ?
Ok no need for an answer - the answer is a) right?
I think I get it now - had to ask the question to know what to look for as an answer but at the top I was confused because I was looking under the ‘im still using renderer.outputEncoding = LinearEncoding’ section, because in my mind the cure would be in this section, rather than the ‘im already using rendered.outputEncoding = sRGBEncoding’ section