Where communities thrive


  • Join over 1.5M+ people
  • Join over 100K+ communities
  • Free without limits
  • Create your own community
People
Repo info
Activity
  • 02:13
    KelSolaar commented #634
  • Oct 29 19:22
    hminle commented #634
  • Oct 29 19:18
    KelSolaar commented #634
  • Oct 29 19:18
    hminle commented #634
  • Oct 29 19:07
    KelSolaar commented #634
  • Oct 29 19:02
    hminle closed #634
  • Oct 29 09:13
    KelSolaar commented #636
  • Oct 29 09:13
    KelSolaar edited #636
  • Oct 29 06:11
    zachlewis commented #636
  • Oct 29 06:07
    zachlewis commented #636
  • Oct 28 18:17
    KelSolaar closed #632
  • Oct 28 18:17

    KelSolaar on develop

    Update various docstrings. Implement support for "I G P G … Merge pull request #633 from co… (compare)

  • Oct 28 18:17
    KelSolaar closed #633
  • Oct 28 18:17
    KelSolaar commented #633
  • Oct 28 13:01
    hminle commented #634
  • Oct 28 08:51
    KelSolaar milestoned #637
  • Oct 28 08:51
    KelSolaar labeled #637
  • Oct 28 08:51
    KelSolaar labeled #637
  • Oct 28 08:51
    KelSolaar labeled #637
  • Oct 28 08:51
    KelSolaar opened #637
Thomas Mansencal
@KelSolaar

Hi @tenberg,

I would need to check again but CIE1994 and CIE2000 are labelled quasimetrics, and should not be symmetrical.

However, testing it CIE2000 is symmetrical indeed!
>>> Lab_1 = np.array([100.00000000, 21.57210357, 272.22819350])
>>> Lab_2 = np.array([100.00000000, 21.57210357, 269.22819350])
>>> delta_E_CIE1994(Lab_1, Lab_2)
0.22984929951933436
>>> delta_E_CIE1994(Lab_2, Lab_1)
0.23218794545964785
>>> delta_E_CIE2000(Lab_1, Lab_2)
0.23669273745958291
>>> delta_E_CIE2000(Lab_2, Lab_1)
0.23669273745958291
Thomas Mansencal
@KelSolaar
As for the weightings, would you mind opening an issue please?
Thanks!
kwrobert
@kwrobert
Hello everyone. I'm a complete color science newb and have been thrust into this surprisingly vast and bewildering world by requirements for a current work project. Is this a place where I can ask questions about color science in general, as well as specific questions about the colour library? Or is this place strictly for questions about the colour library?
tenberg
@tenberg
Ask away. This group seems very well versed in the art of color science.
Thomas Mansencal
@KelSolaar
Sure, feel free to ask any questions! Given the transient nature of Gitter, you mind find it more appropriate to use our Discourse group: https://colour-science.discourse.group/
kwrobert
@kwrobert
Oh sweet heavens you are all my saviors
Thomas Mansencal
@KelSolaar
It is a bit low traffic but replaces our Google Group where some interesting generic discussions occurred in the past.
kwrobert
@kwrobert
I've followed the breadcrumbs left about in the documentation of the library and bruce lindbloom's website and managed to make substantial progress so I don't have anything pressing at the moment, but I'm sure I will in the coming days
For what it's worth, I have now learned the critical importance of the illuminant when converting between color spaces
Got some weird results that had me scratching my head until I figured that out
Thomas Mansencal
@KelSolaar
Great! This is a critical component that is often glossed over.
tenberg
@tenberg
Haha! I read "no" as "not" and @KelSolaar read it as "now". Yes, illuminant is very important.
kwrobert
@kwrobert
I actually do have a question about my process though. I have a picture of an X Rite Passport color checker taken inside a light box I built, using a cheap camera and some LEDs for illumination. I'm trying to understand how much the camera+lighting combination is distorting my color by taking RGB values of each swatch from the image, converting to L*a*b* space, then computing DeltaE between the L*a*b* values from the image and the values published by X Rite at this webpage: https://xritephoto.com/ph_product_overview.aspx?ID=820&Action=support&SupportID=5159
So two questions: 1) How do I characterize/measured the chromaticity of my LED illuminant? Do I need a spectrometer or could I use a color sensor like this one:?https://ams.com/tcs34725
2) Is there any way I can measured the color difference due to both the camera and the illuminant? If so, when converting to L*a*b* space from the RGB space of the camera, what illuminant chromaticity coordinates should I be using? Should I use the same chromaticity coordinates as the illuminant used by X Rite when publishing their values (D50 standard 2 degree)?
kwrobert
@kwrobert
Side note: I was able to accomplish all of this by using the colour library and the colorchecker detectinon module. Without these tools, I would have been completely and utterly lost. So, thank you all for releasing your phenomenal work as open source software!
Thomas Mansencal
@KelSolaar

The first consideration here is whether your camera data is linearised properly, i.e. it is devoid of artistic tweaks, e.g. tone curve, LUTs and, has been decoded to linear RGB values. Only then you can do a RGB to Lab conversion that is meaningful.

Question 1) Spectrometer is the way to go indeed
Question 2) In order to discard the illuminant from the problem/"equation", you would need to measure spectrally both your chart and the illumination source, compute the measured chart values under the measured illuminant, and then, you can compare with the camera captured values of the chart under the same illuminant

The values published by X-Rite are under a very specific illuminant, i.e. D50 (the ICC values), so if you were to compare those to random or even D50 like LED lighting, you would be subjected to significant metameric errors from the illumination source itself
the idea is to remove uncertainty coming from as much variables as possible
kwrobert
@kwrobert

Ok. So everything you are saying would allow me to isolate color differences caused by purely the camera, eliminating differences due to the illuminant from the problem. That all makes sense to me. I'm still digesting some of what you are saying so bare with me here. A few things

  1. I don't have a spectrometer, which I recognize is a problem. In my naivety a bought a color sensor thinking I could use that to measure the color of my light source in RGB, then convert to color temperature/chromaticity.
  2. I also would like to know how much color difference is due to the fact that my illuminant is not D50.
  3. The camera I'm using has terrible documentation and I don't think it's designers intended it to receive the level of scrutiny I am applying to it. Here is the camera. I am assuming the RGB data in the JPEG file it produces is in sRGB space, but beyond that I have no idea what manipulations the camera might be performing on the raw sensor outputs.

So, from what I understand of your response, is it completely incorrect and not meaningful to convert the RGB data I get from the camera to L*a*b*& space without knowing the chromaticity of my illuminant? Or, could I just use D50 as my illuminant, and assume that the DeltaE I get between the X Rite published values and my measured values is due to a combination of the camera and the illuminant? Does that approach make any sense at all?

Thanks for all your help thus far by the way!
kwrobert
@kwrobert

So I recognize there are a few sources of error here

  1. My color checker is different in actual, physical color (regardless of measurement procedure) from the one X Rite used when publishing their data
  2. My illuminant is certainly not a D50, which means even if everything else was perfect, I wouldn't be able to reproduce the X Rite data because my illuminant is distorting the colors
  3. My camera is an inaccurate color measurement device for various reasons (sensor responsivity, bayer filter interpolation, lens, etc.)

I'm willing to neglect 1 as small compared to 2 and 3 and ignore it outright. 2 and 3 are the issues I'm interested in, and I'm wondering if I can measure the DeltaE due to both of those effects simultaneously. So, if I assume my illuminant is D50 (which it's not), and convert to L*a*b* I will get some numbers. I guess my fundamental question is this: Does it make any sense to compare the resulting values of my conversion to the published values at all? Would that give me a quantitative measure of my color difference due to both the camera and my illuminant?

Thomas Mansencal
@KelSolaar

Would that give me a quantitative measure of my color difference due to both the camera and my illuminant?

Yes but this would be hard to use meaningfully.

There are quite a lot of publications to estimate the illuminant given a set of photographs, but they assume some form of knowledge of the spectral distributions of the camera OR they will make some assumptions that are not great
Finding what the camera is doing and provided you can manually change its exposure settings is certainly achievable
We were doing things like that a few decades ago, e.g. Debevec (1997) which gives a way to recover camera response functions.
Thomas Mansencal
@KelSolaar
It however takes quite a bit of experience to use successfully, i.e. it is not trivial and you could easily compute a response function that is making things even worse!
Are you trying to achieve a particular goal?
kwrobert
@kwrobert

Are you trying to achieve a particular goal?

Yes. Basically, I built two versions of a light box with a different lighting configuration and LED model in each. The same camera is used in both light boxes. Pictures taken in the first version have very inaccurate color rendering. The second one, to my eye, looks much better. I'm trying to prove to others that the second version with new LEDs drastically improves color rendering in a quantitative, non-subjective way rather than to just show them pictures from the second one and let them decide whether it's an improvement or not. Additionally, I also want to have some quantitative way of measuring the improvement in color rendering relative to the first version as we make adjustments in the second version.

Yes but this would be hard to use meaningfully.

What do you mean by "use meaningfully"? I'm mainly using this computation as a measure of relative improvement in color accuracy between two photographic environments, holding all other things fixed except the shape of the light box and the illuminant used. Is my computation meaningful when used to measure a relative improvement in color accuracy between two versions of my light box? I guess I'm still not grasping how the calculation I'm proposing can't be used in a general sense. The way I'm looking at this problem, there are 2 points in L*a*b* space I'm dealing with:

  1. The point published by X-Rite for the color checker swatch in question. My goal is for measurements of the RGB values in the image produced by my camera to be at this point.
  2. The point I'm actually at after taking measurements from a real image and converting to L*a*b* space.

Ignoring actual physical differences in the color checker, in my mind the DeltaE distance between these two points is a cumulative measure of all error sources in my system, is it not? Does that not that have meaning in some sense? Am I missing some critical piece of understanding about the science and math of this problem space? I'm fully willing to accept my own naivety, as everything I know about color science I learned while working on this project by piecemeal collecting and reading sources online as I needed them.

Thomas Mansencal
@KelSolaar

Additionally, I also want to have some quantitative way of measuring the improvement in color rendering relative to the first version as we make adjustments in the second version.

Irrespective of the camera, you can measure the quality of the light itself, assuming you have the spectral distribution you could use metrics such as CQS or CRI to do so, we have TM-30-18 and CFI17 in a feature branch but they are not merged yet

should be in the coming days though
With those metrics you can show that indeed your new lights are improving the rendering quality
kwrobert
@kwrobert
Awesome, thanks for your advice Thomas, this was all extremely helpful to me. One last question: Do you have a recommendations for an affordable spectrometer?
Thomas Mansencal
@KelSolaar
Nothing cheap unfortunately :( This one is kind of cheap: https://www.sekonic.com/industrial/c-7000
The good thing is that you can rent them as they are used onset quite often, e.g. https://www.lensrentals.com/rent/sekonic-c-700r-u-spectromaster-color-meter
kwrobert
@kwrobert
Ooof, no chance I'll be able to own one in the foreseeable future, but the rental option will totally work for me. Thanks!
kwrobert
@kwrobert
Another question: Because I am doing some questionable things with my illuminant when converting my measured RGB values to L*a*b* space, does this mean I cannot interpret my results on an absolute scale? I have been reading a little bit about the "threshold of imperceptibility", which I've come to understand is at a DeltaE of around 5 for most people (please correct me if I'm wrong). Because of the shady nature of my conversion to L*a*b* space, my intuition is telling me that even though I can use my calculations as a measure of relative improvement, trying to interpret my DeltaE values on an absolute scale compared to the threshold of imperceptibility would be incorrect because (as you were saying) my DeltaE values lack meaning as an absolute measure of color accuracy.
tenberg
@tenberg
For threshold of imperceptibility, which is referred to as a JND (just noticeable difference), I would use a value of 1 if you are using CIEDE2000. To interpret your results on an absolute scale, you will really have to characterize your entire system (camera, light source, and color checker).
kwrobert
@kwrobert
Ok, that makes sense. As I learned about this whole field of science, it became more clear to me that getting a spectrometer and taking precise measurements end to end would probably be necessary, but this is a project of tight deadlines a cutting corners (unfortunately). Once I've characterized each component of the system, will it be possible to attribute how much of my total DeltaE value for a given color swatch is due to error from each component (DeltaE due to camera, DeltaE due to color checker differences, etc. etc)? Also, how does one get in to the field of color science anyway?
Thomas Mansencal
@KelSolaar
Guess that either it is a path you have willingly chosen to follow e.g. researcher in colour vision or imaging or you had to understand it for things you were doing and bite the bullet :)
I'm the latter :)
Thomas Mansencal
@KelSolaar
@kwrobert : We have CIE 2017 Colour Fidelity Index now in develop
should you need to rate the spectrum of your lights
kwrobert
@kwrobert
Thanks Thomas! Appreciate all the hard work as always. So what things do you do that require understanding color science? Do you work in manufacturing? Film industry? Genuinely curious ...
Thomas Mansencal
@KelSolaar
Sorry, I have been slammed at work lately!
Film industry, I 'm working for Weta Digital in New Zealand.