Computation of colours

Started by WAS, February 24, 2021, 08:47:59 PM

Previous topic - Next topic

WAS

Does anyone knows if a CPU/GPU computing different colors has any impact? Regardless of how small? I was just thinking after talking about astronomy and light diffusion to earth -- different light spectrums moving at different speeds. Was curious if anything like that exists in the digital computation field.

Naturally something like this now a days wouldn't be noticed but i do vaguely remember old image rendering programs for DOS on my old PC and I remember when it was scanning in the image from disk certain colors (pixels) took longer then others. This may just be clocking and old hardware unrelated to color processing but just thinking and couldn't find any answers. It is kinda meaningless trivia it seems. 

Maybe Martin or Matt knows?

PabloMack

I don't think the computation is any different between CPUs or GPUs. The three channels are considered to be in linear space. Where colors are a challenge is in actually turning them into light. That's why the same "colors" look quite different on different monitors.

Linear space is not an efficient method of storing colors when there are only a few bits to represent each pixel because our perception is actually logarithmic. So spacing each brightness increment closer together at the low end while farther apart at the high end gives a more even distribution. But computations would be slow so I would think that most renders just use linear because it is faster.

The way this could be done is that brightness would be stored as a logarithmic number. It would be converted to linear for processing and then be converted back to logarithmic for storage. I'm sure you know that RGB is sometimes stored as floating point numbers. I don't know how cameras store HDRI images but I'm sure they do something along these lines to keep from losing resolution at the low end and keep from clipping at the high end.

WAS

Hmm. I think I understand. Though I suppose it seems strange that any color profile would just exist at the same speed without any creation overhead.

I wonder if I could benchmark it with a a language creating solid colours per-pixel in loops, like say 200x200 raster images and see if any colours take longer than others. Maybe just benching the actual code to generate not saving or anything.

PabloMack

#3
Assuming I understand what you are saying...

To any processor (CPU or GPU alike) they are just numbers. The speed of normal processing of them shouldn't be any different because the processor doesn't care about what the number is going to be used for. Now doing some sort of color transformation for a special purpose might be different. Depending on the algorithm, the different colors might require different amounts of computation. But in general image processing like adding blues is no different from adding reds, for example. The only difference is where the numbers are stored in the data structure. Viewing them, on the other hand, is just a matter of scanning the numbers and converting them to displayable signals and this is done at the frame rate, whatever that is, for all colors alike.

I have seen instances where there may be a different number of bits to store one color versus another. Our perception is not the same for different colors. I saw a video on YouTube recently that talked about how some languages do not even have a word for "blue" but have many different words that mean different shades of green. If people can truly differentiate between intensities of green better than they can blue, then it might make sense to store blue values with fewer bits than either green or red. I think I saw this implemented in a computer's display system in the 1980's when one byte stored the color of a pixel. Green and Red had 3 bits each while blue only had 2.
┌─┬─┬─┬─┬─┬─┬─┬─┐
RedGrnBlu
└─┴─┴─┴─┴─┴─┴─┴─┘