Assuming I understand what you are saying...
To any processor (CPU or GPU alike) they are just numbers. The speed of normal processing of them shouldn't be any different because the processor doesn't care about what the number is going to be used for. Now doing some sort of color transformation for a special purpose might be different. Depending on the algorithm, the different colors might require different amounts of computation. But in general image processing like adding blues is no different from adding reds, for example. The only difference is where the numbers are stored in the data structure. Viewing them, on the other hand, is just a matter of scanning the numbers and converting them to displayable signals and this is done at the frame rate, whatever that is, for all colors alike.
I have seen instances where there may be a different number of bits to store one color versus another. Our perception is not the same for different colors. I saw a video on YouTube recently that talked about how some languages do not even have a word for "blue" but have many different words that mean different shades of green. If people can truly differentiate between intensities of green better than they can blue, then it might make sense to store blue values with fewer bits than either green or red. I think I saw this implemented in a computer's display system in the 1980's when one byte stored the color of a pixel. Green and Red had 3 bits each while blue only had 2.
┌─┬─┬─┬─┬─┬─┬─┬─┐
│ Red │ Grn │Blu│
└─┴─┴─┴─┴─┴─┴─┴─┘