Cairo convert to monochrome? - cairo

Is there a straightforward way of converting a Cairo RGB surface to 1 bit monochrome (FORMAT_A1) using operators, or does it require iterating through all the pixels?
There are numerous examples of converting to greyscale, where the hue and saturation are discarded and only the luminosity of the RGB channels is kept. The problem is that in A1 surfaces the 1 bit data comes from the alpha channel, and I can't find any method for translating luminosity (in the RGB channels) to alpha.

Related

Image processing-YUV to Rgb

Why does one convert from YUV to RGB , what is the advantage in image processing using matlab of doing such a conversion. I know the answer partially that Y is the light component which gets eliminated in RGB format? what is the basis of such conversions?
I'll tell you what you could have easily found on the internet:
YUV was introduced when colour tvs came up. there should be minimum interference with existing monochrome tvs. so they added color information uv to the luminance signal y.
Due to the way digital colour images are captured (using red, green or blue pass-filtered pixels) the native colour space for digital images is RGB.
Modern displays also use red, green and blue pixels.
For printing you will find YMCK colour space as printing.
Nowadays RGB is the default colour space in digital image processing as we usually process the raw image information. you won't find many algorithms that can handle YUV images directly.

Create a satellite true color image using Matlab

I'm trying to create a true color RBG image from satellite data using matlab, but I don't know how to do it.
The false color RGB image is simple, just employing the right channels for the red, green and blue you can make it
RGB(:,:,1)=(ref16)'; %red - reflectance 1.6mic
RGB(:,:,2)=(ref06)'; %green - reflectance 600nm
RGB(:,:,3)=(ref05)'; %blue - reflectance 500nm
image(RGB)
In this case I'm using reflectances from the satellite channels which range from 0 to 1, so I don't need to modify the original data
But I'm having so much trouble when I try to plot true color images.
According to literature, the following profile should yield good RGB images from MERIS Level-1b data products (the data I'm using). The linear-combinations for the red, green and blue components are based on the colour matching functions of the CIE 1931 color space.
RGB(:,:,1)=log(1.0+0.35*radiance_2+0.60*radiance_5+radiance_6+0.13*radiance_7)'
RGB(:,:,2)=log(1.0+0.21*radiance_3+0.50*radiance_4+radiance_5+0.38*radiance_6)'
RGB(:,:,3)=log(1.0+0.21*radiance_1+1.75*radiance_2+0.47*radiance_3+0.16*radiance_4)'
Radiance are real values going from 0 to 400 (with the scale factor applied), so I guess that I have to normalize the RGB array (0-1 or 0-255) to create the image.
But doing the normalization myself or just using im2uint8 doesn't produce the right image.
It's likely that I'm doing everything wrong because I'm not familiar with colour profiles. Is there a way in matlab to create the image using directly the CIE rgb combination (the one I think I'm getting from the above formulas)?
Is anyone out there familiar with images using matlab and satellite data?
Thanks!

14 Bit RGB to YCrCb

I have a 14 bit image that I like to convert to YCrCb color space. As far as I know the conversions are written for 8-bit images. For instance when I use matlab function rgb2ycrcb and I convert it back to rgb then it would be all whites. It is very important for me to not lose any information. What I want to do is to separate luminance from chroma and do some process and convert it back to RGB.
The YCbCr standard to convert quantities from the RGB colour space was specifically designed for 8-bit colour channel images. The scaling factors and constants are tailored so that the input is a 24-bit RGB image (8-bits per channel. BTW, your notation is confusing. Usually you use xx-bit RGB to represent how many bits in total that is required to represent the image).
One suggestion I could make is to rescale your channels independently so that they go from [0-1] for all channels. rgb2ycbcr can accept floating point inputs so long as they're in the range of [0-1]. Judging from your context, you have 14 bits representing each colour channel. Therefore, you can simply do this, given that your image is stored in A and the output will be stored in B:
B = rgb2ycbcr(double(A) / (2^14 - 1));
You can then process your chroma and luminance components using the output of rgb2ycbcr. Bear in mind that the components will also be normalized between [0-1]. Do your processing, then convert back using ycbcr2rgb, then rescale your outputs by 2^14 - 1 to bring your image back into 14-bit RGB per channel. Assuming Bout is your output image after your processing in the YCbCr colour space, do:
C = round((2^14 - 1)*ycbcr2rgb(Bout));
We round as this will most likely provide floating point values, and your image colour planes need to be unsigned integers.

Block artifact in converting RGB 2 HSV

I would like to convert an image from RGB space to HSV in MATLAB and use the Hue.
However, when I use 'rgb2hsv' or some other codes that I found in the internet the Hue component has block artifacts. An example of the original image and the block artifact version are shown below.
Original
Hue
I was able to reproduce your error. For those of you who are reading and want to reproduce this image on your own end, you can do this:
im = imread('http://i.stack.imgur.com/Lw8rj.jpg');
im2 = rgb2hsv(im);
imshow(im2(:,:,1));
This code will produce the output image that the OP has shown us.
You are directly using the Hue and showing the result. You should note that Hue does not have the same interpretation as grayscale intensity as per the RGB colour space.
You should probably refer to the definition of the Hue. The Hue basically refers to how humans perceive the colour to be, or the dominant colour that is interpreted by the human visual system. This is the angle that is made along the circular opening in the HSV cone. The RGB colour space can be represented as all of its colours being confined into a cube. It is a 3D space where each axis denotes the amount of each primary colour (red, green, blue) that contributes to the colour pixel in question. Converting a pixel into HSV, also known as Hue-Saturation-Value, converts the RGB colour space into a cone. The cone can be parameterized by the distance from the origin of the cone and moving upwards (value), the distance from the centre of the cone moving outwards (saturation), and the angle around the circular opening of the cone (hue).
This is what the HSV cone looks like:
Source: Wikipedia
The mapping between the angle of the Hue to the dominant / perceived colour is shown below:
Source: Wikipedia
As you can see, each angle denotes what the dominant colour would be. In MATLAB, this is scaled between [0,1]. As such, you are not visualizing the Hue properly. You are using the Hue channel to directly display this result as a grayscale image.
However, if you do a scan of the values within this image, and multiply each result by 360, then refer to the Hue colour table that I have shown above, this will give you a representation of what the dominant colours at these pixel locations would be.
The moral of this story is that you can't simply use the Hue and visualize that result. Converting to HSV can certainly be used as a pre-processing step, but you should do some more processing in this domain before anything fruitful is to happen. Looking at it directly as an image is pretty useless, as you have seen in your output image. What you can do is use a colour map that derives a relationship between hue and colour like in the Hue lookup map that I showed you, and you can then colourize your image but that's really only used as an observational tool.
Edit: July 23, 2014
As a bonus, what we can do is display the Hue as an initial grayscale image, then apply an appropriate colour map to the image so we can actually visualize what each dominant colour at each location looks like. Fortunately, there is a built-in HSV colour map that is pretty much the same as the colour lookup map that I showed above. All you would have to do is do colormap hsv right after you show the Hue channel. We can show the original image and this colourized image side-by-side by doing:
im = imread('http://i.stack.imgur.com/Lw8rj.jpg');
im2 = rgb2hsv(im);
subplot(1,2,1);
imshow(im); title('Original Image');
subplot(1,2,2);
imshow(im2(:,:,1)); title('Hue channel - Colour coded');
colormap hsv;
This is what the figure looks like:
The figure may be a bit confusing. It is labelling the sky as being blue as the dominant colour. Although this is confusing, this makes actual sense. On a clear day, the sky is blue, but the reason why the sky appears gray in this photo is probably due to the contributions in saturation and value. Saturation refers to how "pure" the colour is. As an example, true red (RGB = [255,0,0]), means that the saturation is 100%. Value refers to the intensity of the colour. Basically, it refers to how dark or how light the colour is. As such, the saturation and value would most likely play a part here which would make the colour appear gray. The few bits of colour that we see in the image is what we expect how we perceive the colours to be. For example, the red along the side of the jet carrier is perceived as red, and the green helmet is perceived to be green. The lower body of the jet carrier is (apparently) perceived to be red as well. This I can't really explain to you, but the saturation and value are contributing to the mix so that the overall output colour is about a gray or so.
The blockiness that you see in the image is most likely due to JPEG quantization. JPEG works great in that we don't perceive any discontinuities in smooth regions of the image, but the way the image is encoded is that it reconstructs it this way... in a method that will greatly reduce the size it takes to save the image, but allow it to be as visually appealing as if you were to look at the RAW image.
The moral of the story here is that you can certainly use Hue as part of your processing chain, but it is not the entire picture. You will probably need to use saturation or value (or even both) to help you discern between the colours.

Direct conversion from YCbCr to CIE L* a* b*

I would like to convert a pixel value in YUV (YCbCr) to CIE L* a* b* color space. Do I have to go through RGB and CIEXYZ color space or do anyone know a formula for direct conversion?
You need to go through each step. YCbCr is often encoded over video range (16-235/240 for 8 bit), and that needs to be converted to a XYZ using a particular Video RGB space definition(ie. Rec709 for High Def) which involves undoing the per-channel non-linearity of the RGB and then multiplying by the RGB->XYZ primary matrix. You then need to supply a white point (typically D65, the one in the RGB space definition), apply a different non-linearity and then another matrix to produce L*a*b*. I doubt there is much efficiency to be gained by combining all these into one transform.