This question already has an answer here:
Closed 10 years ago.
Possible Duplicate:
Good way to calculate ‘brightness’ of UIImage?
For a UIImage how can you determine the percentage whiteness of the whole image?
cheers
Depending on your definition of 'whiteness', you may be able to simply draw the image to a 1x1 CGBitmapContextRef, then check the whiteness of that single pixel.
Related
This question already has answers here:
Best method to find edge of noise image
(2 answers)
Closed 6 years ago.
I have used adapthisteq to improve the visibility of the foreground objects. However, this seems to have created grainy noisy details. How can I remove these grainy details from the image? I have tried Gaussian blurring through imgaussfilt and while it does remove some of the grainy details, the shape of the cells in the image become less defined. The second image shows the binary image of the first image.
You can use a filter that takes into consideration the edge information like bilateral filter. https://en.wikipedia.org/wiki/Bilateral_filter
The bilateral filter doesn't only weighs the value according to the distance in pixels (like a regular Gaussian blurring) but also according to the distance in color between the pixels.
taken from: http://www.slideshare.net/yuhuang/fast-edge-preservingaware-high-dimensional-filters-for-image-video-processing
An Matlab implementation you can find here:
https://www.mathworks.com/matlabcentral/fileexchange/12191-bilateral-filtering
Closed. This question needs to be more focused. It is not currently accepting answers.
Want to improve this question? Update the question so it focuses on one problem only by editing this post.
Closed 6 years ago.
Improve this question
I have to store a 1MByte word file into a 512x512 pixels image using Matlab and extract it again. The only thing that I know is that we have to remove the invaluable bits of the image (the ones that are all noise) and store our fie there.
Unfortunately I know nothing about both Matlab and Image Processing.
Thanks All.
Given the numbers provided, you can't. 512x512 give 6.2MBit given 24 bits per pixel. So your doc is larger than the image you are hiding it in.
If we ignore the above, then this is what you have to do:
Load the image and convert to uints.
Mask out a number of LSB bits in each pixel.
Load the doc as binary and fill those bits in where you
masked the others out.
Now, from the above to actual code is a bit of work. If you have no experience with matlab it won't be easy. Try reading up on imread() and bit operations in matlab. When you have some code up and running, then post it here for help.
Regards
In matlab you can read images with imread()
(details on: http://de.mathworks.com/help/matlab/ref/imread.html?s_tid=gn_loc_drop )
Image = imread("Filename.jpg")
figure()
imshow(Image)
This code would show you the Image in a Window.
I think what you're looking for is steganography instead of watermarking.
Steganography:
https://en.wikipedia.org/wiki/Steganography
Here is an example of an image with a file inside it:
http://marvinproject.sourceforge.net/en/plugins/steganography.html
Related topic:
Image Steganography
This question already has answers here:
Closed 10 years ago.
Possible Duplicate:
How does one compare one image to another to see if they are similar by a certain percentage, on the iPhone?
I've found this code and am trying to understand it better:
UIImage *img1 = // Some photo;
UIImage *img2 = // Some photo;
NSData *imgdata1 = UIImagePNGRepresentation(img1);
NSData *imgdata2 = UIImagePNGRepresentation(img2);
if ([imgdata1 isEqualToData:imgdata2]) {
NSLog(#"Same Image");
}
Will this confirm that image 1 is exactly the same as image 2? Is this method best practice, or is there a better approach to this?
Your code is comparing the two images bit by bit, so yes it's a 100%-comparison.
If you need something faster you can generate an hash from each UIImage and compare the two hashes, as explained here.
Take a look at this link, it talks all about sampling to images to see the percentage similarity: How does one compare one image to another to see if they are similar by a certain percentage, on the iPhone?
This question already has answers here:
Closed 10 years ago.
Possible Duplicate:
How to get pixel data from a UIImage (Cocoa Touch) or CGImage (Core Graphics)?
Get Pixel color of UIImage
I have a scenario in which the user can select a color from an image, for example the one below:
Depending on the tap location of the user on the image I need to extract the RGB and alpha value at that very point—or say, pixel.
How do I accomplish this?
You need to create a bitmap context (CGContextRef) from the image and convert the CGPoint that was tapped to an array offset location to retrieve the color information from the pixel data.
See What Color is My Pixel? for a tutorial and this similar Stack Overflow question.
Methods:
- (CGContextRef) createARGBBitmapContextFromImage:(CGImageRef) inImage
Returns a CGContextRef representing the image passed as an argument
using the correct color space. This method is called by:
- (UIColor*) getPixelColorAtLocation:(CGPoint) point
This is the function you would call to get the UIColor at the passed
CGPoint.
Note that these methods are in a subclass of a UIImageView to make the process more straightforward.
This question already has answers here:
how to crop image in to pieces programmatically
(3 answers)
Closed 8 years ago.
how can I slice an image into multiple pieces ? my image size is 300x300 an I want to make 9 pieces of it.
Thanks..
CWUIKit project available at https://github.com/jayway/CWUIKit has a category on UIImage adding a method like this:
UIImage* subimage = [originalImage subimageWithRect:CGRectMake(0, 0, 100, 100)];
Should be useful for you.