Can you give me a starter point for creating an iphone app that recognises colour. I have seen this app http://www.windowsphone.com/en-US/apps/2f83a363-2fce-4107-8394-27feb3645fff and it would be excellent to be able to find out the technique used for the colour recognition part. Thanks.
Probably the easiest way would be to display a UIImagePickerController with it's source type set to UIImagePickerControllerSourceTypeCamera. Once the user snaps a photo, you can then get a CGImage of the photo taken, and look for the color info that way. This question might help you with figuring out the color info you want.
Related
i have hired a programmer to create an iPhone app for me. The purpose of the app is to take a photo and upload it to a server. We want to make a special purpose screen to review the photo before uploading it. This specially developed screen will crucially have zooming functionality.
He claims that after taking a photo, it is impossible to avoid the "use"/"reuse" screen to show up, so now we have two screens to review the photo. First the standard one from Apple, then our own with zoom. Is he right about that? It just sounds so unreasonable that Apple would put such a restriction.
Edit: I mean taking a photo using the camera.
As par Apple's documentation
To perform fully-customized image or movie capture, instead use the AV
Foundation framework as described in “Media Capture and Access to
Camera” in AV Foundation Programming Guide. To create a
fully-customized image picker for browsing the photo library, use
classes from the Assets Library framework. For example, you could
create a custom image picker that displays larger thumbnail images,
that makes use of EXIF metadata including timestamp and location
information, or that integrates with other frameworks such as Map Kit.
For more information, see Assets Library Framework Reference. Media
browsing using the Assets Library framework is available starting in
iOS 4.0
In short yes it is possible check out this sample
[Update]
Use the allowsEditing property on your UIImagePickerController
imagePickerController.allowsEditing = NO;
Previous answer was a bit of a hack to take advantage of a code path that didn't show the buttons but wasn't awesome.
[Previous answer]
You can actually avoid it without going through the hassle of setting up your own image capture from AV Foundation.
Including the following will remove the need to show the "review" screen. All you have to do is put in a few of your own buttons and wire them up to the appropriate functionality.
[self.imagePickerController setShowsCameraControls:NO];
It is a little bit too late I know, but for future reference:
This is far more simple that the answers already provided,
what you are looking for is the allowsEditing option.
imagePickerController.allowsEditing = NO;
That should be enough to avoid showing the "Retake"/"Use" screen after the user takes a picture.
I have an image in UIImageView. Those images are generally of clothes or accessories captured from camera on plain backgrounds. Now, I have to give a functionality to users so that they can remove the background from the actual image which is being shown. Something like what is shown in the picture here. As the slider will move the background will start getting removed more and more. Something like the 'instant alpha' brush in the Preview application available in Mac OS X. I wish to do this in native iPhone app.
I know I'll require some algorithm for image processing to do this. Does anyone have anything helpful which I can refer or use in order to get this done? Thank you so much in advance.
You can render your image in some context, than change all points you need to the color you want, get image from your context and display it again.
This link should help you to get color of a pixel in context.
Note, that this method is too slow, so I think you should remember all positions of pixels you
need to change to make your app a bit faster.
I would like to store severals photos like photo application, however i don't know exactly the objects what i need. If anyone know how to implement this way i'm interested.
A link to give you an idea: http://blog.photobox.fr/wp-content/uploads/2010/07/Album-iPhone5.jpg
thank you in advance.
You can try the PhotoViewer provided in three20 framework. It is exact replica of the Photos app of iPhone, if that's what you're looking for. You can find a tutorial here.
go to this link it help u
https://github.com/kirbyt/KTPhotoBrowser
Having spoken to an Apple engineer on some of the optimizations that they went through on the Photos app, I can give you a couple of tips:
They never display back the original photo. Because of a photo's size, they only take the original photo and save off a number of optimized thumbnail images.
The example image you show does not contain a series of thumbnail images. Each row is actually a single image. For selection, an overlay is placed in the exact size and dimension of the thumbnail image to give the impression that you are selecting a particular image. This could be accomplished by using a table view, but it more likely just a scroll view.
Does anyone know how to do this?
Is anyone able to provide an example? I believe this is out of NDA now as was available in version 4.0 ?
Take a look to AVFoundation framework , specially to avcapture avsession avinputdevice, etc. You can find some listings in the iPhone dev center forums: search "avcapture"
AVFoundation is the framework you want to use to record, modify raw frames, show them, an offcourse add some overlay
If you want to do only overlay then, UIImagePickerController should b enough.
If i understand, what you are trying to do corectly, you have set up video capture someting like suggested by Apple in this Q&A:http://developer.apple.com/iphone/library/qa/qa2010/qa1702.html
From there it shouldn't be much of a Problem to use the method described in this answer:blend two uiimages based on alpha/transparency of top image to blend the preview with your overlay, provided you draw it in a UIImage first. Feed the resulting images to a buffer and save it accordingly.
Though, this method most certainly wouldn't give you a lot of frames per second.
as you probably know, the UIImagePickerController in the UIKit Framework allows the user to take a photo which is then handed over to my App for further processing.
If the property allowsEditing is set to YES, the user is allowed to move and scale the photo after taking it.
What I'd like to accomplish is to modify the photo in my code BEFORE the user is presented with the move and scale screen. This is because I'd like to add some visual effects to the photo and the user should be able to move and scale with these effects already applied to the photo.
I know that there's the cameraOverlayView property, but this is not useful in my case as far as I'm concerned.
Any ideas?
Thanks, Thomas
Not so easy way is to implement move and scale functionality on your own.
make showsCameraControls NO. You can design your own preview screen (with modified Image).
Looks like there are issues with touches in IOS5 cameraOverlayView. Make sure the above solution works for IOS5 also :).