Build Iphone app that can recognise colour from streaming camera - iphone

I am building an iphone app to recognise a specific colour through the iphone camera when placed onto a colour board.
Note that I want it to work through the streaming camera output not just a still image or photo.
My initial thoughts were to scan series of pixels (say 4 on each corner of the camera feed) and if the colours registered in each pixel match, then display colour (in text) to user.
Can someone please point me in the right direction as far as example code or API or even if there is a better design solution to the problem.

Related

Pixel art across iOS resolutions

In my iOS game, I have custom buttons in a pixel art style. There are a few buttons in a row at the bottom of the screen, and I want them to fill the screen width, regardless of iPhone resolution.
I have been trying to think of a way to achieve this. Is the best way to do this (and to target devices for all other assets, too) by just checking the resolution, e.g. with self.size.width, and then setting the image (so have multiple images for the buttons, to ensure pixel art style is correct and not distorted) and position for that particular device?
Is there an easier / built in way to do this?

Why aren't my Unity Game's ui elements working on Google Play store?

I added some ui elements to my game and they work properly in unity and on my device when connected to unity remote.
But I uploaded it to google play, downloaded it, and some text doesn't appear, buttons aren't pressable, and the background is red for some reason (supposed to be blue). Anyone know what's going on?
It may be a ratio problem, to fix this use anchors in your UI elements by using the anchor presets or by using the anchor or by using the anchor min and max values in insector, to make it scale with size you can set its UI Scale Mode to Scale With Screen Size.(https://docs.unity3d.com/Packages/com.unity.ugui#1.0/manual/HOWTO-UIMultiResolution.html). Let me know if that helps

Polling IPhone Camera to Process Image

Scenario is I want my app to process (in the background if possible) images been seen by the iphone camera.
e.g. App is running, user places the phone down on a piece of red cardboard, than want to display an alertview saying "Phone placed on Red Surface"(this is a simplified version of what i want to do but just to keep the question direct).
Hope this makes sense. I know there is two seperate concerns here.
How to process images from the camera in the background of the app (if we cant do this that we can initiate the process with say a button click if needed).
Processing the image to say what solid colour it is sitting on.
Any help/guidance would be greatly appreciated.
Thanks
Generic answers to your two questions:
Background processing of image can be triggered as a timer event. Say for example, every 30 second, capture the image on the screen and do the processing behind. If the processing is not computing/time intensive, this should work
It is technically possible to know the color of say one pixel programatically. If you are sure that the entire image is just one color, you can try that approach. Get few random points and get the color of the pixel in the image. But if the image (in your example, red board) consists of an image or multiple colors, then that will require detailed image processing techniques.
Hope this helps
1) Image Capture
There's two kinds of apps that continually take imagery from the camera: media capture (e.g. Camera, iMovie) or Augmented Reality apps.
Here's the iPhone SDK tutorial for media capture:
https://developer.apple.com/library/ios/documentation/AudioVideo/Conceptual/AVFoundationPG/Articles/04_MediaCapture.html#//apple_ref/doc/uid/TP40010188-CH5-SW3
Access the camera with iPhone SDK
Augmented Reality apps take continual pictures from the camera for processing/overlay. I suggest you look into some of the available AR kits and see how they get a continual stream from the camera and also analyze the pixels.
Starting a augmented reality (AR) app like Panasonic VIERA AR Setup Simulator
http://blog.bordertownlabs.com/post/157320598/customizing-the-iphone-camera-view-with
2) Image Processing
Image processing is a really big topic that's been addressed in multiple other places:
https://photo.stackexchange.com/questions/tagged/image-processing
https://dsp.stackexchange.com/questions/tagged/image-processing
https://mathematica.stackexchange.com/questions/tagged/image-processing
..but for starters, you'll need to use some heuristical analysis to determine what you're looking for. Sampling the captured pixels in a bunch of places (e.g. corners + middle) may help, as would generating a histogram of colour intensities - if there's lots of red but little or no blue and green, it's a red card.

How to make iPhone Camera less sensitive to movement

I have made a "two screen app" in which the camera is divided into two sides left and right each of which can be captured independently and merged later.
The problem which iam facing is when ever user touches the capture button the camera moves a bit and captured image shakes so the user is unable to match the two halves.
Is there any way to make camera less sensitive to minor movements?
I am using imagepicker
Thanks
Roll your won image capture. Capture a larger image than necessary and stabilize the shown image using the gyro.
There is a great example of stabilizing the compass in much the same way here:
http://www.sundh.com/blog/2011/09/stabalize-compass-of-iphone-with-gyroscope/

iPhone UIImagePickerController: show camera preview on top of openGl viewport

i've a question about UIImagePickerController class reference, in photo-camera mode.
I'm developing an OpenGl game. I should add to game photo-camera features: game should open a window (say for example 200x200 pixels, in the middle of screen), that display in real time a preview of photo-camera IN FRONT OF GL VIEWPORT. So photo-camera preview must be in a window of 200x200 in front of our Gl viewport that display game.
I've some questions:
- main problem is that i've difficult to open UIImagePickerController window in front of our Gl viewport (normally UIImagePickerController window covers all iPhone screen);
which is the better way to capture an image buffer periodically to perform some operations, like face detection (we have library to perform this on a bitmap image) ?
iPhone can reject such approach? It's possible to have this approach with camera (camera preview window that partially overlap an openGl viewport) ?
at the end, it's possible to avoid visualization of camera shutter? I'd like to initialize camera without opening sound and shutter visualization.
This is a screenshot:
http://www.powerwolf.it/temp/UIImagePickerController.jpg
If you want to do something more custom than what Apple intended for the UIImagePickerController, you'll need to use the AV Foundation framework instead. The camera input can be ported to a layer or view. Here is an example that will get you half way there (it is intended for frame capture). You could modify it for face detection by taking sample images using a timer. As long as you use these public APIs, it'll be accepted in the app store. There are a bunch of augmented reality applications that use similar techniques.
http://developer.apple.com/library/ios/#qa/qa2010/qa1702.html