How to make iPhone Camera less sensitive to movement - iphone

I have made a "two screen app" in which the camera is divided into two sides left and right each of which can be captured independently and merged later.
The problem which iam facing is when ever user touches the capture button the camera moves a bit and captured image shakes so the user is unable to match the two halves.
Is there any way to make camera less sensitive to minor movements?
I am using imagepicker
Thanks

Roll your won image capture. Capture a larger image than necessary and stabilize the shown image using the gyro.
There is a great example of stabilizing the compass in much the same way here:
http://www.sundh.com/blog/2011/09/stabalize-compass-of-iphone-with-gyroscope/

Related

How to disable image detection if there is already a detected image

I have the, in my opinion, simple problem of disabling image detection with the AR Camera. I have the problem, that my app detects an image from the image library and spawns an object etc. everything according to plan.
But the problem is that if move the camera over another detectable image, it recognizes it. This is bad not because it spawns something additionaly but because you can "collect" the images in my app, so it unlocked the other detected one even though it shouldn´t.
So how can I disable image detection without turning off the AR-Camera?
I so far tried to simply disable the "ARManager" and the "ARTrackedImageManager" script (.enabled=false), but it didn´t solve my problem, because the app still detects other images.
Hope I could explain what my question and problem is properly. Any help is appreciated!
It really depends on what library you're using to detect the image. Generally, most marker tracking libraries will create a marker object in your Unity scene. You can disable these marker objects after you find one, and only leave the marker you're interested in. Make sure you also set the number of tracked images to 1 so you won't accidentally find two markers in one frame.

Build Iphone app that can recognise colour from streaming camera

I am building an iphone app to recognise a specific colour through the iphone camera when placed onto a colour board.
Note that I want it to work through the streaming camera output not just a still image or photo.
My initial thoughts were to scan series of pixels (say 4 on each corner of the camera feed) and if the colours registered in each pixel match, then display colour (in text) to user.
Can someone please point me in the right direction as far as example code or API or even if there is a better design solution to the problem.

Polling IPhone Camera to Process Image

Scenario is I want my app to process (in the background if possible) images been seen by the iphone camera.
e.g. App is running, user places the phone down on a piece of red cardboard, than want to display an alertview saying "Phone placed on Red Surface"(this is a simplified version of what i want to do but just to keep the question direct).
Hope this makes sense. I know there is two seperate concerns here.
How to process images from the camera in the background of the app (if we cant do this that we can initiate the process with say a button click if needed).
Processing the image to say what solid colour it is sitting on.
Any help/guidance would be greatly appreciated.
Thanks
Generic answers to your two questions:
Background processing of image can be triggered as a timer event. Say for example, every 30 second, capture the image on the screen and do the processing behind. If the processing is not computing/time intensive, this should work
It is technically possible to know the color of say one pixel programatically. If you are sure that the entire image is just one color, you can try that approach. Get few random points and get the color of the pixel in the image. But if the image (in your example, red board) consists of an image or multiple colors, then that will require detailed image processing techniques.
Hope this helps
1) Image Capture
There's two kinds of apps that continually take imagery from the camera: media capture (e.g. Camera, iMovie) or Augmented Reality apps.
Here's the iPhone SDK tutorial for media capture:
https://developer.apple.com/library/ios/documentation/AudioVideo/Conceptual/AVFoundationPG/Articles/04_MediaCapture.html#//apple_ref/doc/uid/TP40010188-CH5-SW3
Access the camera with iPhone SDK
Augmented Reality apps take continual pictures from the camera for processing/overlay. I suggest you look into some of the available AR kits and see how they get a continual stream from the camera and also analyze the pixels.
Starting a augmented reality (AR) app like Panasonic VIERA AR Setup Simulator
http://blog.bordertownlabs.com/post/157320598/customizing-the-iphone-camera-view-with
2) Image Processing
Image processing is a really big topic that's been addressed in multiple other places:
https://photo.stackexchange.com/questions/tagged/image-processing
https://dsp.stackexchange.com/questions/tagged/image-processing
https://mathematica.stackexchange.com/questions/tagged/image-processing
..but for starters, you'll need to use some heuristical analysis to determine what you're looking for. Sampling the captured pixels in a bunch of places (e.g. corners + middle) may help, as would generating a histogram of colour intensities - if there's lots of red but little or no blue and green, it's a red card.

Can I do Swipe Transition with CIImageAccumulator in IOS?

I have 2 images one is on top of another. the image on the top should disappear or appear if I roll iPad up/down. The effect should be like Swipe Transition effect.
To do this, if I redraw the top image every angle that the device is rolled, application is slow down and transition effect is not playing smoothly.
In the Core Image Programming Guide, I saw a topic for Imaging Dynamical Systems.
Is it useful for my situation?
Unfortunately CIImageAccumulator is not available in iOS 5 SDK (at least not in 5.1).
You can create your own CIFilter or use opengl shaders to make it faster.
Also, I think you don't need is to re-draw the image at every angle, people will barely notice it, you could draw every 5 or even 10 degrees according to your needs.

Measuring distance with iPhone camera

How to implement a way to measure distances in real time (video camera?) on the iPhone, like this app that uses a card to compare the size of the card with the actual distance?
Are there any other ways to measure distances? Or how to go about doing this using the card method? What framework should I use?
Well you do have something for reference, hence the use of the card. Saying that after watching the a video for the app I can't seem it seems too user friendly.
So you either need a reference of an object that has some known size, or you need to deduct the size from the image. One idea I just had that might help you do it is what the iPhone's 4 flash (I'm sure it's very complicated by it might just work for some stuff).
Here's what I think.
When the user wants to measure something, he takes a picture of it, but you're actually taking two separate images, one with flash on, one with flash off. Then you can analyze the lighting differences in the image and the flash reflection to determine the scale of the image. This will only work for close and not too shining objects I guess.
But that's about the only other way I thought about deducting scale from an image without any fixed objects.
I like Ron Srebro's idea and have thought about something similar -- please share if you get it to work!
An alternative approach would be to use the auto-focus feature of the camera. Point-and-shoot camera's often have a laser range finder that they use to auto-focus. iPhone doesn't have this and the f-stop is fixed. However, users can change the focus by tapping the camera screen. The phone can also switch between regular and macro focus.
If the API exposes the current focus settings, maybe there's a way to use this to determine range?
Another solution may be to use two laser pointers.
Basically you would shine two laser pointers at, say, a wall in parallel. Then, the further back you go, the beams will look closer and closer together in the video, but they will still remain the same distance apart. Then you can easily come up with some formula to measure the distance based on how far apart the dots are in the photo.
See this thread for more details: Possible to measure distance with an iPhone and laser pointer?.