I need to implement a meter just like sound meter for my iPhone app ,but I am little bit confuse about its implementation.
Please help how can I implement it?
The source code for the Android sound recorder's VU Meter is available online and should get you started.
You may use OpenGL based simple and easily extensible data visualization framework gl-data-visualization-view to display UV meter. You should just add GLDataVisualizationView, set visualization type as VU meter and set value to visualize. There are also iPhone sample project which shows how to use framework.
Related
Actually I need a region over camera overlay like mostly of the QR code scanner app.
And when a square box comes within it just focus and click picture from it. Any idea how to implement it. I was using the UIIMAGEPICKER class but after doing some googling I found that I need to use the AVFoundation framework. But unfortunately I am not the near one.
Any code or any tutorial will be helpful. Please let me know how can I implement this.
One more thing if i need to take picture can i make the picture only to the region size?
Yes, you are correct. You will need to use AV Foundation to implement this. Have a look at the 'Using the Camera with AV Foundation' video from the WWDC 2010 session videos, to get an overview of the framework.
AvFoundation has no Dependancies on UIKit. So you will get some nice performance increases over using UIImagePickerController. It will also give you Full Access to the camera.
When using AV Foundation you are in control of the 'Device Capture Settings' i.e. Flash as well as Focus Mode and Exposure; including their points of interest. Have a look at the Programming Guide to see how to use these, or the device behaviour may differ from what you expected.
You can also download and example of an application that uses AV Foundation to implement the camera here.
Once you're up and running with that, have a look at this tutorial to get started with the overlay on the camera.
One more thing if i need to take picture can i make the picture only to the region size?
Yes you will be able to implement this. You can also configure the AVFoundation session itself to output the lowest practical resolution.
I have an equalizer view with 10 bars in OpenGL ES which can light up and down. Now I'd like to drive this equalizer view from the background music that is playing in iOS.
Someone suggested what I need is a Fast Fourier Transform to transform the audio data into something else. But since there are so many audio visualizations floating around, my hope is that there is an open source library or anything that I could start with.
Maybe there are open source iOS projects which do audio visualization?
Yes.
You can try this Objective-C library which I wrote for exactly this purpose. What it does is to give you an interface for playing files from URLs and then getting real-time FFT and waveform data so that you can feed it with your OpenGL bars or whatever graphics you're using to visualise the sound. It also tries to deliver very accurate results.
If you want to do it in swift, you can take a look at this example which is cleaner and also shows how it's possible to actually draw the equalizer view.
In iphone, I want to calculate the distance from the camera to the subject.
I was wondering maybe iphone camera's "Active Auto Focus" provide me the distance to the subject (point of interest)??
many thanks in advance.
p.s. guys if you think it is not possible plz let me know ;)
As far as I'm aware, it's not possible - see here for more info on what you can and can't do. You can tell when the camera is autofocusing, but nothing more.
I'm sure there is probably some more info you can get on focus/distance using a private API call - but unless you're not worried about getting on the App Store, it's not really an option.
That doesn't mean there's not another way though. Looks like a good discussion on the topic here: Distance using OpenCV (you can compile openCV for iOS!)
I've been investigating the ability of the camera to measure short distances and not finding anything yet I came up with this shot in the dark. I haven't tried it yet, but the docs make me think this might work:
Create and configure an AVCaptureDevice representing the camera
Through the AVCaptureSession, capture the AVCaptureStillImageOutput
From the AVCaptureStillImageOutput object check the Exif properties for kCGImagePropertyExifSubjectDistance
Love to hear if anyone has been able to use a methodology like this to be able to make accurate (less than 1 foot) distance measurements.
how can I realize realtime face detection, when I use iPhone camera to take picture?
just like the example: http://www.morethantechnical.com/2009/08/09/near-realtime-face-detection-on-the-iphone-w-opencv-port-wcodevideo/ (this example don't provide the .xcodeproj, so I can't compile .cpp file)
another example: http://blog.beetlebugsoftware.com/post/104154581/face-detection-iphone-source
(can't be compiled)
do you have any solution? please give a hand!
Wait for iOS 5:
Create amazing effects in your camera
and image editing apps with Core
Image. Core Image is a
hardware-accelerated framework that
provides an easy way to enhance photos
and videos. Core Image provides
several built-in filters, such as
color effects, distortions and
transitions. It also includes advanced
features such as auto enhance, red-eye
reduction and facial recognition.
I want to adjust the exposure of the iPhone/iPod touch camera with intimate detail. I would prefer to take a series of photos with decreasing exposure times to obtain a sequence of images (for HDR reconstruction). Is this possible?
If not, what's the next best thing? It seems you can set a point of interest in the image for the autoexposure. Perhaps I could search for a dark/light region of the image and then use this exposurePointOfInterest to adjust the exposure, but this seems like a very indirect solution that is also error-prone. If anybody has tried an alternative, such an answer is also desirable.
As iOS gives control of frame durations by
MinFrameDuration
MaxFrameDuration
since exposure times vary based on fram rate and frame duration
By setting min and max frame rate to a particular value
You will be locking the fram rate.
That will effect your exposure times.
This is also very indirect way of controlling, may be it helps your case
some example would be like this:
if (conn.isVideoMinFrameDurationSupported)
conn.videoMinFrameDuration = CMTimeMake(1, CAPTURE_FRAMES_PER_SECOND);
if (conn.isVideoMaxFrameDurationSupported)
conn.videoMaxFrameDuration = CMTimeMake(1, CAPTURE_FRAMES_PER_SECOND);
Since you would have to decrease the shutter speed of the camera, this unfortunately does not appear to be possible, and more importantly, against the HIG:
Changing the behavior of iPhone external hardware is a violation of
the iPhone Developer Program License Agreement. Applications must
adhere to the iPhone Human Interface Guidelines as outlined in the
iPhone Developer Program License Agreement section 3.3.7
Related article Apple Removes Camera+ iPhone App From The App Store After Developer Reveals Hack To Enable Hidden Feature.
If it can be done programatically, instead of with the hardware, you might have a chance, but then its just an effect on an image,not a true long exposure picture.
There are some simulated slow shutter apps that do get approved like Slow Shutter or Magic Shutter.
Related article: New iPhone Camera App “Magic Shutter” Hits The App Store.
This is supported since iOS 8:
http://developer.xamarin.com/guides/ios/platform_features/intro_to_manual_camera_controls/
Have a look at AVCaptureExposureModeCustom and CaptureDevice.LockExposure...
I tried to do this for my motion activated camera app (Pocket Sentry) and I found that it is not possible to do this AND get approved in the app store.
I have been trying to do this myself. I think its possible only by using the exposure point of interest property. I am detecting the dark and bright spots and then adjusting the point accordingly.
Please refer : Detecting bright/dark points on iPhone screen
Does anyone know a better way to do this?
I am not too sure, but you should try using AVFoundation class to build the camera app, following the apple's sample code:
AVCam Sample Code
And then try to leverage the exposureMode property of the Class:
exposureMode Class Reference