Realtime Face-tracking on Iphone [closed] - iphone

Closed. This question does not meet Stack Overflow guidelines. It is not currently accepting answers.
We don’t allow questions seeking recommendations for books, tools, software libraries, and more. You can edit the question so it can be answered with facts and citations.
Closed 7 years ago.
Improve this question
Does anybody know which,currently,is the best library for realizing a real time face-tracking solution for iPhone? I've done a research but I've found quite old articles about OpenCV portings. I would like to know if there is any specific,reliable,fast (and possibly free) AR solution for overlay in real time an image to the face in iPhone camera Video Stream (not simply a static image)
Any help (link,tutorial) would be great.
Thanks everybody!!
Elos

iOS 5 brings facial recognition as a native feature.
Basically you just have to configure an object to act as your the video output stream’s delegate (could be your controller, for example) and use a CIDetector object to process this stream (which is a class available only in iOS 5).
This CIDetector object will look for the faces in each of your video's frame and return a CIFaceFeature object with several information about the faces found, such as the eyes and mounth position and also the bounds (the rectangle that the face was found inside).
You can check this blog for more implementation details:
https://web.archive.org/web/20130908115815/http://i.ndigo.com.br/2012/01/ios-facial-recognition/

opencv is the best i think.
checkout this tutorial:
http://www.morethantechnical.com/2009/08/09/near-realtime-face-detection-on-the-iphone-w-opencv-port-wcodevideo/

https://github.com/beetlebugorg/PictureMe
a starting point... he's using opencv.

Related

Read text from image iPhone SDK [closed]

Closed. This question does not meet Stack Overflow guidelines. It is not currently accepting answers.
We don’t allow questions seeking recommendations for books, tools, software libraries, and more. You can edit the question so it can be answered with facts and citations.
Closed 7 years ago.
Improve this question
I have an image on my device which I capture from camera. The image has readable text. I want to convert that image into text i.e. get the text of image and display it.
I went through Tesseract demo, but not all the text of image is converted into text. I know that there are few paid SDK's like ABBY SDK available for this, but I was looking for some free source.
Are there other SDKs available for the same?
See this for how to do something quick on iOS with Tesseract. I doubt you will get the accuracy you want though. So far I haven't found a good opensource solution because the iPhone camera is not well suited to this problem. There are a few online API options that do better.
Oh and one word of advice, don't pay for anything without trying it in your situation :)

Music pitch affecting a game [closed]

Closed. This question does not meet Stack Overflow guidelines. It is not currently accepting answers.
We don’t allow questions seeking recommendations for books, tools, software libraries, and more. You can edit the question so it can be answered with facts and citations.
Closed 4 years ago.
Improve this question
In windows media player, do you know that music visualization graph that changes based on frequency and pitch? What I want is to implement this into an iphone game.
I'll try to explain this as well as I can. I will be playing classical music in a game. I want to use the music's volume/pitch/whatever it is called, to affect gameplay. Like, if suddenly in the music, the volume raises (not the volume of the iphone, but the actual playing of the music) it would increase the chances of a spawn or something.
I'm not asking for a guide on how to implement this, I want to know if there is something that can give me numbers or something based on the pitch/volume/high and low notes of the song that was playing in a game.
Oh and if anyone can tell me what the name of the music graph I am looking for, it would be greatly appreciated.
This sample shows how to do what you want to do. The visualizer in WMP uses the amplitude (volume) of the signal as well as frequency information (using Fast Fourier Transform - probably) to construct the visualization effect.
You can also use the simpler AVAudioPlayer API, if you're interested in just responding to the music's current volume level (and if you want to skip the frequency analysis part). The API includes a callback that notifies your app periodically of the current audio volume.

Are there any image effects libraries (e.g. lomo) for Android or iPhone? [closed]

Closed. This question does not meet Stack Overflow guidelines. It is not currently accepting answers.
We don’t allow questions seeking recommendations for books, tools, software libraries, and more. You can edit the question so it can be answered with facts and citations.
Closed 7 years ago.
Improve this question
are there any image effects (e.g. lomo, watercolor, sketch ...etc) libries can be used on mobile device and suitable for mobile device?
There are libraries like JJIL, simple-iphone-image-processing, JH labs, imageMagick porting to iphone and opencv have been mentioned in various previous posts.
I want libraries that I don't need to care about the algorithm of effect, but lots of libraries mentioned above are this type of libraries.
What I need are libraries that already have done effects functions and I just call the functions to apply effects on photos (It is ok to set parameters and attributes by myself when I use it).
Are there any more suggestions?
Free is good, commercial would be fine.
Thank you.
You could try the effects API from Aviary. Here: http://developers.aviary.com/
I never used it though, so I can't say if that's exactly what you are looking for.
Potentially worth your while:
OpenCV
Simple iPhone Image Processing

Fire Effect opengl iphone [closed]

Closed. This question does not meet Stack Overflow guidelines. It is not currently accepting answers.
We don’t allow questions seeking recommendations for books, tools, software libraries, and more. You can edit the question so it can be answered with facts and citations.
Closed 8 years ago.
Improve this question
Hello any one knows a good tutorial where i can find a pointer to fire/flame effects on the iphone. i know i will have to use opengl but got no clue where to start.
Cheers.
You would probably need a particle engine for this. If you wish to learn to create such effects yourself, it might be tough. You can instead use other frameworks like cocos2d for iPhone. But if you want to do it in pure openGl, search on the lines of 'particle engine / particle generation on iphone' on stackoverflow or even google. You'll get many good pointers. Good Luck.
COCOS2D has a great particle emitter you can use. You can also try Corona for iPhone which is great.
Here's a link to a page that has a particle effect tutorial. This site use to have a fire effect article but the link was removed. This answer use to have the old link.

Learning using layers and CoreAnimation for iphone [closed]

Closed. This question does not meet Stack Overflow guidelines. It is not currently accepting answers.
We don’t allow questions seeking recommendations for books, tools, software libraries, and more. You can edit the question so it can be answered with facts and citations.
Closed 4 years ago.
Improve this question
Except for the official docs, what is a good resource for learning how to use CALayers and Core Animation efficiently/correctly?
Beyond Apple's documentation, I highly recommend Marcus Zarra and Matt Long's book "Core Animation: Simplified Animation Techniques for Mac and iPhone Development", as well as Bill Dudney's "Core Animation for Mac OS X and the iPhone". If you had to pick just one right now, I'd go with the former, which is newer, contains more information on iPhone-specific issues, and is beautifully illustrated.
There are always the core-animation tagged questions here on Stack Overflow, which include many interesting answers on the use of layers and animations.
I've compiled what I know about Core Animation in the detailed class notes for the course I teach on iPhone development. Those notes can be downloaded from here (in VoodooPad format). EDIT (6/29/2010): The video for the corresponding Core Animation session is available with the rest of the class on iTunes U. I provide many examples of the use of layers and complex animations, including sample code.
Even if it targets Mac OS X (CALayer on iPhone and Mac OS X are close), you can learn a lot of things with the following links:
http://theocacao.com/document.page/527
http://theocacao.com/document.page/555
http://www.claireware.com/blog_files/iphone_animation_view_with_sound.html
For the rest, Google is your friend.