Capturing camera framebuffer on iPhone 3G (not S)? - iphone

I've downloaded a free application from App Store (very nice application for instance) called ReadTheQRCode and it dont asks you to take the picture to decode the QRCode, my point is, is the application using the framebuffer of the camera on iPhone 3G or it is taking several pictures at a given time, ommiting the iris animation, the edit step and decoding it? Can anyone with more experience take a look at this App and give an opinion?
Thanks in advance!

There is -[UIImagePickerController takePicture] (as of 3.1) and UIGetScreenImage(). Although the latter is undocumented, Apple allows apps using it into the store (see this thread on the Apple Dev Forums).

Related

Best way to build a camera app on iPhone

I am thinking of building a camera application - with the ability to do image processing (adjust contrast, apply different image filters) while you are taking picture or after the pictures has taken.
The app will also have the ability of drag and drop icons.
At the end you are able to export the edited images either to the camera roll or app memory.
There is already many apps out there like this. (Line Camera) etc...
Just wondering what is the best way to build such app.
Can I build the app purely with Objective C ios sdk? or do i need to build it with C++/cocos2d, etc...
Thanks for your help!
Your question is very broad, so here is a broad answer...
Accessing the camera/photo library
First you'll need to access the camera using UIImagePickerController to either take a new photo or grab one from your photo library. You can read up on how to accomplish this here: Camera Programming Topics for iOS
Image Manipulation
AviarySDK has much of this already built for you. Very easy to set up and use in your apps. You can download their sample app for free in the app store if you want to see what it can do. Check it out here: http://aviary.com/
Alternatively, read up on Core Image if you'd like to avoid third-party libraries. See Core Image Programming Guide for more information.
There is absolutely no need for cocos2d which is a game engine.
You can accomplish everything you mentioned using only Objective-C.
If you want real-time effects you will need to dive into OpenGL. you can use GLKit if you target iOS 5 and above.

Is it possible to read iPhone or Android display data through the audio jack?

I'm not very well versed in the iPhone and Android API, so please bear with me if this is a stupid question.
As I understand it, Square's card reader works by converting the magnetic information on the card stripe into an audio tone that its software can then process. [1]
In a similar way, is there a way to somehow read what exactly is being displayed on the device screen simply through a small device inserted into the audio jack on that device?
[1] http://www.quora.com/How-does-Squares-hardware-work
It's not quite clear what you wish to achieve. You can indeed make an app that would output a representation (perhaps audio frequency-shift keying?) of the screen's contents to the iPhone's audio jack.
The iPhone (and other iOS-based devices) use TRRS connectors for bi-directional audio (and hence arbitrary modulated data) communication and there are well-supported publicly-documented APIs for using these interfaces.
That said, if you're writing your own app: why would you want to output the contents of the screen? If you are developing the app in question, why not transmit the salient data in a more effective manner? Which leads me to my next assumption:
You want to read what's being displayed on the device's screen at any time, not just when an app of your creation is open. In this case, the answer is that it is not possible, with the possible exception of a jailbroken solution. That said, I can't imagine a jailbroken solution being useful much longer on account of iOS 5 introduced "display mirroring" by means of AirPlay.
On Android, I have no idea. :-)
No. The screen is not connected to the audio jack.
I think you can make an app to take a screenshot and then encode that photo as music to play it.
It won't sound good though :)
For this kind of task, there is built in camera

Record iPhone app video (without simulator)

How can I record a video of an iPhone app? I can't use the Simulator because the application is very OpenGL-heavy and uses an accelerometer/gyroscope.
You can have the iphone output video and capture on another device: How can I use MPTVOutWindow iPhone undocumented class?
one of the links in that answer says it doesn't work in iOS4+, but on a project I worked on less than one month ago, we used that feature from an iPhone4 to present, so I would challenge that (unless the developer that handled that portion used another approach)
I'm not sure there is a "native" solution here, short of building video capture into your actual app.
The cleanest way of handling this, assuming your game/app has a cleanly designed input pipeline, is probably to mock the input:
Put in (debug-only code) that lets you "record" all the raw input events.
Using the device, play out the demo session to create a "recording".
Run the app in the simulator, and feed it the input "recording" you made on the device.
The simulator will run GL stuff just fine, and probably at a higher framerate than your device will.

how does first generation iPhone camera video?

There are some apps that let the first generation iPhone record video with a reasonable quality. My question is, which api do those apps use? Do they use custom code for compression to mpeg? And how do they gather so many images per second from the camera, which does only allow to take still pictures? The takePicture function of UIImagePickerController would be too slow for that.
"This app works by using the long-blacklisted UIGetScreenImage() function that I've written about in the past. (I discovered this use by scanning the application using my APIkit scanner.) Apple must have willingly given the go-ahead for its use, as their automated scanning must have picked the same function call. Good news on the "more flexible review" front. Since Apple recently gave the green light to the UStream video app, with Qik hot on its heels, it's likely we'll see more of these applications that provide iPhone video functionality for livecasting or recording from your device." - http://www.tuaw.com/2009/12/14/app-store-approved-app-brings-video-recording-to-iphone-3g-and-1/
My summary: I think that the app opens a "Take a picture" type of view, and then "records the screen and saves to video" by using the UIGetScreenImage() API.
Unless these apps run on iOS versions prior to 4.0, I very much suspect they use the standard Apple API, as the UIImagePickerController has specific support for recording video on supported devices.
See the startVideoCapture instance method within the UIImagePickerController class reference.

Use images form camera without taking a photo

Is it possible to analyze the images without taking a foto on the iPhone ?
I want to analyze some matrix codes, without taking any foto. I saw this on some Nokia models and it's quite impressive: really fast!
On the iPhone, I've seen analyzing codes after taking a snapshot (or photo), and it's slower.
Also the iPhone camera is not good as some other mobile cameras.
I'm refering to iPhone 3G/3GS.
thanks,
r.
Yes.
Under iOS 4.x, you can use the new AVFoundation Framework to access camera pixel buffers on the 3G and 3GS, as well as the newer iPhone 4, without taking a photo/snapshot. The newer devices have higher resolution camera's.
I don't know if it's possible. But I think you should take a look at the AVCamDemo Apple's Code from de WWDC 2010. I think it could help you even if I didn't read a lot of code (juste compiled tje xCode project and tried it)
(Sorry but I can't find back the link)
You should be able to find the code in your emails if you are registred as an iPhone Developer. Or maybe on the developer.apple.com/iphone/ website.
By the way, don't think doing it on iPhone 3G (impossible I think). The iPhone 3GS should be fast enough for this kind of app.
Good Luck !