how does first generation iPhone camera video? - iphone

There are some apps that let the first generation iPhone record video with a reasonable quality. My question is, which api do those apps use? Do they use custom code for compression to mpeg? And how do they gather so many images per second from the camera, which does only allow to take still pictures? The takePicture function of UIImagePickerController would be too slow for that.

"This app works by using the long-blacklisted UIGetScreenImage() function that I've written about in the past. (I discovered this use by scanning the application using my APIkit scanner.) Apple must have willingly given the go-ahead for its use, as their automated scanning must have picked the same function call. Good news on the "more flexible review" front. Since Apple recently gave the green light to the UStream video app, with Qik hot on its heels, it's likely we'll see more of these applications that provide iPhone video functionality for livecasting or recording from your device." - http://www.tuaw.com/2009/12/14/app-store-approved-app-brings-video-recording-to-iphone-3g-and-1/
My summary: I think that the app opens a "Take a picture" type of view, and then "records the screen and saves to video" by using the UIGetScreenImage() API.

Unless these apps run on iOS versions prior to 4.0, I very much suspect they use the standard Apple API, as the UIImagePickerController has specific support for recording video on supported devices.
See the startVideoCapture instance method within the UIImagePickerController class reference.

Related

iPhone User Interface steps online demo

I've designed the User Interface of an iPhone app and I wish to show an online demo of that consisting for the moment of a series of static images representing the main steps of the app.
According to you what is the best way to do this simulation?
You know, something like a series of single webpage, optimized for mobile, containing a single image linking to the next step, but I was wondering if exists a much elegant and sophisticated solution, with a transition effect for example or other features.
I hope I was clear enough :)
Any help will be sincerely appreciated.
Thanks in advance for your attention.
This sounds like a good use for Briefs Briefs App Website. This pretty much allows you to create an interface and step through it as if it were an application. I believe you'll need to have a developer account to run the app that will read the brief on your phone (since it wasn't able to be released in the app store).
An alternative to static images would be to make a video. I use the iShowU video screen capture tool and set it to record the iPhone/iPad simulator window. I then run through the screens, type inputs, etc. In addition to recording the video, the program records my voice as I narrate the app's features.
As to transition effects, the video will capture whatever transition animations are in your program.
In the end you have a video that you could give your user, put on YouTube, or whatever.
You can do this easily and for free on AppDemoStore. You just have to upload the app screenshots and then add hotspots which are used for the navigation through the demo.
AppDemoStore offers also the sophisticated features you are asking for:
iPhone specific transition effects such as slide up/down/left/right, fade and flip
gestures icons for the hotspots
text boxes and callouts
multiple hotspots on a screen in order to create a simulation of the app (and not just a linear demo)
Here's a sample demo: http://www.appdemostore.com/demo?id=1699008
Moreover, the demos created on AppDemoStore run in any browser and mobile device and can be embedded in your webpage or blog (like you do it with a YouTube video). With the FREE account, you can create up to 10 demos with unlimited screenshots and all the features specified above.
Regards,
Daniel

How to record game in cocos2d iPhone

I am developing a cocos2d app.
It's almost completed but now I want to record the activities of my app as a video file, including sound produced by the app.
How can I implement this?
Anybody can help me.
Please suggest a way to implement this.
Thanks in advance.
The question isn't new, but since it isn't answered I thought I'd pitch in:
We provide an SDK called "Everyplay" that allows you to do exactly what you're looking for. It's free to use, and is lightweight.
We provide out-of-the-box integrations for Unity3D, cocos2d (1.x, 2.x), cocos2d-x, and you can of course integrate to a custom OpenGL-based game engine.
The documentation is available at https://developers.everyplay.com/doc
The documentation contains an example app key to use when developing, but you can of course sign up for your own client key at https://developers.everyplay.com/
There are many options - and the fact that your app is cocos2d doesn't matter much.
iSimulate works well. You can actually play the app on your device and record the gameplay as well as the touch events. This is important if you want to show user interaction in your app. You run the app in the simulator but you control it from your device.
If you just want to record the app interaction without caring about showing users the touch events, you can use Screenflow or Jing or some other recording software. I used to use Jing (free) but Screenflow works better for me and it also lets you create more advanced video like a trailer with effects. edit You should be able to capture touch events through the simulator with Screenflow too. You can choose to show them or not. And can use different indicators for those events.
Search on google for mac or iphone recording software. There are many options. I had the best experience with Screenflow because I wanted to make a trailer and gameplay video.
I'm developing similar application which allow user record the activity within cocos2d-x activity.
I'm using screen capture method and then combine it using FFMPEG. The performance wasn't too good thought but is the easiest way to achieve.

Is it possible to read iPhone or Android display data through the audio jack?

I'm not very well versed in the iPhone and Android API, so please bear with me if this is a stupid question.
As I understand it, Square's card reader works by converting the magnetic information on the card stripe into an audio tone that its software can then process. [1]
In a similar way, is there a way to somehow read what exactly is being displayed on the device screen simply through a small device inserted into the audio jack on that device?
[1] http://www.quora.com/How-does-Squares-hardware-work
It's not quite clear what you wish to achieve. You can indeed make an app that would output a representation (perhaps audio frequency-shift keying?) of the screen's contents to the iPhone's audio jack.
The iPhone (and other iOS-based devices) use TRRS connectors for bi-directional audio (and hence arbitrary modulated data) communication and there are well-supported publicly-documented APIs for using these interfaces.
That said, if you're writing your own app: why would you want to output the contents of the screen? If you are developing the app in question, why not transmit the salient data in a more effective manner? Which leads me to my next assumption:
You want to read what's being displayed on the device's screen at any time, not just when an app of your creation is open. In this case, the answer is that it is not possible, with the possible exception of a jailbroken solution. That said, I can't imagine a jailbroken solution being useful much longer on account of iOS 5 introduced "display mirroring" by means of AirPlay.
On Android, I have no idea. :-)
No. The screen is not connected to the audio jack.
I think you can make an app to take a screenshot and then encode that photo as music to play it.
It won't sound good though :)
For this kind of task, there is built in camera

Record iPhone app video (without simulator)

How can I record a video of an iPhone app? I can't use the Simulator because the application is very OpenGL-heavy and uses an accelerometer/gyroscope.
You can have the iphone output video and capture on another device: How can I use MPTVOutWindow iPhone undocumented class?
one of the links in that answer says it doesn't work in iOS4+, but on a project I worked on less than one month ago, we used that feature from an iPhone4 to present, so I would challenge that (unless the developer that handled that portion used another approach)
I'm not sure there is a "native" solution here, short of building video capture into your actual app.
The cleanest way of handling this, assuming your game/app has a cleanly designed input pipeline, is probably to mock the input:
Put in (debug-only code) that lets you "record" all the raw input events.
Using the device, play out the demo session to create a "recording".
Run the app in the simulator, and feed it the input "recording" you made on the device.
The simulator will run GL stuff just fine, and probably at a higher framerate than your device will.

Capturing camera framebuffer on iPhone 3G (not S)?

I've downloaded a free application from App Store (very nice application for instance) called ReadTheQRCode and it dont asks you to take the picture to decode the QRCode, my point is, is the application using the framebuffer of the camera on iPhone 3G or it is taking several pictures at a given time, ommiting the iris animation, the edit step and decoding it? Can anyone with more experience take a look at this App and give an opinion?
Thanks in advance!
There is -[UIImagePickerController takePicture] (as of 3.1) and UIGetScreenImage(). Although the latter is undocumented, Apple allows apps using it into the store (see this thread on the Apple Dev Forums).