How can I record a video of an iPhone app? I can't use the Simulator because the application is very OpenGL-heavy and uses an accelerometer/gyroscope.
You can have the iphone output video and capture on another device: How can I use MPTVOutWindow iPhone undocumented class?
one of the links in that answer says it doesn't work in iOS4+, but on a project I worked on less than one month ago, we used that feature from an iPhone4 to present, so I would challenge that (unless the developer that handled that portion used another approach)
I'm not sure there is a "native" solution here, short of building video capture into your actual app.
The cleanest way of handling this, assuming your game/app has a cleanly designed input pipeline, is probably to mock the input:
Put in (debug-only code) that lets you "record" all the raw input events.
Using the device, play out the demo session to create a "recording".
Run the app in the simulator, and feed it the input "recording" you made on the device.
The simulator will run GL stuff just fine, and probably at a higher framerate than your device will.
Related
So my application for iphone4 reads data from the accelerometer and sends it to another application via tcp sockets. I need my app to work in background mode, so what I did was:
I put an mp3 file in the application's Documents folder
I used AVAudioPlayer library to play the file in a loop. It works.
I edited Info.plist and added option "required background
modes" with "audio" on.
Still, the scheduler suspends the application whenever I press the iphone's home button. Is there anything I missed?
I read apple's documentation, but I didn't find a solution. A few thoughts on this:
do I have to edit appDelegate.m?
is it because I use AVAudioPlayer instead of the iPod?
is it because I play an audio file from the application documents
folder?
I read about one person changing iOS Development Target from 4.0 to
3.2.1, but that didn't work for me.
And finally, say I get this to work, would the application still be getting data from the accelerometer?
On a side note, I don't want to submit the application to the App Store.
No, you will not receive accelerometer notifications in background mode. As far as I know, it is not possible. Check Executing Code in Background.
If you read the docs carefully, you will know that the whole background code model is based on responding to specific events (location and voip modes).
As for the audio mode here is an extract from Apple:
Your application should limit itself to doing only the work necessary
to provide data for playback while in the background. For example, a
streaming audio application would download any new data from its
server and push the current audio samples out for playback. You should
not perform any extraneous tasks that are unrelated to playing the
content.
Not sure whether you have solved your issue or not since this question was posted more than one year ago. Also, not sure whether playing audio is a must in your app or not. If both answers are no, my recent investigation may help a bit.
Here are how I get my app getting accelerometer data at the background
1. Follow this tutorial http://mobile.tutsplus.com/tutorials/iphone/ios-multitasking-background-location/ to get the background location working.
2. Follow this tutorial http://jonathanhui.com/ios-motion to get the accelerometer working.
Then you can get an app collecting accelerometer data at the background. Hope this helps.
I'm not very well versed in the iPhone and Android API, so please bear with me if this is a stupid question.
As I understand it, Square's card reader works by converting the magnetic information on the card stripe into an audio tone that its software can then process. [1]
In a similar way, is there a way to somehow read what exactly is being displayed on the device screen simply through a small device inserted into the audio jack on that device?
[1] http://www.quora.com/How-does-Squares-hardware-work
It's not quite clear what you wish to achieve. You can indeed make an app that would output a representation (perhaps audio frequency-shift keying?) of the screen's contents to the iPhone's audio jack.
The iPhone (and other iOS-based devices) use TRRS connectors for bi-directional audio (and hence arbitrary modulated data) communication and there are well-supported publicly-documented APIs for using these interfaces.
That said, if you're writing your own app: why would you want to output the contents of the screen? If you are developing the app in question, why not transmit the salient data in a more effective manner? Which leads me to my next assumption:
You want to read what's being displayed on the device's screen at any time, not just when an app of your creation is open. In this case, the answer is that it is not possible, with the possible exception of a jailbroken solution. That said, I can't imagine a jailbroken solution being useful much longer on account of iOS 5 introduced "display mirroring" by means of AirPlay.
On Android, I have no idea. :-)
No. The screen is not connected to the audio jack.
I think you can make an app to take a screenshot and then encode that photo as music to play it.
It won't sound good though :)
For this kind of task, there is built in camera
I'd like to stream video from the camera on an iOS device to a receiver via wifi, in effect turning the device into a wireless webcam. Is there a way to build a small app that captures video input on an iOS app and sends it via an RTSP stream or similar?
As this is an ad hoc experiment, I'm not concerned about App Store guidelines and can jailbreak if necessary.
If I interpret your question correctly you more or less need to solve four problems:
Get the camera feed.
Convert/encode this to the right format.
Stream the data.
Prevent the phone from locking itself and going into deep sleep.
The first one is fairly simple and Apple has as always provided good documentation and examples -> API link. Make sure you check out their example in the end as you will get a CMSampleBufferRef data object back.
For the second and third part, you should check out the CFNetwork framework and specially CFFTPStream for streaming using FTP.
If your are only building this for yourself then you can always turn off the Auto-Lock feature in the settings. If you on the other hand would like to distribute this to other users you could use a trick to play a mute sound every 10 seconds. This is more or less how all the alarm clocks work in the App Store. Here's a tutorial. =)
I hope I helped a little bit at least.
Good luck and best regards!
I'm 70% of the way to doing the same thing. Here's how I did it:
Capture content from video input
Chop video into files for use in HTML Live Streaming.
Spin up a web server on the iPhone and make the video files available.
Connect to the IP address of the phone and viola! you've got live streaming video.
Last time I touched the code I was trying to debug my Live Streaming not working. I'll try and get my source code posted on github this weekend, if you'd like to take a look.
There are some apps that let the first generation iPhone record video with a reasonable quality. My question is, which api do those apps use? Do they use custom code for compression to mpeg? And how do they gather so many images per second from the camera, which does only allow to take still pictures? The takePicture function of UIImagePickerController would be too slow for that.
"This app works by using the long-blacklisted UIGetScreenImage() function that I've written about in the past. (I discovered this use by scanning the application using my APIkit scanner.) Apple must have willingly given the go-ahead for its use, as their automated scanning must have picked the same function call. Good news on the "more flexible review" front. Since Apple recently gave the green light to the UStream video app, with Qik hot on its heels, it's likely we'll see more of these applications that provide iPhone video functionality for livecasting or recording from your device." - http://www.tuaw.com/2009/12/14/app-store-approved-app-brings-video-recording-to-iphone-3g-and-1/
My summary: I think that the app opens a "Take a picture" type of view, and then "records the screen and saves to video" by using the UIGetScreenImage() API.
Unless these apps run on iOS versions prior to 4.0, I very much suspect they use the standard Apple API, as the UIImagePickerController has specific support for recording video on supported devices.
See the startVideoCapture instance method within the UIImagePickerController class reference.
I know the iPhone can play video on an external screen if you have the Apple component output cable. I also know you can write an app that plays video. Is there a way to put those two things together and write an app that will play video specifically on an external screen?
This is currently not possible with the iPhone API. I have heard of apps that have done it on jail-broken phones, but there is not Apple-approved way of doing it at this time.
This can be done using private private APIs, but it won't get in the store. This guy wrote a class to do it here: http://dragonforged.com/DFVideoOut.shtml Haven't used it myself, but it looks very simple.