We have an http stream running on iPad iOS 4.3.3.
We are using MPMoviePlayerController. I am trying to change the playback rate in order to implement a custom fast forward experience by using:
[player setCurrentPlaybackRate:2.0];
But it isn't working. If I display the current playback rate immediately after the above line of code, it displays 1.0 only. Any idea if this doesn't work for a stream? The documentation doesn't say anything about it.
The property currentPlaybackRate does indeed not work when using HTTP-streaming. I would suggest you to file a bug report to Apple on this issue. Still I assume it will not work in the future but possibly another bug report motivates them Apple folks to update their documentation accordingly.
Related
We are creating an app using Swift on IOS that needs to use custom audio controls: a play button, a stop button, a current time/pos indicator, the duration and perhaps even a seek bar to play 1 audio file that is built into the application. My friend has worked hard on this project but so far was unable to find a solution. Is it possible using mediaplayer or is another api easier, possibly something similar to exoplayer for android apps.
Thanks.
In my opinion I would say that using AVFoundation would probably be easiest since it is built into swift you just have to do import AVFoundation and you would probably want to create an AVAudioPlayer to listen to the music. Also the API is pretty simple. If you need any help getting started I found this article on Mashable that shows how to play audio and how to control it with a custom play and pause button. https://medium.com/yay-its-erica/creating-a-music-player-app-in-swift-3-53809471f663. Also I found on Apple's documentation how to get the current time of the audio playing https://developer.apple.com/documentation/avfoundation/avaudioplayer/1387297-currenttime. I hope I was able to help and if you have anymore questions don't hesitate to ask!
So my application for iphone4 reads data from the accelerometer and sends it to another application via tcp sockets. I need my app to work in background mode, so what I did was:
I put an mp3 file in the application's Documents folder
I used AVAudioPlayer library to play the file in a loop. It works.
I edited Info.plist and added option "required background
modes" with "audio" on.
Still, the scheduler suspends the application whenever I press the iphone's home button. Is there anything I missed?
I read apple's documentation, but I didn't find a solution. A few thoughts on this:
do I have to edit appDelegate.m?
is it because I use AVAudioPlayer instead of the iPod?
is it because I play an audio file from the application documents
folder?
I read about one person changing iOS Development Target from 4.0 to
3.2.1, but that didn't work for me.
And finally, say I get this to work, would the application still be getting data from the accelerometer?
On a side note, I don't want to submit the application to the App Store.
No, you will not receive accelerometer notifications in background mode. As far as I know, it is not possible. Check Executing Code in Background.
If you read the docs carefully, you will know that the whole background code model is based on responding to specific events (location and voip modes).
As for the audio mode here is an extract from Apple:
Your application should limit itself to doing only the work necessary
to provide data for playback while in the background. For example, a
streaming audio application would download any new data from its
server and push the current audio samples out for playback. You should
not perform any extraneous tasks that are unrelated to playing the
content.
Not sure whether you have solved your issue or not since this question was posted more than one year ago. Also, not sure whether playing audio is a must in your app or not. If both answers are no, my recent investigation may help a bit.
Here are how I get my app getting accelerometer data at the background
1. Follow this tutorial http://mobile.tutsplus.com/tutorials/iphone/ios-multitasking-background-location/ to get the background location working.
2. Follow this tutorial http://jonathanhui.com/ios-motion to get the accelerometer working.
Then you can get an app collecting accelerometer data at the background. Hope this helps.
I'd like to stream video from the camera on an iOS device to a receiver via wifi, in effect turning the device into a wireless webcam. Is there a way to build a small app that captures video input on an iOS app and sends it via an RTSP stream or similar?
As this is an ad hoc experiment, I'm not concerned about App Store guidelines and can jailbreak if necessary.
If I interpret your question correctly you more or less need to solve four problems:
Get the camera feed.
Convert/encode this to the right format.
Stream the data.
Prevent the phone from locking itself and going into deep sleep.
The first one is fairly simple and Apple has as always provided good documentation and examples -> API link. Make sure you check out their example in the end as you will get a CMSampleBufferRef data object back.
For the second and third part, you should check out the CFNetwork framework and specially CFFTPStream for streaming using FTP.
If your are only building this for yourself then you can always turn off the Auto-Lock feature in the settings. If you on the other hand would like to distribute this to other users you could use a trick to play a mute sound every 10 seconds. This is more or less how all the alarm clocks work in the App Store. Here's a tutorial. =)
I hope I helped a little bit at least.
Good luck and best regards!
I'm 70% of the way to doing the same thing. Here's how I did it:
Capture content from video input
Chop video into files for use in HTML Live Streaming.
Spin up a web server on the iPhone and make the video files available.
Connect to the IP address of the phone and viola! you've got live streaming video.
Last time I touched the code I was trying to debug my Live Streaming not working. I'll try and get my source code posted on github this weekend, if you'd like to take a look.
I've been developing an iPhone application that streams audio using Matt Gallagher's audio streamer found here: GitHud: AudioStreamer
However, I'm having some problems when the iPhone loses internet connection because the stream cuts out and doesn't reconnect until the user actually presses the play button again. I've even tried using the reachability classes from Apple to try and automatically stop and reconnect the stream but this isn't working 100%.
I've been reading around on the internet and I've found something called HTTP Live Streaming that can supposedly be used to stream audio on the iPhone. However, I can't seem to find any examples of how to use this, therefore can anyone help me by given a brief description any any source that might help to get this working please?
Thanks in advance,
Luke
Not enough detail for me to answer this entirely, but I use a set of calls
to be notified of reachability changes.
When I get a failure, I change the play image to stop.
I then wait for a notification that the network is back
and then programmatically press play for the user.
I am trying to get the duration of a video taken with the camera using UIImagePickerController on the iphone, has anyone found a solution to this?
Thanks
Daniel
You can now do this using AVFoundation you can make your movie into an AVAsset and then check the duration property
Oh, why, hello hopelessly obsolete answer. I'm afraid you're only left here as historical evidence that yes, before iOS 4 if you wanted to do anything remotely interesting on a recorded video (besides playing it) you had to implement the processing yourself.
I don't know of any framework function to do so, so I'm afraid you'll have to parse the video container yourself (which by the way, is QuickTime/.mov) to extract this info. It's not like it's not documented. Luckily since the provider is known, you can trust all info to be truthful, which you can't assume of random videos found on the web.