Because audio streaming with queues/etc (using Apple's SpeakHere project as an example) is far too complicated for my brain to comprehend, I was thinking about playing streaming music a different way. I have a device that is acting as a "radio station", in that other devices connect to that device via wi-fi and "listen in". So i've implemented cocoaHTTPserver, and what I do is take an MPMediaItem from the iTunes library and put it into the public web folder. I then pass the song URL to the other nearby devices using GameKit. This works great, all of my devices can successfully play the song at the URL of the broadcasting device. My problem is that the stream always starts at the beginning. So if the "radio host device" is 30 seconds in on a song, and then someone connects, the new connection starts the song at the beginning.
So my question is, if I continually pass the current playback time from the host device, to the other devices, if a new device connects to the "stream", is there a way for me on an audio stream using AVPlayer, to set the current plackback time?
So the song 20 seconds in, a new device connects, we tell the new device to start the stream at 20 seconds in. Is that possible?
So i've got this solution working excepting my little problem, and you probably think i'm crazy for doing it this way, but its easier to do it this way than figure out the low level APIs that Apple has for streaming.
I used Matt Galagher's AudioStreamer to play a Mp3 stream. Check this link:
http://cocoawithlove.com/2008/09/streaming-and-playing-live-mp3-stream.html
The article begins with:
This week, I present a sample application that streams and plays an
audio file from a URL on the iPhone or Mac
so it may be helpful.
This post deals with starting playback at any point in the file (startWithOffsetInSecs):
http://www.saygoodnight.com/2009/08/streaming-audio-to-the-iphone-starting-at-an-offset/
Related
I'd like to record what the iPhone is currently outputting. So I'm thinking about recording audio from Apps like Music (iPod), Skype, any Radio Streaming App, Phone, Instacast... I don't want to record my own audio or the mic input.
Is there an official way to do this? How do I do it? It seems like AVAudioRecorder does not allow this, can somebody confirm?
Officially you can't. The audio stream belongs to the app playing it ,and iOS.
The Sandbox paradigm means that a resource owned by your App can't be used by another App. Resource here means Audio/Video stream or file. Exceptions are when a mediator like Document interaction controller are used.
If you want to do this you'd have to start with deducing AVFoundation's private methods and find out if theres a way there. Needless to say this it wouldn't be saleable on the App store and will probably only be possible on a jailbreak.
Good Luck.
TLDR;
This is only feasible only from time to time, as it's a time expensive process.
You can record the screen while listening your songs on Spotify, Music or whatever music application.
This will generate a video on your Photos application. That video can be converted on MP3 from your computer.
Actually, this is not true. The screen recordings will not actually have the audio from Apple Music at all, as it blocks it. Discord also uses this pipe as well, so you cannot record Discord audio either this way.
I am trying to create a streaming video DVR like functionality in an app I am developing. I have an HTTP Live Stream that I have successfully gotten to play on the iPad. I want the user to be able to push the "Record" button, and begin recording the video that is currently playing from that point. This video file will be accessible from the app or from the camera roll. Currently, I am using the MPMoviePlayerController object to play the video stream. I do not see any methods of accessing the data from the object in Apple's documentation. Here are some thoughts I had on ways of going about this.
1) Somehow access the video data from MPMoviePlayerController, and write this to a file. Or use another type of player object that will allow me to play the video and access the currently playing data.
2) Implement some sort of screen capture recording that gets a video capture of the iPad's screen. This would allow me to record the video in a "screenshot" sort of way.
3) Locate the HTTP Live Streaming video segments where they are stored by MPMoviePlayerController. Presumably they need to be stored somewhere on the iPad for playback. Is there a way of accessing these files?
4) Manually download the stream video segments over http while streaming the file. This seems like its not ideal since the stream would have to be downloaded twice.
5) This could work. Periodically download the video segments to the iPhone. Set up a local http server on the iPhone and server the videos to the MPMoviePlayerController. This way the video segments could be marked for recording and assembled into a video.
6) I do have control of the streaming server. I could write some server side code to record the video on the server end, then send the video to the iPad after the fact. I would rather not do this.
Has anyone done any of these things? Ideally the iPhone would just be able to access the video data somehow and easily record it. I would rather not get into options 4, 5, or 6 (above) if I don't have to.
Thanks in advance.
DVR on the device is somewhat not encouraged, due to the limited space available and other factors like battery life, processing power, cleanup procedures after the user stops the dvr, etc.
If you want to achieve DVR playback on iOS devices (or other devices using HLS), I suggest you keep the video server side. The live stream is already captured and segmented server side, all you would have to do is keep the segments a bit longer, instead of deleting them. By using the EXT-X-PLAYLIST-TYPE and EXT-X-MEDIA-SEQUENCE tags, you can suggest to the player that he's opening a live stream which has DVR (earlier) video available.
Alternatively, you can use a server that does that out of the box, for example Wowza. Here's an article on how to achieve this with Wowza
My iPhone application plays a wma audio stream over the mms:// protocol.
When the Wi-Fi connection drops it won't switch to 3G to continue streaming. I have enough buffer to play for another 15 seconds (I tried to increase the buffer size, but it will stop anyway).
What sort of mechanism can I implement so the playing won't stop and the iPhone changes from Wi-fi to 3G?
In other posts I saw that it should do it automatically, but in my case it doesn't, because its wma over mms protocol.
Thanks!
I may be talking out my butt, however, you could try detecting the Error from the connection when it fails. and restart the stream from your buffered location so you can resume the download. If the internet is down the iPhone should reconnect it to accommodate your new request for the mms feed.
I'd like to stream video from the camera on an iOS device to a receiver via wifi, in effect turning the device into a wireless webcam. Is there a way to build a small app that captures video input on an iOS app and sends it via an RTSP stream or similar?
As this is an ad hoc experiment, I'm not concerned about App Store guidelines and can jailbreak if necessary.
If I interpret your question correctly you more or less need to solve four problems:
Get the camera feed.
Convert/encode this to the right format.
Stream the data.
Prevent the phone from locking itself and going into deep sleep.
The first one is fairly simple and Apple has as always provided good documentation and examples -> API link. Make sure you check out their example in the end as you will get a CMSampleBufferRef data object back.
For the second and third part, you should check out the CFNetwork framework and specially CFFTPStream for streaming using FTP.
If your are only building this for yourself then you can always turn off the Auto-Lock feature in the settings. If you on the other hand would like to distribute this to other users you could use a trick to play a mute sound every 10 seconds. This is more or less how all the alarm clocks work in the App Store. Here's a tutorial. =)
I hope I helped a little bit at least.
Good luck and best regards!
I'm 70% of the way to doing the same thing. Here's how I did it:
Capture content from video input
Chop video into files for use in HTML Live Streaming.
Spin up a web server on the iPhone and make the video files available.
Connect to the IP address of the phone and viola! you've got live streaming video.
Last time I touched the code I was trying to debug my Live Streaming not working. I'll try and get my source code posted on github this weekend, if you'd like to take a look.
I want to play the song from one iphone to the other, it is like the user from device A selects a song from his IPod and plays it, while the users on Device B can also listen to that song using their own iphone which is connected to the Device A using either the Game Kit or wifi.
So is it possible to stream the song from the ipod library to the other devices or not.
This is not possible technically, because you can't get to the audio data -- either as a decoded PCM stream or as the original encoded AAC or MP3 files -- via the MPMusicPlayerController or any other public API. Therefore device A is not in a position to send audio data over the network to device B.
Device B could receive and play streamed audio from the network, of course, since that's what apps like Pandora, Last.fm, and AOL Radio do. But for device A to send the data, it would have to use some source other than its own iPod library.
From the point of view of copyright protection, I don't think this is possible - legally, that is.
It might be possible technically, in that you may be able to send the raw bytes of the song file over wifi or bluetooth and then decode and play them on the second device, but that surely won't get past Apple's reviewers.