iPhone audio streaming - saving state? - iphone

How can I save the current state of an audio file currently being streamed by my app. I want to start the song at the same place where the user left it (something like what Pandora does when listening to a song)
Right now I am getting the packet number being played. From the packet number I get the byte number and send it in the Range header field. But this does not work and returns the whole song right from the beginning.
I am using AudioFileStream and CFHTTPStream to stream the audio.
Thanks.

That's how we do it (although we use NSURLConnection). Sounds like your server isn't respecting the range header. Start debugging there -- the problem is either that your server doesn't support the range header or your client isn't sending it properly.

This really depends a lot on the capabilities of the streaming server. Most of the common streaming servers don't support range - like Shoutcast or Icecast.

Related

Can i get bit rate with connect with RTSP?

Can i get bit rate with connect with RTSP?? (in flutter)
I just think if I can connect rtsp url than get streaming data incoming from rtp ,So I can check bit rate(streaming) of rtsp url.(Is it right?)
I have url info it looks like this
rtsp://ip.ip.ip.ip:port/video1?key = token
(rtsp is always open if the camera is turned on)
And I want to know this url is available and how fast it can streaming the video with out seeing video.
(I am not good at english so the question might be hard to understand... I'm sorry..)

What protocol to use when sending video to multiple devices simultaneously

I don't know if this is the right place to ask, but I have a question..
I am working on a students project where we want to stream a video from one server to multiple devices. But the video should only play on one display at a time. The other displays should be black. But if you switch to another display, the video should continue seamlessly. (Please ask if you need clarification)
The video outputs can be plain displays or with additional servers so a wide range of protocols can be implemented. Wifi connection is possible. Web solutions for running the video in a browser is also possible.
I thought of a lot of alternatives like DLNA, RTP or Chromecast. But I don't know where to start and which is the right solution. It is important that the video that is streamed from the server can be continued on any display seamlessly.
If would mean much to me if you'd give me a hint.

Start audio download half way through file.

I was wondering if there is a way to start downloads of audio files from a start point depending on bytes or time. Right now on my website if I want to start a audio track at the half way point I have to wait for it to download past halfway and then skip to that part. Thanks!
Yes, the server has to support byte range requests and the client would have to make the appropriate request to get the bytes it needs. Most browsers support this already natively; Flash does not.
If you're using the native <audio> element and are unable to set the playback position before the entire song has downloaded then that means the server does not accept byte range requests.
Using SoundManager2 you can pass a from value when creating the sound which will start the playback from said offset (when using Flash, the entire file will still be downloaded):
soundManager.createSound({
url: 'http://www.podtrac.com/pts/redirect.mp3/traffic.libsyn.com/theadamcarollashow/2013.08.07ACS.mp3',
from: 10*60*1000 // position to start playback within a sound (msec): 10 minutes
});

How to continuously stream audio while IceCast server changes streams

Problem:
Streaming live audio via an Icecast mountpoint. On the server side, when the live show stops, the server reverts to playing a music playlist (the actual mountpoint stays /live). However, when the live stream stops, the audio player stops too. Dev tools says the request has been cancelled. Player must be in HTML5, so no Flash.
Mountpoint: http://198.154.112.233:8716/
Stream: http://198.154.112.233:8716/live
I've tried:
Listening for the stream to end, and tell the player to reconnect. However, all of the events on the jPlayer and Mediaelement.js APIs don't return anything when the stream is interrupted.
Busy contacting the server host to ask for advice when dealing with their behind-the-scenes playlist switcher.
I'd like to find a client-side solution to this. Could websockets / webrtc solve this problem by keeping a connection open?
Your problem isn't client-side, but how you are handling your encoding. No changes client-side can appropriately fix this problem.
The stream configuration you are using is that the encoder is using files on disk as a backup stream. Unfortunately, it sounds like instead of re-encoding and splicing (and matching sample-rate and channels if needed), it is just sending the raw file data.
This works some of the time, as MPEG decoders are often tolerant of corrupt streams, and will re-sync. However, sometimes the stream is too broken, and the decoder gives up. The decoder will also often stop if there is a change in sample rate or channel count. (Bitrate changes are generally not a large problem.)
To fix your problem, you must contact your host.
Yes this is unfortunately a problem if the playlist and live stream are not the same codec. Additional tools such as Liquidsoap have solved the problem for me, as well as providing many more features:
savonet.sourceforge.net

Realtime Audio/Video Streaming FROM iPhone to another device (Browser, or iPhone)

I'd like to get real-time video from the iPhone to another device (either desktop browser or another iPhone, e.g. point-to-point).
NOTE: It's not one-to-many, just one-to-one at the moment. Audio can be part of stream or via telephone call on iphone.
There are four ways I can think of...
Capture frames on iPhone, send
frames to mediaserver, have
mediaserver publish realtime video
using host webserver.
Capture frames on iPhone, convert to
images, send to httpserver, have
javascript/AJAX in browser reload
images from server as fast as
possible.
Run httpServer on iPhone, Capture 1 second duration movies on
iPhone, create M3U8 files on iPhone, have the other
user connect directly to httpServer on iPhone for
liveStreaming.
Capture 1 second duration movies on
iPhone, create M3U8 files on iPhone,
send to httpServer, have the other
user connected to the httpServer
for liveStreaming. This is a good answer, has anyone gotten it to work?
Is there a better, more efficient option?
What's the fastest way to get data off the iPhone? Is it ASIHTTPRequest?
Thanks, everyone.
Sending raw frames or individual images will never work well enough for you (because of the amount of data and number of frames). Nor can you reasonably serve anything from the phone (WWAN networks have all sorts of firewalls). You'll need to encode the video, and stream it to a server, most likely over a standard streaming format (RTSP, RTMP). There is an H.264 encoder chip on the iPhone >= 3GS. The problem is that it is not stream oriented. That is, it outputs the metadata required to parse the video last. This leaves you with a few options.
Get the raw data and use FFmpeg to encode on the phone (will use a ton of CPU and battery).
Write your own parser for the H.264/AAC output (very hard)
Record and process in chunks (will add latency equal to the length of the chunks, and drop around 1/4 second of video between each chunk as you start and stop the sessions).
"Record and process in chunks (will add latency equal to the length of the chunks, and drop around 1/4 second of video between each chunk as you start and stop the sessions)."
I have just wrote such a code, but it is quite possible to eliminate such a gap by overlapping two AVAssetWriters. Since it uses the hardware encoder, I strongly recommend this approach.
We have similar needs; to be more specific, we want to implement streaming video & audio between an iOS device and a web UI. The goal is to enable high-quality video discussions between participants using these platforms. We did some research on how to implement this:
We decided to use OpenTok and managed to pretty quickly implement a proof-of-concept style video chat between an iPad and a website using the OpenTok getting started guide. There's also a PhoneGap plugin for OpenTok, which is handy for us as we are not doing native iOS.
Liblinphone also seemed to be a potential solution, but we didn't investigate further.
iDoubs also came up, but again, we felt OpenTok was the most promising one for our needs and thus didn't look at iDoubs in more detail.