I want to upload audio data through chunked transfer encoding. My application reads audio buffer after each 0.5 seconds. As soon as I get data, I want to upload this data on the server using chunked encoding. Also I want to keep the connection open. So, after 0.5 seconds, when next buffer will be available, I can upload those data on same connection.
I have tried with NSMutableURLRequest and NSURLConncetion. But as soon as I start communication asynchronously, control goes to iOS and my delegates are notified. I want to upload more data using same NSURLConnection.
I have also tried with ASIHttpRequest. But I cant find a way to give data periodically and upload vi http chunking.
Please let me know if you have any standard way to upload data using transfer encoding chunked with iOS APIs.
Thanks in advance.
I would heartily recommend the AFNetworking library. It seems to be the cool kid on the block for async networking on the iPhone. It was written by the developers at Gowalla and I can vouch for it's ease of use, reliability, and speed.
Related
I have a working POST request that will send a CAF audio file (recorded on the iPhone with AVAudioRecorder) to a web server -- the request itself is structured similarly to Send an audio FILE and JSON string in iOS.
The server correctly receives the file, but I'm having trouble converting it into a format that will play directly in a browser. In addition to every audio/iPhone/server post I could find on stack overflow, I've looked into ffmpeg and HTML5 audio as well, but couldn't find clear instructions on how to convert the received audio binary to a browser-playable format. Essentially, I need to understand how to accomplish Arun's server-side suggestion here.
I know that Audacity, Soundbooth, etc. will allow you to use the raw data and save it as another format, but I need this to be done programmatically on the server. If there are any suggestions they would be very much appreciated! Thank you.
I find most of streaming audio discussions are about the streaming media from http server, e.g. AudioStreamer from cocoa with love or MPMoviePlayerController. They both init with NSURL. But my case is other than that. I use SMB to access the media files on some window shared server. The media content is got with SMB message (thru socket) and is accumulated in memory (NSMutableData)
So is there a way to play them (those NSMutableData) before download is finished ?
Update, so for streaming audio I understand I need audio queue service.
What about stream video other than http? I think it is doable because there is a free app called TIOD which does not only stream audio but also video from SMB server.
BTW, I never expect others to do work for me. I check all the document I can find and can't find a way to do it (for video). I had thought, well, that may mean it can't be done. But then I find TIOD can do that. That's why I raised the quesion in the first place to see if other has experiences for it.
Yea you can stream that as well, its the same thing as getting the data from an NSURL... if you look at audio streaming example by matt gallagher here you see that he is getting data from some URL, but ultimatly when he calls the parse function he is giving it bytes of data, same things should apply for your situation, with the data you get you should be able to call the parse function and have the Audio Player stream your audio file..
I have an iOS app where the user has the capability to upload video. I'd like to be able to support the resuming of uploads for when an upload is interrupted by the network, user, or any other circumstance. I realize this will require changes both on the client and server-side. I was wondering if anyone could point me in the right direction for sample code and/or documentation I can read for clues in how to support this functionality? Something with clues on proper chunking, figuring out what chunk was last sent after an interrupted connection, etc.
See ASIHTTPRequest for this. It is a great replacement library for anything network.
See ASIHTTPRequest documentation
Or
you can download sample code here
1 - How can I upload a photo to a server with the iPhone?
I want to be able to (live) stream the frames/video FROM the iPhone camera to the internet. I've seen in a Thread (streaming video FROM an iPhone) that it's possible using AVCaptureSession's beginConfiguration and commitConfiguration. But I don't know how to start designing this task. There are already a lot of tutorials about how to stream video TO the iPhone, and it is not actually what I am searching for.
Could you guys give me any ideas which could help me further?
That's a tricky one. You should be able to do it, but it won't be easy.
One way that wouldn't be live (not answering your need, but worth mentioning) is to capture from the camera and save it to a video file. see the AV Foundation Guide on how to do that. Once saved you can then use the HTTP Live Streaming segmenter to generate the proper segments. Apple has applications for Mac OSX, but there's an open source version as well that you could adapt for iOS. On top of that, you'd also have to run an http server to serve those segments. Lots of http servers out there you could adapt.
But to do it live, first as you have already found, you need to collect frames from the camera. Once you have those you want to convert them to h.264. For that you want ffmpeg. Basically you shove the images to ffmpeg's AVPicture, making a stream. Then you'd need to manage that stream so that the live streaming segmenter recognized it as a live streaming h.264 device. I'm not sure how to do that, and it sounds like some serious work. Once you've done that, then you need to have an http server, serving that stream.
What might actually be easier would be to use an RTP/RTSP based stream instead. That approach is covered by open source versions of RTP and ffmpeg supports that fully. It's not http live streaming, but it will work well enough.
i want to develop an iPhone app where the app downloads data (say audio clips) from a specified server and stores it locally on the device.
then the app should use the data stored in the device rather than stream it from the server.
could anybody give me the guidelines as to how this can be done? tutorials and samples also appreciated. Thanks :)
The easiest way to play files from the internet is to use -[AVAudioPlayer initWithContentsOfURL:error:]. If you want to make sure that the whole file is downloaded, I think your best bet would be to download the file using NSURLConnection (see the URL Loading Guide) and then using -[AVAudioPlayer initWithData:error:].
Look into ASIHTTPRequest, you will find it much easier to fetch large chunks of binary over the web asynchronously than if you try to code everything yourself.