Is it possible to send audio from iphone to server in real time? I am using AQRecorder which records the data and saves it on a file.I was wondering if there was a way of sending the recorded audio to server in real time.I want to send the audio while it is being recorded.Is it possible? Can we redirect the recorded stream to the server instead of a file?
NOTE:I am using the speak here code from apple https://developer.apple.com/library/ios/samplecode/speakhere/Listings/Classes_AQRecorder_mm.html
Nate is right. A lot of legitimate apps do that. So, you don't even need to have a jailbroken device.
Take a look at this question: Live streaming audio from iPhone
You may be interested to look into open source SIP clients for iOS:
https://code.google.com/p/siphon/
http://www.pjsip.org
http://www.linphone.org/eng/linphone/news/linphone-for-iphone.html
Related
I'd like to get real-time video from the iPhone to another device (either desktop browser or another iPhone, e.g. point-to-point).
NOTE: It's not one-to-many, just one-to-one at the moment. Audio can be part of stream or via telephone call on iphone.
There are four ways I can think of...
Capture frames on iPhone, send
frames to mediaserver, have
mediaserver publish realtime video
using host webserver.
Capture frames on iPhone, convert to
images, send to httpserver, have
javascript/AJAX in browser reload
images from server as fast as
possible.
Run httpServer on iPhone, Capture 1 second duration movies on
iPhone, create M3U8 files on iPhone, have the other
user connect directly to httpServer on iPhone for
liveStreaming.
Capture 1 second duration movies on
iPhone, create M3U8 files on iPhone,
send to httpServer, have the other
user connected to the httpServer
for liveStreaming. This is a good answer, has anyone gotten it to work?
Is there a better, more efficient option?
What's the fastest way to get data off the iPhone? Is it ASIHTTPRequest?
Thanks, everyone.
Sending raw frames or individual images will never work well enough for you (because of the amount of data and number of frames). Nor can you reasonably serve anything from the phone (WWAN networks have all sorts of firewalls). You'll need to encode the video, and stream it to a server, most likely over a standard streaming format (RTSP, RTMP). There is an H.264 encoder chip on the iPhone >= 3GS. The problem is that it is not stream oriented. That is, it outputs the metadata required to parse the video last. This leaves you with a few options.
Get the raw data and use FFmpeg to encode on the phone (will use a ton of CPU and battery).
Write your own parser for the H.264/AAC output (very hard)
Record and process in chunks (will add latency equal to the length of the chunks, and drop around 1/4 second of video between each chunk as you start and stop the sessions).
"Record and process in chunks (will add latency equal to the length of the chunks, and drop around 1/4 second of video between each chunk as you start and stop the sessions)."
I have just wrote such a code, but it is quite possible to eliminate such a gap by overlapping two AVAssetWriters. Since it uses the hardware encoder, I strongly recommend this approach.
We have similar needs; to be more specific, we want to implement streaming video & audio between an iOS device and a web UI. The goal is to enable high-quality video discussions between participants using these platforms. We did some research on how to implement this:
We decided to use OpenTok and managed to pretty quickly implement a proof-of-concept style video chat between an iPad and a website using the OpenTok getting started guide. There's also a PhoneGap plugin for OpenTok, which is handy for us as we are not doing native iOS.
Liblinphone also seemed to be a potential solution, but we didn't investigate further.
iDoubs also came up, but again, we felt OpenTok was the most promising one for our needs and thus didn't look at iDoubs in more detail.
I'd like to stream video from the camera on an iOS device to a receiver via wifi, in effect turning the device into a wireless webcam. Is there a way to build a small app that captures video input on an iOS app and sends it via an RTSP stream or similar?
As this is an ad hoc experiment, I'm not concerned about App Store guidelines and can jailbreak if necessary.
If I interpret your question correctly you more or less need to solve four problems:
Get the camera feed.
Convert/encode this to the right format.
Stream the data.
Prevent the phone from locking itself and going into deep sleep.
The first one is fairly simple and Apple has as always provided good documentation and examples -> API link. Make sure you check out their example in the end as you will get a CMSampleBufferRef data object back.
For the second and third part, you should check out the CFNetwork framework and specially CFFTPStream for streaming using FTP.
If your are only building this for yourself then you can always turn off the Auto-Lock feature in the settings. If you on the other hand would like to distribute this to other users you could use a trick to play a mute sound every 10 seconds. This is more or less how all the alarm clocks work in the App Store. Here's a tutorial. =)
I hope I helped a little bit at least.
Good luck and best regards!
I'm 70% of the way to doing the same thing. Here's how I did it:
Capture content from video input
Chop video into files for use in HTML Live Streaming.
Spin up a web server on the iPhone and make the video files available.
Connect to the IP address of the phone and viola! you've got live streaming video.
Last time I touched the code I was trying to debug my Live Streaming not working. I'll try and get my source code posted on github this weekend, if you'd like to take a look.
So, I recently submitted my first iphone app to Apple.
I did not stream my videos and they are over 10 minutes long, so my app was denied because I did not use HTTP Live Streaming.
So, we stream live videos every week. Those files are stored somewhere, but I am a little unsure of where. I want the video files that I made a feed for to be converted into streamed videos. But I don't want to use Apple's HTTP Live software. I do not know how to code into streamed video.
Is there anyway to either figure out where my streamed files are storing or is there a software that will convert videos into streamed video? Will take any suggestions.
Thanks
The main problem is that you must use HTTP Live Streaming if you wan't your app to be approved, and also be aware of the Apple restrictions (you must set different bitrates, one of 64kbps or lower).
If you don't want to use Apple tools, you can use ffmpeg. Take a look at ioncannon.net http://www.ioncannon.net/programming/452/iphone-http-streaming-with-ffmpeg-and-an-open-source-segmenter/
With Apple tools is easier. You just need mediafilesegmenter/mediastreamsegmenter.
There is also professional services out there, but not free, that will take care of all the process.
If you don't know where are your files, maybe you can use a sniffer and check where is your computer "listening to".
The easiest solution is to simply require that your users be on WiFi in order to watch the videos. The 10 min. / 5MB restriction only applies to video that is sent over Cellular networks, not WiFi. See Apple's "Reachability" code for an example of how to test the user's network connection at run-time.
I'm looking to build a little toy app that is very similar to a voip application. One person would hold one iphone and talk to the other iphone. I don't want to use gamekit because it forces a p2p connection and does not work over 3g. I'm worrying about the server side of this later but just wanted to get started with the iPhone side of it. Which API to record audio in real time and which to play it back?
Look to Audio Queue Services for capture and recording. You'd need to come up with a wire protocol to transmit the audio, but the tools to capture and playback or save on either side of the connection can be built using the queue services.
I have seen plenty of articles and SO questions about streaming TO an iPhone app, but my question is the reverse, that is, streaming FROM an iPhone app.
I have audio content in an iPhone app, that I want to stream to a browser. So the idea is that the browser can connect to a server running on the iphone. The server on the iphone will give the audio to the browser. The browser will play the endless stream.
I already have seamless looping content on the phone with AudioQueue. I already know how to setup a server running on the phone with CocoaHTTPServer. Is there a third piece that can make the AudioQueue (or a FileStream) stream to a browser connected to the internal iPhone server?
Anybody have any thoughts on how to implement this?
Well, there are a few good open source projects to dissect, port, or imitate for this. What I would suggest is looking at how Icecast and streamTranscoderv3 operate together. The latter will take an audio source and send it to an Icecast server as a source. Port parts of both and run them locally on the iPhone and you'd have a solution. I imagine that Bonjour could be used so that other systems on the LAN could find and listen to the iPhone.
Or send the streamTranscoder output to an Icecast server elsewhere and make it available for the world.
Unfortunately, neither project is over engineered - the code isn't super modular but it is comprehensible and modestly cross platform.