I'm trying to use VLC as a RTSP client.The rtsp server is based on the libstreaming example1. I'm use Samsung Galaxy S6.
Is it possible to stream 1080p resolution? Then what is the rtsp URL format.
Yes, it is possible, use MediaRecorder instead of MediaCodec.
Max resolution for MediaCodec is 640x480.
libstreaming:
public static final byte MODE_MEDIARECORDER_API = 0x01;
1080p works for me on Meizu MX4, but has huge latency and sometimes mpeg artifacts.
I tested this on many devices and it depends on device model.
URL format does not depend on video resolution
to set resolution use .setVideoQuality() method as described in readme here https://github.com/fyhertz/libstreaming
Related
I have an HDMI source connected to a Chinese HD HDMI Encoder box. Playback to VLC on my PC works (open network stream http://192.168.0.150:80/hdmi)
Stream is NOT leaving my local network (on purpose)
I cannot get a signal to display on my Google Nexus Player or my NVidia Shield via Cumulus TV app. (The point being to integrate the feed into the Google Live Channels app) I have tried adjusting several of the settings to no avail. Should I be trying a specific format? I tried the Fiddler (didn't see anything descriptive in that tool) but still have no definitive answers. I am pretty sure this device only produces a H.264 bitstream, which works in the PC version of VLC, but I have no luck on my androidTV devices (to include VLC). I can also get playback on my android PHONE in VLC...
seeking help/ troubleshooting advice...
main stream settings are:
H.264 Level: high profile Encoding frame rate: 30[5-30]
Bitrate control:vbr Key interval: 30[5-200]
Encoded size: auto MinQp: 3[1-51] MaxQp: 32[MinQp-51]
MaxBitrate: 8000[16-12000]
Audio bitrate:192000 Audio channel:L+R
Audio Codec:AAC Resample:Disable Package:B HTTP: Enable /hdmi (begin with "/")
HTTP Port:80[1-65535] Change TS ID:Disable
transport_stream_id: 300[256-3800]pmt_start_pid: 480[256-3800]
stream_start_pid: 481[256-3800]RTSP: Disable Multicast IP: Disable
RTMP server ip: Disable ONVIF:Disable Enable
It looks like your encoder can stream three different formats:
Http - probably HLS
RTMP
RTP/RTSP
Now the question is which formats do your clients support and is the format on the above list.
You could install Fiddler on your PC (web app debugger) to verify that your streaming box actually serves HLS.
Since you know that VLC plays your stream you could try to install VLC on your Google Nexus player: https://play.google.com/store/apps/details?id=org.videolan.vlc
I am seeking the following three items, which I cannot find on STACKOVERFLOW or anywhere:
sample code for AVFoundation capturing to file chunks (~10seconds) that are ready for compression?
sample code for compressing the video and audio for transmisison across the Internet?
ffmpeg?
sample code for HTTP Live Streaming sending files from iPhone to Internet server?
My goal is to use the iPhone as a high quality AV camcorder that streams to a remote server.
If the intervening data rate bogs down, files should buffer at the iPhone.
thanks.
You can use AVAssetWriter to encode a MP4 file of your desired length. The AV media will be encoded into the container in H264/AAC. You could then simply upload this to a remote server. If you wanted you could segment the video for HLS streaming, but keep in mind that HLS is designed as a server->client streaming protocol. There is no notion of push as far as I know. You would have to create a custom server to accept pushing of segmented video streams (which does not really make a lot of sense given the way HLS is designed. See the RFC Draft. A better approach might be to simply upload the MP4(s) via a TCP socket and have your server segment the video for streaming to client viewers. This could be easily done with FFmpeg either on the command line, or via a custom program.
I also wanted to add that if you try and stream 720p video over a cellular connection your app will more then likely get rejected for excessive data use.
Capture Video and Audio using AVFouncation. You can specify the Audio and Video codecs to kCMVideoCodecType_H264 and kAudioFormatMPEG4AAC, Frame sizes, Frame rates in AVCaptureformatDescription. It will give you Compressed H264 video and AAC aduio.
Encapsulate this and transmit to server using any RTP servers like Live555 Media.
I'm interested in setting up a streaming video server (perhaps on a cloudfront server) with videojs. I understand that flash video can be streamed, however, is it possible to stream video using videojs and a different codec? (like h246). I tried looking through the videojs documentation and forums but did not find any additional info.
Video.js has limited support for RTMP streaming in Flash, but hopefully more in the next few months.
HTTP Live Streaming (HLS) is the most supported streaming format for HTML5 (iOS, Safari, latest Android). Video.js can support that on the devices that support HLS natively.
I think you would have to transcode the h264 file on the fly to get the effect you want. Subsonic is a program which will read your file structure, display your videos and music in a webui, transcode the audio/video and stream it--but it uses jwplayer, not videojs.
However, it is opensource, so if you want to try to modify that, I'm sure it would be possible.
Will the ALAC format support live streaming in iPhone ? the ALAC audio recording format is streamed to Server machine? so will i be able to play the audio chunk data, does ALAC format support?
Thank You.
Assuming you mean "Apple Lossless" audio...
I don't see why it wouldn't, but I don't know the details. You'll probably need to embed it in a transport stream instead of a MPEG 4 container (but then, I don't know how the HTTP live streaming works either).
I don't think streaming lossless audio is sensible, though.
Streaming lossless audio is possible, we have flac streaming using icecast and it works beautifully. However, we are not using HTTP Live Stream (HLS) to do it. We stream flac from the source generator to a number of servers and they create HLS's from there.
It is technically possible to mux alac into mpegts (ffmpeg can do this) as well as play it back (using ffmpeg), but there isn't a format identifier for other clients. Adding this feature to HLS will be as easy as calling/writing Apple and asking them to add ALAC to this list:
http://www.smpte-ra.org/mpegreg/mpegreg.html
and update their products accordingly. If you've purchased an Apple product less than 90 days ago, or you have AppleCare: give them a call. They have to work on the issue for you if you are covered. The more requests that get elevated to their engineers, the more likely they are to add support for alac in HLS.
Using FFMPEG, Live555, JSON
Not sure how it works but if you look at the source files at http://github.com/dropcam/dropcam_for_iphone you can see that they are using a combination of open source projects like FFMPEG, Live555, JSON etc. Using Wireshark to sniff the packets sent from one of the public cameras that's available to view with the free "Dropcam For Iphone App" at the App Store, I was able to confirm that the iphone was receiving H264 video via RTP/RTSP/RTCP and even RTMPT which looks like maybe some of the stream is tunneled?
Maybe someone could take a look at the open source files and explain how they got RTSP to work on the iphone.
Thanks for the info TinC0ils. After digging a little deeper I'v read that they have modified the Axis camera with custom firmware to limit the streaming to just a single 320x240 H264 feed, to better provide a consistent quality video over different networks and, as you point out, be less of a draw on the phone's hardware etc. My interest was driven by a desire to use my iphone to view live video and audio from a couple of IP cameras that I own without the jerkiness of MJPEG or the inherent latency that is involved with "http live streaming". I think Dropcam have done an excellent job with their hardware/software combo, I just don't need any new hardware at the moment.
Oh yeah, I almost forgot the reason of this post RTSP PROTOCOL DOES WORK ON THE IPHONE!
They are using open source projects to receive the frames and decoding in software instead of using hardware decoders. This will work, however, this runs counter to Apple's requirement that you use their HTTP Streaming. It will also require greater CPU resources such that it doesn't decode video at the desired fps/resolution on older devices and/or decrease battery life compared to HTTP streaming.