I have a C# code developed to stream a video file encoded to raw H264 and stream using UDP protocol when I receive that stream using omxplayer it played about 30s and then stopped and display 'have a nice day'. But I can receive that stream using ffplay and vlc player (in windows) without any problem.
Then i tried to stream using vlc player. And also i got same issue. Stream stopped and displayed 'have a nice day'.If any one can help me to figure this out, it is very grateful.
code used to receive stream in omxplayer
omxplayer udp://224.1.1.1:1234
I have given a screen shot of omxplayer when receiving the stream. According to this the buffer size of the omxplayer becomes 0 after playing some time. I tried changing the url of the stream as given below.
omxplayer -s udp://224.1.1.1:1234?overrun_nonfatal=1
Then that problem was alright. But the stream was not smoothly running. Can you suggest any better solution for this problem.
Thank You!
Related
So I have been trying for (embarassingly) long to efficiently stream video from my pi to my computer over the internet. Video encoding and streaming still looks fuzzy to me, since most examples hide the things that happen behind the scenes.
Things I've tried:
OpenCV capture, encode as jpeg, send through socket and decode - I get a working video feed, but data usage is crazy.
using the Picamera module, capture and send over socket, receive and pipe into VLC - also worked, data usage is low, but latency is even more than the first approach. Also I want to play the video on my own UI on pyQt
using the Picamera module, capture and send over socket, pipe to FFMPEG, and try to use opencv to capture from the stdout - I got "bad argument" errors on the cv2.VideoCapture(stdout) method.
All I want is an efficient way to transmit video with low bandwidth and latency using h264, and no pre-written protocols like rtmp or installing servers, then blast it onto my pyQt UI. Something like:
bytes =
video.capture().encode('h264')
udp_socket.sendto(bytes, (ip, port))
And on the receiving side:
data = udp_socket.recvfrom(65534) frame = data.decode() update_frame()
The closest thing I got to this was with the first approach.
Any help would be greatly appreciated.
I am playing a hardware accelerated video on the Raspberry Pi 3 using this simple pipeline:
gst-launch-1.0 playbin uri=file:///test/test.mp4
As soon as the video begins to play, any sound being played in parallel using ALSA begins to crackle (tested with gstreamer and mplayer). It's a simple WAV-file and I am using a USB audio interface.
Listening to the headphone jack already crackles without playing an audio file (but this jack is very low quality and I don't know if that's a different effect).
Playing the audio in the same pipeline as the video does not help. CPU is only on approx. 30 % load and there's plenty of free memory. I already overclocked the SD card. Playing two videos in parallel with omxplayer has no impact and sound still plays well. But as soon as I start the pipe above, the sound begins to crackle.
I tried "stress" to simulate high CPU load. This had no impact either, so CPU does not seem to be the problem (but maybe the GPU?).
This is the gstreamer pipeline to test the audio:
gst-launch-1.0 filesrc location=/test/test.wav ! wavparse ! audioconvert ! alsasink device=hw:1,0
GST_DEBUG=4 shows no problems.
I tried putting queues on different places but nothing helped. Playing a video without audio tracks works a little bit better. But I have no idea, where the ressource shortage may lie, if it even is one.
It somehow seems like gstreamer is disturbing audio streams.
Any ideas where the problem may be are highly appreciated.
It seems like the USB driver of my interface is expecting a very responsive system. I bought a cheap new USB audio interface with a bInterval value of 10 instead of 1 and everything works fine now. More details can be found here: https://github.com/raspberrypi/linux/issues/2215
Problem:
Streaming live audio via an Icecast mountpoint. On the server side, when the live show stops, the server reverts to playing a music playlist (the actual mountpoint stays /live). However, when the live stream stops, the audio player stops too. Dev tools says the request has been cancelled. Player must be in HTML5, so no Flash.
Mountpoint: http://198.154.112.233:8716/
Stream: http://198.154.112.233:8716/live
I've tried:
Listening for the stream to end, and tell the player to reconnect. However, all of the events on the jPlayer and Mediaelement.js APIs don't return anything when the stream is interrupted.
Busy contacting the server host to ask for advice when dealing with their behind-the-scenes playlist switcher.
I'd like to find a client-side solution to this. Could websockets / webrtc solve this problem by keeping a connection open?
Your problem isn't client-side, but how you are handling your encoding. No changes client-side can appropriately fix this problem.
The stream configuration you are using is that the encoder is using files on disk as a backup stream. Unfortunately, it sounds like instead of re-encoding and splicing (and matching sample-rate and channels if needed), it is just sending the raw file data.
This works some of the time, as MPEG decoders are often tolerant of corrupt streams, and will re-sync. However, sometimes the stream is too broken, and the decoder gives up. The decoder will also often stop if there is a change in sample rate or channel count. (Bitrate changes are generally not a large problem.)
To fix your problem, you must contact your host.
Yes this is unfortunately a problem if the playlist and live stream are not the same codec. Additional tools such as Liquidsoap have solved the problem for me, as well as providing many more features:
savonet.sourceforge.net
I'm trying to save 5 seconds .mov segments of an RTSP stream with VLC. First I tried openRTSP and ffmpeg but both of them gives incorrect output (Index missing etc). I've read a lot of the VLC cli, but havn't had any luck of saving an RTSP stream as segments.
If I use the VLC GUI I can both save segments as saving snapshots (PNGs) but I need to do this via CLI.
mov files are not streamable. [they are right in saying index file missing]. I don't even know how you are sending them over rtsp there is no rtp payloader i am aware of for mov/mp4 format.
I have a program on the server side that keeps generating a series of JPEG files, and I want to play these files on the client browser as a video stream, with a desired frame rates (this video should be playing while the new JPEG files are being generated). Meanwhile, I have a wav file that is handy and I want to play this wav file in the client side, when the streaming video is being played.
Is there anyway to do it? I have done a plenty of research but can't find a satisfactory solution -- they are either just for video streaming or just for audio streaming.
I know mjpg-streamer at http://sourceforge.net/projects/mjpg-streamer/ is capable of playing streaming videos in MJPG format from JPEG files, but it doesn't look like that it can play streaming audios.
I am very new to this area, so more detailed explanation will be extremely appreciated. Thank you so much!!!
P.S. a solution/library in C++ is preferred but anything else would help as well. I am working on linux.
The browser should be able to do this natively, no? Firefox can do this certainly, if you simply give it the correct url of the streaming mjpeg source. The mjpeg stream should be properally formatted.
I figured it out. The proper way of doing it is to use ffmpeg, libav and an RTMP server, such as red5.