Playing mp4 video via Gstreamer with hardaware acceleration on RPI4 - raspberry-pi

Is there an example of pipeline for Gstreamer(gst-launch) to play mp4 video with HW acceleration on Raspberry pi4?

Related

Flutter plays H264 raw stream

In Flutter, I want to obtain H264 raw stream playing video through WebSockect. I can support playing video on the web and Android by using Jmuxer with embedded web pages.
but I can't play it because of the ios player。
Therefore, I used the method I used in js before. I transferred h264 to YUV through ffmpeg and then played it through WebGL. But I don't know how to convert h264 to YUV through FFmpeg in Flutter

Raspberry pi camera h.264 codec licence

I have pi camera module v2. I want to make life video stream.
I have tried to read h.264 license agreement but didn't understand: do I need to buy license for h.264 usage or it is included in camera's price?

Stream audio from a lm393 sound sensor with microphone raspberry

I'm trying to implement a baby monitor using a raspberry pi with a lm393 sound sensor + microphone, like this one, http://www.dx.com/p/lm393-sound-detection-sensor-module-black-221267#.WEP3oaLhCRs.
I want stream the audio it receives when the sensor detects a louder noise.
What Im wondering is how to use the microphone to stream the audio, haven't found any info on this.
Any help will be welcome.
The answer is No. Because lm393 can only be used as a sound sensor, it needs an amplifier in order to be able to stream sound.

ffmpeg with AudioUnit

What I have
My aim is to parse some media file using ffmpeg and provide video and audio playback. Which I do successfully using the OpenGL for video and AudioQueue for audio.
What I need to do
I need to change AudioQueue to Audio Unit service, because it does provide several nasty features for Audio manipulations.
Basically I'm confused on integration of Audio Units into ffmpeg run loop.
So would like to have some references/samples from you guys where Audio Unit is intergrated with ffmpeg media playback loop i.e. media packet extraction and its pushing into some buffer which Audio Unit can play.
Yes, i already used AudioUnit for playing a decoded audio using ffmpeg.
I took Novocaine project from github ( https://github.com/alexbw/novocaine ) and did a some changes (mostly threw code for input and OSX).
See KxAudioManager class from kxmovie project: https://github.com/kolyvan/kxmovie
If you plan to code for iOS, then note about need resampling sound to 44100Hz.
A many movies have audio streams with 48000 Hz sample rate.
And the best result I got: resample via swr_convert function from libswresample lib.

Timestamp on Live video after transcoding from RTSP stream to FLV

I have access to an RTSP stream from an IP camera, and the RTSP packets have video timestamps in the header.
If I transcode the video stream to FLV using VLC for the purposes of playing the video on the web, is it safe to assume that I will lose the timestamp information in the transcode process?
If so, is there any way around this?