I am trying to get the camera feed from a blackmagic capture card into the mediaplayer of the Vlc plugin for Unity.
What i have done :
I can get the capture device with the vlc desktop application, so camera and capture card work fine.
I can run the sample scene of the vlc plugin which show an video from a web url, it works fine
I searched the LIBVLCSharp to try to understand a bit how it all works, https://code.videolan.org/videolan/LibVLCSharp/-/blob/master/src/LibVLCSharp/Media.cs
I am trying to modify the UseRenderingPlugin.cs, which is a script which plays the video on a texture in the Unity scene, and especially the line which chose the media to be played :
The original line of code :
_mediaPlayer.Media = new Media(_libVLC, "http://commondatastorage.googleapis.com/gtv-videos-bucket/sample/BigBuckBunny.mp4", FromType.FromLocation);
And what i achieved so far (but doesn't work). I changed 'FromLocation' to 'FromPath' and replace the URL with the mrl to the capture card with the options, thanks to the vlc desktop application :
_mediaPlayer.Media = new Media(_libVLC, "dshow:// :dshow-vdev=Blackmagic WDM Capture :dshow-adev=Entrée ligne (Blackmagic DeckLink Mini Recorder 4K Audio) :dshow-aspect-ratio=16\\:9 :dshow-chroma= :dshow-fps=50 :no-dshow-config :no-dshow-tuner :dshow-tuner-channel=0 :dshow-tuner-frequency=0 :dshow-tuner-country=0 :dshow-tuner-standard=0 :dshow-tuner-input=0 :dshow-video-input=-1 :dshow-video-output=-1 :dshow-audio-input=-1 :dshow-audio-output=-1 :dshow-amtuner-mode=1 :dshow-audio-channels=0 :dshow-audio-samplerate=0 :dshow-audio-bitspersample=0 :live-caching=300 ", FromType.FromPath);
I would like to ask you if anyone knows the right syntax to use directshow in that function, or redirect me to a similar topic (that I haven't been able to found though, but i apologize if I missed it) or if I'm getting it all wrong.
Thank you so much for your time, it's the first time I use this plugin and LibVLCSharp so please be patient with me :D
Thank you #mfkl for your help.
Here is what has worked :
_mediaPlayer.Media = new Media(_libVLC, "dshow:// ", FromType.FromLocation );
And add all the options like this :
_mediaPlayer.Media.AddOption(":dshow-vdev='Blackmagic WDM Capture'");
_mediaPlayer.Media.AddOption(":dshow-fps=50");
...
Related
I am developing a game for android and ios via Unity. I have to play a video in a scene in the game and I use the video player component for this. I get the video link on local using xampp. And the video I'm trying to play is in mp4 format. But when I start the game, the video cannot be played properly. I am not getting an error, but video looks like the picture I send. I don't know what I'm doing wrong, can you help me? I also share the code I used and related pictures with you.
public VideoPlayer videoplayer;
public string videoUrl="urlgir";
void Start() {
videoplayer.url = videoUrl;
videoplayer.audioOutputMode=VideoAudioOutputMode.AudioSource;
videoplayer.EnableAudioTrack (0, true);
videoplayer.Prepare (); }
I think I found something. I uploaded the video I mentioned to unity again. I got a warning after uploading the video.
VFR warning: 1111 video frames have a different duration than expected 0.0333333s, ranging from 0s to 1.2771s.D:/Program Files/Unity/xxx/Assets/Scenes/hp.mp4 (30FPS) may have variable frame rate (VFR), which is not supported. This may lead to incorrect timing in transcoded clip.
I think the video has unsupported variable frame rate. So I can't run the video as clib or url. Well, does anyone know of this warning? What should be needed?
I am trying to create an Oculus GO app that requires the feature to play mp4 videos in 360 with sound. I have followed countless of tutorials both on text from Googling and from YouTube and all result in the same output. I end up being able to hear the sound from the video but I can only see the camera background (which is black/brownish in my case). I have absolutely no idea of what to do and there is nothing on the internet (that i have been able to find) that can help me
My scene consists of the following:
- A OVRPlayerController containing a OVRCameraRig
- A videoplayer
I have attached a picture giving you an overview of my entire project and everything at play in this scene at this link --> https://imgur.com/a/HvIoAtD
Please, I would really appreciate any help. Thank you!
I've been playing around with Facebook AR but I'm not sure how to script using the AudioModule. The tutorial in facebook's page only explains on using the patch editor but i want to know how to script it as well.
Basically, the audio plays only when a video texture is playing. The video texture only plays after a timeout of 10secs, so the audio should follow suit. Obviously, its sound.play() but how do i get the audio itself?
I have a speaker -> audio : playback_controller_model0
Does using AudioModule require anything? like var Audio = require("Audio"); ?
i did something like :
var Audio = require("Audio");
var sound = Audio.getPlayBackController("playback_controller_model0");
sound.play();
and it doesn't seem to be working though because the video texture script is being ignored.
Unfortunately Facebook recently removed the option to script audio playback in favor or using the Patch Editor to control audio playback. As of the latest version you can only use the Patch Editor to play audio, God knows why... But, you can use the script to patches bridge to trigger playback via script, by sending messages to the playback controller in your Patch from script.
Like so in script:
Patches.setPulseValue('PlayMySound', Reactive.once());
And in the Patches Editor:
Good luck!
I am currently using s3eVideo in the Marmalade SDK to play a video in my project after a button event. I attempted to find a way to implement a slider bar (or something of the like) to go back and forth in the video. I am unsure if this feature is even supported, but I may be wrong. Otherwise, is there a way to open a native video player outside of the app and then play the video that way with the seek feature I need?
Any help would be greatly appreciated.
There doesn't seem to be a way of finding out the length of the video but s3eVideoSetInt (S3E_VIDEO_POSITION, timeInMilliseconds) should do the trick.
I guess it will depend on if the frame index in the file is ok and if the specific platform supports it. Only really did play/stop video in Marmalade so you may have to try using this function while playing, while pause etc and see what works and what errors.
I am working an iphone app that needs to record a vedio automatically.
I used mobile coreservices framework and using that. I made it to came into video mode and clicking on record option its start capturing a vedio. But I want it automatically that is.. I should able to record a video without clicking on record option. That is when video mode comes up its automatically start record video.
Could any one help?
You can look at UIImagePickerControllers startVideoCapture method which is used to start taking video from the camera, this is to be used when you arent using the camera standard controlors and you provide and overlay view. Here is a reference UIImagePickerCOntroller ref. If this is not enough for you, you might want to look into AVFoundation framework which gives you a lot more control over video capturing process...hope that helps