best way to upload a binary file from unreal engine to firebase cloud storage, using blueprints? - unreal-engine4

in my VR 'experience', player records an answer to a question in voice. how to upload that file to cloud storage so other players can hear it? i am using unreal engine 4 blueprints.

i started with the firebase plugins (usd 50 and usd 75). the usd 50 one cloud storage node crashes UE frequently, and 100% when packaging for windows (oculus rift, or oculus quest air linked). the usd 75 one cloud storage does not work when packaged for windows. the solution that worked for me were the following two plugins:
.. Low Entry File Manager plugin (usd 40) to convert binary file to byte array
.. VaRest (free) to upload binary file to firebase cloud storage
blueprint pasted below.
please add your answer if there is a better way to do this.

Related

Azure media player source manifist

We are very small junior school private tutors have setup of an online portal where students can login and watch the daily video lectures. We have many videos uploaded to Azure media services but we realized the encoding cost is high and not affordable. So I encoded a video using FFMPEG and generated m4s and audio file and .mpd (metadata) using MPBox in my local.
I have copied all the files on Azure blob storage and blob storage have HTTPS access. Can I use .mpd as source url for Azure media player ?
e.g. Azure media player source is //amssamples.streaming.mediaservices.windows.net/3b970ae0-39d5-44bd-b3a3-3136143d6435/AzureMediaServicesPromo.ism/manifest
but my generated metadata from MPDBox is
https://bb.sourceoftraining.companywebinternet.storage/ssj-ewrrer-2343s-ssssdf23/process_and_benifits.mpd
Or any other player I can use. I tried Shaka player but unable to show the Resolution and Playback speed settings.
Uploading pre-encoded MP4's works just fine. I suggest you download the latest version of the Azure Media Explorer tool for the v3 API. In there you can now upload an MP4 into a new asset, and have it generate the client and server manifests needed for streaming. Just upload to a new empty Asset, and then double click on the asset to get to the tab for the files, and click the generate manifests buttons.
That pre-gens the required manifest files needed for streaming an MP4 that is pre-encoded with closed 2 second GOPs. The tool pre-generates both the client and server manifest and saves them back into the asset to improve the playback performance from the streaming server.
You can use Azure Media Player to play back DASH, Smooth, or HLS - but the technology that it chooses to use for playback differs by platform. For example depending on the browser version, OS, or mobile client it will chose to load a different player tech or it will use the built-in OS player support.
https://learn.microsoft.com/en-us/azure/media-services/azure-media-player/azure-media-player-overview
For DASH content (.mpd) the AMP player chooses to use Dash on Windows, and on Android in specific conditions. It does this by detecting the platform and using the right tech combined with the /manifest(format=mpd-time-cmaf) format on the URL. You can learn more about how "dynamic packaging" works in AMS here - https://learn.microsoft.com/en-us/azure/media-services/latest/dynamic-packaging-overview
There are various "format" options on the streaming locator URL in AMS that provide different manifest formats back.
Smooth Streaming = /manifest
MPEG-DASH-CMAF = /manifest(format=mpd-time-cmaf)
HLS with CMAF = manifest(format=m3u8-cmaf)
HLS v3 (TS) = /manifest(format=m3u8-aapl-v3)
Using one of those various formats, you can use any 3rd party player that supports them. Shaka, HLS.js, Exoplayer on Android, iOS AvFoundation native player, Video.js, or even the 'adpater-player' noted by Jason above. Any player that supports the current HLS or DASH specifications should work.
If you have School email addresses that you can use for yourself and your students the simplest solution would be to leverage capabilities from Microsoft Stream via the free O365 education plan - https://www.microsoft.com/en-us/microsoft-365/academic/compare-office-365-education-plans. Info on Microsoft Stream at https://www.microsoft.com/en-us/microsoft-365/microsoft-stream.
And to clarify feedback Jason Pan just provided, while Azure Media Player doesn't support just pointing at .mpd file for playback this is rather done via first creating appropriate server manifest and then requesting .mpd manifest via format option in the URL clients will use to request content. Media Services will then dynamically create the appropriate manifest to respond to the client request. See John's response for links to articles with additional feedback on this.
If you use Shaka Player's UI library, you'll be able to display the Resolution and Playback speed settings.
Shaka UI library Shaka Player Demo

Is it possible to upload a video file to IBM Cloud Functions / OpenWhisk function and encode it?

We are developing a video streaming platform. In that we want to encode video into H.264 format after uploading it.
We decided to use IBM Cloud Functions / OpenWhisk to encode the video, but having some doubts. Is it possible to upload a video file to IBM Cloud Functions / OpenWhisk and encode it? Is it supported, how can it be done?
Yes, that should be possible.
I recommend checking out this "Dark Vision" app using IBM Cloud Functions. You can upload videos which then are split into frames, the frames processed with Visual Recognition. The source code for Dark Vision is available on GitHub.
In addition you should go over the documented IBM Cloud Functions system limits to see if they match your requirements.

How to add videos to firebase database using flutter

I need a way to add videos to a firebase database, and I was able to find a video where someone had explained how to do this with native coding, but I was wondering if there was a way to do this with flutter.
This will not be possible (if you are sane).
The maximum data size of a string in the Realtime Database is 10 MB and it would be one hell of a torture to try to store videos in chunks of UTF-8 encoded strings.
Firebase offers Cloud Storage for Firebase, where you can easily store videos and other files.
The documentation is great for beginners with Cloud Storage and it should be easy to integrate it into your existing Realtime Database project.
If you compare GB stored of the Realtime Database and Cloud Storage on the pricing page, you will quickly realise that it would be insane to store videos in the Database instead of Cloud Storage.
The documentation on the Firebase website do not yet include Flutter, but the firebase_storage Flutter plugin is easy to operate in combination with the official docs.

facebook application to store data (mp3)

I would like to develop a facebook app (using as3) where the output is xml files and mp3 files. what are the recommended way of doing so ?
From small research I did I came up with these options so far:
convert xml files to sql tables , store the mp3 as a blob using facebook sql // is that allowed? what is the size limit for the table?
use amazon cloud service ? // up tp 5gb free but I would rather find a completely free option.
Thanks
Facebook doesn't have storage facility, you can use dropbox (2gb), sky-drive (5gb), live-kive (5gb) and recently, Box.net offers 50GB free for Apple mobile devices only :(

Movie making using Google earth COM API

I am trying to programatically generate movie/video file using Google Earth COM APIs (along with Google earth Pro). But unfortunately I could not find any COM APIs to automate movie maker feature in Google earth pro.
Basically my project idea is that: client will provide tour information to server, then some server side service will launch google earth pro locally on server to export tour video to local file, which then will streamed down to client. So client will not need to have google earth plugin installed.
and also as per my knowledge generating movie file using Google earth plugin is not possible (please correct me if I am wrong)
Can anyone point me to some solution?
The Google Earth COM API is being depreciated and developers are being encouraged to use the Google Earth Plugin Api instead.
Using the plug-in in an embedded application would easily allow you to achieve this as you could just capture the frames from the plug-in and then render them to a video file using your desired codec.
I have put together a library of controls that work with the Google Earth Api in managed code. They may not be exactly what you need but they should give some idea of the main principals (e.g. loading, geocoding, screen-grab, etc)