How to send a video stream to a cloud? - sockets

I want to send a live video stream to a server and I want to perform facial recognition on that video and I would like to get the result back to the client program. Where do I get a server? Can I use Windows Azure here? If yes, can I also make a Python/C++ Server program listen on a particular port?

You haven't talked about the client-side piece. Assuming you're in control of a client app, you could push the video to a Blob, then drop a notification in an Azure queue for a background task to process the uploaded video fragment.
Instead of directly pushing to blobs, you could host a web service that lets you push uploads, and the web service could store the video fragment and then trigger a background processing task.
Running python should be very straightforward - just upload the python exe and any related modules, either with your Windows Azure deployment or in blob storage (then pull them down from blob storage and install them when the VM starts up). As far as port-listening, you can define up to 25 ports that are external-facing. You'd then have your python app listen on the port you defined (either tcp, http, or https).
More info on block and page blobs here. Steve Marx posted this example for installing python in your Web or Worker role.

Related

how to do local and remote file storage in a flutter app

I'm doing a flutter app that needs to open a binary file, display the content to a user and allow them to edit and save. File size would be between 10K and 10 MB. The file also needs to be in the cloud for sharing and accessing from other devices. To minimise remote network egress data charges and also local user mobile data charges, I'm envisaging that when the user saves the file it would be saved locally rather than written to the cloud and only written to the cloud when the user closes the file or maybe at regular intervals or no activity. To minimise network data charges I would like to keep a permanent local copy and the remote copy of the file would have a small supporting file that identified by who and when the remote file was last written. When the app starts, it checks if its local copy is up to date by reading the supporting file. The data does not need high security.
The app will run on Android, IOS, the web and preferably on the desktop - though I know that google firebase SDK for Windows is incomplete/ unavailable.
Is google firebase cloud storage the easiest and best way to do this. If not what is the easiest way.
Are there any cloud storage providers that don't charge for network egress data, just for storage.

How to read a file on a remote server from openshift

I have an app (java, Spring boot) that runs in a container in openshift. The application needs to go to a third-party server to read the logs of another application. How can this be done? Can I mount the directory where the logs are stored to the container? Or do I need to use some Protocol to remotely access the file and read it?
A remote server is a normal Linux server. It runs an old application running as a jar. It writes logs to a local folder. An application that runs on a pod (with Linux) needs to read this file and parse it
There is a multiple way to do this.
If a continious access is needed :
A Watcher access with polling events ( WatchService API )
A Stream Buffer
File Observable with Java rx
Then creating an NFS storage could be a possible way with exposing the remote logs and make it as a persistant volume is better for this approach.
Else, if the access is based on pollling the logs at for example a certain time during the day then a solution consist of using an FTP solution like Apache Commons FTP Client or using an ssh client which have an SFTP implementation like JSch which is a native Java library.

GStreamer plugin to post progress to REST service

For load balancing my GPU based GStreamer tasks I would like to send the percentage of work done on the GPU side to a remote task scheduler service.
I am not that deep into GStreamer, so my question is:
can this be done with already available plugins (e.g. progressreport ! tcpserversink), or do I have to create a special progressreport plugin, that outputs its messages to an http service?

Start of a Script from one server to another server

I've got a server (raspberry PI 3) which functions as a webserver.
Additionally, I've got a client (raspberry PI 3) with a camera connected.
I've created a script on the client, which creates an image of the camera and then sends it automatically to a volume share of the server(webserver).
With a simple <img> HTML tag I want to show the image on the website.
Now I have to manually start the script, which triggers the camera.
So my question is:
How can I start a script from the server(webserver) which triggers the script from the client automatically?
My idea was with sshpass. Unfortunately that did not work.
How would it work via SSH PASS or what else is a useful way?

How to upload image from raspberry pi to wamp server (web server) automatically?

I have a camera attached to raspberry pi which captures the image and image saved in raspberry pi memory. I want that image to upload to wamp web server automatically, so I can get it to my android phone.
Thanks
Well, in principle you would need an API on the server which you call from the Pi where you send the image to. That should receive the image and handle the storing at the server.
At the Pi you will have to run some kind of script calling that API. You can trigger that by the a newly captured image available, or e.g. a cron job, just what fits better.
But depending on what you actually want to achieve - just transferring the image(s) to a webdrive - perhaps using e.g. a "network drive approach" perhaps is easier.
You can do it in few steps:
1>>Setup Apache web server
2>>Setup [vsftpd] or similar FTP deamon on same machine. Please be sure that FTP server upload folder is Apache server root folder. For example [/var/www/html]
3>>Make small script [/home/myname/transfer_files.sh], based on [curl] which will transfer created files to Apache web server via ftp.
curl ftp://myftpsite.com --user myname:mypassword
4>>Add line in contab [/etc/crontab] which will be executed every single minute
*/1 * * * * root /home/myname/transfer_files.sh
This is not perfect solution, but it will work and you do not need to do any codding.