Start of a Script from one server to another server - server

I've got a server (raspberry PI 3) which functions as a webserver.
Additionally, I've got a client (raspberry PI 3) with a camera connected.
I've created a script on the client, which creates an image of the camera and then sends it automatically to a volume share of the server(webserver).
With a simple <img> HTML tag I want to show the image on the website.
Now I have to manually start the script, which triggers the camera.
So my question is:
How can I start a script from the server(webserver) which triggers the script from the client automatically?
My idea was with sshpass. Unfortunately that did not work.
How would it work via SSH PASS or what else is a useful way?

Related

Scripted FTP Upload from Container

I am trying to upload a file from a container field to a location on FTP as a serverside script. I have been trying to use the Base Elements BE_FTP_Upload as I'm lead to believe this works on a server script, however I just simply cannot get it to work, I've had the file on FTP, but its always blank missing the content.
I should also add that the BE_Curl_Trace feedback shows successful connection to the FTP, it seems to be my method of moving the file rather than a bad connection. Script attached. (excuse the squiggles, data protection and what not.)
After all of this, simply changing the "filewin:" to "file:" solved my problem, I am now exporting from FM to FTP via a scheduled server script :)

Trying to Get a Raspberry Pi 3 to write and modify Data in a PostgreSQL database on a seperate Server

As a side project I have been interested in energy consumption. I have written and executed a program on a raspberry pi 3; this uses external hardware to gather data over Ethernet using ModbusTCP.
Within my program that is a data logging feature that creates and saves a CSV file with the values collected for that day. Every day at midnight a new CSV file is created and marked with the new day's date. This CSV file is saved locally on the raspberry pi and as it runs headless I’ve had to set up a cronjob to move the files onto a Thumb drive to allow me to view and assess the CSV file.
The modification I am trying to attempt is: I currently have a PostgreSQL database on a separate server; I am trying to get the Raspberry Pi to connect to the database and populate it with data as soon as the Pi has recorded it.
I have searched the internet, both this site and many others, but most of what I have found is tutorials and guides on how to set the Raspberry Pi up as a PostgreSQL server, which is not what I want to achieve.
Any advice and help is greatly appreciated.
Carl
Update : the programming language i am using is python 3
I would investigate using ssh. You can transfer the csv, and then invoke a script on the target machine, wait for a cron or just call pgctl at the remote command line. That will avoid the necessity of setting up a client on the PI, and opening the firewall at port 5432, configuring pg_hba.conf, etc.

How to upload image from raspberry pi to wamp server (web server) automatically?

I have a camera attached to raspberry pi which captures the image and image saved in raspberry pi memory. I want that image to upload to wamp web server automatically, so I can get it to my android phone.
Thanks
Well, in principle you would need an API on the server which you call from the Pi where you send the image to. That should receive the image and handle the storing at the server.
At the Pi you will have to run some kind of script calling that API. You can trigger that by the a newly captured image available, or e.g. a cron job, just what fits better.
But depending on what you actually want to achieve - just transferring the image(s) to a webdrive - perhaps using e.g. a "network drive approach" perhaps is easier.
You can do it in few steps:
1>>Setup Apache web server
2>>Setup [vsftpd] or similar FTP deamon on same machine. Please be sure that FTP server upload folder is Apache server root folder. For example [/var/www/html]
3>>Make small script [/home/myname/transfer_files.sh], based on [curl] which will transfer created files to Apache web server via ftp.
curl ftp://myftpsite.com --user myname:mypassword
4>>Add line in contab [/etc/crontab] which will be executed every single minute
*/1 * * * * root /home/myname/transfer_files.sh
This is not perfect solution, but it will work and you do not need to do any codding.

Run batch file on remote pc *visibly* to logged on user

I've got a batch file dmx2vlc which will play a random video file through VLC-Player when called.
It works well locally but I need this to happen on another machine on the network (will be adhoc) and the result (VLC-Player playing the video) must be visible on the remote screen.
I've tried SSH, Powershell and PsExec, but both seem to run the batch file and the player in the session of the command line, even when applying a patch to allow multiple logins.
So IF I get to run the batch file it is never visible on screen.
Using Teamviewer and the like is no option as I need to be able to call all this programmatically from my dmx program.
I'm not bound to being able to call the batch directly, it would be sufficient for me if I could somehow trigger it to run.
Sadly latency is a problem here as we are talking about a lighting (thus dmx) environment.
Any hints would be greatly appreciated!
You can use PSexec if the remote system is XP with the interactive parameter if you state the session to interact with, 0 would probably be the console (person physically in front of the machine).
This has issues with Windows Vista and newer as it pops up a prompt to ask the user to change their display mode first.
From memory, you could create a scheduled task on the remote system pretty easily though and as long as it's interactive the user should see it.
Good luck.
Try using web interface. It is rather easy: VLC is running http server, and accessing particular URL from remote machine will give full control over VLC. Documentation can be found here

How to send a video stream to a cloud?

I want to send a live video stream to a server and I want to perform facial recognition on that video and I would like to get the result back to the client program. Where do I get a server? Can I use Windows Azure here? If yes, can I also make a Python/C++ Server program listen on a particular port?
You haven't talked about the client-side piece. Assuming you're in control of a client app, you could push the video to a Blob, then drop a notification in an Azure queue for a background task to process the uploaded video fragment.
Instead of directly pushing to blobs, you could host a web service that lets you push uploads, and the web service could store the video fragment and then trigger a background processing task.
Running python should be very straightforward - just upload the python exe and any related modules, either with your Windows Azure deployment or in blob storage (then pull them down from blob storage and install them when the VM starts up). As far as port-listening, you can define up to 25 ports that are external-facing. You'd then have your python app listen on the port you defined (either tcp, http, or https).
More info on block and page blobs here. Steve Marx posted this example for installing python in your Web or Worker role.