Howdy, I am trying to run matlab remotely on windows via OpenSSH installed with Cygwin, but launching matlab in windows without the GUI seems to be impossible.
If i am logged in locally, I can launch matlab -nodesktop -nodisplay -r script, and matlab will launch up a stripped down GUI and do the command.
However, this is impossible to do remotely via ssh, as, matlab needs to display the GUI.
Does anyone have any suggestions or work arounds?
Thanks,
Bob
Short story: is your script calling exit()? Are you using "-wait"?
Long story: I think you're fundamentally out of luck if you want to interact with it, but this should work if you just want to batch jobs up. Matlab on Windows is a GUI application, not a console application, and won't interact with character-only remote connectivity. But you can still launch the process. Matlab will actually display the GUI - it will just be in a desktop session on the remote computer that you have no access to. But if you can get it to do your job without further input, this can be made to work, for some value of "work".
Your "-r script" switch is the right direction. But realize that on Windows, Matlab's "-r" behavior is to finish the script and then go back to the GUI, waiting for further input. You need to explicitly include an "exit()" call to get your job to finish, and add try/catches to make sure that exit() gets reached. Also, you should use a "-logfile" switch to capture a copy of all the command window output to a log file so you can see what it's doing (since you can't see the GUI) and have a record of prior runs.
Also, matlab.exe is asynchronous by default. Your ssh call will launch Matlab and return right away unless you add the "-wait" switch. Check the processes on the machine you're sshing to; Matlab may actually be running. Add -wait if you want it to block until finished.
One way to do this stuff just use -r to call to a standard job wrapper script that initializes your libraries and paths, runs a job, and does cleanup and exit. You'll also want to make a .bat wrapper that sets up the -logfile switch to point to a file with the job name, timestamp, and other info in it. Something like this at the M-code level.
function run_batch_job(jobname)
try
init_my_matlab_library(); % By calling classpath(), javaclasspath(), etc
feval(jobname); % assumes jobname is an M-file on the path
catch err
warning('Error occurred while running job %s: %s', jobname, err.message)
end
try
exit();
catch err
% Yes, exit() can throw errors
java.lang.System.exit(1); % Scuttle the process hard to make sure job finishes
end
% If your code makes it to here, your job will hang
I've set up batch job systems using this style in Windows Scheduler, Tidal, and TWS before. I think it should work the same way under ssh or other remote access.
A Matlab batch system on Windows like this is brittle and hard to manage. Matlab on Windows is fundamentally not built to be a headless batch execution system; assumptions about an interactive GUI are pervasive in it and hard to work around. Low-level errors or license errors will pop up modal dialog boxes and hang your job. The Matlab startup sequence seems to have race conditions. You can't set the exit status of MATLAB.exe. There's no way of getting at the Matlab GUI to debug errors the job throws. The log file may be buffered and you lose output near hangs and crashes. And so on.
Seriously consider porting to Linux. Matlab is much more suitable as a batch system there.
If you have the money or spare licenses, you could also use the Matlab Distributed Computing toolbox and server to run code on remote worker nodes. This can work for parallelization or for remote batch jobs.
There are two undocumented hacks that reportedly fix a similar problem - they are not guarantied to solve your particular problem but they are worth a try. Both of them depend on modifying the java.opts file:
-Dsun.java2d.pmoffscreen=false
Setting this option fixes a problem of extreme GUI slowness when launching Matlab on a remote Linux/Solaris computer.
-Djava.compiler=NONE
This option disables the Java just-in-time compiler (JITC). Note that it has no effect on the Matlab interpreter JITC. It has a similar effect to running Matlab with the '–nojvm' command-line option. Note that this prevents many of Matlab's GUI capabilities. Unfortunately, in some cases there is no alternative. For example, when running on a remote console or when running pre-2007 Matlab releases on Intel-based Macs. In such cases, using the undocumented '-noawt' command-line option, which enables the JVM yet prevents JAVA GUI, is a suggested compromise.
Using putty use ssh -X remote "matlab" it should work
Related
I do know about the double hop issue. My scenario is: I have a script I want to run remotely that calls another script located on a network share that calls a third script located on a second network share in a different domain.
Currently what I am doing is using Credssp (I've read there can be security issues but this environment is not public facing) to pass credentials for the 1st network share that has script2. I do not have access to the computer with the second domain so I cannot setup credssp on it. In order to work around that, inside of the script2 I am using "net use" command on the third script in order for the script to be able to find the path. I am then using "Copy-Item" to copy the third script on to the machine running script2 (the remote machine).
Up to this step, everything is working when I run script1. I can see script3 is copied over onto the remote machine. When script3 is called, it should make a web request that sends text to stdout (which I pipe to Out-File in script2). However, whenever I try to run the copy of script3 (located on the remote machine) from script2 (running on the remote machine) it does not seem to do anything. If I run script2 locally on the remote machine then it works fine (file is generated from script3's output).
Any idea's on why this won't work? I've tried running script 3 using several variations of invoke-expression, invoke-command, start-process, and even trying to run with cmd. I'm also having trouble getting output on what exactly is causing the issue (stdout and stderr are many times empty when using the different commands). Am I missing some command or tool that may make this easier to troubleshoot? It almost seems like script3 is still running into a double hop issue despite it only making a web request? And if it was running into that, I thought it would have had an error returned.
There my be a better design for doing what I'm trying to do. I'm fairly new to PowerShell and may be over complicating this.
Edit: Rewrote my scripts in python and got it working.
I have powershell script which is present on chef server to run on remote windows server, how can i run this powershell script from chef server on remote windows server.
Chef doesn't do anything like this. First, Chef Server can never remotely access servers directly, all it does is stores data. Second, Chef doesn't really do "run a thing in a place right now". We offer workstation tools like knife ssh and knife winrm as simplistic wrappers but they aren't made for anything complex. The Chef-y way to do this would be to make a recipe and run your script using the the powershell_script resource.
Does it mean chef is also running on Windows server ?
If yes, why not to use psexec from Windows Ps tools ?
https://learn.microsoft.com/en-us/sysinternals/downloads/psexec
Here is my understanding of what you are trying to achieve. If I'm wrong then please correct me in a comment and I will update my answer.
You have a powershell script that you need to run on a specific server or set of servers.
It would be convenient to have a central management solution for running this script instead of logging into each server and running it manually.
Ergo you either need to run this script in many places when a condition isn't filled, such as a file is missing, or you need to run this script often, or you need this script to be run with a certain timing in regards to other processes you have going on.
Without knowing precisely what you're trying to achieve with your script the best solution I know of is to write a cookbook and do one of the following
If your script is complex place it in your cookbook/files folder (assuming the script will be identical on all computers it runs on) or in your cookbook/templates folder (if you will need to inject information into it at write time). You can then write the .ps file to the local computer during a Chef converge with the following code snippet. After you write it to disk you will also have to call it with one of the commands in the next bullet.
Monomorphic file:
cookbook_file '<destination>' do
source '<filename.ps>'
<other options>
end
Options can be found at https://docs.chef.io/resource_cookbook_file.html
Polymorphic file:
template '<destination>' do
source '<template.ps.erb>'
variables {<hash of variables and values>}
<other options>
end
Options can be found at https://docs.chef.io/resource_template.html
If your script is a simple one-liner you can instead use powershell_script, powershell_out! or execute. powershell_out! has all the same options and features as the shell_out! command and the added advantage that your converge will pause until it receives an exit status for the command, if that is desirable. The documentation on using it is a bit more spotty though so spend time experimenting with it and googling.
https://docs.chef.io/resource_powershell_script.html
https://docs.chef.io/resource_execute.html
Which ever option you end up going with you will probably want to guard your resource with conditions on when it should not run, such as when a file already exists, a registry key is set or what ever else your script changes that you can use. If you truly want the script to execute every single converge then you can skip this step, but that is a code smell and I urge you to reconsider your plans.
https://docs.chef.io/resource_common.html#guards
It's important to note that this is not an exhaustive list of how to run a powershell script on your nodes, just a collection of common patterns I've seen.
Hope this helped.
I've got a batch file dmx2vlc which will play a random video file through VLC-Player when called.
It works well locally but I need this to happen on another machine on the network (will be adhoc) and the result (VLC-Player playing the video) must be visible on the remote screen.
I've tried SSH, Powershell and PsExec, but both seem to run the batch file and the player in the session of the command line, even when applying a patch to allow multiple logins.
So IF I get to run the batch file it is never visible on screen.
Using Teamviewer and the like is no option as I need to be able to call all this programmatically from my dmx program.
I'm not bound to being able to call the batch directly, it would be sufficient for me if I could somehow trigger it to run.
Sadly latency is a problem here as we are talking about a lighting (thus dmx) environment.
Any hints would be greatly appreciated!
You can use PSexec if the remote system is XP with the interactive parameter if you state the session to interact with, 0 would probably be the console (person physically in front of the machine).
This has issues with Windows Vista and newer as it pops up a prompt to ask the user to change their display mode first.
From memory, you could create a scheduled task on the remote system pretty easily though and as long as it's interactive the user should see it.
Good luck.
Try using web interface. It is rather easy: VLC is running http server, and accessing particular URL from remote machine will give full control over VLC. Documentation can be found here
Is it possible to setup Matlab to run a specific script in the background when the user is NOT logged in? The script works fine on its own on a Windows Server 2008 machine with Matlab R2014a. It doesn't need a gui for the script to complete, but I'm guessing that Matlab requires user-specific environments to be set. Is there a place where this can be set ahead of time maybe?
I have tried "Task Scheduler" and it works just fine, but you have to set the setting to run only when that particular user is logged in or else nothing happens. The problem, of course, is the user session would require continuous monitoring in order to remain logged in (power outage, updates, etc.).
Has anyone dealt with this in the past? We've considered compiling it, but apparently there are certain functions and objects that the script uses (I didn't write it) that don't carryover during compilation.
Any thoughts or suggestions are welcome!
I've done some work for a client where we have an instance of MATLAB running continuously on a server, doing some stuff. The server occasionally fails (power outages, IT dept screw-ups etc), and it needs to be brought back up automatically.
Note that MATLAB does need to be run as a user for licensing reasons, so our MATLAB instance always runs under a designated account, with a license dedicated to running that instance continuously.
We have a Windows batch file to start up a suitable MATLAB instance, that contains a command similar to the following:
CALL matlab.exe -nosplash -nodesktop -sd "myStartupFolder" -r "myMATLABCommand"
We then have a scheduled task set up so that 5 minutes after that account logs in, the batch file runs, and we have Windows set up so that when Windows starts, that account is automatically logged in (I'm no Windows admin, but I think we had to do some weird stuff in order to enable that, such as adding the account to some special domain group, or giving the account special privileges - you may need to research that a little more).
Anyway, that solved the issue for us. If the server goes down and then recovers (perhaps IT bring it back up), the account is automatically logged in, the batch file runs, and the MATLAB instance is brought back up. If we need (rarely) to log in directly under that account without the task running, we have a 5 minute window to stop the scheduled task from running, which is no problem.
Hope that helps!
Unfortunately and afaik, Matlab can only be startet without GUI on Linux (maybe on Mac OS X too?).
~$ cat /tmp/stackoverflow.m
s='stackoverflow';
length(s)
~$ ./R2013a/bin/matlab -nodisplay -nojvm -nodesktop -nosplash -r "run /tmp/stackoverflow.m, exit"
< M A T L A B (R) >
Copyright 1984-2013 The MathWorks, Inc.
R2013a (8.1.0.604) 64-bit (glnxa64)
February 15, 2013
To get started, type one of these: helpwin, helpdesk, or demo.
For product information, visit www.mathworks.com.
ans =
13
~$
However Matlab itself is not capable of Shebang #! in a Bash script. So it's always a workaround.
A better solution might be to run your Matlab instance continuously and write a daemon/script, which will run you .m script time-dependent for example.
A much better way is to use the Matlab Coder Toolbox (if you have it) and compile a stand-alone binary from you .m file. This binary should be easily runable with task schedular on Windows.
I have a command line program that listen to a tcp port until user type Q to exist. It works fine in local powershell window. But when I try to run it on another machine using powershell remote session, it just starts and quit. Is there a way to keep it running?
The remote script runs in a PowerShell that never becomes visible so AFAICT it doesn't even got a console handle by which to handle reading keyboard input.
You can take a look at the SysInternals utility - psexec. From my testing, that utility works for what you are trying to do.
Ensure you have Powershell 3 or higher since it adds support for detached sessions/background jobs.
Use a Remote Disconnected Session, described on Technet