Running scala program from command line fails; pasting it into interpreter succeeds - scala

When I paste a code fragment into the scala interpreter, it works as
expected, but when I attempt to run the same file using
scala ./name-of-file.scala
It prints
<my hostname>: <my hostname>
I am on Fedora 11, and the Scala version I am using is 2.7.7final.

Does running the following command works?
ping `hostname`
If it doesn't, that's most likely your problem.
You see, because not only Scala programs run on JVM, but the Scala compiler itself runs on JVM, and the JVM has pretty steep starting times, when running scripts Scala keeps a copy of the compiler running in background as a daemon, and talks to it through a TCP connection.
Alas, it gets the IP address by resolving the hostname, which means any computer which has a hostname that is not associated to a valid IP address on that hostname will have problems.

Related

SSH command from within a Scala program with pure Scala API or Ammonite API

I would like to perform an SSH command and then execute some command on a distant machine from within scala. Both the scala API and Scala Ammonite provide way to execute system command. However I am having problem with ssh. How do i run an ssh command and then run command and get the result back in from within my scala program ?
EDIT1
found the following in that post Problems using scala to remotely issue commands via ssh
"ssh user#Main.local 'mkdir Desktop/test'".!
I wonder how can this be done with scala Amonite API. I have not been able to find the way.
So far found what i wanted with the Scala API:
http://www.scala-lang.org/api/2.12.1/scala/sys/process/ProcessBuilder.html
e.g.
import scala.sys.process.ProcessBuilder
import scala.sys.process._
import scala.sys.process
"ssh 192.168.0.103 ls IdeaProjects" lineStream
It works as a charm Executed from one of my machine to the other it worked flawlessly. This require a phraseless ssh to be set up in advance. I am the same use on both machine to other i would have had to specify the user as well
If someone can complement the Answer and explain how to do it with the Ammonite API that would be great.
You can use Cable to do your SSH tasks,
Cable supports ~/.ssh config as well as global ssh client config.
Cable also supports jumping over networks, jumping over bastion hosts.
Cable allows tasks composing within a host and across many different hosts.

Restart PlayConsole SBT server

I'm running a scala playframework app using sbt run on intellij console. However, I exited the server using ctrl+Z instead of ctrl+D. Now, I try to sbt sbt run again, but I'm getting following exception:
java.net.BindException: Address already in use
The port is already in use. That means, previous server did not exited. If I try sbt run with different port sbt run 9999 other than default 9000, the server starts without any exception.
So, is there any way to restart or end previous session so that I will not get any binding failed exception if I run the project again?
You have another process already on that port you are using. You need to kill it or use another port.
You can list the process that are using the port and then kill them
use lsof -i:portnumber(8080)
Then kill the process using that port kill PID
Hope this helps!

Apache Spark master running failed

I tried to run C:\Spark\spark-1.6.1-bin-hadoop2.6\sbin>start-master.sh
, but the following mistake appeared
I also noticed that there is a warn while running bin/spark-shell
16/04/24 23:14:41 WARN : Your hostname, Pavilion resolves to a loopback/non-reachable address: fe80:0:0:0:0:5efe:c0a8:867%net14, but we couldn't find any external IP address!
http://localhost/8080 is also unawaliable.
Can anybody please tell where is a mistake or did I miss any settings which are nessesary to run Master properly?
The problem is that this script is not designed to be executed on a Windows machine. Please refer to Official Spark Manual
Note: The launch scripts do not currently support Windows
The rule of thumb is that only script ending with .cmd will run on Windows. While scripts ending in .sh are designed for Linux and Mac OS. And while it should be possible to manually start Spark Master on Windows, it's probably better to just run a local[*] mode, unless you are creating a cluster of Windows machines. local[*] mode already fully utilizes power of the local machine.

PsExec fails to open the stdin, stdout and stderr named pipes

I am using PsExec to access two servers from my laptop. Access to one of the servers (running Windows 2008R2 Enterprise) works fine. Access to the other (Windows 2008R2 Standard) fails with an error (message below). My laptop has Windows 8.1.
Literature
I've verified that my systems match the requirements, as specified here
Investigating an earlier access denied, I followed the steps outlined here
Commands
Steps taken:
net use \\<servername>\Admin$ /user:me * succeeds
dir \\<servername>\Admin$ lists the remote directory, as expected
PsExec.exe \\<servername> -e cmd is very slow, and then fails with the message:
Error establishing communication with PsExec service on <servername>:
The system cannot find the file specified.
To the best of my knowledge (though I'm no Windows expert), I am in the administrator group on both the client and server machine.
Analysis 1
After executing the command, psexesvc.exe is still running on the remote machine, so the command at least partially succeeds
Before each invocation, I stop and delete the psexesvc service, and remove the executable manually
Analysis 2
I used wireshark to observe the communication between client and server. The following seems to happen:
File PSEXESVC.EXE gets successfully created and written
A "Bind" and a "Map" of SVCCTL
Delay of about 20 seconds (?)
Several SVCCTL calls (presumably resulting in the PSEXESVC service being created and started)
Named pipe PSEXESVC is opened and successfully written to
Named pipes for stdin, stdout and stderr cannot be found (STATUS_OBJECT_NAME_NOT_FOUND)
Trying to eliminate a permissions problem, I tried to verify that the named pipes actually don't exist, by running pipelist.exe (found here) on the server side. I did not actually see the pipes, but if they existed only for a very short time, I might not have run pipelist.exe frequently enough.
Version numbers
PsExec version 2.11

Scala open ports

So I have been running a few scala programs on Ubunutu and I have noticed that whenever I run a program for some reason Java starts listening on some random port and won't stop listening.
I am running scala version 2.9.2 on ubuntu 14.10.
Is this meant to happen and does it happen for anyone else?
To check open ports I am using
sudo netstat -anltp
lsof -i
Thanks