Jelastic communincation between nodes - perl

I have a perl program that dynamically generate images based on given width. I need to create web service that takes image dimensions from client, and pass it to the perl program to create the image then send it back to the client.
Now on Jelastic cloud, I created 3 nodes:
Node 1: for tomcat (contains the java code).
Node 2: for MySql (contains the database).
Node 3: for Centos VPS (contains the perl code).
My questions:
I'm I doing the right thing?? if not what is the best way to do my program?
How can I call the perl code (in node 3) from the java service (in node 1), and return the generated image back to client.

It sounds a reasonable design. You would write something like this.
import java.lang.Runtime;
int width = 99;
try {
Runtime runt = Runtime.getRuntime()
Process proc = runt.exec('/usr/bin/perl', '/path/to/myperl.pl', Integer.toString(width));
proc.waitFor();
}
catch (Exception ioe) {
ioe.printStackTrace();
}
Of course you'll have to adjust /usr/bin/perl to where your own perl exeutable really is, or you could invoke the shell to get it to search the path by using
runt.exec( '/bin/bash', '-c', 'perl', '/path/to/myperl.pl', Integer.toString(width) );
As for how to get the image back to the client, you don't say much about how your Perl program works, but either you tell it where to write the file or it decides for itself and tells you where it's put it afterwards
If it's the former, then you presumably pass the path on the command line, so you just have to extend the call to runt.exec above to pass another parameter
If it's the latter, then presumably the program prints to STDOUT where it has put the new file, and you need to read that stream from your Java code to collect the information. That would look like this in place of the proc.waitFor() call
import java.io.*;
BufferedReader inp = new BufferedReader(
new InputStreamReader(proc.getInputStream())
);
while ( ( line = inp.readLine() ) != null ) {
// Process output of Perl code to get file location
}

Related

Stop huge error output from testing-library

I love testing-library, have used it a lot in a React project, and I'm trying to use it in an Angular project now - but I've always struggled with the enormous error output, including the HTML text of the render. Not only is this not usually helpful (I couldn't find an element, here's the HTML where it isn't); but it gets truncated, often before the interesting line if you're running in debug mode.
I simply added it as a library alongside the standard Angular Karma+Jasmine setup.
I'm sure you could say the components I'm testing are too large if the HTML output causes my console window to spool for ages, but I have a lot of integration tests in Protractor, and they are SO SLOW :(.
I would say the best solution would be to use the configure method and pass a custom function for getElementError which does what you want.
You can read about configuration here: https://testing-library.com/docs/dom-testing-library/api-configuration
An example of this might look like:
configure({
getElementError: (message: string, container) => {
const error = new Error(message);
error.name = 'TestingLibraryElementError';
error.stack = null;
return error;
},
});
You can then put this in any single test file or use Jest's setupFiles or setupFilesAfterEnv config options to have it run globally.
I am assuming you running jest with rtl in your project.
I personally wouldn't turn it off as it's there to help us, but everyone has a way so if you have your reasons, then fair enough.
1. If you want to disable errors for a specific test, you can mock the console.error.
it('disable error example', () => {
const errorObject = console.error; //store the state of the object
console.error = jest.fn(); // mock the object
// code
//assertion (expect)
console.error = errorObject; // assign it back so you can use it in the next test
});
2. If you want to silence it for all the test, you could use the jest --silent CLI option. Check the docs
The above might even disable the DOM printing that is done by rtl, I am not sure as I haven't tried this, but if you look at the docs I linked, it says
"Prevent tests from printing messages through the console."
Now you almost certainly have everything disabled except the DOM recommendations if the above doesn't work. On that case you might look into react-testing-library's source code and find out what is used for those print statements. Is it a console.log? is it a console.warn? When you got that, just mock it out like option 1 above.
UPDATE
After some digging, I found out that all testing-library DOM printing is built on prettyDOM();
While prettyDOM() can't be disabled you can limit the number of lines to 0, and that would just give you the error message and three dots ... below the message.
Here is an example printout, I messed around with:
TestingLibraryElementError: Unable to find an element with the text: Hello ther. This could be because the text is broken up by multiple elements. In this case, you can provide a function for your text matcher to make your matcher more flexible.
...
All you need to do is to pass in an environment variable before executing your test suite, so for example with an npm script it would look like:
DEBUG_PRINT_LIMIT=0 npm run test
Here is the doc
UPDATE 2:
As per the OP's FR on github this can also be achieved without injecting in a global variable to limit the PrettyDOM line output (in case if it's used elsewhere). The getElementError config option need to be changed:
dom-testing-library/src/config.js
// called when getBy* queries fail. (message, container) => Error
getElementError(message, container) {
const error = new Error(
[message, prettyDOM(container)].filter(Boolean).join('\n\n'),
)
error.name = 'TestingLibraryElementError'
return error
},
The callstack can also be removed
You can change how the message is built by setting the DOM testing library message building function with config. In my Angular project I added this to test.js:
configure({
getElementError: (message: string, container) => {
const error = new Error(message);
error.name = 'TestingLibraryElementError';
error.stack = null;
return error;
},
});
This was answered here: https://github.com/testing-library/dom-testing-library/issues/773 by https://github.com/wyze.

How to compile documents and run Jshop2 in Eclipse?

I am a student who begin to study SHOP2 from China.
My teacher told me to run JSHOP2 in Eclipse.Now I can run original zenotravel problem and generate GUI and plans.Likewise, I want to put other domain and problems to SHOP2 and produce plans.
But the problem is that I don't know how to compile them and My teacher only asked me to run the the main function in Internaldomain but it can't succeed.Follow is the original code:
public static void main(String[] args) throws Exception
{
//compile();
// compile(args);
//-- run the planning algorithm
run(args);
}
This code can run zenotravel.Then I put domain and problems named pfile1 and
tdepots respectively into SHOP2 folder.Change the codes to:
{
compile(domaintdepots);
// compile(args);
//-- run the planning algorithm
run(args);
}
It warns "domainpdfiles cannot be resolved to a variable".
Or
//--compile();
compile(args);
//-- run the planning algorithm
//run(args);
It turns out:
"Exception in thread "main" java.lang.ArrayIndexOutOfBoundsException: 0
at JSHOP2.InternalDomain.compile(InternalDomain.java:748)
at JSHOP2.InternalDomain.main(InternalDomain.java:720)"
720 is main funcition above.And 748 is compile function:
public static void compile(String[] args) throws Exception
{
//-- The number of solution plans to be returned.
int planNo = -1;
//-- Handle the number of solution plans the user wants to be returned.
if (args.length == 2 || args[0].substring(0, 2).equals("-r")) {
if (args[0].equals("-r"))
planNo = 1;
else if (args[0].equals("-ra"))
planNo = Integer.MAX_VALUE;
else try {
planNo = Integer.parseInt(args[0].substring(2));
} catch (NumberFormatException e) {
}
}
Finally,according to the advice of the friend,I put the two pddls into src folder and use “java Jshop2.InternalDomain domaintdepots”in CMD commad but an error appeared:"the main class Interdomain can't be found or loaded".But I have set the class path accurately and the Zenotravel planning can run.So how
and where can I use the command ?
And what is written in the bracket"compile()" in Eclipse?
I am also not familiar with JAVA so it's better if there is concrete instruction.Thanks a lot.
Please describe what are you trying to build, what is it supposed to do, what is the expected end result.
If you do have a valid PDDL domain and problem file, you could try to load them into the online http://editor.planning.domains/ editor using the File > Load menu. Then press the Solve button and confirm which of the file is the domain and which is problem. If the PDDL model is valid (and the underlying solver can handle the requirements), you will get a plan back.
If you are trying to build a software solution that needs a PDDL-based planning engine as one of its component, perhaps you could use one of the available implementations: https://nergmada.github.io/pddl-reference/guide/whatisplanner.html#list-of-planners
If you are trying to build your own planning engine in Java using the Eclipse IDE, you probably need a Java-based PDDL parser. Here is a tutorial, how to use pddl4j for that purpose:
https://github.com/pellierd/pddl4j/wiki/A-tutorial-to-develop-your-own-planner
If you need to use Jshop2 in particular, it looks from their documentation (http://www.cs.umd.edu/projects/shop/description.html) that you need to indeed compile the domain and problem PDDL into Java code using following commands:
java JSHOP2.InternalDomain domainFileName
java JSHOP2.InternalDomain -r problemFileName
Edited on June 19th
Java package names (e.g. JSHOP2) and class names (InternalDomain) are case sensitive, so make sure you type them as per the documentation. That is probably why you are getting the "main class not found error".
It is difficult to say what the lines numbers 748 and 720 exactly correspond to, because in the GitHub repo https://github.com/mas-group/jshop2/blob/master/src/JSHOP2/InternalDomain.java the code is different from yours. Can you indicate in your questions which lines those are exactly?
The make file shows how to execute an out-of-the-box example in the distribution:
cd examples\blocks
java JSHOP2.InternalDomain blocks
java JSHOP2.InternalDomain -r problem300
Does that work for you?

Changing working directory in Scala [duplicate]

How can I change the current working directory from within a Java program? Everything I've been able to find about the issue claims that you simply can't do it, but I can't believe that that's really the case.
I have a piece of code that opens a file using a hard-coded relative file path from the directory it's normally started in, and I just want to be able to use that code from within a different Java program without having to start it from within a particular directory. It seems like you should just be able to call System.setProperty( "user.dir", "/path/to/dir" ), but as far as I can figure out, calling that line just silently fails and does nothing.
I would understand if Java didn't allow you to do this, if it weren't for the fact that it allows you to get the current working directory, and even allows you to open files using relative file paths....
There is no reliable way to do this in pure Java. Setting the user.dir property via System.setProperty() or java -Duser.dir=... does seem to affect subsequent creations of Files, but not e.g. FileOutputStreams.
The File(String parent, String child) constructor can help if you build up your directory path separately from your file path, allowing easier swapping.
An alternative is to set up a script to run Java from a different directory, or use JNI native code as suggested below.
The relevant OpenJDK bug was closed in 2008 as "will not fix".
If you run your legacy program with ProcessBuilder, you will be able to specify its working directory.
There is a way to do this using the system property "user.dir". The key part to understand is that getAbsoluteFile() must be called (as shown below) or else relative paths will be resolved against the default "user.dir" value.
import java.io.*;
public class FileUtils
{
public static boolean setCurrentDirectory(String directory_name)
{
boolean result = false; // Boolean indicating whether directory was set
File directory; // Desired current working directory
directory = new File(directory_name).getAbsoluteFile();
if (directory.exists() || directory.mkdirs())
{
result = (System.setProperty("user.dir", directory.getAbsolutePath()) != null);
}
return result;
}
public static PrintWriter openOutputFile(String file_name)
{
PrintWriter output = null; // File to open for writing
try
{
output = new PrintWriter(new File(file_name).getAbsoluteFile());
}
catch (Exception exception) {}
return output;
}
public static void main(String[] args) throws Exception
{
FileUtils.openOutputFile("DefaultDirectoryFile.txt");
FileUtils.setCurrentDirectory("NewCurrentDirectory");
FileUtils.openOutputFile("CurrentDirectoryFile.txt");
}
}
It is possible to change the PWD, using JNA/JNI to make calls to libc. The JRuby guys have a handy java library for making POSIX calls called jnr-posix. Here's the maven info
As mentioned you can't change the CWD of the JVM but if you were to launch another process using Runtime.exec() you can use the overloaded method that lets you specify the working directory. This is not really for running your Java program in another directory but for many cases when one needs to launch another program like a Perl script for example, you can specify the working directory of that script while leaving the working dir of the JVM unchanged.
See Runtime.exec javadocs
Specifically,
public Process exec(String[] cmdarray,String[] envp, File dir) throws IOException
where dir is the working directory to run the subprocess in
If I understand correctly, a Java program starts with a copy of the current environment variables. Any changes via System.setProperty(String, String) are modifying the copy, not the original environment variables. Not that this provides a thorough reason as to why Sun chose this behavior, but perhaps it sheds a little light...
The working directory is a operating system feature (set when the process starts).
Why don't you just pass your own System property (-Dsomeprop=/my/path) and use that in your code as the parent of your File:
File f = new File ( System.getProperty("someprop"), myFilename)
The smarter/easier thing to do here is to just change your code so that instead of opening the file assuming that it exists in the current working directory (I assume you are doing something like new File("blah.txt"), just build the path to the file yourself.
Let the user pass in the base directory, read it from a config file, fall back to user.dir if the other properties can't be found, etc. But it's a whole lot easier to improve the logic in your program than it is to change how environment variables work.
I have tried to invoke
String oldDir = System.setProperty("user.dir", currdir.getAbsolutePath());
It seems to work. But
File myFile = new File("localpath.ext");
InputStream openit = new FileInputStream(myFile);
throws a FileNotFoundException though
myFile.getAbsolutePath()
shows the correct path.
I have read this. I think the problem is:
Java knows the current directory with the new setting.
But the file handling is done by the operation system. It does not know the new set current directory, unfortunately.
The solution may be:
File myFile = new File(System.getPropety("user.dir"), "localpath.ext");
It creates a file Object as absolute one with the current directory which is known by the JVM. But that code should be existing in a used class, it needs changing of reused codes.
~~~~JcHartmut
You can use
new File("relative/path").getAbsoluteFile()
after
System.setProperty("user.dir", "/some/directory")
System.setProperty("user.dir", "C:/OtherProject");
File file = new File("data/data.csv").getAbsoluteFile();
System.out.println(file.getPath());
Will print
C:\OtherProject\data\data.csv
You can change the process's actual working directory using JNI or JNA.
With JNI, you can use native functions to set the directory. The POSIX method is chdir(). On Windows, you can use SetCurrentDirectory().
With JNA, you can wrap the native functions in Java binders.
For Windows:
private static interface MyKernel32 extends Library {
public MyKernel32 INSTANCE = (MyKernel32) Native.loadLibrary("Kernel32", MyKernel32.class);
/** BOOL SetCurrentDirectory( LPCTSTR lpPathName ); */
int SetCurrentDirectoryW(char[] pathName);
}
For POSIX systems:
private interface MyCLibrary extends Library {
MyCLibrary INSTANCE = (MyCLibrary) Native.loadLibrary("c", MyCLibrary.class);
/** int chdir(const char *path); */
int chdir( String path );
}
The other possible answer to this question may depend on the reason you are opening the file. Is this a property file or a file that has some configuration related to your application?
If this is the case you may consider trying to load the file through the classpath loader, this way you can load any file Java has access to.
If you run your commands in a shell you can write something like "java -cp" and add any directories you want separated by ":" if java doesnt find something in one directory it will go try and find them in the other directories, that is what I do.
Use FileSystemView
private FileSystemView fileSystemView;
fileSystemView = FileSystemView.getFileSystemView();
currentDirectory = new File(".");
//listing currentDirectory
File[] filesAndDirs = fileSystemView.getFiles(currentDirectory, false);
fileList = new ArrayList<File>();
dirList = new ArrayList<File>();
for (File file : filesAndDirs) {
if (file.isDirectory())
dirList.add(file);
else
fileList.add(file);
}
Collections.sort(dirList);
if (!fileSystemView.isFileSystemRoot(currentDirectory))
dirList.add(0, new File(".."));
Collections.sort(fileList);
//change
currentDirectory = fileSystemView.getParentDirectory(currentDirectory);

Is it possible to use Log4perl to create per user log files?

I have a mojolicious web app that uses Log4perl for logging. It is a multi user application and sometimes it is difficult to follow the various threads in log file when more than one user is accessing the application. What I would like to do is have each user (the population is under 25 users) activity be logged to a separate file. E.g. ./log/userX.log ./log/userY.log, etc.
I thought about using something like this in the config file:
log4perl.appender.MAIN.filename=sub { return get_user_filename(); }
but the logger is defined in the Mojolicious startup subroutine and the user isn't known until request time.
Another idea that seems more promising, is to write a bridge route that creates an appender for the user and then assigns that to the logger. I could then cache the appender for later reuse (or destroy and recreate).
I'll be playing with the second option, but if anyone has tried to do this before and wants to share their wisdom, I would appreciate it.
-- Update --
So in my bridge route I'm doing the following:
my $user = $self->req->headers->header('authuser'); # from apache
my $appender = Log::Log4perl::Appender->new(
'Log::Log4perl::Appender::File',
name => $user . "_file_appender",
filename => "/tmp/$user.log",
mode => "append",
);
$appender->layout($layout); # previously defined
$appender->level($loglevel); # again previously defined and omitted for brevity
Log::Log4perl::get_logger($user)->add_appender($appender);
$self->app->log(Log::Log4perl::get_logger($user));
I'm getting the following error, however:
Can't locate object method File:() in Log::Log4perl::Appender at /usr/local/share/perl/5.14.2/Log/Log4perl/Appender.pm line 282, <DATA> line 747.
/tmp/user.log is being created, though (zero length) Latest CPAN instal of Log::Log4perl. Any ideas?
I liked Richard's comment and already wrote this up before you said you didn't want to go the DB route. So I include it for others. Aside: I think DB logging is pretty heavy handed sometimes and I personally would typically choose files too.
Somewhere in the top of the request/dispatch cycle (Catalyst example)-
Log::Log4perl::MDC->put( "user", $ctx->user_exists ? $ctx->user->id : 0 );
Then in the Log4perl config-
log4perl.appender.toDBI = Log::Log4perl::Appender::DBI
log4perl.appender.toDBI.Threshold = INFO
log4perl.appender.toDBI.layout = Log::Log4perl::Layout::NoopLayout
log4perl.appender.toDBI.datasource = sub { "DBI:mysql:" . db_name_function() }
log4perl.appender.toDBI.attrs.f_encoding = utf8
log4perl.appender.toDBI.username = db_username
log4perl.appender.toDBI.password = s3cr37
log4perl.appender.toDBI.sql = INSERT INTO toDBI \
( user, file, line, message ) \
VALUES ( ?, ?, ?, ? )
log4perl.appender.toDBI.usePreparedStmt = 1
log4perl.appender.toDBI.params.1 = %X{user}
log4perl.appender.toDBI.params.2 = %F
log4perl.appender.toDBI.params.3 = %L
log4perl.appender.toDBI.params.4 = %m
As far as files go, I think it might be possible but won't be much fun, seems likely to introduce bugs, and is probably overdone. It is trivial to add something like user:{userid} to your logging and then grep/ack the logfile with that to get exactly the user's request/log thread.

install4j: how can i pass command line arguments to windows service

I've created a windows service using install4j and everything works but now I need to pass it command line arguments to the service. I know I can configure them at service creation time in the new service wizard but i was hoping to either pass the arguments to the register service command ie:
myservice.exe --install --arg arg1=val1 --arg arg1=val2 "My Service Name1"
or by putting them in the .vmoptions file like:
-Xmx256m
arg1=val1
arg2=val2
It seems like the only way to do this is to modify my code to pick up the service name via exe4j.launchName and then load some other file or environment variables that has the necessary configuration for that particular service. I've used other service creation tools for java in the past and they all had straightforward support for command line arguments registered by the user.
I know you asked this back in January, but did you ever figure this out?
I don't know where you're sourcing val1, val2 etc from. Are they entered by the user into fields in a form in the installation process? Assuming they are, then this is a similar problem to one I faced a while back.
My approach for this was to have a Configurable Form with the necessary fields (as Text Field objects), and obviously have variables assigned to the values of the text fields (under the 'User Input/Variable Name' category of the text field).
Later in the installation process I had a Display Progress screen with a Run Script action attached to it with some java to achieve what I wanted to do.
There are two 'gotchas' when optionally setting variables in install4j this way. Firstly, the variable HAS to be set no matter what, even if it's just to the empty string. So, if the user leaves a field blank (ie they don't want to pass that argument into the service), you'll still need to provide an empty string to the Run executable or Launch Service task (more in that in a moment) Secondly, arguments can't have spaces - every space-separated argument has to have its own line.
With that in mind, here's a Run script code snippet that might achieve what you want:
final String[] argumentNames = {"arg1", "arg2", "arg3"};
// For each argument this method creates two variables. For example for arg1 it creates
// arg1ArgumentIdentifierOptional and arg1ArgumentAssignmentOptional.
// If the value of the variable set from the previous form (in this case, arg1) is not empty, then it will
// set 'arg1ArgumentIdentifierOptional' to '--arg', and 'arg1ArgumentAssignmentOptional' to the string arg1=val1 (where val1
// was the value the user entered in the form for the variable).
// Otherwise, both arg1ArgumentIdentifierOptional and arg1ArgumentAssignmentOptional will be set to empty.
//
// This allows the installer to pass both parameters in a later Run executable task without worrying about if they're
// set or not.
for (String argumentName : argumentNames) {
String argumentValue = context.getVariable(argumentName)==null?null:context.getVariable(argumentName)+"";
boolean valueNonEmpty = (argumentValue != null && argumentValue.length() > 0);
context.setVariable(
argumentName + "ArgumentIdentifierOptional",
valueNonEmpty ? "--arg": ""
);
context.setVariable(
argumentName + "ArgumentAssignmentOptional",
valueNonEmpty ? argumentName+"="+argumentValue : ""
);
}
return true;
The final step is to launch the service or executable. I'm not too sure how services work, but with the executable, you create the task then edit the 'Arguments' field, giving it a line-separated list of values.
So in your case, it might look like this:
--install
${installer:arg1ArgumentIdentifierOptional}
${installer:arg1ArgumentAssignmentOptional}
${installer:arg2ArgumentIdentifierOptional}
${installer:arg2ArgumentAssignmentOptional}
${installer:arg3ArgumentIdentifierOptional}
${installer:arg3ArgumentAssignmentOptional}
"My Service Name1"
And that's it. If anyone else knows how to do this better feel free to improve on this method (this is for install4j 4.2.8, btw).