I've submitted my job by the following command:
bsub -e error.log -o output.log ./myScript.sh
I have a question: why are the output and errors logs available only once the job ended?
Thanks
LSF doesn't steam the output back to the submission host. If the submission host and the execution host have a shared file system, and the JOB_SPOOL_DIR is in that shared file system (the spool directory is $HOME/.lsbatch by default) then you should see the stdout and stderr there. After the job finishes, the files there are copied back to the location specified by bsub.
Check bparams -a | grep JOB_SPOOL_DIR to see if the admin has changed the location of the spool dir. With or without the -o/-e options, while the job is running its stdout/err will be captured in the job's spool directory. When the job is finished, the stdout/stderr is copied to the filenames specified by bsub -o/-e. The location of the files in the spool dir is $JOB_SPOOL_DIR/<jobsubmittime>.<jobid>.out or $JOB_SPOOL_DIR/<jobsubmittime>.<jobid>.err
[user1#beta ~]$ cat log.sh
LINE=1
while :
do
echo "line $LINE"
LINE=$((LINE+1))
sleep 1
done
[user1#beta ~]$ bsub -o output.log -e error.log ./log.sh
Job <930> is submitted to default queue <normal>.
[user1#beta ~]$ tail -f .lsbatch/*.930.out
line 1
line 2
line 3
...
According to the LSF documentation the behaviour is configurable:
If LSB_STDOUT_DIRECT is not set and you use the bsub -o option, the standard output of a job is written to a temporary file and copied to the file you specify after the job finishes.
Related
I have tried to submit the script below to HPC
#!/bin/bash
#PBS -N bwa_mem_tumor
#PBS -q batch
#PBS -l walltime=02:00:00
#PBS -l nodes=2:ppn=2
#PBS -j oe
sample=x
ref=absolute/path/GRCh38.p13.genome.fa
fwd=absolutepath/forward_read.fq.gz
rev=absolutepath/reverse_read.fq.gz
module load bio/samtools/1.9
bwa mem $ref $fwd $rev > $sample.tumor.sam && samtools view -S $sample.tumor.sam -b > $sample.tumor.bam && samtools sort $sample.tumor.bam > $sample.tumor.sorted.bam
However as an output I can get only the $sample.tumor.sam and log file says that
Lmod has detected the following error: The following module(s) are unknown:
"bio/samtools/1.9"
Please check the spelling or version number. Also try "module spider ..."
It is also possible your cache file is out-of-date; it may help to try:
$ module --ignore-cache load "bio/samtools/1.9"
Also make sure that all modulefiles written in TCL start with the string
#%Module
However when I input modeles avail it shows that bio/samtools/1.9 is on the list.
Also when i use the option module --ignore-cache load "bio/samtools/1.9"
the result is the same
If i try to continue working with the sam file and input manually the command line
samtools view -b RS0107.tumor.sam > RS0107.tumor.bam
it shows
[W::sam_read1] Parse error at line 200943
[main_samview] truncated file.
What's possibly wrong with the samtools module ir we with the script?
I have Perl script configured to run periodically via Windows Task Scheduler.
Action: Start a program
Program: C:\Perl64\bin\perl.exe
Add arguments: script.pl config.json > output.txt 2>&1
or: script.pl config.json 2>&1 > output.txt
Start in: c:\path\to\scriptPL\
The program runs, but it gets either > or 2>&1 in $ARGV[1], instead of redirecting outputs. When running from the command prompt output redirects work.
What am I missing?
Output redirection may or may not work with the Task Scheduler. The workaround is to run your desired command (including output redirection) inside a batch file, and to call the batch file from Task Scheduler.
script.bat
----------
C:\Perl64\bin\perl.exe script.pl config.json > output.txt 2>&1
Imagine I have a lsf file as fllows:
#!/bin/sh
#BSUB -J X
#BSUB -o X.out
#BSUB -e X.err
...
Once it is run the output will appear in the current folder.
Now imagine I am in
~/code
I need the files to appear in
../cluster/
basically go one folder back and from there go to folder cluster.
How should I write do it within the lsf file?
You can put any relative or absolute path in #BSUB -[eo] <file>. e.g. #BSUB -e ../cluster/X.err. If using a relative path, its relative to the job CWD. By default the job CWD is the job submission directory, but can be changed by a bunch of different parameters. bjobs -l <jobid> shows the actual CWD.
What happens is that while the job is running, the stdout and stderr goes to a file under LSF_TMPDIR (default is $HOME/.lsbatch). After the job finishes, the contents of those files is copied to the pathnames specified in -[eo]. The copying is done on the execution host.
I made a test script test.qsub:
#!/bin/bash
#PBS -q batch
#PBS -o output.txt
#PBS -e Error.err
echo "hello world"
When running qsub test.qsub it does not generate the output.txt file nor the file error.txt. I also believe that the other options do not work either, appreciate your help ! It is said you should configure the torque.cfg but in my installation the file is not generated and not in /var/spool/torque.
Try "#PBS -k oe". This directs pbs to keep stdout and stderr.
When I run isql with a script file:
isql.exe -q -e -i %1 -o %~n1.log
Then in the output file I see commands, but the error of commands I see on the screen when it run.
The Error doesn't isn't written to the output file. Which command should I use so the errors are also written to the output file?
You have to use the -m(erge) command line switch in order to send the error messages into the output file.