How to find a provider of a package in Yocto? - yocto

In Yocto, is there any bitbake/oe-pkgdata-util command which can give the info about which recipe is the provider of a particular package? in other words, how can I find the PROVIDER of a particular package?

You can use bitbake command with some filtering.
For example, if I know that my linux-example provides the virtual/kernel package:
bitbake -e | awk -F '[_=]' '/PREFERRED_PROVIDER.*linux-example/ { print $3 }'
Result:
virtual/kernel
Or:
bitbake -e linux-example | grep ^PROVIDES
Result:
PROVIDES="linux-example virtual/kernel"
So, you can check if the first command returns something then that is the provider, or the second command contains a name other than the recipe name itself.

You can use oe-pkgdata-util lookup-recipe <package-name>.
For example
oe-pkgdata-util lookup-recipe libssl
outputs
openssl

Related

Samtools/hpc/truncated file

I have tried to submit the script below to HPC
#!/bin/bash
#PBS -N bwa_mem_tumor
#PBS -q batch
#PBS -l walltime=02:00:00
#PBS -l nodes=2:ppn=2
#PBS -j oe
sample=x
ref=absolute/path/GRCh38.p13.genome.fa
fwd=absolutepath/forward_read.fq.gz
rev=absolutepath/reverse_read.fq.gz
module load bio/samtools/1.9
bwa mem $ref $fwd $rev > $sample.tumor.sam && samtools view -S $sample.tumor.sam -b > $sample.tumor.bam && samtools sort $sample.tumor.bam > $sample.tumor.sorted.bam
However as an output I can get only the $sample.tumor.sam and log file says that
Lmod has detected the following error: The following module(s) are unknown:
"bio/samtools/1.9"
Please check the spelling or version number. Also try "module spider ..."
It is also possible your cache file is out-of-date; it may help to try:
$ module --ignore-cache load "bio/samtools/1.9"
Also make sure that all modulefiles written in TCL start with the string
#%Module
However when I input modeles avail it shows that bio/samtools/1.9 is on the list.
Also when i use the option module --ignore-cache load "bio/samtools/1.9"
the result is the same
If i try to continue working with the sam file and input manually the command line
samtools view -b RS0107.tumor.sam > RS0107.tumor.bam
it shows
[W::sam_read1] Parse error at line 200943
[main_samview] truncated file.
What's possibly wrong with the samtools module ir we with the script?

Why is this docopt string not working either with or without optional args?

Here is the complete docopt string I used:
foo.
Usage:
foo [options] <file> -o <output>
foo --help | --version
Options:
-h, --help print this help message.
--target <target> target.
--version print the version.
According to the official parser, either foo a -o b or foo --target abc a -o b is not correctly parsed. What could be possible reasons for this case? Any help would be appreciated.
I'm not entirely sure about the allowed combinations of options for your script, but here's something that should be close.
Just for fun, I wrote a script that has similar options to yours to test this out with the latest docopts.
I found it simplest to write just [options] in the main Usage section, and have all the options below as alternatives, with no specific combinations required.
I'm on macOS so I'm using bash 3.2 (with patched docopts.sh to fix some Mac issues). You can avoid some of the code in this script if you're on bash 4.x - see the commented-out section with --auto option and docopt_print_ARGS function. Currently you would need bash 4.x to avoid patching docopts.sh.
#!/bin/bash
#
# foo.
#
# Usage:
# foo [options] <file>
# foo --help | --version
#
# Options:
# -t, --target <target> target.
# -o, --output <output> output.
# -h, --help print this help message.
# --version print the version.
#
# bash 3.2 (patched docopts.sh) - include set -x to see the args easily
source docopts.sh
usage=$(docopt_get_help_string "$0")
set -x
eval "$(docopts -G ARGS -V "$VERSION" -h "$usage" : "$#")"
# On bash 4.x, replace preceding section with this, or use -A instead of -G above
# source docopts.sh --auto "$#"
# docopt_print_ARGS
This parses the Usage section OK and processes command lines such as:
foo --target a -o b file1
foo --target a --output b file1
foo --target a file1
Partial output with set -x to show args processed correctly:
$ ./foo --target a file1 --output b
...
++ ARGS_target=a
++ ARGS_output=b
++ ARGS_file=file1
++ ARGS_help=false
++ ARGS_version=false
Thanks for #RichVel's efforts. Yesterday I finally found out the underlying (stupid) cause for this problem.
In the official online parser the first part, i.e. foo shouldn't be used. --target abc a -o b works fine in the online example.
Regarding my question, the bug actually comes from that docopt.rs stores --target abc in flag_target instead of arg_target.

Batch rename with command line

I have some files: file1.txt, file2.txt and I would like to rename them like this: file1.something.txt and file2.something.txt
I looked for some similar questions and I come up with this:
for i in file*.txt; do echo mv $i file*.something.txt; done
but unfortunately the output is:
mv file1.txt file*.something.txt
mv file2.txt file*.something.txt
and therefore only 1 file is created.
Could please somebody help?
(I am using a macbook air, I am not sure if this is relevant)
Thank you very much
Try this :
rename -n 's/\.txt/something.txt' *
(remove -n switch when your tests are OK)
There are other tools with the same name which may or may not be able to do this, so be careful.
If you run the following command (GNU)
$ file "$(readlink -f "$(type -p rename)")"
and you have a result like
.../rename: Perl script, ASCII text executable
and not containing:
ELF
then this seems to be the right tool =)
If not, to make it the default (usually already the case) on Debian and derivative like Ubuntu :
$ sudo update-alternatives --set rename /path/to/rename
(replace /path/to/rename to the path of your perl's rename command.
If you don't have this command, search your package manager to install it or do it manually
Last but not least, this tool was originally written by Larry Wall, the Perl's dad.

how to retrive a perl file using wget and execute it using a one-liner?

I'm looking to use wget to retrieve a perl file and execute it in one line. Does anyone know if this is possible/how I would go about doing this?
In order to use wget for this purpose, you would use the -O flag and give it the '-' character as an argument. From the manpage:
-O file
--output-document=file
Giving '-' as the "file" option to -O tells it to send it's output to stdout, which can then be piped into the Perl command.
You can provide the -q flag as well to turn off wget's own warning and message output:
-q
--quiet
Turn off Wget's output.
This will make things look cleaner in the shell.
So you would end up with something like:
wget -qO - http://127.0.0.1/myscript.pl | perl -
For more information on I/O redirection take a look at this:
http://www.tldp.org/LDP/abs/html/io-redirection.html
Just download and pipe to perl
curl -L http://your_location.pl | perl -
You'll sometimes see code like for install modules like cpanm.

how to print the progress of the files being copied in bash [duplicate]

I suppose I could compare the number of files in the source directory to the number of files in the target directory as cp progresses, or perhaps do it with folder size instead? I tried to find examples, but all bash progress bars seem to be written for copying single files. I want to copy a bunch of files (or a directory, if the former is not possible).
You can also use rsync instead of cp like this:
rsync -Pa source destination
Which will give you a progress bar and estimated time of completion. Very handy.
To show a progress bar while doing a recursive copy of files & folders & subfolders (including links and file attributes), you can use gcp (easily installed in Ubuntu and Debian by running "sudo apt-get install gcp"):
gcp -rf SRC DEST
Here is the typical output while copying a large folder of files:
Copying 1.33 GiB 73% |##################### | 230.19 M/s ETA: 00:00:07
Notice that it shows just one progress bar for the whole operation, whereas if you want a single progress bar per file, you can use rsync:
rsync -ah --progress SRC DEST
You may have a look at the tool vcp. Thats a simple copy tool with two progress bars: One for the current file, and one for overall.
EDIT
Here is the link to the sources: http://members.iinet.net.au/~lynx/vcp/
Manpage can be found here: http://linux.die.net/man/1/vcp
Most distributions have a package for it.
Here another solution: Use the tool bar
You could invoke it like this:
#!/bin/bash
filesize=$(du -sb ${1} | awk '{ print $1 }')
tar -cf - -C ${1} ./ | bar --size ${filesize} | tar -xf - -C ${2}
You have to go the way over tar, and it will be inaccurate on small files. Also you must take care that the target directory exists. But it is a way.
My preferred option is Advanced Copy, as it uses the original cp source files.
$ wget http://ftp.gnu.org/gnu/coreutils/coreutils-8.21.tar.xz
$ tar xvJf coreutils-8.21.tar.xz
$ cd coreutils-8.21/
$ wget --no-check-certificate wget https://raw.githubusercontent.com/jarun/advcpmv/master/advcpmv-0.8-8.32.patch
$ patch -p1 -i advcpmv-0.8-8.32.patch
$ ./configure
$ make
The new programs are now located in src/cp and src/mv. You may choose to replace your existing commands:
$ sudo cp src/cp /usr/local/bin/cp
$ sudo cp src/mv /usr/local/bin/mv
Then you can use cp as usual, or specify -g to show the progress bar:
$ cp -g src dest
A simple unix way is to go to the destination directory and do watch -n 5 du -s . Perhaps make it more pretty by showing as a bar . This can help in environments where you have just the standard unix utils and no scope of installing additional files . du-sh is the key , watch is to just do every 5 seconds.
Pros : Works on any unix system Cons : No Progress Bar
To add another option, you can use cpv. It uses pv to imitate the usage of cp.
It works like pv but you can use it to recursively copy directories
You can get it here
There's a tool pv to do this exact thing: http://www.ivarch.com/programs/pv.shtml
There's a ubuntu version in apt
How about something like
find . -type f | pv -s $(find . -type f | wc -c) | xargs -i cp {} --parents /DEST/$(dirname {})
It finds all the files in the current directory, pipes that through PV while giving PV an estimated size so the progress meter works and then piping that to a CP command with the --parents flag so the DEST path matches the SRC path.
One problem I have yet to overcome is that if you issue this command
find /home/user/test -type f | pv -s $(find . -type f | wc -c) | xargs -i cp {} --parents /www/test/$(dirname {})
the destination path becomes /www/test/home/user/test/....FILES... and I am unsure how to tell the command to get rid of the '/home/user/test' part. That why I have to run it from inside the SRC directory.
Check the source code for progress_bar in the below git repository of mine
https://github.com/Kiran-Bose/supreme
Also try custom bash script package supreme to verify how progress bar work with cp and mv comands
Functionality overview
(1)Open Apps
----Firefox
----Calculator
----Settings
(2)Manage Files
----Search
----Navigate
----Quick access
|----Select File(s)
|----Inverse Selection
|----Make directory
|----Make file
|----Open
|----Copy
|----Move
|----Delete
|----Rename
|----Send to Device
|----Properties
(3)Manage Phone
----Move/Copy from phone
----Move/Copy to phone
----Sync folders
(4)Manage USB
----Move/Copy from USB
----Move/Copy to USB
There is command progress, https://github.com/Xfennec/progress, coreutils progress viewer.
Just run progress in another terminal to see the copy/move progress. For continuous monitoring use -M flag.