Closed. This question does not meet Stack Overflow guidelines. It is not currently accepting answers.
Questions asking for code must demonstrate a minimal understanding of the problem being solved. Include attempted solutions, why they didn't work, and the expected results. See also: Stack Overflow question checklist
Closed 9 years ago.
Improve this question
I have this file
lrwxrwxrwx. 1 user user 32 Sep 20 15:43 SingletonLock -> user.hostname.com-22222
I need perl recognizes that file and get its size, What command could I use for this?
I have tried:
my $size = (lstat("/home/user/.config/google-chrome/SingletonLock -> user.hostname.com-22222"))[7];
print $size;
but the variable is empty and $! is No such file or directory
The stat function? Or you might need lstat to get the link information, or readlink to read the name that the symbolic link points to.
Example of stat and lstat working:
$ echo "Petunia" > user.hostname.com-22222
$ ln -s user.hostname.com-22222 SingletonLock
$ ls -l user.* Singl*
lrwxr-xr-x 1 jleffler staff 23 Sep 26 20:24 SingletonLock -> user.hostname.com-22222
-rw-r--r-- 1 jleffler staff 8 Sep 26 20:24 user.hostname.com-22222
$ cat stat.pl
#!/usr/bin/env perl
use strict;
use warnings;
my #names = ( "user.hostname.com-22222", "SingletonLock" );
foreach my $file (#names)
{
my ($dev,$ino,$mode,$nlink,$uid,$gid,$rdev,$size,
$atime,$mtime,$ctime,$blksize,$blocks)
= lstat $file;
printf "lstat: %2d (%.5o) - %s\n", $size, $mode, $file;
($dev,$ino,$mode,$nlink,$uid,$gid,$rdev,$size,
$atime,$mtime,$ctime,$blksize,$blocks)
= stat $file;
printf "stat: %2d (%.5o) - %s\n", $size, $mode, $file;
}
$ perl stat.pl
lstat: 8 (100644) - user.hostname.com-22222
stat: 8 (100644) - user.hostname.com-22222
lstat: 23 (120755) - SingletonLock
stat: 8 (100644) - SingletonLock
$
Give only file name to lstat
my $size = (lstat("/home/user/.config/google-chrome/SingletonLock"))[7];
If you want to get only size of the file you can use -s option in perl
print -s "/home/user/.config/google-chrome/SingletonLock";
or
$filesize = -s $filename;
Related
Why does the Perl file test operator "-l" fail to detect symlinks under the following conditions?
System Info
john#testbed-LT:/temp2/test$ uname -a
Linux Apophis-LT 4.13.0-37-generic #42-Ubuntu SMP Wed Mar 7 14:13:23 UTC 2018 x86_64 x86_64 x86_64 GNU/Linux
john#testbed-LT:/temp2/test$ lsb_release -a
No LSB modules are available.
Distributor ID: Ubuntu
Description: Ubuntu 17.10
Release: 17.10
Codename: artful
Perl Info
john#testbed-LT:/temp2/test$ perl -v
This is perl 5, version 26, subversion 0 (v5.26.0) built for x86_64-linux-gnu-thread-multi (with 56 registered patches, see perl -V for more detail)
Test Resources
john#testbed-LT:/temp2/test$ touch regular_file
john#testbed-LT:/temp2/test$ mkdir dir
john#testbed-LT:/temp2/test$ ln -s regular_file symlink
john#testbed-LT:/temp2/test$ ls -al
total 12
drwxrwxr-x 3 john john 4096 May 6 02:29 .
drwxrwxrwx 6 john john 4096 May 6 02:29 ..
drwxrwxr-x 2 john john 4096 May 6 02:29 dir
-rw-rw-r-- 1 john john 0 May 6 02:29 regular_file
lrwxrwxrwx 1 john john 12 May 6 02:29 symlink -> regular_file
Script Containing Failing "-l" Operator
john#testbed-LT:/temp2/test$ cat ~/.scripts/test.pl
#!/usr/bin/perl
use strict;
use warnings;
use Cwd 'abs_path';
my $targetDir = "/temp2/test";
opendir(DIR, $targetDir) || die "Can't open $targetDir: $!";
while (readdir DIR) {
my $file = "$_";
if($file =~ m/^\.{1,2}/) {
next;
}
$file = abs_path($file);
if(-l "$file") {
print "Link: $file\n";
}
elsif(-d "$file") {
print "Dir: $file\n";
}
elsif(-f "$file") {
print "File: $file\n";
}
else {
print "\n\n *** Unhandled file type for file [$file]!\n\n";
exit 1;
}
}
closedir(DIR);
Script Output
john#testbed-LT:/temp2/test$ perl ~/.scripts/test.pl
File: /temp2/test/regular_file
Dir: /temp2/test/dir
File: /temp2/test/regular_file
Problem I'm Trying to Solve
Note in the above output that the symlink (named "symlink") is not listed while the file, "regular_file," is listed twice (I want "symlink" listed -- the actual link and not the file it points to).
When I change ... if(-l "$file") ... to ... if(lstat "$file") ... in the script, again "symlink" is not listed while "regular_file" is listed twice, but they are being listed from within the block meant to catch symlinks, i.e.:
john#testbed-LT:/temp2/test$ perl ~/.scripts/test.pl
Link: /temp2/test/regular_file
Link: /temp2/test/dir
Link: /temp2/test/regular_file
Goal
The output I'm trying to achieve (which is faked below -- not actually generated by the script, but by hand) is:
john#testbed-LT:/temp2/test$ perl ~/.scripts/test.pl
File: /temp2/test/regular_file
Dir: /temp2/test/dir
Link: /temp2/test/symlink
...but not necessarily in that order (I don't care about the order of the listing).
Why is the above-shown script not achieving the above-stated goal (why is the "-l" operator not working)?
perldoc Cwd:
abs_path
my $abs_path = abs_path($file);
Uses the same algorithm as getcwd(). Symbolic links and relative-path components ("." and "..") are resolved to return the canonical pathname, just like realpath(3). On error returns undef, with $! set to indicate the error.
(Emphasis mine.)
If you want to see symlinks, don't use abs_path.
What you want to do instead is
$file = "$targetDir/$file";
i.e. prepend the name of the directory you read $file from.
Additional notes:
opendir(DIR, $targetDir) || die "Can't open $targetDir: $!";
while (readdir DIR) {
my $file = "$_";
should be
opendir(my $dh, $targetDir) || die "Can't open $targetDir: $!";
while (my $file = readdir $dh) {
Why use bareword filehandles when you can just use normal variables (that are scoped properly)?
There's no reason to quote "$_" here.
Why first assign to $_ when you're just going to copy the string to $file in the next step?
Note in the above output that the symlink (named "symlink") is not listed while the file, "regular_file," is listed twice
Yeah, because you used abs_path to turn symlink into /temp2/test/regular_file. Get rid of that line.
By the way, you are missing
$file = "$targetDir/$file";
The only reason your program worked without it is because $targetDir happened to be the current work directory.
I want to get the list of file names present in the remote location.
I am using the below snippet in my Perl script.
my $command = "sftp -q -o${transferAuthMode}=yes -oPort=$sftpPort ${remoteUsername}\#${remoteHost} 2>\&1 <<EOF\n" .
"cd \"${remotePath}\"\n" .
"ls -l \n" .
"quit\n" .
"EOF\n";
my #files = `$command`;
When the number of files in the remote location is large (>500) then not all the file names are captured in #files.
When I manually do SFTP and list the files, all files are getting listed but I'm not getting the same through the script. Each time getting #files size different. It's occurring only when there are large number of files.
I'm unable find the reason behind this. Could you please help?
This can be achieved without requiring any additional package module/s. I tested this on my CentOS 7 Server (Windows VM).
My remote host details: I got ~2000 files in the remote host dir. A CentOS 6.8 server.
%_gaurav#[remotehost]:/home/gaurav/files/test> ls -lrth|head -3;echo;ls -lrth|tail -2
total 7.9M
-rw-rw-r--. 1 gaurav gaurav 35 Feb 16 23:51 File-0.txt
-rw-rw-r--. 1 gaurav gaurav 35 Feb 16 23:51 File-1.txt
-rw-rw-r--. 1 gaurav gaurav 38 Feb 16 23:51 File-1998.txt
-rw-rw-r--. 1 gaurav gaurav 38 Feb 16 23:51 File-1999.txt
%_gaurav#[remotehost]: /home/gaurav/files/test>
Script output from LocalHost: Please note that I am running your command sans the o${transferAuthMode}=yes part. As seen below, the script is able to gather all results in an array, greater than 500 results.
I am prnting the total entries, some particular index numbers from the array to show the results, but give it a try with un-commented Dumper line to see the full result.
%_STATION#gaurav * /root/ga/study/pl> ./scp.pl
Read 2003 lines from SCP command.
ArrayIndex: 2,3,1999,2000 contain:
[-rw-rw-r-- 0 501 501 36B Feb 16 23:51 File-58.txt]
[-rw-rw-r-- 0 501 501 37B Feb 16 23:51 File-129.txt]
[-rw-rw-r-- 0 501 501 38B Feb 16 23:51 File-1759.txt]
[-rw-rw-r-- 0 501 501 38B Feb 16 23:51 File-1810.txt]
%_STATION#gaurav * /root/ga/study/pl>
Script and its Working:
#!/usr/bin/perl
use strict ;
use warnings ;
use Data::Dumper ;
my $sftp_port=22 ;
my ($user, $host) = ("gaurav","192.168.246.137") ;
my $remote_path = '/home/gaurav/files/test' ;
my #result ; # To store result
my $command = "sftp -q -oPort=$sftp_port ${user}\#${host} 2>\&1 <<EOF\n"."cd $remote_path\nls -lrth\nquit\nEOF" ;
# open the command as a file handle, read output and store it.
open FH, "$command |" or die "Something went wrong!!\n" ;
while (<FH>) {
tr/(?\r|\f|\n)//d ; # Removing any new line, carriage return or form feed.
push(#result,"\[$_\]") ;
}
close FH ;
#print Dumper #result ;
# Just for printing a little bit of results from
# the array. Following lines can be deleted.
my $total = scalar #result ;
print "Read $total lines from SCP command.\n" ;
print "\nArrayIndex: 2,3,1999,2000 contain:\n
$result[2]
$result[3]
$result[1999]
$result[2000]
" ;
Another way: One could also get around this issue by making a shell script and calling it from the perl script and read its output. As shown below, my shell script which gets called by the perl script and the final output. This can be used as a quick technique when one doesn't have much time to write/formulate commands in perl directly. You can use the qx style(shown below) in earlier script as well.
Shell script "scp.sh"
%_STATION#gaurav * /root/ga/study/pl> cat scp.sh
#!/bin/bash
sftp -oPort=${1} ${2}#${3} 2>&1 <<EOF
cd ${4}
ls -l
quit
EOF
Perl Script "2scp.pl"
%_STATION#gaurav * /root/ga/study/pl> cat 2scp.pl
#!/usr/bin/perl
use strict ;
use warnings ;
use Data::Dumper ;
my $sftp_port=22 ;
my ($user, $host) = ("gaurav","192.168.246.137") ;
my $remote_path = '/home/gaurav/files/test' ;
# Passing arguements to shell script using concatination.
my $command = './scp.sh '." $sftp_port $user $host $remote_path" ;
my #result = qx{$command} ; # Runs the command and stores the result.
my $total = scalar #result ;
print "Read $total lines from SCP command.\n" ;
# End.
Output:
%_STATION#gaurav * /root/ga/study/pl> ./2scp.pl
Read 2004 lines from SCP command.
%_STATION#gaurav * /root/ga/study/pl>
Try it out and let us know.
Thanks.
ls -l
-rw-r--r-- 1 angus angus 0 2013-08-16 01:33 copy.pl
-rw-r--r-- 1 angus angus 1931 2013-08-16 08:27 copy.txt
-rw-r--r-- 1 angus angus 492 2013-08-16 03:15 ex.txt
-rw-r--r-- 1 angus angus 25 2013-08-16 09:07 hello.txt
-rw-r--r-- 1 angus angus 98 2013-08-16 09:05 hi.txt
I need only the read, write , access data as well as the file name.
#! /usr/bin/perl -w
#list = `ls -l`;
$index = 0;
#print "#list\n";
for(#list){
($access) = split(/[\s+]/,$_);
print "$access\n";
($data) = split(/pl+/,$_);
print "$data";
#array1 = ($data,$access);
}
print "#array1\n"
I have written this code to extract the read,write,access permission details and the file name corresponding to it.
I couldn't extract the filename which is the last column.
Check perl stat http://perldoc.perl.org/functions/stat.html
It's more robust and efficient than calling external ls command,
use File::stat;
$sb = stat($filename);
printf "File is %s, size is %s, perm %04o, mtime %s\n",
$filename, $sb->size, $sb->mode & 07777,
scalar localtime $sb->mtime;
I think you have an error in line number 8 of your script. You are trying to split the line using the string "pl" as a delimiter which will only match the first line of your input and will not give you what I think you want.
I believe you should just split the whole line on white space and assign just the columns you want (number 1 and 8 in this case).
change your loop for this:
for my $filename (#list){
chomp($filename);
my ($access, $data) = (split(/\s+/, $filename))[0, 7]; #use a slice to get only the columns you want.
print "$access $data\n";
}
Note: mpapec suggestion to use Stat would be better. I just wanted to let you know why your code is not working.
Closed. This question does not meet Stack Overflow guidelines. It is not currently accepting answers.
Questions asking for code must demonstrate a minimal understanding of the problem being solved. Include attempted solutions, why they didn't work, and the expected results. See also: Stack Overflow question checklist
Closed 9 years ago.
Improve this question
I have a Folder with multiple .txt files in it. I want to check few strings in those text files and give output as a out.txt with 5 lines above and 5 lines below of the located string.
It is easier to do with grep:
grep -A 5 -B 5 'searchstring' *.txt > a.out
With Perl :-)
use strict;use warnings;
`grep -A 5 -B 5 'searchstring' *.txt > a.out`;
die "Something went wrong: $!" if $?;
if you insist on a perl oneliner;
perl -n -e 'if (/searchStringHere/) {print "\n\n\n\n\n$_\n\n\n\n\n"}' *.txt
If the grep solution works for you, i consider it to be more elegant...
update
it just struck me that you might be a windows user, so you don't have grep...
this code was not tested, as i don't have perl installed on this machine, but it should work:
!#yourperl/perl
$flag, $l1, $l2, $l3, $l4, $l5, $l6;
$cnt = 5;
$file = shift;
open("$file") || die "can't open $file\n";
while (<>) {
$l1 = $l2; # starting with reserving the lines for back print
$l2 = $l3;
$l4 = $l5;
$l5 = $l6;
$l6 = $_;
if (/your string here/) {
$cnt = 5;
print ($l1$l2$l3$l4$l5$l6);# print line + 5 prev lines
next
}
if ($cnt >0) { # print the next 5 lines
print $l6;
$cnt--;
}
}
By my #dir = $ftp->ls() i can get the list of all dir but witch one is latest how can i filter that one. I am using windows os and those dir is from FTP.
Thnaks
You'll get a qucik and dirty hack for your carelessly worded question:
First:
Assuming you are using Net::FTP
you have to call
$ftp->dir()
and not
$ftp->ls()
to get the long directory listing.
Then try this:
use feature "say";
use Net::FTP;
use Date::Parse;
$ftp = Net::FTP->new("ftp", Debug => 0)
or die "Cannot connect to some.host.name: $#";
$ftp->login("anonymous",'-anonymous#')
or die "Cannot login ", $ftp->message;
$ftp->cwd("/pub")
or die "Cannot change working directory ", $ftp->message;
#dir = $ftp->dir()
or die "ls()/dir() failed ", $ftp->message;
#map {say } #dir;
#Now parse the array of strings that dir() returned
#magic numbers to find substring with modif-date
my $start = 44;
my $len = 10;
#dir = map {$_->[0]} sort {$b->[1] <=> $a->[1]} map {[$_, str2time(substr($_, $start, $len))] } grep {/^d/} #dir;
$latest = $dir[0];
This will work only for directories with this format
drwxr-xr-x 17 root other 4096 Apr 12 2010 software
but not with this (note:year missing)
drwxr-xr-x 36 root root 4096 Nov 29 09:14 home
The code will also ignore symbolic links such as this:
lrwxrwxrwx 1 root root 8 May 30 2011 i -> incoming
but it will give you a start.
The
map{} sort{} map {} #array;
construct is called a "Schwartzian transform", and does most of the work.
The string returned by $ftp->dir can vary depending on the type of ftp server you are accessing. The OS and user configs can also influence the format of the string so parsing this string is likely to lead to problems even though it seems to be a quick solution. It is much easier to use $ftp->mdtm($file). This returns the last modified date and time as epoch time. Simple!