Perl chdir fails with a glob pattern - perl

I am trying to do cd in my perl script. I am using the below command:
chdir "/home/test/test1/test2/perl*";
perl* value is actually perl_0122_2044, but this value may vary.
The above chdir command is not doing cd to the path. Am I doing something wrong?

chdir does not accept * and other expansion characters in the argument. Use glob or something similar for this to extract a single directory, then chdir to that. For example, this changes directory to the first /home/test/test1/test2/perl* it finds:
$dir = (glob "/home/test/test1/test2/perl*")[0];
# only change dir if any dir was found:
if (-d $dir) {
# fail if cannot change dir (or, even better, use autodie):
chdir $dir or die "Could not change to $dir: $!";
}

chdir expects a path, not a wildcard. Use glob to expand the wildcard:
my ($dir) = glob "/home/test/test1/test2/perl*";
chdir $dir or die "$dir: $!";
If there are multiple expansions, the first one will be used.

In similar vein, glob is handled by a module in raku https://modules.raku.org/dist/IO::Glob

Related

chdir in perl using two strings: chdir "/home/$dir1/$dir2"

I have two strings: $dir1 and $dir2. I want to use chdir as shown below:
chdir "/home/$dir1/$dir2";
when I try this it doesn't change the directory. I checked the current working directory same as before
Is there any way to do this?
Typically I write that as:
use v5.10;
use File::Spec::Functions;
my $dir = catfile( '/home', $dir1, $dir2 );
die "Dir <$dir> isn't a directory or doesn't exist" unless -e -d $dir;
chdir $dir or die "Could not change to <$dir>: $!";
Whenever you do something with the system, check the results to ensure it happened.
And, curiously, I didn't realize that Pearson's sample chapter of my book Effective Perl Programming is the one that covers stacked file test operators.

Why is opendir argument invalid?

This is perl 5, version 30, subversion 1 (v5.30.1) built for MSWin32-x64-multi-thread
Win10
cygwin
I can't figure out how to use opendir. Here is my example code:
sub test($) {
my $dir = shift;
opendir (DIR, $dir) || die "Couldn't open dir $dir $!";
}
sub main() {
my $dir = `pwd`;
test($dir);
}
Error message
Couldn't open dir /home/skidmarks/Projects/Perl
Invalid argument at ./test.py line .
pwd returns a unix formatted directory path ('/'). I have tried it with a windows formatted directory path ('\'). The only thing that works is to use a literal string for the path, e.g., "." or "some_directory_path".
Can't I use a variable in opendir for the path?
The qx (backticks) returns the newline as well, so you need chomp $dir;.
Better yet, why not use Perl's facilities
use Cwd qw(cwd);
my $dir = cwd;
and now you don't have to worry about system commands and how exactly they return.
As the OP uses pwd from cygwin, even once the linefeed is gone the obtained path is unix-style and this conflicts with MSWin32 build of Perl (as reported when opening the file). Using a portable tool (like Cwd above) and a Windows build of Perl should avoid such problems.
Or use a tool to convert paths, like cygpath. See this post
Try following piece of code, it works well with Strawberry perl.
Also try to put full path in double quotes "c:\Program Files\Common Files".
If directory name is not provided then the script will list current directory
Usage: perl script.pl "C:\Users\User_name"
use strict;
use warnings;
use feature 'say';
my $dir_name = shift || '.';
opendir(my $dir, $dir_name)
or die "Couldn't open $dir_name";
map{ say } readdir($dir);
closedir $dir;
NOTE: Navigate in Cygwin terminal to target directory and issue command pwd. Perl script run in Cygwin perhaps will expect the path in this form.
Latest version of Cygwin was installed and tested with slightly modified code -- works fine.
NOTE: pwd is Linux/UNIX command which produces an error in MS Windows, but works in Cygwin which emulates Linux/UNIX environment (binary incompatible, requires recompilation of the programs)
#!/usr/bin/perl
use strict;
use warnings;
use feature 'say';
sub test($) {
my $dir = shift;
opendir(my $dh, $dir)
or die "Couldn't open dir $dir $!";
map{ say } readdir($dh);
close $dh;
}
sub main() {
my $dir = `pwd`;
chomp $dir;
print "[$dir]\n";
test($dir);
}
main();
Function main is not required in perl (main() function is C/C++ entrance point) and normally code looks like following
#!/usr/bin/perl
use strict;
use warnings;
use feature 'say';
my $dir = `pwd`; # pwd is UNIX/Linux command will give an error in MS Windows
chomp $dir; # trim \n at the end of $dir
say "DIRECTORY: [$dir]"; # Let's check what we got
test($dir);
sub test {
my $dir = shift;
opendir(my $dh, $dir)
or die "Couldn't open dir $dir $!";
map{ say } readdir($dh);
close $dh;
}

Find file names with certain extensions

I want to search a directory for file names with any of the following extensions: .srt, .sub, .txt, .ass, .ssa.
I would appreciate any input.
The simplest way is to use glob:
chdir $directory;
my #files = glob '*.srt *.sub *.txt *.ass *.ssa';
Another way is to use readdir, but then you have to filter the files yourself, e.g. by using grep:
open my $dir_handle, $directory or die $!;
my #files = grep /\.(?:srt|sub|txt|ass|ssa)\z/, readdir $dir_handle;
And if you need to search through an entire directory tree, use File::Find.

Recursive directory traversal in Perl

I'm trying to write a script that prints out the file structure starting at the folder the script is located in. The script works fine without the recursive call but with that call it prints the contents of the first folder and crashes with the following message: closedir() attempted on invalid dirhandle DIR at printFiles.pl line 24. The folders are printed and the execution reaches the last line but why isn't the recursive call done? And how should I solve this instead?
#!/usr/bin/perl -w
printDir(".");
sub printDir{
opendir(DIR, $_[0]);
local(#files);
local(#dirs);
(#files) = readdir(DIR);
foreach $file (#files) {
if (-f $file) {
print $file . "\n";
}
if (-d $file && $file ne "." && $file ne "..") {
push(#dirs, $file);
}
}
foreach $dir (#dirs) {
print "\n";
print $dir . "\n";
printDir($dir);
}
closedir(DIR);
}
You should always use strict; and use warnings; at the start of your Perl program, especially before you ask for help with it. That way Perl will show up a lot of straightforward errors that you may not notice otherwise.
The invalid filehandle error is likely because DIR is a global directory handle and has been closed already by a previous execution of the subroutine. It is best to always used lexical handles for both files and directories, and to test the return code to make sure the open succeeded, like this
opendir my $dh, $_[0] or die "Failed to open $_[0]: $!";
One advantage of lexical file handles is that they are closed implicitly when they go out of scope, so there is no need for your closedir call at the end of the subroutine.
local isn't meant to be used like that. It doesn't suffice as a declaration, and you are creating a temporary copy of a global variable that everything can access. Best to use my instead, like this
my #dirs;
my #files = readdir $dh;
Also, the file names you are using from readdir have no path, and so your file tests will fail unless you either chdir to the directory being processed or append the directory path string to the file name before testing it.
Use the File::Find module. The way i usually do this is using the find2perl tool which comes with perl, which takes the same parameters as find and creates a suitable perl script using File::Find. Then i fine-tune the generated script to do what i want it to do. But it's also possible to use File::Find directly.
Why not use File::Find?
use strict; #ALWAYS!
use warnings; #ALWAYS!
use File::Find;
find(sub{print "$_\n";},".");

Win32 Perl - Telling the difference between files and folders using a passed directory argument

I'm writing a script in perl strawberry. The first thing it needs to be able to do is take a path argument and get a directory listing for that path, and it needs to be able to distinguish between files and folders. I read some tutorials on the subject and wrote the script below, but it only works when I give it the path that the script is currently residing in. If I give it any other path, the -f and -d tests don't work.
EDIT: Clarification: The script DOES put all the files and folders into #thefiles if I give it a path other than it's own, it's just the -f and -d tests that don't work.
use Getopt::Long;
my $dir;
GetOptions('-d=s' => \$dir);
opendir(DIR, $dir) or die "BORKED";
#thefiles = readdir(DIR);
print DIR;
closedir(DIR);
#filez;
#dirz;
foreach $file (#thefiles){
if (-f$file){
push(#filez, $file);
}
if (-d$file){
push(#dirz, $file);
}
}
print "files: #filez \n";
print "Directories: #dirz \n";
Here's a screenshot: http://i.stack.imgur.com/RMmFz.jpg
Hope someone can help and thanks very much for your time. :)
martin clayton told you the reason your code does not work.
Here is a way to fix it using map and some more modern Perl constructs:
use strict;
use warnings;
use Getopt::Long;
my $dir;
GetOptions('-d=s' => \$dir);
opendir my $dh, $dir or die "BORKED: $!";
my #thefiles = map { "$dir/$_" } readdir $dh;
closedir $dh;
my #filez;
my #dirz;
for my $file (#thefiles) {
push #filez, $file if -f $file;
push #dirz , $file if -d $file;
}
print "files: #filez \n";
print "Directories: #dirz \n";
It's because the filetest -f and -d operators use a relative path unless you provide an absolute one. The readdir function will return the file (and subdirectory...) names found in the directory, but not the full paths.
From the docs:
If you're planning to filetest the
return values out of a readdir, you'd
better prepend the directory in
question. Otherwise, because we didn't
chdir there, it would have been
testing the wrong file.