Basically, I'm trying to create a new directory with today's date, then create a new file and save it in that folder.
I can get all the steps working separately, however the file doesn't want to be saved inside the directory. Basically I'm working with:
mkdir($today);
opendir(DIR, $today) or die "Error in opening dir $today\n";
open SAVEPAGE, ">>", $savepage
or die "Unable to open $savepage for output - $!";
print SAVEPAGE $data;
close(SAVEPAGE);
closedir(DIR);
I've done a lot of searches to try and find an appropriate example, but unfortunately every word in queries I've tried get millions of hits "open/save/file/directory" etc. I realise I could handle errors etc better, that'll be the next step once I get the functionality working. Any pointers would be appreciated, cheers.
Just prefix the file to open with the directory name. No need for opendir
mkdir($today);
open SAVEPAGE, ">>", "$today/$savepage";
Rather than using the fully-qualified filename all the time you may prefer to use chdir $today before you open the file. This will change the current working directory and force a file specified using a relative path or no path at all to be opened relative to the new directory.
In addition, using the autodie pragma will avoid the need to check the status of open, close etc.; and using lexical filehandles is preferable for a number of reasons, including implicit closing of files when the filehandle variable goes out of scope.
This is how your code would look.
use strict;
use warnings;
use autodie;
my $today = 'today';
my $savepage = 'savepage';
my $data = 'data';
mkdir $today unless -d $today;
{
chdir $today;
open my $fh, '>>', $savepage;
print $fh $data;
}
However, if your program deals with files in multiple directories then it is awkward to chdir backwards and forwards between them, and the original directory has to be explicitly saved otherwise it will be forgotten. In this case the File::chdir module may be helpful. It provides the $CWD package variable which will change the current working directory if its value is changed. It can also be localized like any other package variabled so that the original value will be restored at the end of the localizing block.
Here is an example.
use strict;
use warnings;
use File::chdir;
use autodie;
my $today = 'today';
my $savepage = 'savepage';
my $data = 'data';
mkdir $today unless -d $today;
{
local $CWD = $today; # Change working directory to $today
open my $fh, '>>', $savepage;
print $fh $data;
}
# Working directory restored to its previous value
Related
I can't seem to get this script to open from one directory and write to another. Both Directories exist. I've commented out what I tried. Funny this is it runs fine when I place it in the directory with the files to process. Here's the code:
use strict;
use warnings "all";
my $tmp;
my $dir = ".";
#my $dir = "Ask/Parsed/Html4/";
opendir(DIR, $dir) or die "Cannot open directory: $dir!\n";
my #files = readdir(DIR);
closedir(DIR);
open my $out, ">>output.txt" or die "Cannot open output.txt!\n";
#open my $out, ">>Ask/Parsed/Html5/output.txt" or die "Cannot open output.txt!\n";
foreach my $file (#files)
{
if($file =~ /html$/)
{
open my $in, "<$file" or die "Cannot open $file!\n";
undef $tmp;
while(<$in>)
{
$tmp .= $_;
}
print $out ">$file\n";
print $out "$tmp\n";
#print $out "===============";
close $in;
}
}
close $out;
The directories you use -- . and Ask/Parsed/Html4/ -- are relative paths, which means they are relative to your current working directory, and so it makes a difference where in the file system you are currently located when you run the script.
In addition, the files you are opening -- output.txt and $file -- have no path information, so Perl will look in your current working directory to find them.
There are a few ways to solve this.
You could cd to the directory where your files are before running the script, and open the directory as . as you currently do
You could achieve the same effect by calling chdir from within the script, which will change the current working directory and make the program ignore your location when you ran it
Or you could add an absolute directory path to the beginning of the file names, preferably using catfile from File::Spec::Functions
However I would choose to use glob -- which works in the same way as command-line filename expansion -- in preference to opendir / readdir as the resulting strings include the path (if one was specified in the parameter) and there is no need to separately filter the .html files.
I would also choose to undefine the input record separator $/ to read the whole file, rather than reading it line-by-line and concatenating them all.
Finally, if you are running version 10 or later of Perl 5 then it is simpler to use autodie rather than checking the success of every open, readline, close, opendir, readdir, and closedir etc.
Something like this
use strict;
use warnings 'all';
use 5.010;
use autodie;
my $dir = '/path/to/Ask/Parsed/Html4';
my #html = glob "$dir/*.html";
open my $out, '>>', "$dir/output.txt";
for my $file (#html) {
my $contents = do {
open my $in, '<', $file;
local $/;
<$in>;
};
print $out "> $file\n";
print $out "$contents\n";
print $out "===============";
}
close $out;
It is likely trying to access the files from where ever you are calling this from. If you're files are located relative to the location of the script use the following example to provide a full path;
use FindBin;
my $file = "$FindBin::Bin/Ask/Parsed/Html5/output.txt";
If your file us not relative to the script, provide the full path;
my $file = "/home/john.doe/Ask/Parsed/Html5/output.txt";
Note that readdir() only returns the file name. If you want to open it prepend the directory
eg
open my $in, "<", "$dir/$file" or die "Cannot open $file!\n";
Note that best practice says you should be using the three parameter version of open, otherwise
I need some help with file manipulations and need some expert advice.
It looks like I am making a silly mistake somewhere but I can't catch it.
I have a directory that contains files with a .txt suffix, for example file1.txt, file2.txt, file3.txt.
I want to add a revision string, say rev0, to each of those files and then open the modified files. For instance rev0_file1.txt, rev0_file2.txt, rev0_file3.txt.
I can append rev0, but my program fails to open the files.
Here is the relevant portion of my code
my $dir = "path to my directory";
my #::tmp = ();
opendir(my $DIR, "$dir") or die "Can't open directory, $!";
#::list = readdir($DIR);
#::list2 = sort grep(/^.*\.txt$/, #::list);
foreach (#::list2) {
my $new_file = "$::REV0" . "_" . "$_";
print "new file is $new_file\n";
push(#::tmp, "$new_file\n");
}
closedir($DIR);
foreach my $cur_file (<#::tmp>) {
$cur_file = $_;
print "Current file name is $cur_file\n"; # This debug print shows nothing
open($fh, '<', "$cur_file") or die "Can't open the file\n"; # Fails to open file;
}
Your problem is here:
foreach my $cur_file(<#::tmp>) {
$cur_file = $_;
You are using the loop variable $cur_file, but you overwrite it with $_, which is not used at all in this loop. To fix this, just remove the second line.
Your biggest issue is the fact you are using $cur_file in your loop for the file name, but then reassign it with $_ even though $_ won't have a value at that point. Also, as Borodin pointed out, $::REV0 was never defined.
You can use the move command from the File::Copy to move the files, and you can use File::Find to find the files you want to move:
use strict;
use warnings;
use feature qw(say);
use autodie;
use File::Copy; # Provides the move command
use File::Find; # Gives you a way to find the files you want
use constant {
DIRECTORY => '/path/to/directory',
PREFIX => 'rev0_',
};
my #files_to_rename;
find (
sub {
next unless /\.txt$/; # Only interested in ".txt" files
push #files_to_rename, $File::Find::name;
}, DIRECTORY );
for my $file ( #files_to_rename ) {
my $new_name = PREFIX . $file;
move $file, $new_name;
$file = $new_name; # Updates #files_to_rename with new name
open my $file_fh, "<", $new_name; # Open the file here?
...
close $file_fh;
}
for my $file ( #files_to_rename ) {
open my $file_fh, "<", $new_name; # Or, open the file here?
...
close $file_fh;
}
See how using Perl modules can make your task much easier? Perl comes with hundreds of pre-installed packages to handle zip files, tarballs, time, email, etc. You can find a list at the Perldoc page (make sure you select the version of Perl you're using!).
The $file = $new_name is actually changing the value of the file name right inside the #files_to_rename array. It's a little Perl trick. This way, your array refers to the file even through it has been renamed.
You have two choices where to open the file for reading: You can rename all of your files first, and then loop through once again to open each one, or you can open them after you rename them. I've shone both places.
Don't use $:: at all. This is very bad form since it overrides use strict; -- that is if you're using use strict to begin with. The standard is not to use package variables (aka global variables) unless you have to. Instead, you should use lexically scoped variables (aka local variables) defined with my.
One of the advantages of the my variable, I really don't need the close command since the variable falls out of scope with each iteration of the loop and disappears entirely once the loop is complete. When the variable that contains the file handle falls out of scope, the file handle is automatically closed.
Always include use strict;, use warnings at the top of EVERY script. And use autodie; anytime you're doing file or directory processing.
There is no reason why you should be prefixing your variables with :: so please simplify your code like the following:
use strict;
use warnings;
use autodie;
use File::Copy;
my $dir = "path to my directory";
chdir($dir); # Make easier by removing the need to prefix path information
foreach my $file (glob('*.txt')) {
my $newfile = 'rev0_'.$file;
copy($file, $newfile) or die "Can't copy $file -> $newfile: $!";
open my $fh, '<', $newfile;
# File processing
}
What you've attempted to store is the updated name of the file in #::tmp. The file hasn't been renamed, so it's little surprise that the code died because it couldn't find the renamed file.
Since it's just renaming, consider the following code:
use strict;
use warnings;
use File::Copy 'move';
for my $file ( glob( "file*.txt" ) ) {
move( $file, "rev0_$file" )
or die "Unable to rename '$file': $!";
}
From a command line/terminal, consider the rename utility if it is available:
$ rename file rev0_file file*.txt
I tried to run a simple copy of one file to another folder using Perl
system("copy template.html tmp/$id/index.html");
but I got the error error: The syntax of the command is incorrect.
When I change it to
system("copy template.html tmp\\$id\\index.html");
The system copies another file to the tmp\$id foler
Can someone help me?
I suggest you use File::Copy, which comes with your Perl distribution.
use strict; use warnings;
use File::Copy;
print copy('template.html', "tmp/$id/index.html");
You do not need to worry about the slashes or backslashes on Windows because the module will take care of that for you.
Note that you have to set relative paths from your current working directory, so both template.html as well as the dir tmp/$id/ needs to be there. If you want to create the folders on the fly, take a look at File::Path.
Update: Reply to comment below.
You can use this program to create your folders and copy the files with in-place substitution of the IDs.
use strict; use warnings;
use File::Path qw(make_path);
my $id = 1; # edit ID here
# Create output folder
make_path("tmp/$id");
# Open the template for reading and the new file for writing
open $fh_in, '<', 'template.html' or die $!;
open $fh_out, '>', "tmp\\$id\index.html" or die $!;
# Read the template
while (<$fh_in>) {
s/ID/$id/g; # replace all instances of ID with $id
print $fh_out $_; # print to new file
}
# Close both files
close $fh_out;
close $fh_in;
I am new to Perl and trying to write text files. I can write text files to an existing directory no problem, but ultimately I would like to be able to create my own directories.
I am going to download files from my course works website and I want to put the files in a folder named after the course. I don't want to make a folder for each course manually beforehand, and I would also like to eventually share the script with others, so I need a way to make the directories and name them based on the course names from the HTML.
So far, I have been able to get this to work:
use strict;
my $content = "Hello world";
open MYFILE, ">C:/PerlFiles/test.txt";
print MYFILE $content;
close (MYFILE);
test.txt doesn't exist, but C:/PerlFiles/ does and supposedly typing > allows me to create files, great.
The following, however does not work:
use strict;
my $content = "area = pi*r^2";
open MYFILE, ">C:/PerlFiles/math_class/circle.txt";
print MYFILE $content;
close (MYFILE);
The directory C:/PerlFiles/math_class/ does not exist.
I also tried sysopen but I get an error when adding the flags:
use strict;
my $content = "area = pi*r^2";
sysopen (MYFILE, ">C:/PerlFiles/math_class/circle.txt", O_CREAT);
print MYFILE $content;
close (MYFILE);
I got this idea from the Perl Cookbook chapter 7.1. Opening a File. It doesn't work, and I get the error message Bareword "O_CREAT" not allowed while "strict subs" in use. Then again the book is from 1998, so perhaps O_CREAT is obsolete. At some point I think I will need to fork over the dough for an up-to-date version.
But still, what am I missing here? Or do the directories have to be created manually before creating a file in it?
Right, directories have to be created manually.
Use mkdir function.
You can check if directory already exists with -d $dir (see perldoc -f -X).
Use File::Path to create arbitrarily deep paths. Use dirname to find out a file's containing directory.
Also, use lexical file handles and three-argument open:
open(my $fd, ">", $name) or die "Can't open $name: $!";
If you have a path to a file (for example, /home/bob/test/foo.txt) where each subdirectory in the path may or may not exist, how can I create the file foo.txt in a way that uses "/home/bob/test/foo.txt" as the only input instead of creating every nonexistent directory in the path one by one and finally creating foo.txt itself?
You can use File::Basename and File::Path
use strict;
use File::Basename;
use File::Path qw/make_path/;
my $file = "/home/bob/test/foo.txt";
my $dir = dirname($file);
make_path($dir);
open my $fh, '>', $file or die "Ouch: $!\n"; # now go do stuff w/file
I didn't add any tests to see if the file already exists but that's pretty easy to add with Perl.
Use make_dir from File::Util
use File::Util;
my($f) = File::Util->new();
$f->make_dir('/var/tmp/tempfiles/foo/bar/');
# optionally specify a creation bitmask to be used in directory creations
$f->make_dir('/var/tmp/tempfiles/foo/bar/',0755);
I don't think there's a standard function that can do all of what you ask, directly from the filename.
But mkpath(), from the module File::Path, can almost do it given the filename's directory. From the File::Path docs:
The "mkpath" function provides a
convenient way to create directories,
even if your "mkdir" kernel call won't
create more than one level of
directory at a time.
Note that mkpath() does not report errors in a nice way: it dies instead of just returning zero, for some reason.
Given all that, you might do something like:
use File::Basename;
use File::Path;
my $fname = "/home/bob/test/foo.txt";
eval {
local $SIG{'__DIE__'}; # ignore user-defined die handlers
mkpath(dirname($fname));
};
my $fh;
if ($#) {
print STDERR "Error creating dir: $#";
} elsif (!open($fh, ">", $fname)) {
print STDERR "Error creating file: $!\n";
}