Remove the first line from my directory - perl

how can i remove the first line from my list of file , this is my code,
open my directory:
use strict;
use warnings;
use utf8;
use Encode;
use Encode::Guess;
use Devel::Peek;
my $new_directory = '/home/lenovo/corpus';
my $directory = '/home/lenovo/corpus';
open( my $FhResultat, '>:encoding(UTF-8)', $FichierResulat );
my $dir = '/home/corpus';
opendir (DIR, $directory) or die $!;
my #tab;
while (my $file = readdir(DIR)) {
next if ($file eq "." or $file eq ".." );
#print "$file\n";
my $filename_read = decode('utf8', $file);
#print $FichierResulat "$file\n";
push #tab, "$filename_read";
}
closedir(DIR);
open my file:
foreach my $val(#tab){
utf8::encode($val);
my $filename = $val;
open(my $in, '<:utf8', $filename) or die "Unable to open '$filename' for read: $!";
rename file
my $newfile = "$filename.new";
open(my $out, '>:utf8', $newfile) or die "Unable to open '$newfile' for write: $!";
remove the first line
my #ins = <$in>; # read the contents into an array
chomp #ins;
shift #ins; # remove the first element from the array
print $out #ins;
close($in);
close $out;
the probem my new file is empty !
rename $newfile,$filename or die "unable to rename '$newfile' to '$filename': $!";
}
It seems true but the result is an empty file.

The accepted pattern for doing this kind of thing is as follows:
use strict;
use warnings;
my $old_file = '/path/to/old/file.txt';
my $new_file = '/path/to/new/file.txt';
open(my $old, '<', $old_file) or die $!;
open(my $new, '>', $new_file) or die $!;
while (<$old>) {
next if $. == 1;
print $new $_;
}
close($old) or die $!;
close($new) or die $!;
rename($old_file, "$old_file.bak") or die $!;
rename($new_file, $old_file) or die $!;
In your case, we're using $. (the input line number variable) to skip over the first line.

Related

perl script only write one row to output file in perl

i wrote a script to open a file on web, and pull out all rows with wireless in the name. It writes the out put to a different file, but it only records one line in the output file, should be mulitipe lines.
#!\Perl64\eg\perl -w
use warnings;
use strict;
use LWP::Simple;
my $save = "C:\\wireless\\";
my $file = get 'http://dhcp_server.test.com/cgi-bin/dhcp_utilization_csv_region.pl?region=test';
open( FILE, '>', $save . 'DHCP_Utilization_test.csv' ) or die $!;
binmode FILE;
print FILE $file;
close(FILE);
open( F, "C:\\wireless\\DHCP_Utilization_test.csv" ) || die "can't opern file: $!";
my #file = <F>;
close(F);
my $line;
foreach $line (#file) {
chomp $line;
if ( $line =~ m/Wireless /g ) {
my ($ip, $rtr, $mask, $zip, $blc, $address, $city,
$state, $space, $country, $space2, $noc, $company, $extra,
$active, $used, $percent, $extra3, $nus, $construct
) = split( /,/, $line );
my $custom_directory = "C:\\wireless\\";
my $custom_filename = "wireless_DHCP.csv";
my $data = "$ip $mask $rtr $active $used $percent $nus $construct";
my $path = "$custom_directory\\$custom_filename";
open( my $handle, ">>", $path ) || die "can't open $path: $!";
binmode($handle); # for raw; else set the encoding
print $handle "$data\n";
close($handle) || die "can't close $path: $!";
}
}
I believe the problem is because you're on Windows, but then saving the file using :raw, and then reopening it using :crlf.
open( FILE, '>', $save . 'DHCP_Utilization_test.csv' ) or die $!;
binmode FILE;
print FILE $file;
close(FILE);
open( F, "C:\\wireless\\DHCP_Utilization_test.csv" ) || die "can't opern file: $!";
my #file = <F>;
close(F);
I therefore suspect that your #file array only contains one line for the entire file.
You can probably also tighten your code to something like the following:
#!\Perl64\eg\perl
use strict;
use warnings;
use autodie;
use LWP::Simple;
my $url = 'http://dhcp_server.test.com/cgi-bin/dhcp_utilization_csv_region.pl?region=test';
my $datafile = "C:\\wireless\\DHCP_Utilization_test.csv";
my $wireless = "C:\\wireless\\wireless_DHCP.csv";
getstore( $url, $datafile );
open my $infh, '<', $datafile;
open my $outfh, '>>', $wireless;
while (<$infh>) {
chomp;
next unless /Wireless /;
my ($ip, $rtr, $mask, $zip, $blc, $address, $city,
$state, $space, $country, $space2, $noc, $company, $extra,
$active, $used, $percent, $extra3, $nus, $construct
) = split /,/;
print $outfh "$ip $mask $rtr $active $used $percent $nus $construct\n";
}

Can't write to the file

Why can't I write output to the input file?
It prints it well, but isn't writing to the file.
my $i;
my $regex = $ARGV[0];
for (#ARGV[1 .. $#ARGV]){
open (my $fh, "<", "$_") or die ("Can't open the file[$_] ");
$i++;
foreach (<$fh>){
open (my $file, '>>', '/results.txt') or die ("Can't open the file "); #input file
for (<$file>){
print "Given regexp: $regex\nfile$i:\n line $.: $1\n" if $_ =~ /\b($regex)\b/;
}
}
}
It's unclear whether your problem has been solved.
My best guess is that you want your program to search for the regex passed as the first parameter in the files named in the following paramaters, appending the results to results.txt.
If that is right, then this is closer to what you need
use strict;
use warnings;
use autodie;
my $i;
my $regex = shift;
open my $out, '>>', 'results.txt';
for my $filename (#ARGV) {
open my $fh, '<', $filename;
++$i;
while (<$fh>) {
next unless /\b($regex)\b/;
print $out "Given regexp: $regex\n";
print $out "file$i:\n";
print $out "line $.: $1\n";
last;
}
}

To replace a string and append a string in perl in 1 file

I want to replace a line in my file and after replacing it I want to append another line. As you can see here, I have to open and close files for 2 times. Can I do it by opening a file only once? Thanks
use strict;
use warnings;
open(FILE,"tmp1.txt") || die "Can't open file: $!";
undef $/;
my $file = <FILE>;
my #lines = <FILE>;
my #newlines;
for each(#lines) {
$_ =~ s/hello/hi/g;
push(#newlines,$_);
}
close(FILE);
open(FILE, "> tmp1.txt ") || die "File not found";
print FILE #newlines;
close(FILE);
open(FILE,"tmp1.txt") || die "Can't open file: $!";
undef $/;
my $file = <FILE>;
my #lines = <FILE>;
my $first_line = "hi";
my $second_line = "sun";
my $insert = "good morning";
$file =~ s/\Q$first_line\E\n\Q$second_line\E/$first_line\n$insert\n$second_line/;
open(OUTPUT,"> tmp3.txt") || die "Can't open file: $!";
print OUTPUT $file;
close(OUTPUT);
Use Three-arg open and open your file in Read+Write mode by +<.

sort files as per their dated names at the same time create filehandles for each file in perl

I have the about 1000 files named as
CU20130404160033.TXT
CU20130405160027.TXT ....
CUYYYYMMDDHHMMSS.TXT
I need to append all the files into single file as per names where where min(date) to max(date)
How can i make the programm efficient where i should sort the files as per their Dates and create filehandles.
opendir (DIR, $Directory) or die $!;
#files = grep { (!/^\./) && -f "$Directory/$_" } readdir(DIR);
chdir($Directory);
#create an array of open filehandles.
#fh = map { open my $f, $_ or die "Cant open $_:$!"; $f } #files;
#create new file for output
open $out_file , ">$filename" or die "cant open new file $!";
Something like this should be helpful.
opendir (DIR, $dir);
#dir=readdir(DIR);
closedir(DIR);
#Sort file list by modification time and write to an output file
open(my $fh, ">", "output.txt") or die $!;
print $fh sort{ -M "$dir/$b" <=> -M "$dir/$a" }(#dir);
close $fh;
I propose this script:
use strict;
use warnings;
#1. Set up the general values
my $directory = "... the dir ...";
my $out_file = "... the out file ...";
#2. Fill an array with the names of your files
opendir(my $dh, $directory) or die $!;
while( my $file = readdir($dh) ) {
push #files, $file;
}
closedir $dh;
#3. Sort the array
#files = sort {$a cmp $b} #files;
#4. Open the target file
open $out_file , '>', $filename or die $!;
#5. Iterate for each input file, open it,
# and write line by line its contents to the target
foreach my $filename(#files) {
open my $ifh, '<', $filename or die $!;
while( my $line = <$ifh> ) {
print $out_file $line;
}
close $ifh;
}
#6. Close the target
close $out_file;

read input file, match and remove data and write remaining lines to a new file

I am stuck trying to get this to write out the contents of the file. What I am trying to do is open an input file, filter out/remove the matched line and write to a new file. Can someone show me how to do this properly? Thanks.
use strict;
use warnings;
use Text::CSV_XS;
my $csv = Text::CSV_XS->new ({ binary => 1 }) or
die "Cannot use CSV: ".Text::CSV_XS->error_diag ();
open my $fh, "<:encoding(UTF-16LE)", "InputFile.txt" or die "cannot open file: $!";
my #rows;
while (my $row = $csv->getline ($fh)) {
my #lines;
shift #lines if $row->[0] =~ m/Global/;
my $newfile = "NewFile.txt";
open(my $newfh, '>', $newfile) or die "Can't open";
print $newfh #lines;
}
$csv->eof or $csv->error_diag ();
close $fh;
Open the output file outside of the loop. As you read each line, decide if you want to keep it. If yes, write to output file. If not, don't do anything.
Something like the following (untested):
use strict;
use warnings;
use Text::CSV_XS;
my ($input_file, $output_file) = qw(InputFile.txt NewFile.txt);
my $csv = Text::CSV_XS->new ({ binary => 1 })
or die sprintf("Cannot use CSV: %s\n", Text::CSV_XS->error_diag);
open my $infh, "<:encoding(UTF-16LE)", $input_file
or die "Cannot open '$input_file': $!";
open my $outfh, '>', $output_file
or die "Cannot open '$output_file': $!";
while (my $row = $csv->getline($infh)) {
next if $row->[0] =~ m/Global/;
unless ( $csv->print($outfh, $row) ) {
die sprintf("Error writing to '%s': %s",
$output_file,
$csv->error_diag
);
}
}
close $outfh
or die "Cannot close '$output_file': $!";
close $infh
or die "Cannot close '$input_file': $!";
$csv->eof
or die "Processing of '$input_file' terminated prematurely";