perl script to read an excel file using cpan modules - perl

I am writing a perl script to read data from an excel file. The script is being written in an unix environment and run on the server, whereas the excel file is available on my Desktop in Windows.
#!/usr/bin/perl
use strict;
use warnings;
use feature 'say';
use Spreadsheet::Read;
my $workbook = ReadData ("C:/Users/tej/Desktop/Work.xlsx");
say $workbook->[1]{A1};
The output gives out a warning saying
Use of uninitialized value in say at..... line 10
And there is no other output being printed. I just wrote a sample code to read the A1 cell value from sheet 1. Later, I need to write a logic to read particular values. For right now, need to fix the error to read and print the excel cell values. Appreciate any help. :)

I fixed the issue. It was about file was not being accessed. I used samba to Map unix disk to Windows Network Drive. But now, I get a different error which says : Parser for XLSX is not installed at.. Can someone help me to resolve it.

Related

Reading Xlsx from another Xlsx file

I have few Xlsx files say X.xlsx,Y.xlsx,Z.XLSX and I kept those three Xlsx files in another xlsx file say A.xlsx. Now I want to ready the content in the three xlsx files(x,y,z) at a time through A.xlsx.
Can any one help me on this.
Thanks in advance
This is easy on Windows if your target machine also has Microsoft Excel installed.
Use the Win32::OLE module to create an instance of Excel, open your master file A.xlsx and then iterate over its ->{OLEObjects} property:
#!perl
use strict;
use warnings;
use Win32::OLE 'in';
$ex = Win32::OLE->new('Excel.Application') or die "oops\n";
my $Axlsx = $ex->Open('C:\\Path\\To\\A.xlsx');
my $i=0;
for my $embedded (in $Axlsx->OLEObjects) {
$embedded->Object->Activate();
$embedded->Object->SaveAs("test$i++.xlsx");
$embedded->Object->Close;
}
After saving them, you can treat them as normal Excel files. Alternatively, you can work directly with $embedded->Object, but as you haven't told us what exactly you need to do, it's hard to give specific advice.
See also Save as an Excel file embedded in another Excel file

Error with Term::ReadKey

I’m trying to run this script to check genotyped data for imputation with HRC or 1000G using an imputation server, it can be found here. It is perl-based. It has these packages/libraries it is (trying) to load.
use strict;
use warnings;
use File::Basename;
use Getopt::Long;
use IO::Uncompress::Gunzip qw(gunzip $GunzipError);
use Term::ReadKey qw/ GetTerminalSize /;
However, it throws an error Unable to get Terminal Size. The TIOCGWINSZ ioctl didn't work. The COLUMNS and LINES environment variables didn't work. The resize program didn't work. at /usr/lib64/perl5/vendor_perl/Term/ReadKey.pm line 362.
How do I solve this?
Ah. So the question is sort of solved. It turns out the script in particular is expecting to be run in a terminal, which stays open. That is to say: it is not expecting to be run in a submission-system, which is what I was doing. In fact, I need to run it in a submission-system, because the computer cluster doesn't allow for big data to be run in a terminal, and because the data is just to big to keep waiting for the program to finish. Submitting it to a compute node with more power makes more sense.
The developer of the script, the HRC or 1000G Imputation preparation and checking tool, has kindly provided a non-terminal-requiring script. He will put it online, and/or provide it via email.
solved

How to find the age of the file in SFTP using perl?

I am connecting SFTP and downloading the file using perl. I want to download the file that is created/modified 1 hours back.
The below is code snippet.
use strict;
use Net::SFTP::Foreign;
my $sftp_conn=Net::SFTP::Foreign->new('test.sftp.com',user=>'test',password=>'test123');
my $a1 = Net::SFTP::Foreign::Attributes->new();
my $a2 = $sftp_conn->stat('/inbox/tested.txt') or die "remote stat command failed: ".$sftp_conn->status;
$sftp_conn->get("/inbox/tested.txt","/tmp"));
Here I want to check the file age at what time it modified and calculate in hours.
You are on the right track. Calling ->stat on the connection object returns a Net::SFTP::Foreign::Attributes object. You can then call ->mtime on it to get the modifcation time.
my $attr = $sftp_conn->stat('/inbox/tested.txt')
or die "remote stat command failed: ".$sftp_conn->status;
print $attr->mtime;
There is no need to create an empty object first. You don't need the following line. You probably copied it from the SYNOPSIS in the docs, but that's just to show different ways of using that module. You can delete it.
my $a1 = Net::SFTP::Foreign::Attributes->new();
I don't know which format the mtime will be in, so I can't tell you how to do the comparison. There is nothing about that in the docs, in the code of the module or in the tests.
A quick google suggested "YYYYMMDDhhmmss", but that might not be the right one. Just try it. If it's a unix timestamp, you can just compare it to time or time - 3600, but if it's a string, you will need to parse it. Time::Piece is a useful module that comes with the Perl core to do that.

Storing output of perl script to a file

I"m calling a perl script stored in one PC, for example with name machine1 from say machine2 using the command:
system("perl /CC/builds/123.pl\n");
Now, i need to get the log of the whole perl file executed to be stored in a say 123.txt file and created on machine1.Can any text file be opened in the perl file at the beginning which stored only output of the line executed?
Please help.
Thanks,
Ramki
I'm quite new on that and I don't know if I am understanding your question, but why you don't use "backticks" instead of "system"? It would let you store the output in a variable and then you can do with this variable whatever you want.
About backticks: What's the difference between Perl's backticks, system, and exec?

How can I read a continuously updating log file in Perl?

I have a application generating logs in every 5 sec. The logs are in below format.
11:13:49.250,interface,0,RX,0
11:13:49.250,interface,0,TX,0
11:13:49.250,interface,1,close,0
11:13:49.250,interface,4,error,593
11:13:49.250,interface,4,idle,2994215
and so on for other interfaces...
I am working to convert these into below CSV format:
Time,interface.RX,interface.TX,interface.close....
11:13:49,0,0,0,....
Simple as of now but the problem is, I have to get the data in CSV format online, i.e as soon the log file updated the CSV should also be updated.
What I have tried to read the output and make the header is:
#!/usr/bin/perl -w
use strict;
use File::Tail;
my $head=["Time"];
my $pos={};
my $last_pos=0;
my $current_event=[];
my $events=[];
my $file = shift;
$file = File::Tail->new($file);
while(defined($_=$file->read)) {
next if $_ =~ some filters;
my ($time,$interface,$count,$eve,$value) = split /[,\n]/, $_;
my $key = $interface.".".$eve;
if (not defined $pos->{$eve_key}) {
$last_pos+=1;
$pos->{$eve_key}=$last_pos;
push #$head,$eve;
}
print join(",", #$head) . "\n";
}
Is there any way to do this using Perl?
Module Text::CSV will allow you to both read and write CSV format files. Text::CSV will internally use Text::CSV_XS if it's installed, or it will fall back to using Text::CSV_PP (thanks to Brad Gilbert for improving this explanation).
Grouping the related rows together is something you will have to do; it is not clear from your example where the source date goes to.
Making sure that the CSV output is updated is primarily a question of ensuring that you have the output file line buffered.
As David M suggested, perhaps you should look at the File::Tail module to deal with the continuous reading aspect of the problem. That should allow you to continually read from the input log file.
You can then use the 'parse' method in Text::CSV to split up the read line, and the 'print' method to format the output. How you combine the information from the various input lines to create an output line is a mystery to me - I cannot see how the logic works from the example you give. However, I assume you know what you need to do, and these tools will give you the mechanisms you need to handle the data.
No-one can do much more to spoon-feed you the answer. You are going to have to do some thinking for yourself. You will have a file handle that can be continuously read via File::Tail; you will have a CSV structure for reading the data lines; you will probably have another CSV structure for the written output; you will have an output file handle that you ensure is flushed every time you write. Connecting these dots is now your problem.