I am trying to list the file of remote host using Net::FTP Perl module. Later I want to download it using put command.
Will it possible to list the particular file using ls command
Syntax I am using is:
$remote_dir = "/home/user/test_dir/";
$ftp->cwd($remote_dir);
$file_name = "test.txt";
print $ftp->ls($file_name);
This file is exists in remote server when I tried checking it manually. But listing is not happening.
ftp->ls(); is used for only listing the directory contents or can it be used to list the particular file from a directory as I mentioned above?
ftp->ls(); is used for only listing the directory contents or can it be used to list the particular file from a directory
From the documentation:
ls ( [ DIR ] )
Get a directory listing of DIR, or the current directory.
Thus, it is clearly documented to list a directory, not a specific regular file.
Related
I'm using the command ls *Pattern*Date* Files.txt to get a list of files from FTP to my local text file.
I'm now required to get multiple patterned files (The Date and Pattern may come in different order). So when I try to add another line ls *Date*Pattern* Files.txt, this clears the FCCfiles.txt and I'm not able to get the first set of files.
Is there any command that can append the list of files rather than creating new files list?
You cannot append the listing to a file with ftp.
But you can merge multiple listings in PowerShell. I assume that you run ftp from PowerShell, based on your use of powershell tag.
In ftp script do:
ls *Pattern*Date* files1.txt
ls *Date*Pattern* files2.txt
And then in PowerShell do:
Get-Content files1.txt,files2.txt | Set-Content files.txt
(based on How do I concatenate two text files in PowerShell?)
How would it be possible to get a list with all directories inside a specific directory of a Cloud Storage bucket using an App Engine application in PHP?
You can try two approaches to list all the directories inside a specific directory in a bucket:
Use the code snippet from the PHP docs samples at Google Cloud Platform GitHub and modify it in a way that list_objects_with_prefix function includes also delimiter, not only prefix. I have written such function in Python in this SO topic, you can use it as a reference. Here prefix needed to be the name of the parent directory, for e.g. 'my_directory/' and delimiter is simply '/' to indicate that we want to end our search on elements finishing with '/' (hence, directories).
Use gsutil ls command to list objects in a directory from within PHP. You will need to use shell_exec function:
$execCommand = "gsutil ls gs://bucket";
$output = shell_exec($execCommand);
output will be a string in this case and it will contain also file names if present in the parent directory.
This SO topic might be also informative, here the question was to list the whole directory (together with files).
Okay so I want to know how I would go about doing this, using grep to locate .txt files named "cocacola1", "cocacola2", "cocacola3" & then copying them to another directory. So searching for files named "cocacola" &/even if it contains other characters within the file name to then copy them to another directory/location.
You can just use unix find. Assuming the files you're searching for are in 'source' and you want to copy to 'destination':
find source -name '*cocacola*' -exec cp {} destination \;
I put the wildcard '*' before and after cocacola since you said other characters might exist in the file name.
I am using Net::SFTP::Foriegn module to connect the SFTP server and I could make the connection successfully.
I would like to read each and every directories and sub directories in the SFTP server to get some files. Is it possible?
And, is there any way to differentiate file and directory using this module?
use find method to find the entries and then use get method
$sftp->find(\#paths, %opts) Does a recursive search over the given
directory $path (or directories #path) and returns a list of the
entries found or the total number of them on scalar context.
Every entry is a reference to a hash with two keys: filename, the full
path of the entry; and a, a Net::SFTP::Foreign::Attributes object
containing file atime, mtime, permissions and size.
Suppose I have a directory structure like
C:\Users\Desktop\abc\d
I want to rar archive the abc folder so that the structure of rar is:
abc\d
When I try using powershell to archive, winrar replicates the full path inside the archive, like:
\Users\Desktop\abc\d
I dont want the full path to be created inside the archive
Here's the script:
https://gist.github.com/saurabhwahile/50f1091fb29c2bb327b7
What am I doing wrong?
Use the command line:
Rar.exe a -r -ep1 Test.rar "C:\Users\Desktop\abc"
Rar.exe is the console version of WinRAR stored in same directory as WinRAR.exe. You can use this command line also with WinRAR.exe if you want to see the compression process in a graphic window.
a is the command and means add files to archive.
-r is a switch to recursively add all files and subdirectories including empty subdirectories to the archive.
-ep1 is another switch which results in execluding base directory.
For this command line the base directory is "C:\Users\Desktop\" and therefore the created archive Test.rar contains only abc and all files and subdirectories in directory abc which is what you want.
One more hint: Using the command line
Rar.exe a -r -ep1 Test.rar "C:\Users\Desktop\abc\"
results in adding all files and subdirectories of directory abc to the archive, but without directory name abc being also stored in the archive. The backslash at end makes this difference.
In other words: On using switch -ep1 everything up to last backslash in file/directory specification is not added to the archive.
For more information about available switches see the text file Rar.txt in the program files directory of WinRAR.