Unzip a file from several large compressed folder from command line - powershell

I have several large zipped folders in my cloud storage drive. I want to transfer a specific file from each of the zipped folders to my local hard drive (I cant copy all of them since i dont have enough space). Is there a way to do this using command line/cmd or powershell. I am using Windows 10 (Build 18362).
The file name is the same so i was hoping if i can write a loop to do this.

Related

Strange behavior of some zip files while extracting (7zip)

I have a problem with unzipping files. Usually I use a PowerShell script, that I run through SQL Server database (procedure) and it will unzip my files how I want them (database shouldn't be any issue).
Main part of script:
exec '"for %i in ("'+#path+'\*.zip") do "C:\Program Files\7-Zip\7z.exe" x "%i" -o"'+#to+'""'
But for once in a while there is a .zip file that if run through this script will extract files with names of files encoded badly and to prevent that I need to manually open every .zip file and click to extract in 7-zip GUI and that will returns names of files correctly.
This manual way of doing things very, very slow. Because it needs to be done on a server, God forbids if I am on VPN at home. So let's say I have 5 zip files (3 GB), it will take me hours to unzip. If I do it, through server it takes like 2 minutes.
So I just trying to find out how to unzip it with script, what is the difference between my script and manual extraction?
The root-cause is likely to be files that were created with a setup that doesn't match the codepage of your windows environment. Are any of the zip file publicly available? If so, can you provide a link?
To deal with these problem files, you need to know what encoding was used for the filenames. If the 7z gui can handle these files automatically there must be an option to tell it what encoding to use.
Once you know the encoding you can use the -scs option in 7z to decode the filenames correctly.
Modern zip files don't have this issue because they store filenames in UTF-8.

Is it possible to run perl in google drive

I have uploaded my entire perl directory to google drive, including perl.exe, /lib, perl scripts, and data files.
Is it possible to run perl.exe on perl scripts using the data files, within the google drive?
If so, where can I find out how to do it?
Google drive is a File storge system.
It is not a server that can run any applications.

How to download files from s3 latest files to local folder using powerShell script and need to include file corruption check point

I need to download latest files from AWS S3 bucket using powershell script and as well as how can tackle file corruption while downloading.

Copy Directories Recursively From Ftp Server Using Perl

I need to write a perl script which has to log in to an FTP server and download all the sub-directories and contents on sub-directories to local machine. The version of Perl on the FTP server is 5.8.8, i can't upgrade it. One method is to create directories on local machine and then copy each file. I was wondering if there is any command to copy a directory and its content. Is it possible to "tar" the directory to save space?
Thanks,
Amit.
There is Net::FTP::Recursive. I haven't tried it but it seems to fit your requirements.

Locate Compressed files on servers

I would like to create a powershell script generating a report showing all compressed files/folders on remote servers. By compressed files I mean files compressed using the buildin Windows Compression utility, not zip. But I have a hard time figuring out how to localize the compressed files. Should I go with WMI or?
Thanks
Frank
The FileInfo/DirectoryInfo classes from .NET (I assume all of this is easy available to PowerShell) will give you the file attributes that includes the compression attribute if compressed.